[
  {
    "path": ".gitignore",
    "content": ".install.data\n.installed\n*.pyc\n*.pyo\n*~\ntesting/cache\n"
  },
  {
    "path": ".gitmodules",
    "content": "[submodule \"installation/externals/chosen\"]\n\tpath = installation/externals/chosen\n\turl = ../chosen.git\n\n[submodule \"installation/externals/v8-jsshell\"]\n\tpath = installation/externals/v8-jsshell\n\turl = ../v8-jsshell.git\n"
  },
  {
    "path": "CONTRIBUTORS",
    "content": "Author:\n\nJens Lindström <jl@critic-review.org>\n\nContributors:\n\nRafał Chłodnicki\nPhilip Jägenstedt\nLeif Arne Storset\nAlexey Feldgendler\nMichał Gawron\nJames Graham\nMartin Olsson\nDaniel Bratell\nOdin Hørthe Omdal\nFredrik Öhrn\nPeter Krefting\nPengfei Xue\nJohan Herland\nRyan Fowler\nJacob Rask\nFelix Ekblom\nAndreas Tolfsen\n"
  },
  {
    "path": "COPYING",
    "content": "Copyright 2012-2014 the Critic contributors, Opera Software ASA\n\nThe Critic code review system is licensed under the Apache License, version 2.0,\nwith exceptions noted below.  You may obtain a copy of the License at\n\n  http://www.apache.org/licenses/LICENSE-2.0\n\nAn list of contributors can be found in the CONTRIBUTORS file.\n\n\nThird-party software\n====================\n\nThe following third-party software is distributed with Critic (checked into\nCritic's Git repository.)\n\njQuery\n------\n\nThe jQuery library is licensed under the MIT License, available in the file\ninstallation/externals/MIT-LICENSE.jQuery.txt.  These files distributed with\nCritic are part of jQuery:\n\n  installation/externals/MIT-LICENSE.jQuery.txt\n  resources/jquery-2.0.0.min.js\n  resources/jquery-ui-1.10.2.custom.css\n  resources/jquery-ui-1.10.2.custom.min.js\n  resources/images/ui-*.png\n\nFor more information, see:\n\n  http://jquery.org/\n\nChosen\n------\n\nThe Chosen library is licensed under the MIT License, available in the file\ninstallation/externals/MIT-LICENSE.Chosen.md.  These files distributed with\nCritic are part of Chosen:\n\n  installation/externals/MIT-LICENSE.Chosen.md\n  resources/chosen*.js\n  resources/chosen*.css\n  resources/chosen-*.png\n\nFor more information, see:\n\n  http://harvesthq.github.io/chosen/\n\nCritic's version of Chosen is built from slightly modified fork of the main\nGitHub repository:\n\n  https://github.com/jensl/chosen\n"
  },
  {
    "path": "INSTALL",
    "content": "Installation\n============\n\nTo install Critic, run the script install.py as root.  It will ask a\nnumber of question and then perform the installation.  In short, what\nit does is:\n\n  * Check for and/or install required software packages.\n\n  * Create a system user (typically named \"critic\") and group (also\n    typically named \"critic\").\n\n  * Generate system configuration into /etc/critic/.\n\n  * Install the source code into /usr/share/critic/.\n\n  * Create a PostgreSQL user and database, both named \"critic\".\n\n  * Create a System-V style init script, /etc/init.d/critic-main, and\n    create links to it in the /etc/rcN.d/ directories.\n\n  * Enable the Apache modules mod_expires and mod_wsgi.\n\n  * Create an Apache site named \"critic-main\" and enable it.\n\n\nRequired Software Packages\n--------------------------\n\nCritic depends on the following software packages:\n\n  * PostgreSQL (9.1 or later), both client and server parts\n\n  * Apache (2.2 or later)\n\n  * Git\n\n  * Python (2.7 or later; 3.x not supported)\n\n  * Non-standard Python modules:\n\n    - passlib (if Critic does user authentication)\n\n    - psycopg2\n\n    - pygments\n\nNote that on Debian/Ubuntu systems, the install.py script can install\nall of these software packages automatically.\n"
  },
  {
    "path": "README.md",
    "content": "Critic\n======\n\nThis is the code review system, Critic.\n\nCritic has a few [concepts][concepts] that might be useful to know.\n\nInstallation\n------------\n\nTo install Critic, run the script `install.py` as root:\n\n    # python install.py\n\nIt will ask a number of questions and then perform the installation.\n\nYou should probably read the [INSTALL file][install] for all the information.\n\n\n[install]: https://github.com/jensl/critic/blob/master/INSTALL\n[concepts]: https://github.com/jensl/critic/blob/master/documentation/concepts.txt\n\nAdding a repository\n-------------------\n\nAfter installing you should be able to navigate to the hostname you\nspecified during installtion and see Critic running.  When using the\nadministrator account you will also see 'Repositories' and 'Services'\nas top level menu items, in addition to the usual menu items.  To add\na new repository, click the 'Repositories' menu item and then the\n'Add Repository' button in the top right corner.\n\nIf Critic only has ssh access to the upstream of your repository, you must\nset up the 'critic' system user (or whatever user name was chosen during\ninstallation) to have ssh access without the need of a password.  You can do\nthat by creating an ssh key without a password and using 'ssh-copy-id' to\ncopy the key across to the server.  If you need to connect to the upstream\nserver using a different user name, you need to create a 'config' file in\nthe 'critic' system user's '.ssh' directory containing:\n\n    Host <upstream-host-name>\n    User <upstream-user-name>\n\nMake sure to verify that you can access the repository from the 'critic'\nuser by running something like this:\n\n    su -s /bin/bash -c \"ssh -v <upstream-host-name>\" critic\n\nThis should also ensure that the upstream server key is stored in the\n'critic' user's 'known_hosts' file.\n\nThis needs only to be done once per upstream server.\n\nAdding users\n------------\n\nIf you are using the 'host' authentication system, users authenticated by\nthe web server will be added automatically to the Critic user database.  If\nyou are using the 'critic' authentication system you can use the 'critcctl'\ntool to add users beyond the administrative user created by the install.py\nscript:\n\n    sudo criticctl adduser\n\nOnly the users added with this method will be able to sign in to the system\nwhen using the 'critic' authentication system.\n\nAdding push rights\n------------------\n\nBefore a user can push review branches to the newly created Critic repository\ntheir system account must be a member of the 'critic' system group (or whatever\ngroup name was chosen during installation).  In Debian/Ubuntu this can be done\nusing:\n\n    usermod -a -G critic <user-who-wants-push-rights>\n\nSetting up reviewers and requesting a review\n--------------------------------------------\n\nThe developers responsible for performing the code review can subscribe\nto new review requests either for a specific set of subdirectories or for\nthe entire source tree.  This configuration is done by each reviewer under\nthe 'Home' top level menu item.\n\nFor information about how to request a new code review, click the 'Tutorial'\ntop level menu item, and then select 'Requesting a Review'.\n\nSee also\n--------\n\nThe [Critic user FAQ][faq] answers some common questions and gives some useful\ntips on how to use Critic efficiently.\n\nThere is a tutorial on basic system administration tasks available in the\ninstalled Critic system; click the 'Tutorial' top level menu item and select\n'Administration'.\n\n[faq]: https://github.com/jensl/critic/blob/master/documentation/user_faq.md\n"
  },
  {
    "path": "documentation/concepts.txt",
    "content": "Critic(al) Concepts\n===================\n\nBranches\n--------\n\nCritic maintains a view of branches that is slightly different from git's.  In\naddition to knowing the head of the branch, that is, the most recent commit, it\nalso keeps track of (after basically inventing the information itself) a base\nbranch and a \"tail\" commit.  A branch is considered to \"contain\" all commits\nthat are reachable from its head commit, except all commits that are reachable\nfrom the base branch.\n\nThis view of a branch is useful when the purpose is (possibly) to review the\nwork done on the branch.  In this case, one needs to limit the scope to where\nthe branch started, since in git's view, the branch contains everything back to\nthe beginning of time, which is not what one means to review.\n\nIt's important to note, however, that the \"base\" of a branch is not something\nthat is stored in git, and thus Critic needs to resort to fairly simple\nheuristics in determining the base of a branch, and can get it wrong.  In\nparticular, it will reverse the relationship between related branches in some\ncases.  If a branch A is created from master, and then later a branch B is\ncreated from branch A, and branches A and B diverge, the \"correct\" relationships\nare that A's base branch is master and B's base branch is A.  However, if B is\npushed to Critic's repository before A, Critic's will think that B's base branch\nis master and A's base branch is B.  If branches are pushed to Critic's\nrepository in the order which they are created, then Critic will get it right,\nthough.\n\nReviews\n-------\n\nA review in Critic is a branch in Critic's repository, and the changes to be\nreviewed are the commits (that, according to Critic, are) contained on that\nbranch.  There are two basic ways to start a review: push a branch to Critic's\nrepository and create a review of all the changes on the branch, or select one\nor more commits and have Critic create a branch containing exactly those\ncommits.  In practice, the difference between the two alternatives is quite\ninsignificant.\n\nThe branch is like any other git branch.  It can be fetched from Critic's\nrepository into a local work repository, and additional commits can be pushed\nback to Critic's repository.  Commits pushed to Critic's repository are\nautomatically added to the review, as changes that need reviewing.\n\nAt this time, non-fast forward updates of review branches in Critic's repository\nare not possible.\n\nReviewers\n---------\n\nReviewers are users of Critic that have registered themselves as reviewers of\nparts of the source code tree.  When changes are scheduled for review, reviewers\nare automatically assigned, based on this configuration.  Thus the user\nrequesting the review needn't assign reviewers manually.\n\nChunks\n------\n\nEach commit scheduled to be reviewed as part of a review is split into\nindividual change \"chunks\" (N lines deleted, M lines added,) individually\nrecorded in Critic's database along with a status.  Each such chunk of changes\nneeds to be approved by a registered reviewer for the modified file.  The\nlow-level details is typically hidden from the reviewers, however.  They don't\nneed to explicitly approve each chunk individually, for instance.\n\n\"Approval\" of chunks of changes is not thought to be final, and thus not\nstanding in conflict with not accepting the changes as-is.  (And thus the term\n\"approve\" is slightly misleading; \"read\" or \"reviewed\" would be perhaps more\nappropriate.)\n\nAny chunk of changes not approved yet blocks the review from being \"accepted\".\n\nComments\n--------\n\nA vital part of reviewing is commenting specific lines of code.  In Critic,\nthere are two types of comments; \"issues\" and \"notes\".  An \"issue\" is a comment\nthat must be addressed, one way or another, before the review can be \"accepted.\"\nA \"note\" is simply a note; it has no formal significance.\n\nWhen a new version of the file that is commented is created (by adding\nadditional commits to the review,) one of two things may happen to existing\ncomments: they can be transferred to the new version, if all commented lines are\nidentical in the new version of the file, or marked as \"addressed,\" if any\ncommented line was modified.  When a comment is automatically marked as\n\"addressed,\" it could be because it was in fact properly addressed, or it could\nbe because some unrelated change was made to the commented lines.  In the latter\ncase, the author of the comment (or anyone else, for that matter) may \"reopen\"\nthe comment by manually transferring the comment to a sequence of lines in the\nnew version of the file.\n\nIt may seem as if \"issue\" comments might easily be \"lost\" by unrelated changes\ntouching the commented lines, but everyone involved in the comment (its author\nand anyone who replied to it) will be notified that the comment was marked as\naddressed, and of course, the new changes in the file have to be reviewed as any\nother changes; the fact that they \"addressed\" a comment does not automatically\nmark the changes themselves as approved.\n\n\"Issue\" comments can also be explicitly closed their authors (or reviewers of\nthe file in which the comment was made) after discussion, if the agreement is\nthat no changes to the commented code needs to be made.\n\nReview Progress\n---------------\n\nThe ultimate goal of a review is to close it.  A review can only be closed when\nit is in an \"accepted\" state, which, in turn, it is when each and every chunk of\nchanges have been approved by reviewers, and every \"issue\" comment has been\nmarked either as addressed or closed.\n\nReviews can also be dropped, meaning the changes are not meant to be merged in\ntheir current change.  To drop a currently accepted (but not closed) review,\nyou need to create an issue to \"un-accept\" it.  It is suitable to explain\nwhy the review will be dropped in that issue.\n"
  },
  {
    "path": "documentation/tutorials.txt",
    "content": "Critic Tutorials\n================\n\nMost of Critic's documentation is available as tutorials available in the\ninstalled system, accessed via the \"Tutorial\" link at the top of every page.\n\nThis documentation can also be read without installing Critic first, in the\npublic Critic system used to review Critic itself:\n\n  https://critic-review.org/tutorial\n\nThey can also be found in source form in the tutorials/ directory.  The source\nis in custom text-based format similar to the Markdown markup language.\n"
  },
  {
    "path": "documentation/user_faq.md",
    "content": "Critic User FAQ\n===============\n\n### How can I create a code review for a subset of my topic branch? ###\n\nSuppose that you've done 5 commits on a topic branch and that you\nwant to submit the first 3 to code review now and then continue work\non the topic branch.  After you've pushed the branch to Critic you\nfind it under the 'Branches' menu item in the top level menu.  Critic\nwill now show all the commits on your topic branch and there is a\nbutton called 'Create Review', in the top right corner, that you\ncan use if you want to push the entire branch into code review.\n\nIf you want to push just a single commit into code review, you can\nclick that commit, and then click 'Create Review' on the subsequent\npage.\n\nHowever, at the branch page it's also possible to select a range of\ncommits to be pushed into review.  This is done by selecting\nthe desired commits using the mouse (i.e. click the first commit, hold\ndown the mouse and release over the last commit in the range).\n\n### When reviewing code, how can I select some snippet or function name etc from the code and copy it to the clipboard? (i.e. it's not possible to select text because clicking to select triggers the 'New Issue' dialog) ###\n\nHold down the CTRL button while selecting.\n\n### Can Critic provide some statistics for how much is submitted/reviewed by whom etc? ###\n\nYes, there is a basic statistics page available at <code>http://your.critic.domain.tld/statistics</code>\n\n### How can I see a list of users that have registered as reviewers/watchers for a particular directory? ###\n\nNavigate to: <code>http://your.critic.domain.tld/showfilters?repository=INSERT_REPO_NAME&path=INSERT_PATH</code>\n\nSee for example: https://critic-review.org/showfilters?repository=critic&path=/\n"
  },
  {
    "path": "extend.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport os\nimport sys\n\n# To avoid accidentally creating files owned by root.\nsys.dont_write_bytecode = True\n\n# Python version check is done before imports below so that python\n# 2.6/2.5 users can see the error message.\nimport pythonversion\npythonversion.check()\n\nimport argparse\nimport subprocess\nimport multiprocessing\nimport tempfile\nimport pwd\n\nfrom distutils.version import LooseVersion\n\nimport installation\n\nparser = argparse.ArgumentParser(description=\"Critic extension support installation script\",\n                                 epilog=\"\"\"\\\nCritic extension support is activated by simply running (as root):\n\n    # python extend.py\n\nFor finer control over the script's operation you can invoke it with one\nor more of the action arguments:\n\n  --prereqs, --fetch, --build, --install and --enable\n\nThis can for instance be used to build the v8-jsshell executable on a\nsystem where Critic has not been installed.\"\"\",\n                                 formatter_class=argparse.RawDescriptionHelpFormatter)\n\n# Uses default values for everything that has a default value (and isn't\n# overridden by other command-line arguments) and signals an error for anything\n# that doesn't have a default value and isn't set by a command-line argument.\nparser.add_argument(\"--headless\", help=argparse.SUPPRESS, action=\"store_true\")\n\nclass DefaultBinDir:\n    pass\n\nv8_url = \"git://github.com/v8/v8.git\"\ndepot_tools_url = \"https://chromium.googlesource.com/chromium/tools/depot_tools.git\"\n\nbasic = parser.add_argument_group(\"basic options\")\nbasic.add_argument(\"--etc-dir\", help=\"directory where the Critic system configuration is stored [default=/etc/critic]\", action=\"store\", default=\"/etc/critic\")\nbasic.add_argument(\"--identity\", help=\"system identity to upgrade [default=main]\", action=\"store\", default=\"main\")\nbasic.add_argument(\"--bin-dir\", help=\"directory where the extension host executable is installed [default=/usr/lib/critic/$IDENTITY/bin]\", action=\"store\", default=DefaultBinDir)\nbasic.add_argument(\"--no-compiler-check\", help=\"disable compiler version check\", action=\"store_true\")\nbasic.add_argument(\"--dry-run\", \"-n\", help=\"produce output but don't modify the system at all\", action=\"store_true\")\nbasic.add_argument(\"--libcurl-flavor\", help=\"libcurl flavor (openssl, gnutls or nss) or install\", choices=[\"openssl\", \"gnutls\", \"nss\"])\n\nactions = parser.add_argument_group(\"actions\")\nactions.add_argument(\"--prereqs\", help=\"(check for and) install prerequisite software\", action=\"store_true\")\nactions.add_argument(\"--fetch\", help=\"fetch the extension host source code\", action=\"store_true\")\nactions.add_argument(\"--build\", help=\"build the extension host executable\", action=\"store_true\")\nactions.add_argument(\"--install\", help=\"install the extension host executable\", action=\"store_true\")\nactions.add_argument(\"--enable\", help=\"enable extension support in Critic's configuration\", action=\"store_true\")\n\nactions.add_argument(\"--with-v8-jsshell\", help=\"v8-jsshell repository URL [default=../v8-jsshell.git]\", metavar=\"URL\")\nactions.add_argument(\"--with-v8\", help=\"v8 repository URL [default=%s]\" % v8_url, metavar=\"URL\")\nactions.add_argument(\"--with-depot_tools\", help=\"depot_tools repository URL [default=%s]\" % depot_tools_url, metavar=\"URL\")\n\n# Useful to speed up repeated building from clean repositories; used\n# by the testing framework.\nactions.add_argument(\"--export-v8-dependencies\", help=argparse.SUPPRESS)\nactions.add_argument(\"--import-v8-dependencies\", help=argparse.SUPPRESS)\n\narguments = parser.parse_args()\n\nif arguments.headless:\n    installation.input.headless = True\n\nimport installation\n\nis_root = os.getuid() == 0\n\nprereqs = arguments.prereqs\nfetch = arguments.fetch\nbuild = arguments.build\ninstall = arguments.install\nenable = arguments.enable\n\nif not any([prereqs, fetch, build, install, enable]) \\\n        and arguments.export_v8_dependencies is None \\\n        and arguments.import_v8_dependencies is None:\n    prereqs = fetch = build = install = enable = True\n\nlibcurl = False\n\nif any([prereqs, install, enable]) and not is_root:\n    print \"\"\"\nERROR: You need to run this script as root.\n\"\"\"\n    sys.exit(1)\n\ngit = os.environ.get(\"GIT\", \"git\")\n\nif install or enable:\n    data = installation.utils.read_install_data(arguments)\n\n    if data is not None:\n        git = data[\"installation.prereqs.git\"]\n\n        installed_sha1 = data[\"sha1\"]\n        current_sha1 = installation.utils.run_git([git, \"rev-parse\", \"HEAD\"],\n                                                  cwd=installation.root_dir).strip()\n\n        if installed_sha1 != current_sha1:\n            print \"\"\"\nERROR: You should to run upgrade.py to upgrade to the current commit before\n       using this script to enable extension support.\n\"\"\"\n            sys.exit(1)\n\nif arguments.bin_dir is DefaultBinDir:\n    bin_dir = os.path.join(\"/usr/lib/critic\", arguments.identity, \"bin\")\nelse:\n    bin_dir = arguments.bin_dir\n\nif \"CXX\" in os.environ:\n    compiler = os.environ[\"CXX\"]\n\n    try:\n        subprocess.check_output([compiler, \"--help\"])\n    except OSError as error:\n        print \"\"\"\nERROR: %r (from $CXX) does not appear to be a valid compiler.\n\"\"\" % compiler\n        sys.exit(1)\nelse:\n    compiler = \"g++\"\n\ndef check_libcurl():\n    fd, empty_cc = tempfile.mkstemp(\".cc\")\n    os.close(fd)\n\n    try:\n        subprocess.check_output([compiler, \"-include\", \"curl/curl.h\", \"-c\", empty_cc, \"-o\", \"/dev/null\"],\n                                stderr=subprocess.STDOUT)\n        return True\n    except subprocess.CalledProcessError as error:\n        if \"curl/curl.h\" in error.output:\n            return False\n        raise\n    finally:\n        os.unlink(empty_cc)\n\ndef missing_packages():\n    packages = []\n\n    if not installation.prereqs.find_executable(\"svn\"):\n        packages.append(\"subversion\")\n    if not installation.prereqs.find_executable(\"make\"):\n        packages.append(\"make\")\n    if \"CXX\" not in os.environ and not installation.prereqs.find_executable(\"g++\"):\n        packages.append(\"g++\")\n    pg_config = installation.prereqs.find_executable(\"pg_config\")\n    if pg_config:\n        try:\n            subprocess.check_output([\"pg_config\"], stderr=subprocess.STDOUT)\n        except subprocess.CalledProcessError:\n            # Just installing the PostgreSQL database server might install\n            # a dummy pg_config that just outputs an error message.\n            pg_config = None\n    if not pg_config:\n        packages.append(\"libpq-dev\")\n\n    return packages\n\nif prereqs:\n    packages = missing_packages()\n\n    if packages:\n        installation.prereqs.install_packages(*packages)\n\n    if not check_libcurl():\n        if arguments.libcurl_flavor:\n            installation.prereqs.install_packages(\n                \"libcurl4-%s-dev\" % arguments.libcurl_flavor)\n        else:\n            print \"\"\"\nNo version of libcurl-dev appears to be installed.  There are usually multiple\nversions available to install using different libraries (openssl, gnutls or nss)\nfor secure communication.  If curl is already installed, you probably need to\ninstall a matching version of libcurl-dev.\n\nThis script can install any one of them, or build the extension host executable\nwithout URL loading support (\"none\").\n\nAvailable choices are: \"openssl\", \"gnutls\", \"nss\"\nAlso: \"none\", \"abort\"\n\"\"\"\n\n            def check(string):\n                if string not in (\"openssl\", \"gnutls\", \"nss\", \"none\", \"abort\"):\n                    return 'please answer \"openssl\", \"gnutls\", \"nss\", \"none\" or \"abort\"'\n\n            choice = installation.input.string(\"Install libcurl-dev version?\", \"none\")\n\n            if choice in (\"openssl\", \"gnutls\", \"nss\"):\n                installation.prereqs.install_packages(\"libcurl4-%s-dev\" % choice)\n            elif choice == \"abort\":\n                print \"\"\"\nERROR: Installation aborted.\n\"\"\"\n                sys.exit(1)\n\nenv = os.environ.copy()\n\nif build and not arguments.no_compiler_check:\n    version = subprocess.check_output([compiler, \"--version\"])\n    if version.startswith(\"g++\"):\n        version = subprocess.check_output([compiler, \"-dumpversion\"]).strip()\n        if LooseVersion(version) < LooseVersion(\"4.7\"):\n            print \"\"\"\nERROR: GCC version 4.7 or later required to build v8-jsshell.\nHINT: Set $CXX to use a different compiler than '%s', or use\n  --no-compiler-check to try to build anyway.\n\"\"\" % compiler\n            sys.exit(1)\n    else:\n        if \"clang\" in version:\n            note_clang = \"NOTE: CLang (version 3.2 and earlier) is known not to work.\\n\"\n        else:\n            note_clang = \"\"\n\n        print \"\"\"\nERROR: GCC (version 4.7 or later) required to build v8-jsshell.\n%sHINT: Set $CXX to use a different compiler than '%s', or use\n  --no-compiler-check to try to build anyway.\n\"\"\" % (note_clang, compiler)\n        sys.exit(1)\n\nenv[\"compiler\"] = compiler\nenv[\"v8static\"] = \"yes\"\nenv[\"postgresql\"] = \"yes\"\n\nif check_libcurl():\n    env[\"libcurl\"] = \"yes\"\n\nenv[\"PATH\"] = (os.path.join(os.getcwd(), \"installation/externals/depot_tools\") +\n               \":\" + os.environ[\"PATH\"])\n\nroot = os.path.dirname(os.path.abspath(sys.argv[0]))\nv8_jsshell = os.path.join(root, \"installation/externals/v8-jsshell\")\n\ndef do_unprivileged_work():\n    global depot_tools_url\n\n    if is_root:\n        stat = os.stat(sys.argv[0])\n        os.environ[\"USER\"] = pwd.getpwuid(stat.st_uid).pw_name\n        os.environ[\"HOME\"] = pwd.getpwuid(stat.st_uid).pw_dir\n        os.setgid(stat.st_gid)\n        os.setuid(stat.st_uid)\n\n    if fetch:\n        if arguments.with_depot_tools:\n            depot_tools_url = arguments.with_depot_tools\n\n        if os.path.isdir('installation/externals/depot_tools'):\n            subprocess.check_call(\n                [git, \"pull\"],\n                cwd=\"installation/externals/depot_tools\")\n        else:\n            subprocess.check_call(\n                [git, \"clone\", depot_tools_url],\n                cwd=\"installation/externals\")\n\n        def fetch_submodule(cwd, submodule, url=None):\n            subprocess.check_call(\n                [git, \"submodule\", \"init\", submodule],\n                cwd=cwd)\n            if url:\n                subprocess.check_call(\n                    [git, \"config\", \"submodule.%s.url\" % submodule, url],\n                    cwd=cwd)\n            subprocess.check_call(\n                [git, \"submodule\", \"update\", submodule],\n                cwd=cwd)\n\n        fetch_submodule(root, \"installation/externals/v8-jsshell\",\n                        arguments.with_v8_jsshell)\n        fetch_submodule(v8_jsshell, \"v8\", arguments.with_v8)\n\n    if arguments.import_v8_dependencies or arguments.export_v8_dependencies:\n        argv = [\"make\", \"v8dependencies\"]\n\n        if arguments.import_v8_dependencies:\n            argv.append(\"v8importdepsfrom=\" + arguments.import_v8_dependencies)\n        if arguments.export_v8_dependencies:\n            argv.append(\"v8exportdepsto=\" + arguments.export_v8_dependencies)\n\n        subprocess.check_call(argv, cwd=v8_jsshell)\n\n    if build:\n        subprocess.check_call(\n            [\"make\", \"-j%d\" % multiprocessing.cpu_count()],\n            cwd=v8_jsshell, env=env)\n\ndef checked_unprivileged_work(result):\n    try:\n        do_unprivileged_work()\n    except:\n        result.put(False)\n        raise\n    else:\n        result.put(True)\n\nif fetch or build \\\n        or arguments.import_v8_dependencies \\\n        or arguments.export_v8_dependencies:\n    if is_root:\n        unprivileged_result = multiprocessing.Queue()\n        unprivileged = multiprocessing.Process(target=checked_unprivileged_work,\n                                               args=(unprivileged_result,))\n        unprivileged.start()\n        unprivileged.join()\n\n        if not unprivileged_result.get():\n            sys.exit(1)\n    else:\n        do_unprivileged_work()\n\nif install or enable:\n    etc_path = os.path.join(arguments.etc_dir, arguments.identity)\n\n    sys.path.insert(0, etc_path)\n\n    import configuration\n\n    executable = configuration.extensions.FLAVORS.get(\"js/v8\", {}).get(\"executable\")\n\n    if not executable or not os.access(executable, os.X_OK):\n        executable = os.path.join(bin_dir, \"v8-jsshell\")\n\nif install:\n    if not os.path.isdir(os.path.dirname(executable)):\n        os.makedirs(os.path.dirname(executable))\n\n    subprocess.check_call(\n        [\"install\", os.path.join(v8_jsshell, \"out\", \"jsshell\"), executable])\n\nif enable and not configuration.extensions.ENABLED:\n    try:\n        subprocess.check_output(\n            [\"su\", \"-s\", \"/bin/bash\",\n             \"-c\", \"psql -q -c 'SELECT 1 FROM extensions LIMIT 1'\",\n             configuration.base.SYSTEM_USER_NAME],\n            stderr=subprocess.STDOUT)\n    except subprocess.CalledProcessError:\n        installation.database.psql_import(\n            \"installation/data/dbschema.extensions.sql\",\n            configuration.base.SYSTEM_USER_NAME)\n\n    data = { \"installation.system.username\": configuration.base.SYSTEM_USER_NAME,\n             \"installation.system.groupname\": configuration.base.SYSTEM_GROUP_NAME,\n             \"installation.extensions.enabled\": True,\n             \"installation.extensions.critic_v8_jsshell\": executable,\n             \"installation.extensions.default_flavor\": \"js/v8\" }\n\n    installation.system.fetch_uid_gid()\n\n    installation.paths.mkdir(configuration.extensions.INSTALL_DIR)\n    installation.paths.mkdir(configuration.extensions.WORKCOPY_DIR)\n\n    compilation_failed = []\n\n    if installation.config.update_file(os.path.join(etc_path, \"configuration\"),\n                                       \"extensions.py\", data, arguments,\n                                       compilation_failed):\n        if compilation_failed:\n            print\n            print \"ERROR: Update aborted.\"\n            print\n\n            installation.config.undo()\n            sys.exit(1)\n\n        subprocess.check_call([\"criticctl\", \"restart\"])\n"
  },
  {
    "path": "install.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport os\nimport sys\nimport stat\n\n# To avoid accidentally creating files owned by root.\nsys.dont_write_bytecode = True\n\n# Python version check is done before imports below so\n# that python 2.6/2.5 users can see the error message.\nimport pythonversion\npythonversion.check(\"\"\"\\\nNOTE: This script must be run in the Python interpreter that will be\nused to run Critic.\n\"\"\")\n\nif sys.flags.optimize > 0:\n    print \"\"\"\nERROR: Please run this script without -O or -OO options.\n\"\"\"\n    sys.exit(1)\n\nimport argparse\nimport subprocess\nimport traceback\n\nimport installation\n\nparser = argparse.ArgumentParser(description=\"Critic installation script\")\n\n# Uses default values for everything that has a default value (and isn't\n# overridden by other command-line arguments) and signals an error for anything\n# that doesn't have a default value and isn't set by a command-line argument.\nparser.add_argument(\"--headless\", help=argparse.SUPPRESS, action=\"store_true\")\n\nparser.add_argument(\"--etc-dir\", help=\"directory where the Critic system configuration is stored\", action=\"store\")\nparser.add_argument(\"--install-dir\", help=\"directory where the Critic source code is installed\", action=\"store\")\nparser.add_argument(\"--data-dir\", help=\"directory where Critic's persistent data files are stored\", action=\"store\")\nparser.add_argument(\"--cache-dir\", help=\"directory where Critic's temporary data files are stored\", action=\"store\")\nparser.add_argument(\"--git-dir\", help=\"directory where the main Git repositories are stored\", action=\"store\")\nparser.add_argument(\"--log-dir\", help=\"directory where Critic's log files are stored\", action=\"store\")\nparser.add_argument(\"--run-dir\", help=\"directory where Critic's runtime files are stored\", action=\"store\")\n\nfor module in installation.modules:\n    if hasattr(module, \"add_arguments\"):\n        module.add_arguments(\"install\", parser)\n\narguments = parser.parse_args()\n\nif os.getuid() != 0:\n    print \"\"\"\nERROR: This script must be run as root.\n\"\"\"\n    sys.exit(1)\n\nif os.path.exists(os.path.join(installation.root_dir, \".installed\")):\n    print \"\"\"\nERROR: Found an .installed file in the directory you're installing from.\n\nThis typically means that Critic is already installed on this system, and if so\nthen the upgrade.py script should be used to upgrade the installation rather than\nre-running install.py.\n\"\"\"\n    sys.exit(1)\n\nif arguments.headless:\n    installation.input.headless = True\n\ndef abort():\n    print\n    print \"ERROR: Installation aborted.\"\n    print\n\n    for module in reversed(installation.modules):\n        try:\n            if hasattr(module, \"undo\"):\n                module.undo()\n        except:\n            print >>sys.stderr, \"FAILED: %s.undo()\" % module.__name__\n            traceback.print_exc()\n\n    sys.exit(1)\n\ntry:\n    lifecycle = installation.utils.read_lifecycle()\n\n    if not lifecycle[\"stable\"]:\n        print \"\"\"\nWARNING: You're about to install an unstable development version of Critic.\n\nIf you're setting up a production server, you're most likely better off\ninstalling from the latest stable branch.\n\nThe latest stable branch is the default branch (i.e. HEAD) in Critic's GitHub\nrepository at\n\n  https://github.com/jensl/critic.git\n\nTo interrogate it from the command-line, run\n\n  $ git ls-remote --symref https://github.com/jensl/critic.git HEAD\n\"\"\"\n\n        if not installation.input.yes_or_no(\n                \"Do you want to continue installing the unstable version?\",\n                default=True):\n            print\n            print \"Installation aborted.\"\n            print\n            sys.exit(1)\n\n    sha1 = \"0\" * 40\n\n    # If Git is already installed, check for local modifications.  If Git isn't\n    # installed (no 'git' executable in $PATH) then presumably we're not\n    # installing from a repository clone, but from an exported tree, and in that\n    # case we can't check for local modifications anyway.\n    if installation.prereqs.git.check():\n        git = installation.prereqs.git.path\n\n        try:\n            if installation.utils.run_git([git, \"status\", \"--porcelain\"],\n                                          cwd=installation.root_dir).strip():\n                print \"\"\"\nERROR: This Git repository has local modifications.\n\nInstalling from a Git repository with local changes is not supported.\nPlease commit or stash the changes and then try again.\n\"\"\"\n                sys.exit(1)\n\n            sha1 = installation.utils.run_git([git, \"rev-parse\", \"HEAD\"],\n                                              cwd=installation.root_dir).strip()\n        except subprocess.CalledProcessError:\n            # Probably not a Git repository at all.\n            pass\n\n    data = { \"sha1\": sha1 }\n\n    for module in installation.modules:\n        try:\n            if hasattr(module, \"prepare\") and not module.prepare(\"install\", arguments, data):\n                abort()\n        except KeyboardInterrupt:\n            abort()\n        except SystemExit:\n            raise\n        except:\n            print >>sys.stderr, \"FAILED: %s.prepare()\" % module.__name__\n            traceback.print_exc()\n            abort()\n\n    print\n\n    installed_file = os.path.join(installation.root_dir, \".installed\")\n    with open(installed_file, \"w\"):\n        pass\n    install_py_stat = os.stat(os.path.join(installation.root_dir, \"install.py\"))\n    os.chown(installed_file, install_py_stat.st_uid, install_py_stat.st_gid)\n\n    for module in installation.modules:\n        try:\n            if hasattr(module, \"install\") and not module.install(data):\n                abort()\n        except KeyboardInterrupt:\n            abort()\n        except SystemExit:\n            raise\n        except:\n            print >>sys.stderr, \"FAILED: %s.install()\" % module.__name__\n            traceback.print_exc()\n            abort()\n\n    for module in installation.modules:\n        try:\n            if hasattr(module, \"finish\"):\n                module.finish(\"install\", arguments, data)\n        except:\n            print >>sys.stderr, \"WARNING: %s.finish() failed\" % module.__name__\n            traceback.print_exc()\n\n    installation.utils.write_install_data(arguments, data)\n    installation.utils.clean_root_pyc_files()\n\n    print\n    print \"SUCCESS: Installation complete!\"\n    print\nexcept SystemExit:\n    raise\nexcept:\n    traceback.print_exc()\n    abort()\n"
  },
  {
    "path": "installation/__init__.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\n__doc__ = \"Installation utilities.\"\n\nimport os\nimport sys\n\nquiet = False\nis_quick_start = False\n\nroot_dir = os.path.abspath(os.path.join(os.path.dirname(os.path.abspath(__file__)), \"..\"))\n\nsys.path.insert(0, os.path.join(root_dir, \"src\"))\n\n# Helpers.\nimport input\nimport process\nimport utils\n\n# Modules.\nimport prereqs\nimport system\nimport paths\nimport files\nimport database\nimport smtp\nimport config\nimport httpd\nimport criticctl\nimport admin\nimport initd\nimport prefs\nimport git\nimport migrate\nimport extensions\n\nmodules = [prereqs,\n           system,\n           paths,\n           files,\n           database,\n           extensions,\n           config,\n           httpd,\n           criticctl,\n           admin,\n           smtp,\n           initd,\n           git,\n           migrate,\n           prefs]\n"
  },
  {
    "path": "installation/admin.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport subprocess\n\nimport installation\n\nusername = None\nemail = None\nfullname = None\npassword = None\n\nsystem_recipients = None\n\ndef add_arguments(mode, parser):\n    if mode != \"install\":\n        return\n\n    parser.add_argument(\"--admin-username\", action=\"store\",\n                        help=\"name of Critic administrator user\")\n    parser.add_argument(\"--admin-email\", action=\"store\",\n                        help=\"email address to Critic administrator user\")\n    parser.add_argument(\"--admin-fullname\", action=\"store\",\n                        help=\"Critic administrator user's full name\")\n    parser.add_argument(\"--admin-password\", action=\"store\",\n                        help=\"Critic administrator user's password\")\n\ndef prepare(mode, arguments, data):\n    global username, email, fullname, password\n\n    if mode == \"install\":\n        print \"\"\"\nCritic Installation: Administrator\n==================================\n\nAn administrator user is a Critic user with some special privileges;\nthey can do various things using the Web interface that other users\nare not allowed to do.  Additional administrator users can be added\npost-installation using the 'criticctl' utility.\n\nThis user does not need to match a system user on this machine.\n\"\"\"\n\n        if arguments.admin_username: username = arguments.admin_username\n        else: username = installation.input.string(prompt=\"Administrator user name:\")\n\n        if arguments.admin_email: email = arguments.admin_email\n        else: email = installation.input.string(prompt=\"Administrator email address:\")\n\n        if arguments.admin_fullname: fullname = arguments.admin_fullname\n        else: fullname = installation.input.string(prompt=\"Administrator full name:\")\n\n        if installation.config.auth_mode == \"critic\":\n            if arguments.admin_password: password = arguments.admin_password\n            else: password = installation.input.password(\"Password for '%s':\" % username)\n\n        print \"\"\"\nCritic Installation: System Messages\n====================================\n\nCritic sends out email notifications when unexpected errors (crashes)\noccur, and in various other cases when things happen that the system\nadministrators might need to know about right away.\n\"\"\"\n\n        if arguments.system_recipients:\n            system_recipients = arguments.system_recipients\n        else:\n            system_recipient = installation.input.string(\n                prompt=\"Where should system messages be sent?\",\n                default=\"%s <%s>\" % (fullname, email))\n            system_recipients = [system_recipient]\n\n        data[\"installation.admin.email\"] = email\n    else:\n        import configuration\n\n        try:\n            system_recipients = configuration.base.SYSTEM_RECIPIENTS\n        except AttributeError:\n            system_recipients = [\"%(fullname)s <%(email)s>\" % admin\n                                 for admin in configuration.base.ADMINISTRATORS]\n\n        # The --system-recipients argument, on upgrade, is mostly intended to be\n        # used by the testing framework.  It is checked after the code above has\n        # run for testing purpose; making sure the code above ever runs while\n        # testing is meaningful.\n        if arguments.system_recipients:\n            system_recipients = arguments.system_recipients\n\n    data[\"installation.system.recipients\"] = system_recipients\n\n    return True\n\ndef install(data):\n    global password\n\n    try:\n        criticctl_argv = [installation.criticctl.criticctl_path, \"adduser\",\n                          \"--name\", username,\n                          \"--email\", email,\n                          \"--fullname\", fullname]\n        if not password:\n            criticctl_argv.extend([\"--no-password\"])\n        else:\n            criticctl_argv.extend([\"--password\", password])\n\n        subprocess.check_output(criticctl_argv)\n\n        for role in [\"administrator\", \"repositories\", \"newswriter\"]:\n            subprocess.check_output(\n                [installation.criticctl.criticctl_path, \"addrole\",\n                 \"--name\", username,\n                 \"--role\", role])\n    except subprocess.CalledProcessError:\n        return False\n\n    return True\n"
  },
  {
    "path": "installation/config.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport os\nimport os.path\nimport py_compile\nimport argparse\nimport multiprocessing\n\nimport installation\n\nauth_mode = \"host\"\nsession_type = None\nallow_anonymous_user = None\nweb_server_integration = None\naccess_scheme = None\nrepository_url_types = [\"http\"]\nallow_user_registration = None\nverify_email_addresses = True\narchive_review_branches = True\n\npassword_hash_schemes = [\"pbkdf2_sha256\", \"bcrypt\"]\ndefault_password_hash_scheme = \"pbkdf2_sha256\"\nminimum_password_hash_time = 0.25\nminimum_rounds = {}\nauth_database = \"internal\"\nenable_access_tokens = True\n\nldap_url = \"ldap://ldap.example.com:389\"\nldap_search_base = \"dc=example,dc=com\"\nldap_create_user = True\nldap_username_attribute = \"uid\"\nldap_fullname_attribute = \"cn\"\nldap_email_attribute = \"mail\"\nldap_cache_max_age = 600\n\nis_development = False\nis_testing = False\ncoverage_dir = None\n\nclass Provider(object):\n    def __init__(self, name):\n        self.name = name\n        self.enabled = False\n        self.allow_user_registration = False\n        self.verify_email_addresses = False\n        self.client_id = None\n        self.client_secret = None\n        self.redirect_uri = None\n        self.bypass_createuser = False\n\n    def load(self, settings):\n        if self.name not in settings:\n            return\n        settings = settings[self.name]\n        self.enabled = settings.get(\"enabled\", self.enabled)\n        self.allow_user_registration = settings.get(\"allow_user_registration\",\n                                                    self.allow_user_registration)\n        self.verify_email_addresses = settings.get(\"verify_email_addresses\",\n                                                   self.verify_email_addresses)\n        self.client_id = settings.get(\"client_id\", self.client_id)\n        self.client_secret = settings.get(\"client_secret\", self.client_secret)\n        self.redirect_uri = settings.get(\"redirect_uri\", self.redirect_uri)\n        self.bypass_createuser = settings.get(\"bypass_createuser\",\n                                              self.bypass_createuser)\n\n    def readargs(self, arguments):\n        def getarg(name, default):\n            value = getattr(arguments, name, None)\n            if value is None:\n                return default\n            return value\n\n        self.enabled = getarg(\n            \"provider_%s_enabled\" % self.name, self.enabled)\n        self.allow_user_registration = getarg(\n            \"provider_%s_user_registration\" % self.name,\n            self.allow_user_registration)\n        self.verify_email_addresses = getarg(\n            \"provider_%s_verify_email_addresses\" % self.name,\n            self.verify_email_addresses)\n        self.client_id = getarg(\n            \"provider_%s_client_id\" % self.name, self.client_id)\n        self.client_secret = getarg(\n            \"provider_%s_client_secret\" % self.name, self.client_secret)\n        self.redirect_uri = getarg(\n            \"provider_%s_redirect_uri\" % self.name, self.redirect_uri)\n\n    def store(self, data):\n        base = \"installation.config.provider_%s.\" % self.name\n\n        data[base + \"enabled\"] = self.enabled\n        data[base + \"allow_user_registration\"] = self.allow_user_registration\n        data[base + \"verify_email_addresses\"] = self.verify_email_addresses\n        data[base + \"client_id\"] = self.client_id\n        data[base + \"client_secret\"] = self.client_secret\n        data[base + \"redirect_uri\"] = self.redirect_uri\n        data[base + \"bypass_createuser\"] = self.bypass_createuser\n\n    def scrub(self, data):\n        base = \"installation.config.provider_%s.\" % self.name\n\n        del data[base + \"client_id\"]\n        del data[base + \"client_secret\"]\n\nproviders = []\ndefault_provider_names = [\"github\", \"google\"]\n\ndef calibrate_minimum_rounds():\n    import time\n    import passlib.context\n\n    min_rounds_name = \"%s__min_rounds\" % default_password_hash_scheme\n    min_rounds_value = 100\n\n    while True:\n        calibration_context = passlib.context.CryptContext(\n            schemes=[default_password_hash_scheme],\n            default=default_password_hash_scheme,\n            **{ min_rounds_name: min_rounds_value })\n\n        before = time.time()\n\n        calibration_context.encrypt(\"password\")\n\n        # It's possible encryption was fast enough to measure as zero, or some\n        # other ridiculously small number.  \"Round\" it up to at least one\n        # millisecond for sanity.\n        hash_time = max(0.001, time.time() - before)\n\n        if hash_time >= minimum_password_hash_time:\n            break\n\n        # Multiplication factor.  Make it at least 1.2, to ensure we actually\n        # ever finish this loop, and at most 10, to ensure we don't over-shoot\n        # by too much.\n        factor = max(1.2, min(10.0, minimum_password_hash_time / hash_time))\n\n        min_rounds_value = int(factor * min_rounds_value)\n\n    # If we're upgrading and have a current calibrated value, only change it if\n    # the new value is significantly higher, indicating that the system's\n    # performance has increased, or the hash implementation has gotten faster.\n    if default_password_hash_scheme in minimum_rounds:\n        current_value = minimum_rounds[default_password_hash_scheme]\n        if current_value * 1.5 > min_rounds_value:\n            return\n\n    minimum_rounds[default_password_hash_scheme] = min_rounds_value\n\ndef add_arguments(mode, parser):\n    def H(help_string):\n        # Wrapper to hide arguments when upgrading, but still supporting them.\n        # Primarily we need to support arguments on upgrade for testing, which\n        # might upgrade from a commit that doesn't support an argument, and thus\n        # needs to provide the argument when upgrading to the tested commit.\n        if mode == \"install\":\n            return help_string\n        else:\n            return argparse.SUPPRESS\n\n    parser.add_argument(\n        \"--auth-mode\", choices=[\"host\", \"critic\"],\n        help=H(\"user authentication mode\"))\n    parser.add_argument(\n        \"--session-type\", choices=[\"httpauth\", \"cookie\"],\n        help=H(\"session type\"))\n    parser.add_argument(\n        \"--allow-anonymous-user\", dest=\"anonymous\", action=\"store_const\",\n        const=True, help=H(\"allow limited unauthenticated access\"))\n    parser.add_argument(\n        \"--no-allow-anonymous-user\", dest=\"anonymous\", action=\"store_const\",\n        const=False, help=H(\"do not allow unauthenticated access\"))\n    parser.add_argument(\n        \"--allow-user-registration\", dest=\"user_registration\",\n        action=\"store_const\", const=True,\n        help=H(\"allow unattended user registration\"))\n    parser.add_argument(\n        \"--no-allow-user-registration\", dest=\"user_registration\",\n        action=\"store_const\", const=False,\n        help=H(\"do not allow unattended user registration\"))\n    parser.add_argument(\n        \"--web-server-integration\", choices=[\"apache\", \"nginx+uwsgi\",\n                                             \"uwsgi\", \"none\"],\n        help=H(\"web server to set up and integrate with\"))\n    parser.add_argument(\n        \"--access-scheme\", choices=[\"http\", \"https\", \"both\"],\n        help=H(\"scheme used to access Critic\"))\n    parser.add_argument(\n        \"--repository-url-types\", default=\"http\",\n        help=H(\"comma-separated list of supported repository URL types \"\n               \"(valid types: git, http, ssh and host)\"))\n\n    for provider_name in default_provider_names:\n        if mode == \"install\":\n            group = parser.add_argument_group(\n                \"'%s' authentication provider\" % provider_name)\n        else:\n            group = parser\n\n        group.add_argument(\n            \"--provider-%s-enabled\" % provider_name, action=\"store_const\",\n            const=True, help=H(\"enable authentication provider\"))\n        group.add_argument(\n            \"--provider-%s-disabled\" % provider_name, action=\"store_const\",\n            const=False, dest=\"provider_%s_enabled\" % provider_name,\n            help=H(\"disable authentication provider\"))\n        group.add_argument(\n            \"--provider-%s-user-registration\" % provider_name,\n            action=\"store_const\", const=True,\n            help=H(\"enable new user registration\"))\n        group.add_argument(\n            \"--provider-%s-no-user-registration\" % provider_name,\n            action=\"store_const\", const=False,\n            dest=\"provider_%s_user_registration\" % provider_name,\n            help=H(\"disable new user registration\"))\n        group.add_argument(\n            \"--provider-%s-client-id\" % provider_name, action=\"store\",\n            help=H(\"OAuth2 client id\"))\n        group.add_argument(\n            \"--provider-%s-client-secret\" % provider_name, action=\"store\",\n            help=H(\"OAuth2 client secret\"))\n        group.add_argument(\n            \"--provider-%s-redirect-uri\" % provider_name, action=\"store\",\n            help=H(\"OAuth2 authentication callback URI\"))\n\n    parser.add_argument(\n        \"--minimum-password-hash-time\",\n        help=H(\"approximate minimum time to spend hashing a single password\"))\n\n    # Using argparse.SUPPRESS to not include these in --help output; they are\n    # not something a typical installer ought to want to use.\n    parser.add_argument(\n        \"--is-development\", action=\"store_true\", help=argparse.SUPPRESS)\n    parser.add_argument(\n        \"--is-testing\", action=\"store_true\", help=argparse.SUPPRESS)\n    parser.add_argument(\n        \"--coverage-dir\", help=argparse.SUPPRESS)\n\ndefault_encodings = [\"utf-8\", \"latin-1\"]\n\ndef prepare(mode, arguments, data):\n    global auth_mode, session_type, allow_anonymous_user\n    global web_server_integration, access_scheme\n    global repository_url_types, default_encodings, allow_user_registration\n    global verify_email_addresses, archive_review_branches\n    global password_hash_schemes, default_password_hash_scheme\n    global minimum_password_hash_time, minimum_rounds, auth_database\n    global enable_access_tokens\n    global is_development, is_testing, coverage_dir\n\n    global ldap_url, ldap_search_base, ldap_create_user, ldap_username_attribute\n    global ldap_fullname_attribute, ldap_email_attribute, ldap_cache_max_age\n\n    header_printed = False\n\n    if mode == \"install\":\n        if arguments.minimum_password_hash_time is not None:\n            try:\n                minimum_password_hash_time = float(arguments.minimum_password_hash_time)\n            except ValueError:\n                print (\"Invalid --minimum-password-hash-time argument: %s (must be a number).\"\n                       % arguments.minimum_password_hash_time)\n                return False\n\n        if arguments.repository_url_types:\n            repository_url_types = filter(\n                None, arguments.repository_url_types.split(\",\"))\n            invalid_url_types = []\n            for url_type in repository_url_types:\n                if url_type not in [\"git\", \"http\", \"ssh\", \"host\"]:\n                    invalid_url_types.append(url_type)\n            if invalid_url_types or not repository_url_types:\n                print (\"Invalid --repository-url-types argument: %s\"\n                       % arguments.repository_url_types)\n                if invalid_url_types:\n                    print (\"These types are invalid: %s\"\n                           % \",\".join(invalid_url_types))\n                if not repository_url_types:\n                    print \"No URL types specified!\"\n                return False\n\n        def check_auth_mode(value):\n            if value.strip() not in (\"host\", \"critic\"):\n                return \"must be one of 'host' and 'critic'\"\n\n        if arguments.auth_mode:\n            error = check_auth_mode(arguments.auth_mode)\n            if error:\n                print \"Invalid --auth-mode argument: %s.\" % arguments.auth_mode\n                return False\n            auth_mode = arguments.auth_mode\n        else:\n            header_printed = True\n\n            print \"\"\"\nCritic Installation: Authentication\n===================================\n\nCritic needs to identify (via HTTP authentication) users who access\nthe Web front-end.  This can be handled in two different ways:\n\n  host    The Web server (Apache) handles authentication and Critic\n          only makes use of the user name that it reports via the\n          WSGI API.\n\n  critic  Critic implements HTTP authentication itself using passwords\n          stored (encrypted) in its database.\n\"\"\"\n\n            auth_mode = installation.input.string(\n                \"Which authentication mode should be used?\",\n                default=\"critic\", check=check_auth_mode)\n\n        is_development = arguments.is_development\n        is_testing = arguments.is_testing\n        coverage_dir = arguments.coverage_dir\n    else:\n        import configuration\n\n        auth_mode = configuration.base.AUTHENTICATION_MODE\n\n        try: session_type = configuration.base.SESSION_TYPE\n        except AttributeError: pass\n\n        try: allow_anonymous_user = configuration.base.ALLOW_ANONYMOUS_USER\n        except AttributeError: pass\n\n        try: web_server_integration = configuration.base.WEB_SERVER_INTEGRATION\n        except AttributeError: web_server_integration = \"apache\"\n\n        try: access_scheme = configuration.base.ACCESS_SCHEME\n        except AttributeError: pass\n\n        try: repository_url_types = configuration.base.REPOSITORY_URL_TYPES\n        except AttributeError: pass\n\n        try: default_encodings = configuration.base.DEFAULT_ENCODINGS\n        except AttributeError: pass\n\n        try:\n            password_hash_schemes = configuration.auth.PASSWORD_HASH_SCHEMES\n            default_password_hash_scheme = configuration.auth.DEFAULT_PASSWORD_HASH_SCHEME\n            minimum_password_hash_time = configuration.auth.MINIMUM_PASSWORD_HASH_TIME\n            minimum_rounds = configuration.auth.MINIMUM_ROUNDS\n        except AttributeError:\n            pass\n\n        try:\n            auth_database = configuration.auth.DATABASE\n        except AttributeError:\n            pass\n\n        try:\n            enable_access_tokens = configuration.auth.ENABLE_ACCESS_TOKENS\n        except AttributeError:\n            pass\n\n        try: is_development = configuration.debug.IS_DEVELOPMENT\n        except AttributeError:\n            # Was moved from configuration.base to configuration.debug.\n            try: is_development = configuration.base.IS_DEVELOPMENT\n            except AttributeError: pass\n\n        try: is_testing = configuration.debug.IS_TESTING\n        except AttributeError: is_testing = arguments.is_testing\n\n        try: coverage_dir = configuration.debug.COVERAGE_DIR\n        except AttributeError: pass\n\n        try: allow_user_registration = configuration.base.ALLOW_USER_REGISTRATION\n        except AttributeError: pass\n\n        try: verify_email_addresses = configuration.base.VERIFY_EMAIL_ADDRESSES\n        except AttributeError: pass\n\n        try: archive_review_branches = configuration.base.ARCHIVE_REVIEW_BRANCHES\n        except AttributeError: pass\n\n        try:\n            ldap = configuration.auth.DATABASES[\"ldap\"]\n        except (AttributeError, KeyError):\n            pass\n        else:\n            ldap_url = ldap.get(\"ldap_url\", ldap_url)\n            ldap_search_base = ldap.get(\"ldap_search_base\", ldap_search_base)\n            ldap_create_user = ldap.get(\"ldap_create_user\", ldap_create_user)\n            ldap_username_attribute = ldap.get(\"ldap_username_attribute\",\n                                               ldap_username_attribute)\n            ldap_fullname_attribute = ldap.get(\"ldap_fullname_attribute\",\n                                               ldap_fullname_attribute)\n            ldap_email_attribute = ldap.get(\"ldap_email_attribute\",\n                                            ldap_email_attribute)\n            ldap_cache_max_age = ldap.get(\"ldap_cache_max_age\",\n                                          ldap_cache_max_age)\n\n    if auth_mode == \"critic\":\n        if session_type is None:\n            def check_session_type(value):\n                if value.strip() not in (\"httpauth\", \"cookie\"):\n                    return \"must be one of 'http' and 'cookie'\"\n\n            if arguments.session_type:\n                error = check_session_type(arguments.session_type)\n                if error:\n                    print \"Invalid --session_type argument: %s.\" % arguments.session_type\n                    return False\n                session_type = arguments.session_type\n            else:\n                if not header_printed:\n                    header_printed = True\n                    print \"\"\"\nCritic Installation: Authentication\n===================================\"\"\"\n\n                print \"\"\"\nCritic can authenticate users either via HTTP authentication or via a\n\"Sign in\" form and session cookies.  The major difference is that HTTP\nauthentication requires a valid login to access any page whereas the\nother type of authentication supports limited anonymous access.\n\n  httpauth  Use HTTP authentication.\n\n  cookie    Use session cookie based authentication.\n\"\"\"\n\n                session_type = installation.input.string(\n                    \"Which session type should be used?\",\n                    default=\"cookie\", check=check_session_type)\n\n        if allow_anonymous_user is None:\n            if session_type == \"httpauth\":\n                allow_anonymous_user = False\n            elif arguments.anonymous is not None:\n                allow_anonymous_user = arguments.anonymous\n            else:\n                if not header_printed:\n                    header_printed = True\n                    print \"\"\"\nCritic Installation: Authentication\n===================================\"\"\"\n\n                print \"\"\"\nWith cookie based authentication, Critic can support anonymous access.\nUsers still have to sign in in order to make any changes (such as\nwrite comments in reviews) but will be able to view most information\nin the system without signin in.\n\"\"\"\n\n                allow_anonymous_user = installation.input.yes_or_no(\n                    \"Do you want to allow anonymous access?\", default=True)\n\n        if allow_user_registration is None:\n            if session_type == \"httpauth\":\n                allow_user_registration = False\n            elif arguments.user_registration is not None:\n                allow_user_registration = arguments.user_registration\n            else:\n                if not header_printed:\n                    header_printed = True\n                    print \"\"\"\nCritic Installation: Authentication\n===================================\"\"\"\n\n                print \"\"\"\nWith cookie based authentication, Critic can support unattended user\nregistration.  With this enabled, the \"Sign in\" page has a link to a\npage where a new user can register a Critic user without needing to\ncontact the system administrator(s).\n\"\"\"\n\n                allow_user_registration = installation.input.yes_or_no(\n                    \"Do you want to allow user registration?\", default=False)\n    else:\n        session_type = \"cookie\"\n\n    if web_server_integration is None:\n        if arguments.web_server_integration:\n            web_server_integration = arguments.web_server_integration\n        else:\n            print \"\"\"\nCritic Installation: Web Server Integration\n===========================================\n\nThis installation script can install and do basic configuration of a\nfew different host web servers.  Supported web servers are:\n\n  1) nginx + uWSGI\n\n     Use the nginx web server together with uWSGI as the WSGI\n     application server to actually run Critic.\n\n     This is the recommended option for new installs.\n\n  2) uWSGI\n\n     Use uWSGI as both HTTP(S) front-end and as the WSGI application\n     server to actually run Critic.\n\n  3) Apache + mod_wsgi\n\n     Use the Apache web server and its third-party WSGI module\n     (mod_wsgi) to actually run Critic.  This is the traditional\n     configuration used to run Critic, but mod_wsgi is not actively\n     maintained, and has some known issues.\n\n  4) no integration\n\n     Don't configure any web server.  The installation performed by\n     this script will be incomplete and the system administrator will\n     need to set the integration up themselves.\n\"\"\"\n\n            def check_web_server_integration(value):\n                if value not in (\"1\", \"nginx+uwsgi\",\n                                 \"2\", \"uwsgi\",\n                                 \"3\", \"apache\",\n                                 \"4\", \"none\"):\n                    return (\"must be one of '1'/'nginx+uwsgi', '2'/'uwsgi', \"\n                            \"'3'/'apache' and '4'/'none'\")\n\n            web_server_integration = installation.input.string(\n                \"What web server should be set up?\", default=\"nginx+uwsgi\",\n                check=check_web_server_integration)\n\n            aliases = { \"1\": \"nginx+uwsgi\",\n                        \"2\": \"uwsgi\",\n                        \"3\": \"apache\",\n                        \"4\": \"none\" }\n\n            if web_server_integration in aliases:\n                web_server_integration = aliases[web_server_integration]\n\n    if access_scheme is None:\n        if arguments.access_scheme:\n            access_scheme = arguments.access_scheme\n        else:\n            print \"\"\"\nCritic Installation: Scheme\n===========================\n\nCritic can be set up to be accessed over HTTP, HTTPS, or both.  This\ninstallation script will not do the actual configuration of the host\nweb server (Apache) necessary for it to support the desired schemes\n(in particular HTTPS, which is non-trivial,) but can at least set up\nCritic's Apache site declaration appropriately.\n\nYou have three choices:\n\n  http   Critic will be accessible only over HTTP.\n\n  https  Critic will be accessible only over HTTPS.\n\n  both   Critic will be accessible over both HTTP and HTTPS.\n\nIf you choose \"both\", Critic will redirect all authenticated accesses\nto HTTPS, to avoid sending credentials over plain text connections.\"\"\"\n\n            if allow_anonymous_user:\n                print \"\"\"\\\nAnonymous users will be allowed to access the site over HTTP, though.\nIf this is not desirable, you should select \"https\" and configure the\nweb server to redirect all HTTP accesses to HTTPS.\n\"\"\"\n            else:\n                print\n\n            def check_access_scheme(value):\n                if value not in (\"http\", \"https\", \"both\"):\n                    return \"must be one of 'http', 'https' and 'both'\"\n\n            access_scheme = installation.input.string(\n                \"How will Critic be accessed?\", default=\"http\",\n                check=check_access_scheme)\n\n    if mode == \"upgrade\" \\\n           and hasattr(configuration, \"auth\") \\\n           and hasattr(configuration.auth, \"PROVIDERS\"):\n        for provider_name in configuration.auth.PROVIDERS:\n            provider = Provider(provider_name)\n            provider.load(configuration.auth.PROVIDERS)\n            providers.append(provider)\n    else:\n        providers.extend(Provider(provider_name)\n                         for provider_name in default_provider_names)\n\n    if access_scheme == \"http\":\n        base_url = \"http\"\n    else:\n        base_url = \"https\"\n\n    base_url += \"://%s/oauth/\" % installation.system.hostname\n\n    for provider in providers:\n        provider.readargs(arguments)\n        if provider.redirect_uri is None:\n            provider.redirect_uri = base_url + provider.name\n\n    data[\"installation.config.auth_mode\"] = auth_mode\n    data[\"installation.config.session_type\"] = session_type\n    data[\"installation.config.allow_anonymous_user\"] = allow_anonymous_user\n    data[\"installation.config.web_server_integration\"] = web_server_integration\n    data[\"installation.config.access_scheme\"] = access_scheme\n    data[\"installation.config.repository_url_types\"] = repository_url_types\n    data[\"installation.config.default_encodings\"] = default_encodings\n    data[\"installation.config.allow_user_registration\"] = allow_user_registration\n    data[\"installation.config.verify_email_addresses\"] = verify_email_addresses\n    data[\"installation.config.archive_review_branches\"] = archive_review_branches\n\n    data[\"installation.config.password_hash_schemes\"] = password_hash_schemes\n    data[\"installation.config.default_password_hash_scheme\"] = default_password_hash_scheme\n    data[\"installation.config.minimum_password_hash_time\"] = minimum_password_hash_time\n    data[\"installation.config.auth_database\"] = auth_database\n    data[\"installation.config.enable_access_tokens\"] = enable_access_tokens\n\n    data[\"installation.config.is_quickstart\"] = False\n    data[\"installation.config.is_development\"] = is_development\n    data[\"installation.config.is_testing\"] = is_testing\n    data[\"installation.config.coverage_dir\"] = coverage_dir\n\n    if mode == \"upgrade\":\n        data[\"installation.config.highlight.max_workers\"] = \\\n            configuration.services.HIGHLIGHT[\"max_workers\"]\n        data[\"installation.config.changeset.max_workers\"] = \\\n            configuration.services.CHANGESET[\"max_workers\"]\n    else:\n        cpu_count = multiprocessing.cpu_count()\n\n        data[\"installation.config.highlight.max_workers\"] = cpu_count\n        data[\"installation.config.changeset.max_workers\"] = max(1, cpu_count / 2)\n\n    for provider in providers:\n        provider.store(data)\n\n    data[\"installation.config.ldap_url\"] = ldap_url\n    data[\"installation.config.ldap_search_base\"] = ldap_search_base\n    data[\"installation.config.ldap_create_user\"] = ldap_create_user\n    data[\"installation.config.ldap_username_attribute\"] = ldap_username_attribute\n    data[\"installation.config.ldap_fullname_attribute\"] = ldap_fullname_attribute\n    data[\"installation.config.ldap_email_attribute\"] = ldap_email_attribute\n    data[\"installation.config.ldap_cache_max_age\"] = ldap_cache_max_age\n\n    return True\n\ncreated_file = []\ncreated_dir = []\nrenamed = []\nmodified_files = 0\n\ndef compile_file(filename):\n    global created_file\n    try:\n        path = os.path.join(installation.paths.etc_dir, \"main\", filename)\n        with installation.utils.as_critic_system_user():\n            py_compile.compile(path, doraise=True)\n    except py_compile.PyCompileError as error:\n        print \"\"\"\nERROR: Failed to compile %s:\\n%s\n\"\"\" % (filename, error)\n        return False\n    else:\n        created_file.append(path + \"c\")\n        return True\n\ndef set_file_mode_and_owner(path):\n    uid = installation.system.uid\n    gid = installation.system.gid\n\n    filename = os.path.basename(path)\n    if filename in (\"database.py\", \"auth.py\", \"smtp-credentials.json\"):\n        # May contain sensitive information.\n        mode = 0600\n        if filename == \"smtp-credentials.json\":\n            uid = gid = 0\n    else:\n        mode = 0640\n\n    os.chmod(path, mode)\n\n    if not installation.is_quick_start:\n        os.chown(path, uid, gid)\n\ndef copy_file_mode_and_owner(src_path, dst_path):\n    status = os.stat(src_path)\n\n    os.chmod(dst_path, status.st_mode)\n    os.chown(dst_path, status.st_uid, status.st_gid)\n\ndef install(data):\n    if auth_mode == \"critic\":\n        calibrate_minimum_rounds()\n    data[\"installation.config.minimum_rounds\"] = minimum_rounds\n\n    source_dir = os.path.join(installation.root_dir, \"installation\", \"templates\", \"configuration\")\n    target_dir = os.path.join(installation.paths.etc_dir, \"main\", \"configuration\")\n    compilation_failed = False\n\n    os.mkdir(target_dir, 0750)\n    created_dir.append(target_dir)\n\n    os.chown(target_dir, installation.system.uid, installation.system.gid)\n\n    for entry in os.listdir(source_dir):\n        source_path = os.path.join(source_dir, entry)\n        target_path = os.path.join(target_dir, entry)\n\n        with open(target_path, \"w\") as target:\n            created_file.append(target_path)\n\n            with open(source_path, \"r\") as source:\n                target.write((source.read().decode(\"utf-8\") % data).encode(\"utf-8\"))\n\n        set_file_mode_and_owner(target_path)\n\n        if entry.endswith(\".py\"):\n            path = os.path.join(\"configuration\", entry)\n            if not compile_file(path):\n                compilation_failed = True\n            else:\n                copy_file_mode_and_owner(target_path, target_path + \"c\")\n\n    if compilation_failed:\n        return False\n\n    # Make the newly written 'configuration' module available to the rest of the\n    # installation script(s).\n    sys.path.insert(0, os.path.join(installation.paths.etc_dir, \"main\"))\n\n    return True\n\ndef update_file(target_dir, entry, data, arguments, compilation_failed):\n    global modified_files\n\n    import configuration\n\n    source_dir = os.path.join(installation.root_dir, \"installation\", \"templates\", \"configuration\")\n    compilation_failed = False\n\n    source_path = os.path.join(source_dir, entry)\n    target_path = os.path.join(target_dir, entry)\n    backup_path = os.path.join(target_dir, \"_\" + entry)\n\n    source = open(source_path, \"r\").read().decode(\"utf-8\") % data\n\n    if not os.path.isfile(target_path):\n        write_target = True\n    else:\n        if open(target_path).read().decode(\"utf-8\") == source:\n            return False\n\n        def generateVersion(label, path):\n            if label == \"updated\":\n                with open(path, \"w\") as target:\n                    target.write(source.encode(\"utf-8\"))\n\n        update_query = installation.utils.UpdateModifiedFile(\n            arguments,\n            message=\"\"\"\\\nA configuration file is about to be updated.  Please check that no\nlocal modifications are being overwritten.\n\nCurrent version: %(current)s\nUpdated version: %(updated)s\n\nPlease note that if any configuration options were added in the\nupdated version, the system will most likely break if you do not\neither install the updated version or manually transfer the new\nconfiguration options to the existing version.\n\"\"\",\n            versions={ \"current\": target_path,\n                       \"updated\": target_path + \".new\" },\n            options=[ (\"i\", \"install the updated version\"),\n                      (\"k\", \"keep the current version\"),\n                      (\"d\", (\"current\", \"updated\")) ],\n            generateVersion=generateVersion)\n\n        write_target = update_query.prompt() == \"i\"\n\n    if write_target:\n        print \"Updated file: %s\" % target_path\n\n        if not arguments.dry_run:\n            if os.path.isfile(target_path):\n                os.rename(target_path, backup_path)\n                renamed.append((target_path, backup_path))\n\n            with open(target_path, \"w\") as target:\n                created_file.append(target_path)\n                target.write(source.encode(\"utf-8\"))\n\n            set_file_mode_and_owner(target_path)\n\n            if target_path.endswith(\".py\"):\n                path = os.path.join(\"configuration\", entry)\n                if not compile_file(path):\n                    compilation_failed.append(path)\n                else:\n                    copy_file_mode_and_owner(target_path, target_path + \"c\")\n\n                    # The module's name (relative the 'configuration' package)\n                    # is the base name minus the trailing \".py\".\n                    module_name = os.path.basename(target_path)[:-3]\n\n                    if module_name != \"__init__\" \\\n                            and hasattr(configuration, module_name):\n                        # Reload the updated module so that code executing later\n                        # sees added configuration options.  (It will also see\n                        # removed configuration options, but that is unlikely to\n                        # be a problem.)\n                        reload(getattr(configuration, module_name))\n\n        modified_files += 1\n\n    return True\n\ndef upgrade(arguments, data):\n    global modified_files\n\n    import configuration\n\n    if auth_mode == \"critic\":\n        calibrate_minimum_rounds()\n    data[\"installation.config.minimum_rounds\"] = minimum_rounds\n\n    source_dir = os.path.join(installation.root_dir, \"installation\", \"templates\", \"configuration\")\n    target_dir = os.path.join(data[\"installation.paths.etc_dir\"], arguments.identity, \"configuration\")\n    compilation_failed = []\n\n    no_changes = True\n\n    for entry in os.listdir(source_dir):\n        if update_file(target_dir, entry, data, arguments, compilation_failed):\n            no_changes = False\n\n    if compilation_failed:\n        return False\n\n    if no_changes:\n        print \"No changed configuration files.\"\n\n    if modified_files:\n        reload(configuration)\n\n    return True\n\ndef undo():\n    map(os.unlink, reversed(created_file))\n    map(os.rmdir, reversed(created_dir))\n\n    for target, backup in renamed: os.rename(backup, target)\n\ndef finish(mode, arguments, data):\n    for target, backup in renamed: os.unlink(backup)\n\n    for provider in providers:\n        provider.scrub(data)\n"
  },
  {
    "path": "installation/criticctl.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport installation\nimport os\nimport os.path\n\ncriticctl_path = None\ncreated_file = []\nrenamed = []\n\ndef install(data):\n    global criticctl_path\n\n    source_path = os.path.join(installation.root_dir, \"installation\", \"templates\", \"criticctl\")\n    target_path = criticctl_path = os.path.join(installation.paths.bin_dir, \"criticctl\")\n\n    with open(target_path, \"w\") as target:\n        created_file.append(target_path)\n\n        os.chmod(target_path, 0755)\n\n        with open(source_path, \"r\") as source:\n            target.write((source.read().decode(\"utf-8\") % data).encode(\"utf-8\"))\n\n    return True\n\ndef upgrade(arguments, data):\n    target_path = os.path.join(installation.paths.bin_dir, \"criticctl\")\n    backup_path = installation.utils.update_from_template(\n        arguments, data,\n        template_path=\"installation/templates/criticctl\",\n        target_path=target_path,\n        message=\"\"\"\\\nThe criticctl utility is about to be updated.  Please check that no local\nmodifications are being overwritten.\n\n%(versions)s\n\nPlease note that if the modifications are not installed, the criticctl utility\nis likely to stop working.\n\"\"\")\n\n    if backup_path:\n        created_file.append(target_path)\n        renamed.append((target_path, backup_path))\n\n    return True\n\ndef undo():\n    map(os.unlink, created_file)\n\n    for target, backup in renamed:\n        os.rename(backup, target)\n\ndef finish(mode, arguments, data):\n    for target, backup in renamed:\n        os.unlink(backup)\n"
  },
  {
    "path": "installation/data/comments.pgsql",
    "content": "-- -*- mode: sql -*-\n--\n-- Copyright 2012 Jens Lindström, Opera Software ASA\n--\n-- Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n-- use this file except in compliance with the License.  You may obtain a copy of\n-- the License at\n--\n--   http://www.apache.org/licenses/LICENSE-2.0\n--\n-- Unless required by applicable law or agreed to in writing, software\n-- distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n-- WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n-- License for the specific language governing permissions and limitations under\n-- the License.\n\nCREATE OR REPLACE FUNCTION chaincomments(chain_id INTEGER) RETURNS INTEGER AS\n$$\nDECLARE\n  result INTEGER;\nBEGIN\n  SELECT COUNT(*) INTO STRICT result FROM comments WHERE chain=chain_id AND state='current';\n  RETURN result;\nEND;\n$$\nLANGUAGE 'plpgsql';\n\nCREATE OR REPLACE FUNCTION chainunread(chain_id INTEGER, user_id INTEGER) RETURNS INTEGER AS\n$$\nDECLARE\n  result INTEGER;\nBEGIN\n  SELECT COUNT(*) INTO STRICT result FROM commentstoread JOIN comments ON (comments.id=commentstoread.comment) WHERE comments.chain=chain_id AND comments.state='current' AND commentstoread.uid=user_id;\n  RETURN result;\nEND;\n$$\nLANGUAGE 'plpgsql';"
  },
  {
    "path": "installation/data/dbschema.base.sql",
    "content": "-- -*- mode: sql -*-\n--\n-- Copyright 2015 the Critic contributors, Opera Software ASA\n--\n-- Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n-- use this file except in compliance with the License.  You may obtain a copy of\n-- the License at\n--\n--   http://www.apache.org/licenses/LICENSE-2.0\n--\n-- Unless required by applicable law or agreed to in writing, software\n-- distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n-- WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n-- License for the specific language governing permissions and limitations under\n-- the License.\n\n-- Disable notices about implicitly created indexes and sequences.\nSET client_min_messages TO WARNING;\n\nCREATE TABLE systemidentities\n  ( key VARCHAR(32) PRIMARY KEY,\n    name VARCHAR(64) UNIQUE,\n    anonymous_scheme VARCHAR(5) NOT NULL,\n    authenticated_scheme VARCHAR(5) NOT NULL,\n    hostname VARCHAR(265) NOT NULL,\n    description VARCHAR(256) NOT NULL,\n    installed_sha1 CHAR(40) NOT NULL,\n    installed_at TIMESTAMP DEFAULT NOW() NOT NULL );\n\nCREATE TABLE files\n  ( id SERIAL PRIMARY KEY,\n    path TEXT NOT NULL );\n\n-- Index used to enforce uniqueness, and for quick lookup of single\n-- paths (using \"SELECT id FROM files WHERE MD5(path)=MD5(...)\".\nCREATE UNIQUE INDEX files_path_md5 ON files (MD5(path));\n\n-- Index used for path searches, for instance when searching for\n-- reviews that touch files in a certain directory.\nCREATE INDEX files_path_gin ON files USING gin (STRING_TO_ARRAY(path, '/'));\n\nCREATE TABLE knownremotes\n  ( url VARCHAR(256) PRIMARY KEY,\n    -- True if this remote has a post-update hook (or similar) that contacts the\n    -- branchtrackerhook service and triggers immediate updates of tracked\n    -- branches.\n    pushing BOOLEAN NOT NULL\n  );\n\nCREATE TABLE timezones\n  ( name VARCHAR(256) PRIMARY KEY,\n    abbrev VARCHAR(16) NOT NULL,\n    utc_offset INTERVAL NOT NULL );\n\nINSERT INTO timezones (name, abbrev, utc_offset)\n     VALUES ('Universal/UTC', 'UTC', INTERVAL '0');\n"
  },
  {
    "path": "installation/data/dbschema.changesets.sql",
    "content": "-- -*- mode: sql -*-\n--\n-- Copyright 2015 the Critic contributors, Opera Software ASA\n--\n-- Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n-- use this file except in compliance with the License.  You may obtain a copy of\n-- the License at\n--\n--   http://www.apache.org/licenses/LICENSE-2.0\n--\n-- Unless required by applicable law or agreed to in writing, software\n-- distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n-- WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n-- License for the specific language governing permissions and limitations under\n-- the License.\n\n-- Disable notices about implicitly created indexes and sequences.\nSET client_min_messages TO WARNING;\n\nCREATE TYPE changesettype AS ENUM\n  ( 'direct',     -- Plain diff between immediate parent and child (including\n                  -- cases where the child commit has other parents.)\n    'custom',     -- Plain diff between any other two commits.\n    'merge',      -- Relevance filtered merge diff between immediate parent and\n                  -- child where child has other parents.\n    'conflicts'); -- Diff between two merge commits, one automatically generated\n                  -- and one \"real.\"  The automatically generated merge commit\n                  -- is created without resolving any conflicts (the conflict\n                  -- markers inserted by \"git merge\" are committed as-is.)\nCREATE TABLE changesets\n  ( id SERIAL PRIMARY KEY,\n    parent INTEGER REFERENCES commits ON DELETE CASCADE,\n    child INTEGER NOT NULL REFERENCES commits ON DELETE CASCADE,\n    type changesettype NOT NULL,\n\n    UNIQUE (parent, child, type) );\nCREATE INDEX changesets_child ON changesets (child);\n\nCREATE TABLE customchangesets\n  ( changeset INTEGER PRIMARY KEY REFERENCES changesets ON DELETE CASCADE,\n    time TIMESTAMP );\n\nCREATE TABLE mergereplays\n  ( original INTEGER PRIMARY KEY REFERENCES commits ON DELETE CASCADE,\n    replay INTEGER NOT NULL REFERENCES commits ON DELETE CASCADE );\n\nCREATE TABLE fileversions\n  ( changeset INTEGER NOT NULL REFERENCES changesets ON DELETE CASCADE,\n    file INTEGER NOT NULL REFERENCES files,\n    old_sha1 CHAR(40),\n    new_sha1 CHAR(40),\n    old_mode CHAR(6),\n    new_mode CHAR(6),\n\n    PRIMARY KEY (changeset, file) );\nCREATE INDEX fileversions_old_sha1 ON fileversions (file, old_sha1);\nCREATE INDEX fileversions_new_sha1 ON fileversions (file, new_sha1);\n\nCREATE TABLE chunks\n  ( id SERIAL PRIMARY KEY,\n    changeset INTEGER NOT NULL REFERENCES changesets ON DELETE CASCADE,\n    file INTEGER NOT NULL REFERENCES files,\n\n    deleteOffset INTEGER NOT NULL,\n    deleteCount INTEGER NOT NULL,\n    insertOffset INTEGER NOT NULL,\n    insertCount INTEGER NOT NULL,\n    analysis TEXT,\n    whitespace INTEGER NOT NULL );\nCREATE INDEX chunks_changeset_file ON chunks (changeset, file);\n\nCREATE TABLE codecontexts\n  ( sha1 CHAR(40),\n    context VARCHAR(256) NOT NULL,\n    first_line INTEGER NOT NULL,\n    last_line INTEGER NOT NULL );\nCREATE INDEX codecontexts_sha1_first_last ON codecontexts (sha1, first_line, last_line);\n"
  },
  {
    "path": "installation/data/dbschema.comments.sql",
    "content": "-- -*- mode: sql -*-\n--\n-- Copyright 2012 Jens Lindström, Opera Software ASA\n--\n-- Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n-- use this file except in compliance with the License.  You may obtain a copy of\n-- the License at\n--\n--   http://www.apache.org/licenses/LICENSE-2.0\n--\n-- Unless required by applicable law or agreed to in writing, software\n-- distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n-- WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n-- License for the specific language governing permissions and limitations under\n-- the License.\n\n-- Disable notices about implicitly created indexes and sequences.\nSET client_min_messages TO WARNING;\n\nCREATE TYPE commentchaintype AS ENUM\n  ( 'issue',  -- The comment chain, while open, blocks the review.\n    'note'    -- The comment chain doesn't block the review.\n  );\nCREATE TYPE commentchainstate AS ENUM\n  ( 'draft',    -- The comment chain (and all it's comments) are drafts.\n    'open',     -- The comment chain is open.\n    'addressed',-- The commented code was changed by a later commit.\n    'closed',   -- The comment chain is closed.\n    'empty'     -- The comment chain has no comments.\n  );\nCREATE TYPE commentchainorigin AS ENUM\n  ( 'old',  -- The user commented the old/left-hand side in a diff.\n    'new'   -- The user commented the new/right-hand side in a diff.\n  );\n\nCREATE TABLE commentchains\n  ( id SERIAL PRIMARY KEY,\n    review INTEGER NOT NULL REFERENCES reviews,\n    batch INTEGER REFERENCES batches ON DELETE CASCADE,\n    uid INTEGER NOT NULL REFERENCES users,\n    time TIMESTAMP NOT NULL DEFAULT NOW(),\n    type commentchaintype NOT NULL DEFAULT 'issue',\n    state commentchainstate NOT NULL DEFAULT 'draft',\n    origin commentchainorigin,\n    file INTEGER REFERENCES files,\n    first_commit INTEGER REFERENCES commits,\n    last_commit INTEGER REFERENCES commits,\n    closed_by INTEGER REFERENCES users,\n    addressed_by INTEGER REFERENCES commits,\n    first_comment INTEGER ); -- Foreign key constraint \"REFERENCES comments\" set up later.\nCREATE INDEX commentchains_review_file ON commentchains(review, file);\nCREATE INDEX commentchains_review_type_state ON commentchains(review, type, state);\nCREATE INDEX commentchains_batch ON commentchains(batch);\n\n-- FIXME: This circular relation is unnecessary.  Should have a separate table\n-- for mapping batches to comments intead.\nALTER TABLE batches ADD CONSTRAINT batches_comment_fkey FOREIGN KEY (comment) REFERENCES commentchains ON DELETE CASCADE;\n\nCREATE TYPE commentchainchangestate AS ENUM\n  ( 'draft',     -- This change hasn't been performed yet.\n    'performed', -- The change has been performed.\n    'rejected'   -- The change was rejected; affected comment chain wasn't in\n                 -- expected state.\n  );\nCREATE TABLE commentchainchanges\n  ( batch INTEGER REFERENCES batches ON DELETE CASCADE,\n    uid INTEGER NOT NULL REFERENCES users ON DELETE CASCADE,\n    chain INTEGER NOT NULL REFERENCES commentchains ON DELETE CASCADE,\n    time TIMESTAMP NOT NULL DEFAULT NOW(),\n    state commentchainchangestate NOT NULL DEFAULT 'draft',\n    from_type commentchaintype,\n    to_type commentchaintype,\n    from_state commentchainstate,\n    to_state commentchainstate,\n    from_last_commit INTEGER REFERENCES commits,\n    to_last_commit INTEGER REFERENCES commits,\n    from_addressed_by INTEGER REFERENCES commits,\n    to_addressed_by INTEGER REFERENCES commits );\nCREATE INDEX commentchainchanges_batch ON commentchainchanges(batch);\nCREATE INDEX commentchainchanges_chain ON commentchainchanges(chain);\n\nCREATE TYPE commentchainlinesstate AS ENUM\n  ( 'draft',\n    'current'\n  );\n\nCREATE TABLE commentchainlines\n  ( chain INTEGER NOT NULL REFERENCES commentchains ON DELETE CASCADE,\n    uid INTEGER REFERENCES users,\n    time TIMESTAMP NOT NULL DEFAULT NOW(),\n    state commentchainlinesstate NOT NULL DEFAULT 'draft',\n    sha1 CHAR(40) NOT NULL,\n    first_line INTEGER NOT NULL,\n    last_line INTEGER NOT NULL,\n\n    -- This UNIQUE constraint is a bit fishy; it means two different users\n    -- can't have a draft \"reopening\" of the commentchain at the same time,\n    -- which strictly speaking wouldn't necessarily be a problem.\n    UNIQUE (chain, sha1) );\nCREATE INDEX commentchainlines_chain_sha1 ON commentchainlines(chain, sha1);\n\nCREATE TABLE commentchainusers\n  ( chain INTEGER NOT NULL REFERENCES commentchains ON DELETE CASCADE,\n    uid INTEGER NOT NULL REFERENCES users,\n\n    PRIMARY KEY (chain, uid) );\n\nCREATE TYPE commentstate AS ENUM\n  ( 'draft',   -- The comment is a draft.\n    'current', -- The comment is currently displayed.\n    'edited',  -- The comment was edited (that is, replaced by another\n               -- comment whose 'edit_of' field references this.)\n    'deleted'  -- The comment was deleted and is not displayed.\n  );\nCREATE TABLE comments\n  ( id SERIAL PRIMARY KEY,\n    chain INTEGER NOT NULL REFERENCES commentchains ON DELETE CASCADE,\n    batch INTEGER REFERENCES batches ON DELETE CASCADE,\n    uid INTEGER NOT NULL REFERENCES users,\n    time TIMESTAMP NOT NULL DEFAULT NOW(),\n    state commentstate NOT NULL,\n\n    comment TEXT,\n    code TEXT );\nCREATE INDEX comments_chain_uid_state ON comments (chain, uid, state);\nCREATE INDEX comments_batch ON comments(batch);\nCREATE INDEX comments_id_chain ON comments(id, chain);\n\n-- FIXME: This is an unfortunate circular relation.  It's here to optimize\n-- accessing a group of comment chains and their first comment (i.e. accessing\n-- comments but not their replies.)  This matters (supposedly) when loading\n-- review front-pages, but it's questionable whether this is really necessary.\nALTER TABLE commentchains ADD CONSTRAINT commentchains_first_comment_fkey FOREIGN KEY (first_comment) REFERENCES comments;\n\nCREATE TABLE commentstoread\n  ( uid INTEGER NOT NULL REFERENCES users ON DELETE CASCADE,\n    comment INTEGER NOT NULL REFERENCES comments ON DELETE CASCADE,\n\n    PRIMARY KEY (uid, comment) );\nCREATE INDEX commentstoread_comment ON commentstoread(comment);\n\nCREATE TABLE commentmessageids\n  ( uid INTEGER NOT NULL REFERENCES users ON DELETE CASCADE,\n    comment INTEGER NOT NULL REFERENCES comments ON DELETE CASCADE,\n    messageid CHAR(24) NOT NULL,\n\n    PRIMARY KEY (uid, comment) );\nCREATE INDEX commentmessageids_comment ON commentmessageids(comment);\n"
  },
  {
    "path": "installation/data/dbschema.extensions.sql",
    "content": "-- -*- mode: sql -*-\n--\n-- Copyright 2012 Jens Lindström, Opera Software ASA\n--\n-- Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n-- use this file except in compliance with the License.  You may obtain a copy of\n-- the License at\n--\n--   http://www.apache.org/licenses/LICENSE-2.0\n--\n-- Unless required by applicable law or agreed to in writing, software\n-- distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n-- WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n-- License for the specific language governing permissions and limitations under\n-- the License.\n\n-- Disable notices about implicitly created indexes and sequences.\nSET client_min_messages TO WARNING;\n\nCREATE TABLE extensions\n  ( id SERIAL PRIMARY KEY,\n    author INTEGER REFERENCES users, -- NULL means system extension\n    name VARCHAR(64) NOT NULL,\n\n    UNIQUE (author, name) );\n\nCREATE TABLE extensionversions\n  ( id SERIAL PRIMARY KEY,\n    extension INTEGER NOT NULL REFERENCES extensions,\n    name VARCHAR(256) NOT NULL,\n    sha1 CHAR(40) NOT NULL,\n\n    UNIQUE (sha1) );\n\n-- Installed extensions.\n-- If uid=NULL, it is a \"universal install\" (affecting all users.)\n-- If version=NULL, the \"LIVE\" version is installed.\nCREATE TABLE extensioninstalls\n  ( id SERIAL PRIMARY KEY,\n    uid INTEGER REFERENCES users,\n    extension INTEGER NOT NULL REFERENCES extensions,\n    version INTEGER REFERENCES extensionversions,\n\n    UNIQUE (uid, extension) );\n\nCREATE TABLE extensionroles\n  ( id SERIAL PRIMARY KEY,\n    version INTEGER NOT NULL REFERENCES extensionversions,\n    script VARCHAR(64) NOT NULL,\n    function VARCHAR(64) NOT NULL );\n\nCREATE TABLE extensionpageroles\n  ( role INTEGER NOT NULL REFERENCES extensionroles ON DELETE CASCADE,\n    path VARCHAR(64) NOT NULL );\n\nCREATE VIEW extensionroles_page AS\n  SELECT version, path, script, function\n    FROM extensionroles\n    JOIN extensionpageroles ON (role=id);\n\nCREATE TABLE extensioninjectroles\n  ( role INTEGER NOT NULL REFERENCES extensionroles ON DELETE CASCADE,\n    path VARCHAR(64) NOT NULL );\n\nCREATE VIEW extensionroles_inject AS\n  SELECT version, path, script, function\n    FROM extensionroles\n    JOIN extensioninjectroles ON (role=id);\n\nCREATE TABLE extensionprocesscommitsroles\n  ( role INTEGER NOT NULL REFERENCES extensionroles ON DELETE CASCADE );\n\nCREATE TABLE extensionfilterhookroles\n  ( role INTEGER NOT NULL REFERENCES extensionroles ON DELETE CASCADE,\n    name VARCHAR(64) NOT NULL,\n    title VARCHAR(64) NOT NULL,\n    role_description TEXT,\n    data_description TEXT );\n\nCREATE TABLE extensionhookfilters\n  ( id SERIAL PRIMARY KEY,\n    uid INTEGER NOT NULL REFERENCES users ON DELETE CASCADE,\n    extension INTEGER NOT NULL REFERENCES extensions ON DELETE CASCADE,\n    repository INTEGER NOT NULL REFERENCES repositories ON DELETE CASCADE,\n    name VARCHAR(64) NOT NULL,\n    path TEXT NOT NULL,\n    data TEXT );\nCREATE INDEX extensionhookfilters_uid_extension\n          ON extensionhookfilters (uid, extension);\nCREATE INDEX extensionhookfilters_repository\n          ON extensionhookfilters (repository);\n\nCREATE TABLE extensionfilterhookevents\n  ( id SERIAL PRIMARY KEY,\n    filter INTEGER NOT NULL REFERENCES extensionhookfilters ON DELETE CASCADE,\n    review INTEGER NOT NULL REFERENCES reviews ON DELETE CASCADE,\n    uid INTEGER NOT NULL REFERENCES users ON DELETE CASCADE,\n    data TEXT );\nCREATE TABLE extensionfilterhookcommits\n  ( event INTEGER NOT NULL REFERENCES extensionfilterhookevents ON DELETE CASCADE,\n    commit INTEGER NOT NULL REFERENCES commits );\nCREATE INDEX extensionfilterhookcommits_event\n          ON extensionfilterhookcommits (event);\nCREATE TABLE extensionfilterhookfiles\n  ( event INTEGER NOT NULL REFERENCES extensionfilterhookevents ON DELETE CASCADE,\n    file INTEGER NOT NULL REFERENCES files );\nCREATE INDEX extensionfilterhookfiles_event\n          ON extensionfilterhookfiles (event);\n\nCREATE TABLE extensionstorage\n  ( extension INTEGER NOT NULL REFERENCES extensions,\n    uid INTEGER NOT NULL REFERENCES users,\n    key VARCHAR(64) NOT NULL,\n    text TEXT NOT NULL,\n\n    PRIMARY KEY (extension, uid, key) );\n\nCREATE TABLE extensionlog\n  ( extension INTEGER NOT NULL REFERENCES extensions,\n    uid INTEGER NOT NULL REFERENCES users,\n    category VARCHAR(64) NOT NULL DEFAULT 'default',\n    time TIMESTAMP NOT NULL DEFAULT NOW(),\n    text TEXT NOT NULL );\nCREATE INDEX extensionlog_extension_uid_category ON extensionlog(extension, uid, category);\n\nCREATE TYPE extensionaccesstype AS ENUM\n  ( 'install',\n    'execute' );\n\nCREATE TABLE accesscontrol_extensions\n  ( id SERIAL PRIMARY KEY,\n\n    -- The profile this exception belongs to.\n    profile INTEGER NOT NULL REFERENCES accesscontrolprofiles ON DELETE CASCADE,\n\n    -- Type of extension access.  NULL means \"any type\".\n    access_type extensionaccesstype,\n    -- Extension key: <auther username>/<extension name> for user extensions and\n    -- <extension name> for system extensions.  NULL means \"any extension\".\n    extension_key TEXT );\nCREATE INDEX accesscontrol_extensions_profile\n          ON accesscontrol_extensions (profile);\n"
  },
  {
    "path": "installation/data/dbschema.filters.sql",
    "content": "-- -*- mode: sql -*-\n--\n-- Copyright 2015 the Critic contributors, Opera Software ASA\n--\n-- Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n-- use this file except in compliance with the License.  You may obtain a copy of\n-- the License at\n--\n--   http://www.apache.org/licenses/LICENSE-2.0\n--\n-- Unless required by applicable law or agreed to in writing, software\n-- distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n-- WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n-- License for the specific language governing permissions and limitations under\n-- the License.\n\n-- Disable notices about implicitly created indexes and sequences.\nSET client_min_messages TO WARNING;\n\nCREATE TYPE filtertype AS ENUM\n  ( 'reviewer',\n    'watcher',\n    'ignored' );\nCREATE TABLE filters\n  ( id SERIAL PRIMARY KEY,\n    uid INTEGER NOT NULL REFERENCES users,\n    repository INTEGER NOT NULL REFERENCES repositories ON DELETE CASCADE,\n    path TEXT NOT NULL,\n    type filtertype NOT NULL,\n    delegate TEXT );\n\n-- Index used to enforce uniqueness.\nCREATE UNIQUE INDEX filters_uid_repository_path_md5 ON filters (uid, repository, MD5(path));\n"
  },
  {
    "path": "installation/data/dbschema.git.sql",
    "content": "-- -*- mode: sql -*-\n--\n-- Copyright 2015 the Critic contributors, Opera Software ASA\n--\n-- Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n-- use this file except in compliance with the License.  You may obtain a copy of\n-- the License at\n--\n--   http://www.apache.org/licenses/LICENSE-2.0\n--\n-- Unless required by applicable law or agreed to in writing, software\n-- distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n-- WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n-- License for the specific language governing permissions and limitations under\n-- the License.\n\n-- Disable notices about implicitly created indexes and sequences.\nSET client_min_messages TO WARNING;\n\nCREATE TABLE repositories\n  ( id SERIAL PRIMARY KEY,\n    parent INTEGER REFERENCES repositories,\n    name VARCHAR(64) NOT NULL UNIQUE,\n    path VARCHAR(256) NOT NULL UNIQUE );\n\nCREATE TABLE gitusers\n  ( id SERIAL PRIMARY KEY,\n    email VARCHAR(256) NOT NULL,\n    fullname VARCHAR(256) NOT NULL,\n\n    UNIQUE (email, fullname) );\n\nCREATE TABLE commits\n  ( id SERIAL PRIMARY KEY,\n    sha1 CHAR(40) NOT NULL UNIQUE,\n    author_gituser INTEGER NOT NULL REFERENCES gitusers,\n    commit_gituser INTEGER NOT NULL REFERENCES gitusers,\n    author_time TIMESTAMP NOT NULL,\n    commit_time TIMESTAMP NOT NULL );\n\nCREATE TABLE edges\n  ( parent INTEGER NOT NULL REFERENCES commits ON DELETE CASCADE,\n    child INTEGER NOT NULL REFERENCES commits ON DELETE CASCADE );\nCREATE INDEX edges_parent ON edges (parent);\nCREATE INDEX edges_child ON edges (child);\n\nCREATE TYPE branchtype AS ENUM\n  ( 'normal',\n    'review' );\nCREATE TABLE branches\n  ( id SERIAL PRIMARY KEY,\n    name VARCHAR(256) NOT NULL,\n    repository INTEGER NOT NULL REFERENCES repositories ON DELETE CASCADE,\n    head INTEGER NOT NULL REFERENCES commits,\n    base INTEGER REFERENCES branches,\n    tail INTEGER REFERENCES commits,\n    type branchtype NOT NULL DEFAULT 'normal',\n    archived BOOLEAN NOT NULL DEFAULT FALSE,\n\n    UNIQUE (repository, name) );\n\nCREATE TABLE reachable\n  ( branch INTEGER NOT NULL REFERENCES branches ON DELETE CASCADE,\n    commit INTEGER NOT NULL REFERENCES commits,\n\n    PRIMARY KEY (branch, commit) );\nCREATE INDEX reachable_branch ON reachable (branch);\nCREATE INDEX reachable_commit ON reachable (commit);\n\nCREATE TABLE tags\n  ( id SERIAL PRIMARY KEY,\n    name VARCHAR(256) NOT NULL,\n    repository INTEGER NOT NULL REFERENCES repositories ON DELETE CASCADE,\n    sha1 CHAR(40) NOT NULL,\n\n    UNIQUE (repository, name) );\nCREATE INDEX tags_repository_sha1 ON tags (repository, sha1);\n\n-- Cached result of 'git merge-base <all parents>' for commits.\nCREATE TABLE mergebases\n  ( commit INTEGER PRIMARY KEY REFERENCES commits ON DELETE CASCADE,\n    mergebase CHAR(40) );\n\n-- Cached per-file-and-parent \"relevant\" commits, for a merge commit.\n--\n-- Each row says that for the merge commit |commit|'s |parent|th parent and the\n-- file |file|, |relevant| is a commit between the merge-base and the merge that\n-- also modifies the file, and that isn't an ancestor of that parent.\nCREATE TABLE relevantcommits\n  ( commit INTEGER REFERENCES commits ON DELETE CASCADE,\n    parent SMALLINT NOT NULL,\n    file INTEGER REFERENCES files,\n    relevant INTEGER REFERENCES commits ON DELETE CASCADE,\n\n    PRIMARY KEY (commit, parent, file, relevant) );\n\nCREATE TYPE repositoryaccesstype AS ENUM\n  ( 'read',\n    'modify' );\n\nCREATE TABLE accesscontrol_repositories\n  ( id SERIAL PRIMARY KEY,\n\n    -- The profile this exception belongs to.\n    profile INTEGER NOT NULL REFERENCES accesscontrolprofiles ON DELETE CASCADE,\n\n    -- Type of access.  NULL means \"any type\".\n    access_type repositoryaccesstype,\n    -- Repository to access.  NULL means \"any repository\".\n    repository INTEGER REFERENCES repositories ON DELETE CASCADE );\nCREATE INDEX accesscontrol_repositories_profile\n          ON accesscontrol_repositories (profile);\n"
  },
  {
    "path": "installation/data/dbschema.news.sql",
    "content": "-- -*- mode: sql -*-\n--\n-- Copyright 2015 the Critic contributors, Opera Software ASA\n--\n-- Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n-- use this file except in compliance with the License.  You may obtain a copy of\n-- the License at\n--\n--   http://www.apache.org/licenses/LICENSE-2.0\n--\n-- Unless required by applicable law or agreed to in writing, software\n-- distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n-- WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n-- License for the specific language governing permissions and limitations under\n-- the License.\n\n-- Disable notices about implicitly created indexes and sequences.\nSET client_min_messages TO WARNING;\n\nCREATE TABLE newsitems\n  ( id SERIAL PRIMARY KEY,\n    date DATE DEFAULT NOW(),\n    text TEXT NOT NULL );\n\nCREATE TABLE newsread\n  ( item INTEGER NOT NULL REFERENCES newsitems ON DELETE CASCADE,\n    uid INTEGER NOT NULL REFERENCES users ON DELETE CASCADE );\n"
  },
  {
    "path": "installation/data/dbschema.preferences.sql",
    "content": "-- -*- mode: sql -*-\n--\n-- Copyright 2015 the Critic contributors, Opera Software ASA\n--\n-- Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n-- use this file except in compliance with the License.  You may obtain a copy of\n-- the License at\n--\n--   http://www.apache.org/licenses/LICENSE-2.0\n--\n-- Unless required by applicable law or agreed to in writing, software\n-- distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n-- WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n-- License for the specific language governing permissions and limitations under\n-- the License.\n\n-- Disable notices about implicitly created indexes and sequences.\nSET client_min_messages TO WARNING;\n\nCREATE TYPE preferencetype AS ENUM\n  ( 'boolean',\n    'integer',\n    'string' );\nCREATE TABLE preferences\n  ( item VARCHAR(64) PRIMARY KEY,\n    type preferencetype NOT NULL,\n    description TEXT NOT NULL,\n\n    -- If TRUE, this preference is relevant to configure per system (IOW\n    -- globally), per user, per repository and/or per filter.  This controls\n    -- whether the preference is displayed on the corresponding /config page\n    -- variant.\n    per_system BOOLEAN NOT NULL DEFAULT TRUE,\n    per_user BOOLEAN NOT NULL DEFAULT TRUE,\n    per_repository BOOLEAN NOT NULL DEFAULT FALSE,\n    per_filter BOOLEAN NOT NULL DEFAULT FALSE );\n\nCREATE TABLE userpreferences\n  ( item VARCHAR(64) NOT NULL REFERENCES preferences,\n    uid INTEGER REFERENCES users ON DELETE CASCADE,\n    repository INTEGER REFERENCES repositories ON DELETE CASCADE,\n    filter INTEGER REFERENCES filters ON DELETE CASCADE,\n\n    integer INTEGER,\n    string TEXT,\n\n    -- Invariant: If 'filter' is not NULL, then 'uid' must not be NULL.\n    CONSTRAINT check_uid_filter CHECK (filter IS NULL OR uid IS NOT NULL),\n\n    -- Invariant: At least one of 'repository' and 'filter' must be NULL.\n    CONSTRAINT check_repository_filter CHECK (repository IS NULL OR filter IS NULL) );\n\n-- These indexes are primarily used to enforce uniqueness.  The three columns\n-- 'uid', 'repository' and 'filter' can all be NULL (in various configurations)\n-- and from a uniqueness point of view, we want those NULL to behave as if they\n-- compared equal.\nCREATE UNIQUE INDEX userpreferences_item\n                 ON userpreferences (item)\n              WHERE uid IS NULL\n                AND repository IS NULL\n                AND filter IS NULL;\nCREATE UNIQUE INDEX userpreferences_item_uid\n                 ON userpreferences (item, uid)\n              WHERE uid IS NOT NULL\n                AND repository IS NULL\n                AND filter IS NULL;\nCREATE UNIQUE INDEX userpreferences_item_repository\n                 ON userpreferences (item, repository)\n              WHERE uid IS NULL\n                AND repository IS NOT NULL\n                AND filter IS NULL;\nCREATE UNIQUE INDEX userpreferences_item_uid_repository\n                 ON userpreferences (item, uid, repository)\n              WHERE uid IS NOT NULL\n                AND repository IS NOT NULL\n                AND filter IS NULL;\nCREATE UNIQUE INDEX userpreferences_item_uid_filter\n                 ON userpreferences (item, uid, filter)\n              WHERE uid IS NOT NULL\n                AND repository IS NULL\n                AND filter IS NOT NULL;\n"
  },
  {
    "path": "installation/data/dbschema.reviews.sql",
    "content": "-- -*- mode: sql -*-\n--\n-- Copyright 2015 the Critic contributors, Opera Software ASA\n--\n-- Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n-- use this file except in compliance with the License.  You may obtain a copy of\n-- the License at\n--\n--   http://www.apache.org/licenses/LICENSE-2.0\n--\n-- Unless required by applicable law or agreed to in writing, software\n-- distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n-- WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n-- License for the specific language governing permissions and limitations under\n-- the License.\n\n-- Disable notices about implicitly created indexes and sequences.\nSET client_min_messages TO WARNING;\n\nCREATE TYPE reviewtype AS ENUM\n  ( 'official',\n    'rfc',\n    'ad-hoc' );\nCREATE TYPE reviewstate AS ENUM\n  ( 'draft',\n    'open',\n    'closed',\n    'dropped' );\nCREATE TABLE reviews\n  ( id SERIAL PRIMARY KEY,\n    type reviewtype NOT NULL,\n    -- The review branch.\n    branch INTEGER NOT NULL REFERENCES branches,\n    -- The (non-review) branch from which this review was created, if any.\n    origin INTEGER REFERENCES branches ON DELETE SET NULL,\n    state reviewstate NOT NULL,\n    serial INTEGER NOT NULL DEFAULT 0,\n    closed_by INTEGER REFERENCES users,\n    dropped_by INTEGER REFERENCES users,\n    applyfilters BOOLEAN NOT NULL,\n    applyparentfilters BOOLEAN NOT NULL,\n\n    summary TEXT,\n    description TEXT );\nCREATE INDEX reviews_branch ON reviews (branch);\n\nCREATE TABLE scheduledreviewbrancharchivals\n  ( review INTEGER PRIMARY KEY REFERENCES reviews (id),\n    deadline TIMESTAMP NOT NULL );\n\nCREATE TABLE reviewfilters\n  ( id SERIAL PRIMARY KEY,\n\n    review INTEGER NOT NULL REFERENCES reviews ON DELETE CASCADE,\n    uid INTEGER NOT NULL REFERENCES users ON DELETE CASCADE,\n    path TEXT NOT NULL,\n    type filtertype NOT NULL,\n    creator INTEGER NOT NULL REFERENCES users ON DELETE CASCADE );\n\n-- Index used to enforce uniqueness.\nCREATE UNIQUE INDEX reviewfilters_review_uid_path_md5 ON reviewfilters (review, uid, MD5(path));\n\nCREATE TABLE batches\n  ( id SERIAL PRIMARY KEY,\n    review INTEGER NOT NULL REFERENCES reviews ON DELETE CASCADE,\n    uid INTEGER NOT NULL REFERENCES users ON DELETE CASCADE,\n    comment INTEGER, -- REFERENCES commentchains,\n    time TIMESTAMP NOT NULL DEFAULT NOW() );\nCREATE INDEX batches_review_uid ON batches (review, uid);\n\nCREATE TYPE reviewusertype AS ENUM\n  ( 'automatic',\n    'manual' );\nCREATE TABLE reviewusers\n  ( review INTEGER NOT NULL,\n    uid INTEGER NOT NULL,\n    owner BOOLEAN NOT NULL DEFAULT FALSE,\n    type reviewusertype NOT NULL DEFAULT 'automatic',\n\n    PRIMARY KEY (review, uid),\n    FOREIGN KEY (review) REFERENCES reviews(id) ON DELETE CASCADE,\n    FOREIGN KEY (uid) REFERENCES users(id) );\nCREATE INDEX reviewusers_uid ON reviewusers (uid);\n\nCREATE TABLE reviewchangesets\n  ( review INTEGER NOT NULL REFERENCES reviews ON DELETE CASCADE,\n    changeset INTEGER NOT NULL REFERENCES changesets,\n\n    PRIMARY KEY (review, changeset) );\n\nCREATE TABLE reviewrebases\n  ( id SERIAL PRIMARY KEY,\n    review INTEGER NOT NULL REFERENCES reviews ON DELETE CASCADE,\n    old_head INTEGER NOT NULL REFERENCES commits,\n    new_head INTEGER REFERENCES commits,\n    old_upstream INTEGER REFERENCES commits,\n    new_upstream INTEGER REFERENCES commits,\n    equivalent_merge INTEGER REFERENCES commits,\n    replayed_rebase INTEGER REFERENCES commits,\n    uid INTEGER NOT NULL REFERENCES users,\n    branch VARCHAR(256),\n\n    UNIQUE (review, old_head) );\n\nCREATE TABLE previousreachable\n  ( rebase INTEGER NOT NULL REFERENCES reviewrebases ON DELETE CASCADE,\n    commit INTEGER NOT NULL REFERENCES commits );\nCREATE INDEX previousreachable_rebase ON previousreachable (rebase);\n\nCREATE TYPE reviewfilestate AS ENUM\n  ( 'pending',    -- No one has said anything.\n    'reviewed'    -- The file has been reviewed.\n  );\nCREATE TABLE reviewfiles\n  ( id SERIAL PRIMARY KEY,\n\n    review INTEGER NOT NULL REFERENCES reviews ON DELETE CASCADE,\n    changeset INTEGER NOT NULL REFERENCES changesets ON DELETE CASCADE,\n    file INTEGER NOT NULL REFERENCES files ON DELETE CASCADE,\n\n    deleted INTEGER NOT NULL,\n    inserted INTEGER NOT NULL,\n\n    state reviewfilestate NOT NULL DEFAULT 'pending',\n    reviewer INTEGER REFERENCES users ON DELETE SET NULL,\n    time TIMESTAMP,\n\n    FOREIGN KEY (review, changeset) REFERENCES reviewchangesets ON DELETE CASCADE,\n    FOREIGN KEY (changeset, file) REFERENCES fileversions ON DELETE CASCADE );\n\nCREATE INDEX reviewfiles_review_changeset ON reviewfiles (review, changeset);\nCREATE INDEX reviewfiles_review_state ON reviewfiles (review, state);\n\nCREATE TABLE reviewassignmentstransactions\n  ( id SERIAL PRIMARY KEY,\n\n    review INTEGER NOT NULL REFERENCES reviews ON DELETE CASCADE,\n    assigner INTEGER NOT NULL REFERENCES users,\n    note TEXT,\n\n    time TIMESTAMP DEFAULT NOW() );\n\nCREATE TABLE reviewassignmentchanges\n  ( transaction INTEGER NOT NULL REFERENCES reviewassignmentstransactions,\n\n    file INTEGER NOT NULL REFERENCES reviewfiles ON DELETE CASCADE,\n    uid INTEGER NOT NULL REFERENCES users,\n    assigned BOOLEAN NOT NULL,\n\n    PRIMARY KEY (transaction, file, uid) );\n\nCREATE TABLE reviewfilterchanges\n  ( transaction INTEGER NOT NULL REFERENCES reviewassignmentstransactions ON DELETE CASCADE,\n\n    uid INTEGER NOT NULL REFERENCES users ON DELETE CASCADE,\n    path TEXT NOT NULL,\n    type filtertype NOT NULL,\n    created BOOLEAN NOT NULL );\n\nCREATE TABLE reviewuserfiles\n  ( file INTEGER NOT NULL REFERENCES reviewfiles ON DELETE CASCADE,\n    uid INTEGER NOT NULL REFERENCES users,\n\n    time TIMESTAMP DEFAULT NOW(),\n\n    PRIMARY KEY (file, uid) );\n\nCREATE INDEX reviewuserfiles_uid ON reviewuserfiles (uid);\n\nCREATE VIEW reviewfilesharing\n  AS SELECT reviewfiles.review AS review, reviewfiles.id AS file, COUNT(reviewuserfiles.uid) AS reviewers\n       FROM reviewfiles\n       JOIN reviewuserfiles ON (reviewfiles.id=reviewuserfiles.file)\n       JOIN users ON (users.id=reviewuserfiles.uid)\n      WHERE users.status='current'\n   GROUP BY reviewfiles.review, reviewfiles.id;\n\nCREATE TYPE reviewfilechangestate AS ENUM\n  ( 'draft',     -- This change hasn't been performed yet.\n    'performed', -- The change has been performed.\n    'rejected'   -- The change was rejected; affected file wasn't in expected\n                 -- state (concurrent update.)\n  );\nCREATE TABLE reviewfilechanges\n  ( batch INTEGER REFERENCES batches,\n    file INTEGER NOT NULL REFERENCES reviewfiles,\n    uid INTEGER NOT NULL REFERENCES users,\n\n    time TIMESTAMP NOT NULL DEFAULT NOW(),\n    state reviewfilechangestate NOT NULL DEFAULT 'draft',\n    from_state reviewfilestate NOT NULL,\n    to_state reviewfilestate NOT NULL,\n\n    FOREIGN KEY (file, uid) REFERENCES reviewuserfiles ON DELETE CASCADE );\n\nCREATE INDEX reviewfilechanges_batch ON reviewfilechanges (batch);\nCREATE INDEX reviewfilechanges_file ON reviewfilechanges (file);\nCREATE INDEX reviewfilechanges_uid_state ON reviewfilechanges (uid, state);\nCREATE INDEX reviewfilechanges_time ON reviewfilechanges (time);\n\nCREATE TABLE lockedreviews\n  ( review INTEGER PRIMARY KEY REFERENCES reviews );\n\nCREATE VIEW fullreviewuserfiles\n  AS SELECT reviewfiles.review as review,\n            reviewfiles.changeset as changeset,\n            reviewfiles.file as file,\n            reviewfiles.deleted as deleted,\n            reviewfiles.inserted as inserted,\n            reviewfiles.state as state,\n            reviewfiles.reviewer as reviewer,\n            reviewuserfiles.uid as assignee\n       FROM reviewfiles\n       JOIN reviewuserfiles ON (reviewuserfiles.file=reviewfiles.id);\n\nCREATE TABLE reviewmessageids\n  ( uid INTEGER NOT NULL REFERENCES users ON DELETE CASCADE,\n    review INTEGER NOT NULL REFERENCES reviews ON DELETE CASCADE,\n    messageid CHAR(24) NOT NULL,\n\n    PRIMARY KEY (uid, review) );\n\nCREATE INDEX reviewmessageids_review ON reviewmessageids (review);\n\nCREATE TABLE reviewmergeconfirmations\n  ( id SERIAL PRIMARY KEY,\n    review INTEGER NOT NULL REFERENCES reviews ON DELETE CASCADE,\n    uid INTEGER NOT NULL REFERENCES users ON DELETE CASCADE,\n    merge INTEGER NOT NULL REFERENCES commits ON DELETE CASCADE,\n    tail INTEGER REFERENCES commits ON DELETE CASCADE,\n    confirmed BOOLEAN NOT NULL DEFAULT FALSE,\n\n    UNIQUE (review, uid, merge) );\n\nCREATE TABLE reviewmergecontributions\n  ( id INTEGER NOT NULL REFERENCES reviewmergeconfirmations ON DELETE CASCADE,\n    merged INTEGER NOT NULL REFERENCES commits ON DELETE CASCADE,\n\n    PRIMARY KEY (id, merged) );\n\nCREATE TABLE reviewrecipientfilters\n  ( review INTEGER NOT NULL REFERENCES reviews,\n    uid INTEGER REFERENCES users,\n    include BOOLEAN NOT NULL,\n\n    UNIQUE (review, uid) );\n\nCREATE TABLE checkbranchnotes\n  ( repository INTEGER NOT NULL REFERENCES repositories ON DELETE CASCADE,\n    branch VARCHAR(256) NOT NULL,\n    upstream VARCHAR(256) NOT NULL,\n    sha1 CHAR(40) NOT NULL,\n    uid INTEGER NOT NULL REFERENCES users,\n    review INTEGER REFERENCES reviews ON DELETE SET NULL,\n    text TEXT,\n\n    PRIMARY KEY (repository, branch, upstream, sha1) );\n"
  },
  {
    "path": "installation/data/dbschema.trackedbranches.sql",
    "content": "-- -*- mode: sql -*-\n--\n-- Copyright 2015 the Critic contributors, Opera Software ASA\n--\n-- Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n-- use this file except in compliance with the License.  You may obtain a copy of\n-- the License at\n--\n--   http://www.apache.org/licenses/LICENSE-2.0\n--\n-- Unless required by applicable law or agreed to in writing, software\n-- distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n-- WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n-- License for the specific language governing permissions and limitations under\n-- the License.\n\n-- Disable notices about implicitly created indexes and sequences.\nSET client_min_messages TO WARNING;\n\nCREATE TABLE trackedbranches\n  ( id SERIAL PRIMARY KEY,\n    repository INTEGER NOT NULL REFERENCES repositories,\n    local_name VARCHAR(256) NOT NULL,\n    remote VARCHAR(256) NOT NULL,\n    remote_name VARCHAR(256) NOT NULL,\n    forced BOOLEAN NOT NULL,\n    disabled BOOLEAN NOT NULL DEFAULT FALSE,\n    updating BOOLEAN NOT NULL DEFAULT FALSE,\n    delay INTERVAL NOT NULL,\n    previous TIMESTAMP,\n    next TIMESTAMP,\n\n    UNIQUE (repository, local_name) );\n\nCREATE TABLE trackedbranchusers\n  ( branch INTEGER NOT NULL REFERENCES trackedbranches ON DELETE CASCADE,\n    uid INTEGER NOT NULL REFERENCES users,\n\n    PRIMARY KEY (branch, uid) );\n\nCREATE TABLE trackedbranchlog\n  ( branch INTEGER NOT NULL REFERENCES trackedbranches ON DELETE CASCADE,\n    time TIMESTAMP NOT NULL DEFAULT NOW(),\n    from_sha1 CHAR(40),\n    to_sha1 CHAR(40) NOT NULL,\n    hook_output TEXT NOT NULL,\n    successful BOOLEAN NOT NULL );\nCREATE INDEX trackedbranchlog_branch ON trackedbranchlog (branch);\n"
  },
  {
    "path": "installation/data/dbschema.users.sql",
    "content": "-- -*- mode: sql -*-\n--\n-- Copyright 2015 the Critic contributors, Opera Software ASA\n--\n-- Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n-- use this file except in compliance with the License.  You may obtain a copy of\n-- the License at\n--\n--   http://www.apache.org/licenses/LICENSE-2.0\n--\n-- Unless required by applicable law or agreed to in writing, software\n-- distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n-- WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n-- License for the specific language governing permissions and limitations under\n-- the License.\n\n-- Disable notices about implicitly created indexes and sequences.\nSET client_min_messages TO WARNING;\n\nCREATE TABLE roles\n  ( name VARCHAR(64) PRIMARY KEY,\n    description TEXT );\n\nINSERT INTO roles (name, description)\n     VALUES ('administrator', 'Almighty system administrator.'),\n            ('repositories', 'Allowed to add and configure repositories.'),\n            ('developer', 'System developer.'),\n            ('newswriter', 'Allowed to add and edit news items.');\n\nCREATE TYPE userstatus AS ENUM\n  ( 'unknown',\n    'current',\n    'absent',\n    'retired' );\n\nCREATE TABLE users\n  ( id SERIAL PRIMARY KEY,\n    name VARCHAR(64) NOT NULL UNIQUE,\n    fullname VARCHAR(256),\n    password VARCHAR(256),\n    email INTEGER, -- Foreign key constraint \"REFERENCES useremails\" set up later.\n    status userstatus NOT NULL DEFAULT 'unknown' );\n\nCREATE TABLE useremails\n  ( id SERIAL PRIMARY KEY,\n    uid INTEGER NOT NULL REFERENCES users ON DELETE CASCADE,\n    email VARCHAR(256) NOT NULL,\n    verified BOOLEAN,\n    verification_token VARCHAR(256),\n\n    UNIQUE (uid, email) );\n\n-- FIXME: This circular relation is unnecessary.  Should have a separate table\n-- for mapping a user's selected email, or just store it as a boolean in the\n-- useremails table instead.\nALTER TABLE users ADD CONSTRAINT users_email_fkey FOREIGN KEY (email) REFERENCES useremails;\n\nCREATE TABLE usersessions\n  ( key CHAR(28) PRIMARY KEY,\n    uid INTEGER NOT NULL REFERENCES users,\n    labels VARCHAR(256),\n    atime TIMESTAMP DEFAULT NOW() );\n\nCREATE TABLE usergitemails\n  ( email VARCHAR(256),\n    uid INTEGER REFERENCES users ON DELETE CASCADE,\n\n    PRIMARY KEY (email, uid) );\nCREATE INDEX usergitemails_uid ON usergitemails (uid);\n\nCREATE TABLE userabsence\n  ( uid INTEGER NOT NULL REFERENCES users,\n    until DATE );\nCREATE INDEX userabsence_uid_until ON userabsence (uid, until);\n\nCREATE TABLE userroles\n  ( uid INTEGER NOT NULL REFERENCES users,\n    role VARCHAR(64) NOT NULL REFERENCES roles );\n\nCREATE TABLE userresources\n  ( uid INTEGER NOT NULL REFERENCES users ON DELETE CASCADE,\n    name VARCHAR(32) NOT NULL,\n    revision INTEGER NOT NULL DEFAULT 0,\n    source TEXT NOT NULL,\n\n    PRIMARY KEY (uid, name, revision) );\n\nCREATE TABLE externalusers\n  ( id SERIAL PRIMARY KEY,\n    uid INTEGER REFERENCES users,\n    provider VARCHAR(16) NOT NULL,\n    account VARCHAR(256) NOT NULL,\n    email VARCHAR(256),\n    token VARCHAR(256),\n\n    UNIQUE (provider, account) );\n\nCREATE TABLE oauthstates\n  ( state VARCHAR(64) PRIMARY KEY,\n    url TEXT,\n    time TIMESTAMP NOT NULL DEFAULT NOW() );\n\nCREATE TYPE systemaccesstype AS ENUM\n  ( -- The system is accessed as a named user.\n    'user',\n    -- The system is accessed by a system service or similar.\n    'system',\n    -- The system is accessed anonymously.\n    'anonymous' );\n\nCREATE TABLE accesstokens\n  ( id SERIAL PRIMARY KEY,\n\n    -- The type of access granted by this access token.\n    access_type systemaccesstype NOT NULL DEFAULT 'user',\n    -- The user (when access_type='user') or NULL.\n    uid INTEGER REFERENCES users ON DELETE CASCADE,\n\n    -- First part of access token (\"username\").\n    part1 VARCHAR(32) NOT NULL,\n    -- Second part of access token (\"password\").\n    part2 VARCHAR(32) NOT NULL,\n\n    -- Access token title.\n    title VARCHAR(256),\n\n    UNIQUE (part1, part2),\n\n    CONSTRAINT valid_user CHECK ((access_type='user' AND uid IS NOT NULL) OR\n                                 (access_type!='user' AND uid IS NULL)) );\n\nCREATE TYPE accesscontrolrule AS ENUM\n  ( 'allow',\n    'deny' );\n\nCREATE TABLE accesscontrolprofiles\n  ( id SERIAL PRIMARY KEY,\n    title TEXT,\n\n    -- Access token that this profile belongs to.\n    access_token INTEGER REFERENCES accesstokens ON DELETE CASCADE,\n\n    http accesscontrolrule NOT NULL DEFAULT 'allow',\n    repositories accesscontrolrule NOT NULL DEFAULT 'allow',\n    extensions accesscontrolrule NOT NULL DEFAULT 'allow',\n\n    UNIQUE (access_token) );\n\nCREATE TYPE httprequestmethod AS ENUM\n  ( 'GET',\n    'HEAD',\n    'OPTIONS',\n    'POST',\n    'PUT',\n    'DELETE' );\n\n-- Exceptions for HTTP requests.\nCREATE TABLE accesscontrol_http\n  ( id SERIAL PRIMARY KEY,\n\n    -- The profile this exception belongs to.\n    profile INTEGER NOT NULL REFERENCES accesscontrolprofiles ON DELETE CASCADE,\n\n    -- HTTP request method.  NULL means \"all methods\".\n    request_method httprequestmethod,\n    -- Python regular expression that must match the entire path.  NULL means\n    -- \"all paths\".\n    path_pattern TEXT );\nCREATE INDEX accesscontrol_http_profile\n          ON accesscontrol_http (profile);\n\nCREATE TABLE useraccesscontrolprofiles\n  ( -- The type of access that is controlled.\n    access_type systemaccesstype NOT NULL DEFAULT 'user',\n\n    -- The user (when access_type='user') or NULL.  If access_type='user' and\n    -- this is NULL, then this is the default profile association.\n    uid INTEGER REFERENCES users,\n\n    -- Access control profile.\n    profile INTEGER NOT NULL REFERENCES accesscontrolprofiles ON DELETE CASCADE,\n\n    CONSTRAINT valid_user CHECK (access_type='user' OR uid IS NULL) );\nCREATE INDEX useraccesscontrolprofiles_uid\n          ON useraccesscontrolprofiles (uid);\n\nCREATE TABLE labeledaccesscontrolprofiles\n  ( -- Authentication labels from user authentication, typically indicating some\n    -- type of group memberships. Labels should be sorted lexicographically and\n    -- separated by pipe ('|') characters.\n    labels VARCHAR(256) PRIMARY KEY,\n\n    -- Access control profile.\n    profile INTEGER NOT NULL REFERENCES accesscontrolprofiles ON DELETE CASCADE );\n"
  },
  {
    "path": "installation/database.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport tempfile\nimport shutil\nimport os\nimport time\nimport errno\nimport subprocess\n\nimport installation\n\nuser_created = False\ndatabase_created = False\nlanguage_created = False\n\ndef psql_import(sql_file, as_user=None):\n    if as_user is None:\n        as_user = installation.system.username\n    temp_file = tempfile.mkstemp()[1]\n    shutil.copy(os.path.join(installation.root_dir, sql_file), temp_file)\n    # Make sure file is readable by postgres user\n    os.chmod(temp_file, 0644)\n    subprocess.check_output(\n        [\"su\", \"-s\", \"/bin/sh\", \"-c\", \"psql -v ON_ERROR_STOP=1 -f %s\" % temp_file, as_user])\n    os.unlink(temp_file)\n\ndef add_arguments(mode, parser):\n    if mode == \"upgrade\":\n        parser.add_argument(\"--backup-database\", dest=\"database_backup\", action=\"store_const\", const=True,\n                            help=\"backup database to default location without asking\")\n        parser.add_argument(\"--no-backup-database\", dest=\"database_backup\", action=\"store_const\", const=False,\n                            help=\"do not backup database before upgrading\")\n\ndef prepare(mode, arguments, data):\n    if mode == \"upgrade\":\n        default_path = os.path.join(data[\"installation.paths.data_dir\"],\n                                    \"backups\",\n                                    time.strftime(\"%Y%m%d_%H%M.dump\", time.localtime()))\n\n        if arguments.database_backup is False:\n            backup_database = False\n        elif arguments.database_backup is True:\n            backup_database = True\n            backup_path = default_path\n        else:\n            if installation.migrate.will_modify_dbschema(data):\n                print \"\"\"\nThe database schema will be modified by the upgrade.  Creating a\nbackup of the database first is strongly recommended.\n\"\"\"\n                default_backup = True\n            else:\n                default_backup = False\n\n            if installation.input.yes_or_no(\"Do you want to create a backup of the database?\",\n                                            default=default_backup):\n                backup_database = True\n                backup_path = installation.input.string(\"Where should the backup be stored?\",\n                                                        default=default_path)\n            else:\n                backup_database = False\n\n        if backup_database:\n            try: os.makedirs(os.path.dirname(backup_path), 0750)\n            except OSError as error:\n                if error.errno == errno.EEXIST: pass\n                else: raise\n\n            print\n            print \"Dumping database ...\"\n\n            with open(backup_path, \"w\") as output_file:\n                subprocess.check_call(\n                    [\"su\", \"-s\", \"/bin/sh\", \"-c\", \"pg_dump -Fc critic\",\n                     data[\"installation.system.username\"]],\n                    stdout=output_file)\n\n    data[\"installation.database.driver\"] = \"postgresql\"\n    data[\"installation.database.parameters\"] = { \"database\": \"critic\",\n                                                 \"user\": data[\"installation.system.username\"] }\n\n    return True\n\nSCHEMA_FILES = [\n    # No dependencies.\n    \"installation/data/dbschema.base.sql\",\n    \"installation/data/dbschema.users.sql\",\n\n    # Depends on: base[files].\n    \"installation/data/dbschema.git.sql\",\n\n    # Depends on: users.\n    \"installation/data/dbschema.news.sql\",\n\n    # Depends on: git, users.\n    \"installation/data/dbschema.trackedbranches.sql\",\n\n    # Depends on: base[files], git.\n    \"installation/data/dbschema.changesets.sql\",\n\n    # Depends on: git, users.\n    \"installation/data/dbschema.filters.sql\",\n\n    # Depends on: git, users, filters.\n    \"installation/data/dbschema.preferences.sql\",\n\n    # Depends on: base[files], git, users, changesets.\n    \"installation/data/dbschema.reviews.sql\",\n\n    # Depends on: base[files], git, users, reviews.\n    \"installation/data/dbschema.comments.sql\",\n\n    # Depends on: base[files], git, users, reviews.\n    \"installation/data/dbschema.extensions.sql\",\n]\n\nPGSQL_FILES = [\"installation/data/comments.pgsql\"]\n\ndef install(data):\n    global user_created, database_created, language_created\n\n    postgresql_version_output = subprocess.check_output(\n        [installation.prereqs.psql.path, \"--version\"])\n\n    postgresql_version = postgresql_version_output.splitlines()[0].split()[-1]\n    postgresql_version_components = postgresql_version.split(\".\")\n\n    postgresql_major = postgresql_version_components[0]\n    postgresql_minor = postgresql_version_components[1]\n\n    if postgresql_major < 9 or (postgresql_major == 9 and postgresql_minor < 1):\n        print\n        print \"\"\"\\\nUnsupported PostgreSQL version: %s\n\nERROR: Critic requires PostgreSQL 9.1.x or later!\n\"\"\" % postgresql_version\n        return False\n\n    print \"Creating database ...\"\n\n    # Several subsequent commands will run as Critic system user or \"postgres\"\n    # user, and these users typically don't have read access to the installation\n    # 'root_dir', so set cwd to something that Critic system / \"postgres\" users\n    # has access to.\n    with installation.utils.temporary_cwd():\n        subprocess.check_output([\"su\", \"-c\", \"psql -v ON_ERROR_STOP=1 -c 'CREATE USER \\\"%s\\\";'\" % installation.system.username, \"postgres\"])\n        user_created = True\n\n        subprocess.check_output([\"su\", \"-c\", \"psql -v ON_ERROR_STOP=1 -c 'CREATE DATABASE \\\"critic\\\";'\", \"postgres\"])\n        database_created = True\n\n        try:\n            subprocess.check_output([\"su\", \"-c\", \"createlang plpgsql critic\", \"postgres\"],\n                                    stderr=subprocess.STDOUT)\n            language_created = True\n        except subprocess.CalledProcessError:\n            # The 'createlang' command fails if the language is already enabled\n            # in the database, and we want to ignore such failures.  It might\n            # also fail for other reasons, that we really don't mean to ignore,\n            # but in that case importing the *.pgsql files below would fail,\n            # since they define PL/pgSQL functions.\n            pass\n\n        subprocess.check_output([\"su\", \"-c\", \"psql -v ON_ERROR_STOP=1 -c 'GRANT ALL ON DATABASE \\\"critic\\\" TO \\\"%s\\\";'\" % installation.system.username, \"postgres\"])\n\n        for schema_file in SCHEMA_FILES:\n            psql_import(schema_file)\n        for pgsql_file in PGSQL_FILES:\n            psql_import(pgsql_file)\n\n        import psycopg2\n\n        def adapt(value): return psycopg2.extensions.adapt(value).getquoted()\n\n        if installation.config.access_scheme in (\"http\", \"https\"):\n            anonymous_scheme = authenticated_scheme = installation.config.access_scheme\n        else:\n            anonymous_scheme = \"http\"\n            authenticated_scheme = \"https\"\n\n        add_systemidentity_query = (\n            \"\"\"INSERT INTO systemidentities (key, name, anonymous_scheme,\n                                             authenticated_scheme, hostname,\n                                             description, installed_sha1)\n                    VALUES ('main', 'main', %s, %s, %s, 'Main', %s);\"\"\"\n            % (adapt(anonymous_scheme), adapt(authenticated_scheme),\n               adapt(installation.system.hostname), adapt(data[\"sha1\"])))\n\n        installation.process.check_input(\n            [\"su\", \"-s\", \"/bin/sh\", \"-c\", \"psql -q -v ON_ERROR_STOP=1 -f -\", installation.system.username],\n            stdin=add_systemidentity_query)\n\n    return True\n\ndef upgrade(arguments, data):\n    git = data[\"installation.prereqs.git\"]\n\n    old_sha1 = data[\"sha1\"]\n    new_sha1 = installation.utils.run_git([git, \"rev-parse\", \"HEAD\"],\n                                          cwd=installation.root_dir).strip()\n\n    for pgsql_file in PGSQL_FILES:\n        old_file_sha1 = installation.utils.get_file_sha1(\n            git, old_sha1, pgsql_file)\n        new_file_sha1 = installation.utils.get_file_sha1(\n            git, new_sha1, pgsql_file)\n\n        if old_file_sha1 == new_file_sha1:\n            continue\n\n        with installation.utils.temporary_cwd():\n            # We assume that these files use CREATE OR REPLACE syntax, so that\n            # we can simply re-import them when they change, and they'll update.\n            # If they need more than that to update (for instance if a function\n            # is removed) we'll need to use a migration script for that.\n            print \"Reloading: %s\" % pgsql_file\n\n            if not arguments.dry_run:\n                psql_import(pgsql_file)\n\n    return True\n\ndef undo():\n    if language_created:\n        subprocess.check_output([\"su\", \"-c\", \"droplang plpgsql critic\", \"postgres\"])\n    if database_created:\n        subprocess.check_output([\"su\", \"-c\", \"psql -v ON_ERROR_STOP=1 -c 'DROP DATABASE \\\"critic\\\";'\", \"postgres\"])\n    if user_created:\n        subprocess.check_output([\"su\", \"-c\", \"psql -v ON_ERROR_STOP=1 -c 'DROP USER \\\"%s\\\";'\" % installation.system.username, \"postgres\"])\n"
  },
  {
    "path": "installation/extensions.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\ndef prepare(mode, arguments, data):\n    data[\"installation.extensions.enabled\"] = False\n    data[\"installation.extensions.critic_v8_jsshell\"] = \"NOT_INSTALLED\"\n    data[\"installation.extensions.default_flavor\"] = \"js/v8\"\n\n    if mode == \"upgrade\":\n        import configuration\n        data[\"installation.extensions.enabled\"] = \\\n            configuration.extensions.ENABLED\n        try:\n            data[\"installation.extensions.critic_v8_jsshell\"] = \\\n                configuration.extensions.FLAVORS[\"js/v8\"][\"executable\"]\n        except (KeyError, AttributeError):\n            pass\n        try:\n            data[\"installation.extensions.default_flavor\"] = \\\n                configuration.extensions.DEFAULT_FLAVOR\n        except AttributeError:\n            pass\n\n    return True\n"
  },
  {
    "path": "installation/externals/.gitignore",
    "content": "depot_tools/\n"
  },
  {
    "path": "installation/externals/MIT-LICENSE.Chosen.md",
    "content": "#### Chosen\n- by Patrick Filler for [Harvest](http://getharvest.com)\n- Copyright (c) 2011-2013 by Harvest\n\nAvailable for use under the [MIT License](http://en.wikipedia.org/wiki/MIT_License)\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in\nall copies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN\nTHE SOFTWARE.\n"
  },
  {
    "path": "installation/externals/MIT-LICENSE.jQuery.txt",
    "content": "Copyright 2012 jQuery Foundation and other contributors\nhttp://jquery.com/\n\nPermission is hereby granted, free of charge, to any person obtaining\na copy of this software and associated documentation files (the\n\"Software\"), to deal in the Software without restriction, including\nwithout limitation the rights to use, copy, modify, merge, publish,\ndistribute, sublicense, and/or sell copies of the Software, and to\npermit persons to whom the Software is furnished to do so, subject to\nthe following conditions:\n\nThe above copyright notice and this permission notice shall be\nincluded in all copies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND,\nEXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF\nMERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND\nNONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE\nLIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION\nOF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION\nWITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.\n"
  },
  {
    "path": "installation/files.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport os\nimport shutil\nimport errno\nimport py_compile\n\nimport installation\n\ncreated_dir = []\ncreated_file = []\nrenamed = []\ncopied_files = 0\nmodified_files = 0\nsources_modified = False\nresources_modified = False\n\ndef compile_file(filename):\n    global created_file\n    if not filename.endswith(\".py\"):\n        return True\n    try:\n        path = os.path.join(installation.paths.install_dir, filename)\n        with installation.utils.as_critic_system_user():\n            py_compile.compile(path, doraise=True)\n    except py_compile.PyCompileError as error:\n        print \"\"\"\nERROR: Failed to compile %s:\\n%s\n\"\"\" % (filename, error)\n        return False\n    else:\n        created_file.append(path + \"c\")\n        return True\n\ndef copyfile(source, destination):\n    if os.path.islink(source):\n        if os.path.lexists(destination):\n            os.unlink(destination)\n        os.symlink(os.readlink(source), destination)\n    else:\n        shutil.copyfile(source, destination)\n\ndef skip(path):\n    filename = os.path.basename(path)\n    if filename == \"unittest.py\" or filename.endswith(\"_unittest.py\"):\n        return not installation.config.is_testing\n    return False\n\ndef install(data):\n    source_dir = os.path.join(installation.root_dir, \"src\")\n    target_dir = installation.paths.install_dir\n\n    # Note: this is an array since it's modified in a nested scope.\n    compilation_failed = []\n\n    def copy(path):\n        global copied_files\n\n        source = os.path.join(source_dir, path)\n        target = os.path.join(target_dir, path)\n\n        if os.path.isdir(source):\n            os.mkdir(target, 0755)\n            os.chown(target, installation.system.uid, installation.system.gid)\n            created_dir.append(target)\n            return True\n        else:\n            copyfile(source, target)\n            created_file.append(target)\n            if not os.path.islink(target):\n                if path.startswith(\"hooks/\"):\n                    mode = 0755\n                else:\n                    mode = 0644\n                os.chmod(target, mode)\n            os.lchown(target, installation.system.uid, installation.system.gid)\n            copied_files += 1\n            if not compile_file(path):\n                compilation_failed.append(path)\n            return False\n\n    def process(path=\"\"):\n        for entry in os.listdir(os.path.join(source_dir, path)):\n            name = os.path.join(path, entry)\n\n            if skip(name):\n                continue\n\n            if copy(name):\n                process(name)\n\n    process()\n    sys.path.insert(0, installation.paths.install_dir)\n\n    if compilation_failed:\n        return False\n\n    print \"Copied %d files into %s ...\" % (copied_files, target_dir)\n\n    return True\n\ndef upgrade(arguments, data):\n    source_dir = installation.root_dir\n    target_dir = data[\"installation.paths.install_dir\"]\n\n    # Note: this is an array since it's modified in a nested scope.\n    compilation_failed = []\n\n    uid = installation.system.uid\n    gid = installation.system.gid\n\n    def chown(directory):\n        os.chown(directory, uid, gid)\n        for name in os.listdir(directory):\n            path = os.path.join(directory, name)\n            if os.path.isdir(path):\n                chown(path)\n            elif path.endswith(\".pyc\"):\n                os.chown(path, uid, gid)\n            elif path.endswith(\".pyo\"):\n                os.unlink(path)\n\n    chown(target_dir)\n\n    git = data[\"installation.prereqs.git\"]\n\n    old_sha1 = data[\"sha1\"]\n    new_sha1 = installation.utils.run_git([git, \"rev-parse\", \"HEAD\"],\n                                          cwd=installation.root_dir).strip()\n\n    old_has_src = installation.utils.get_tree_sha1(git, old_sha1, \"src\")\n\n    def isResource(path):\n        return path.endswith(\".css\") or path.endswith(\".js\") or path.endswith(\".txt\")\n\n    def remove(old_source_path, target_path):\n        full_target_path = os.path.join(target_dir, target_path)\n        backup_path = os.path.join(os.path.dirname(full_target_path),\n                                   \"_\" + os.path.basename(target_path))\n\n        if not os.path.isfile(full_target_path):\n            return\n\n        old_file_sha1 = installation.utils.get_file_sha1(\n            git, old_sha1, old_source_path)\n        current_file_sha1 = installation.utils.hash_file(\n            git, full_target_path)\n\n        assert old_file_sha1 is not None\n\n        if old_file_sha1 != current_file_sha1:\n            def generateVersion(label, path):\n                if label == \"installed\":\n                    source = installation.utils.run_git(\n                        [git, \"cat-file\", \"blob\", old_file_sha1],\n                        cwd=installation.root_dir)\n                    with open(path, \"w\") as target:\n                        target.write(source)\n\n            update_query = installation.utils.UpdateModifiedFile(\n                arguments,\n                message=\"\"\"\\\nA source file is about to be removed, but the existing source file\nappears to have been edited since it was installed.\n\n  Installed version: %(installed)s\n  Current version  : %(current)s\n\nNot removing the file can cause unpredictable results.\n\"\"\",\n                versions={ \"installed\": full_target_path + \".org\",\n                           \"current\": full_target_path },\n                options=[ (\"r\", \"remove the file\"),\n                          (\"k\", \"keep the file\"),\n                          (\"d\", (\"installed\", \"current\")) ],\n                generateVersion=generateVersion)\n\n            if update_query.prompt() == \"r\":\n                remove_file = True\n            else:\n                remove_file = False\n        else:\n            remove_file = True\n\n        if remove_file:\n            print \"Removing file: %s\" % target_path\n            if not arguments.dry_run:\n                os.rename(full_target_path, backup_path)\n                renamed.append((full_target_path, backup_path))\n\n                if target_path.endswith(\".py\"):\n                    if os.path.isfile(full_target_path + \"c\"):\n                        os.unlink(full_target_path + \"c\")\n                    if os.path.isfile(full_target_path + \"o\"):\n                        os.unlink(full_target_path + \"o\")\n\n    def copy(old_source_path, new_source_path, target_path):\n        global copied_files, modified_files\n        global resources_modified, sources_modified\n\n        full_source_path = os.path.join(source_dir, new_source_path)\n        full_target_path = os.path.join(target_dir, target_path)\n        backup_path = os.path.join(os.path.dirname(full_target_path),\n                                   \"_\" + os.path.basename(target_path))\n\n        if skip(new_source_path) or not os.path.isfile(full_source_path):\n            remove(old_source_path, target_path)\n            return\n\n        if os.path.isfile(full_source_path) and os.path.isdir(full_target_path):\n            print \"\"\"\nThe directory\n\n  %s\n\nis about to be deleted because a file is about to be installed in its\nplace.  Please make sure it doesn't contain anything that shouldn't be\ndeleted.\n\"\"\" % full_target_path\n\n            if not installation.input.yes_or_no(\"Do you want to delete the directory?\", default=False):\n                return False\n\n            print \"Removing directory: %s\" % target_path\n            if not arguments.dry_run:\n                os.rename(full_target_path, backup_path)\n                renamed.append((full_target_path, backup_path))\n\n        if not os.path.isfile(full_target_path):\n            print \"New file: %s\" % target_path\n            if not arguments.dry_run:\n                try: os.makedirs(os.path.dirname(full_target_path), 0755)\n                except OSError as error:\n                    if error.errno == errno.EEXIST: pass\n                    else: raise\n                copyfile(full_source_path, full_target_path)\n                created_file.append(full_target_path)\n            copied_files += 1\n            if isResource(target_path):\n                resources_modified = True\n            else:\n                sources_modified = True\n        else:\n            old_file_sha1 = installation.utils.get_file_sha1(\n                git, old_sha1, old_source_path)\n            new_file_sha1 = installation.utils.get_file_sha1(\n                git, new_sha1, new_source_path)\n\n            assert old_file_sha1 is not None\n            assert new_file_sha1 is not None\n\n            current_file_sha1 = installation.utils.hash_file(\n                git, full_target_path)\n\n            if current_file_sha1 != new_file_sha1:\n                if current_file_sha1 != old_file_sha1:\n                    def generateVersion(label, path):\n                        if label == \"installed\":\n                            source = installation.utils.run_git(\n                                [git, \"cat-file\", \"blob\", old_file_sha1],\n                                cwd=installation.root_dir)\n                            with open(full_target_path + \".org\", \"w\") as target:\n                                target.write(source)\n                        elif label == \"updated\":\n                            copyfile(full_source_path,\n                                     full_target_path + \".new\")\n\n                    update_query = installation.utils.UpdateModifiedFile(\n                        arguments,\n                        message=\"\"\"\\\nA source file is about to be updated, but the existing source file\nappears to have been edited since it was installed.\n\n  Installed version: %(installed)s\n  Current version  : %(current)s\n  Updated version  : %(updated)s\n\nNot installing the updated version can cause unpredictable results.\n\"\"\",\n                        versions={ \"installed\": full_target_path + \".org\",\n                                   \"current\": full_target_path,\n                                   \"updated\": full_target_path + \".new\" },\n                        options=[ (\"i\", \"install the updated version\"),\n                                  (\"k\", \"keep the current version\"),\n                                  (\"do\", (\"installed\", \"current\")),\n                                  (\"dn\", (\"current\", \"updated\")) ],\n                        generateVersion=generateVersion)\n\n                    install_file = update_query.prompt() == \"i\"\n                else:\n                    install_file = True\n\n                if install_file:\n                    print \"Updated file: %s\" % target_path\n                    if not arguments.dry_run:\n                        os.rename(full_target_path, backup_path)\n                        renamed.append((full_target_path, backup_path))\n                        copyfile(full_source_path, full_target_path)\n                        created_file.append(full_target_path)\n                        if not compile_file(target_path):\n                            compilation_failed.append(target_path)\n                    modified_files += 1\n                    if isResource(target_path):\n                        resources_modified = True\n                    else:\n                        sources_modified = True\n\n        if target_path.startswith(\"hooks/\"):\n            mode = 0755\n        else:\n            mode = 0644\n        if not arguments.dry_run:\n            if not os.path.islink(full_target_path):\n                os.chmod(full_target_path, mode)\n            os.lchown(full_target_path, installation.system.uid, installation.system.gid)\n\n    differences = installation.utils.run_git(\n        [git, \"diff\", \"--numstat\", \"%s..%s\" % (old_sha1, new_sha1)],\n        cwd=installation.root_dir)\n\n    changed_paths = set()\n\n    for line in differences.splitlines():\n        _, _, path = map(str.strip, line.split(None, 2))\n        if path.startswith(\"src/\"):\n            changed_paths.add(path[len(\"src/\"):])\n        elif not old_has_src:\n            if os.path.isfile(os.path.join(target_dir, path)):\n                changed_paths.add(path)\n\n    for path in sorted(changed_paths):\n        if old_has_src:\n            old_source_path = os.path.join(\"src\", path)\n        else:\n            old_source_path = path\n        if copy(old_source_path=old_source_path,\n                new_source_path=os.path.join(\"src\", path),\n                target_path=path) is False:\n            return False\n\n    if compilation_failed:\n        return False\n\n    if copied_files == 0 and modified_files == 0:\n        print \"No new or modified source files.\"\n\n    return True\n\ndef undo():\n    map(os.unlink, reversed(created_file))\n    map(os.rmdir, reversed(created_dir))\n\n    for target, backup in renamed:\n        os.rename(backup, target)\n\ndef finish(mode, arguments, data):\n    for target, backup in renamed:\n        if os.path.isdir(backup): shutil.rmtree(backup)\n        else: os.unlink(backup)\n"
  },
  {
    "path": "installation/git.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport os\nimport subprocess\n\nimport installation\n\ndef install(data):\n    socket_path = os.path.join(installation.paths.run_dir,\n                               \"main\", \"sockets\", \"githook.unix\")\n    subprocess.check_call([installation.prereqs.git.path,\n                           \"config\", \"--system\", \"critic.socket\", socket_path])\n    return True\n"
  },
  {
    "path": "installation/httpd.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2015 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport os\nimport re\nimport subprocess\nimport time\n\nimport installation\n\narguments = None\ninstance = None\n\ncreated_file = []\nrenamed = []\n\ndef backup_path(path):\n    return os.path.join(os.path.dirname(path), \"_\" + os.path.basename(path))\n\ndef undoable_remove(path):\n    os.rename(path, backup_path(path))\n    renamed.append((path, backup_path(path)))\n\ndef process_configuration_file(\n        mode, data, template_path, target_path, message=None):\n    global created_file, renamed\n\n    with open(template_path, \"r\") as template_file:\n        template = template_file.read().decode(\"utf-8\")\n        source = template % data\n\n    if mode == \"install\":\n        write_target = True\n    else:\n        with open(target_path, \"r\") as target_file:\n            target = target_file.read().decode(\"utf-8\")\n\n        if source != target:\n            def generateVersion(label, path):\n                if label == \"updated\":\n                    with open(path, \"w\") as target:\n                        target.write(source.encode(\"utf-8\"))\n\n            update_query = installation.utils.UpdateModifiedFile(\n                arguments,\n                message=message,\n                versions={ \"current\": target_path,\n                           \"updated\": target_path + \".new\" },\n                options=[ (\"i\", \"install the updated version\"),\n                          (\"k\", \"keep the current version\"),\n                          (\"d\", (\"current\", \"updated\")) ],\n                generateVersion=generateVersion)\n\n            write_target = update_query.prompt() == \"i\"\n        else:\n            write_target = False\n\n        if write_target:\n            if not getattr(arguments, \"dry_run\", False):\n                undoable_remove(target_path)\n\n            print \"Updated file: %s\" % target_path\n\n    if write_target and not getattr(arguments, \"dry_run\", False):\n        with open(target_path, \"w\") as target_file:\n            created_file.append(target_path)\n            os.chmod(target_path, 0640)\n            target_file.write(source.encode(\"utf-8\"))\n\nclass Service(object):\n    def __init__(self):\n        self.stopped = False\n\n    def service_command(self, command, errors_are_fatal):\n        print\n        try:\n            subprocess.check_call([\"service\", self.service_name, command])\n        except subprocess.CalledProcessError:\n            print \"WARNING: The %s service failed to %s.\" % (self.display_name,\n                                                             command)\n\n            if errors_are_fatal:\n                print \"\"\"\nYou can now either abort this Critic installation/upgrade, or you can\ngo ahead anyway, fix the configuration problem manually (now or\nlater), and then make sure the %(name)s service is running yourself\nusing the command\n\n  service %(name)s (start|restart)\n\nNote that if you don't abort, the Critic system will most likely not\nbe accessible until the configuration problem has been fixed.\n\"\"\" % { \"name\": self.service_name }\n                return not installation.input.yes_or_no(\n                    \"Do you want to abort this Critic installation/upgrade?\")\n        return True\n\n    def start(self, errors_are_fatal=True):\n        print\n        if not self.service_command(\"start\", errors_are_fatal):\n            return False\n        self.stopped = False\n        return True\n\n    def stop(self, errors_are_fatal=False):\n        print\n        if not self.service_command(\"stop\", errors_are_fatal):\n            return False\n        self.stopped = True\n        return True\n\n    def restart(self):\n        print\n        if not self.stop():\n            return False\n        return self.start()\n\n    def prepare(self, mode, arguments, data):\n        return True\n    def install(self, data):\n        return True\n    def upgrade(self, arguments, data):\n        return True\n\n    def undo(self):\n        if self.stopped:\n            self.start()\n\nclass Apache(Service):\n    display_name = \"Apache\"\n    service_name = \"apache2\"\n\n    etc_dir = \"/etc/apache2\"\n    template_dir = \"installation/templates/apache\"\n\n    def __init__(self):\n        self.template_path = os.path.join(\n            installation.root_dir, self.template_dir,\n            \"site.%s\" % installation.config.access_scheme)\n        self.site_enabled = False\n        self.default_site_disabled = False\n\n    def get_version(self):\n        output = subprocess.check_output([installation.prereqs.apache2ctl.path, \"-v\"])\n        match = re.search(\"Server version:\\s*Apache/([^\\s\\n]*)\", output, re.M)\n        if not match:\n            return None\n        return match.group(1)\n\n    def prepare(self, mode, arguments, data):\n        if installation.config.auth_mode == \"critic\":\n            pass_auth = \"On\"\n        else:\n            pass_auth = \"Off\"\n\n        data[\"installation.apache.pass_auth\"] = pass_auth\n\n        return True\n\n    def restart(self):\n        if not self.stop():\n            return False\n        time.sleep(1)\n        return self.start()\n\n    def setup(self):\n        version = self.get_version()\n        if version and version.startswith(\"2.2.\"):\n            self.site_suffix = \"\"\n            self.default_site = \"default\"\n        else:\n            self.site_suffix = \".conf\"\n            self.default_site = \"000-default\"\n\n        self.target_path = os.path.join(\n            self.etc_dir, \"sites-available\", \"critic-main%s\" % self.site_suffix)\n\n    def install(self, data):\n        self.setup()\n\n        process_configuration_file(\n            \"install\", data, self.template_path, self.target_path)\n\n        subprocess.check_call([installation.prereqs.a2enmod.path, \"expires\"])\n        subprocess.check_call([installation.prereqs.a2enmod.path, \"rewrite\"])\n        subprocess.check_call([installation.prereqs.a2enmod.path, \"wsgi\"])\n\n        subprocess.check_call([installation.prereqs.a2ensite.path, \"critic-main\"])\n        self.site_enabled = True\n\n        output = subprocess.check_output(\n            [installation.prereqs.a2dissite.path, self.default_site],\n            env={ \"LANG\": \"C\" })\n        if (\"Site %s disabled.\" % self.default_site) in output:\n            self.default_site_disabled = True\n\n        return self.restart()\n\n    def upgrade(self, arguments, data):\n        self.setup()\n\n        # If the configuration file doesn't exist, we're probably migrating the\n        # system from one web server to another, so run the whole installation\n        # procedure instead.\n        if not os.path.isfile(self.target_path):\n            return install(data)\n\n        process_configuration_file(\n            \"upgrade\", data, self.template_path, self.target_path, \"\"\"\\\nThe Apache site definition is about to be updated.  Please check that no local\nmodifications are being overwritten.\n\n  Current version: %(current)s\n  Updated version: %(updated)s\n\nPlease note that if the modifications are not installed, the system is likely\nto break.\n\"\"\")\n\n        return True\n\n    def undo(self):\n        if self.site_enabled:\n            subprocess.check_call(\n                [installation.prereqs.a2dissite.path, \"critic-main\"])\n\n            if self.default_site_disabled:\n                subprocess.check_call(\n                    [installation.prereqs.a2ensite.path, self.default_site])\n\n            self.restart()\n\nclass nginx(Service):\n    display_name = service_name = \"nginx\"\n\n    etc_dir = \"/etc/nginx\"\n    template_dir = \"installation/templates/nginx\"\n\n    def __init__(self):\n        self.site_enabled = False\n        self.default_site_disabled = False\n        self.template_path = os.path.join(\n            installation.root_dir, self.template_dir,\n            \"site.%s\" % installation.config.access_scheme)\n        self.target_path = os.path.join(\n            self.etc_dir, \"sites-available/critic-main\")\n        self.enabled_path = os.path.join(\n            self.etc_dir, \"sites-enabled/critic-main\")\n        self.default_site_path = os.path.join(\n            self.etc_dir, \"sites-enabled/default\")\n\n    def install(self, data):\n        process_configuration_file(\n            \"install\", data, self.template_path, self.target_path)\n\n        os.symlink(self.target_path, self.enabled_path)\n        self.site_enabled = True\n\n        if os.path.islink(self.default_site_path):\n            os.unlink(self.default_site_path)\n            self.default_site_disabled = True\n\n        return self.restart()\n\n    def upgrade(self, arguments, data):\n        # If the configuration file doesn't exist, we're probably migrating the\n        # system from one web server to another, so run the whole installation\n        # procedure instead.\n        if not os.path.isfile(self.target_path):\n            return install(data)\n\n        process_configuration_file(\n            \"upgrade\", data, self.template_path, self.target_path, \"\"\"\\\nThe nginx site definition is about to be updated.  Please check that no local\nmodifications are being overwritten.\n\n  Current version: %(current)s\n  Updated version: %(updated)s\n\nPlease note that if the modifications are not installed, the system is likely\nto break.\n\"\"\")\n\n        return True\n\n    def undo(self):\n        if self.site_enabled:\n            os.unlink(self.enabled_path)\n\n            if self.default_site_disabled:\n                os.symlink(\n                    os.path.join(self.etc_dir, \"sites-available/default\"),\n                    self.default_site_path)\n\n            self.restart()\n\nclass uWSGIBackend(Service):\n    display_name = \"uWSGI\"\n    service_name = \"uwsgi\"\n\n    etc_dir = \"/etc/uwsgi\"\n    template_dir = \"installation/templates/uwsgi\"\n\n    def __init__(self):\n        self.app_enabled = False\n        self.template_path = os.path.join(\n            installation.root_dir, self.template_dir, \"app.backend.ini\")\n        self.target_path = os.path.join(\n            self.etc_dir, \"apps-available/critic-backend-main.ini\")\n        self.enabled_path = os.path.join(\n            self.etc_dir, \"apps-enabled/critic-backend-main.ini\")\n\n    def install(self, data):\n        process_configuration_file(\n            \"install\", data, self.template_path, self.target_path)\n\n        os.symlink(self.target_path, self.enabled_path)\n        self.app_enabled = True\n\n        return self.restart()\n\n    def upgrade(self, arguments, data):\n        # If the configuration file doesn't exist, we're probably migrating the\n        # system from one web server to another, so run the whole installation\n        # procedure instead.\n        if not os.path.isfile(self.target_path):\n            return install(data)\n\n        process_configuration_file(\n            \"upgrade\", data, self.template_path, self.target_path, \"\"\"\\\nThe uWSGI back-end app definition is about to be updated.  Please check that\nno local modifications are being overwritten.\n\n  Current version: %(current)s\n  Updated version: %(updated)s\n\nPlease note that if the modifications are not installed, the system is likely\nto break.\n\"\"\")\n\n        return True\n\n    def undo(self):\n        if self.app_enabled:\n            os.unlink(self.enabled_path)\n            self.restart()\n\nclass uWSGIFrontend(Service):\n    display_name = \"uWSGI\"\n    service_name = \"uwsgi\"\n\n    etc_dir = \"/etc/uwsgi\"\n    template_dir = \"installation/templates/uwsgi\"\n\n    def __init__(self):\n        self.app_enabled = False\n        self.template_path = os.path.join(\n            installation.root_dir, self.template_dir,\n            \"app.frontend.ini.%s\" % installation.config.access_scheme)\n        self.target_path = os.path.join(\n            self.etc_dir, \"apps-available/critic-frontend-main.ini\")\n        self.enabled_path = os.path.join(\n            self.etc_dir, \"apps-enabled/critic-frontend-main.ini\")\n\n    def install(self, data):\n        process_configuration_file(\n            \"install\", data, self.template_path, self.target_path)\n\n        os.symlink(self.target_path, self.enabled_path)\n        self.app_enabled = True\n\n        return self.restart()\n\n    def upgrade(self, arguments, data):\n        # If the configuration file doesn't exist, we're probably migrating the\n        # system from one web server to another, so run the whole installation\n        # procedure instead.\n        if not os.path.isfile(self.target_path):\n            return install(data)\n\n        process_configuration_file(\n            \"upgrade\", data, self.template_path, self.target_path, \"\"\"\\\nThe uWSGI front-end app definition is about to be updated.  Please check that\nno local modifications are being overwritten.\n\n  Current version: %(current)s\n  Updated version: %(updated)s\n\nPlease note that if the modifications are not installed, the system is likely\nto break.\n\"\"\")\n\n        return True\n\n    def undo(self):\n        if self.app_enabled:\n            os.unlink(self.enabled_path)\n            self.restart()\n\nclass Multiple():\n    def __init__(self, *services):\n        self.services = services\n\n    def prepare(self, *args):\n        return all(service.prepare(*args) for service in self.services)\n\n    def install(self, *args):\n        return all(service.install(*args) for service in self.services)\n\n    def upgrade(self, *args):\n        return all(service.upgrade(*args) for service in self.services)\n\n    def undo(self):\n        for service in self.services:\n            service.undo()\n\n    def start(self):\n        return all(service.start() for service in self.services)\n    def stop(self):\n        return all(service.stop() for service in self.services)\n    def restart(self):\n        return all(service.restart() for service in self.services)\n\ndef prepare(mode, args, data):\n    global arguments, instance\n\n    arguments = args\n\n    data[\"installation.httpd.username\"] = \"www-data\"\n    data[\"installation.httpd.groupname\"] = \"www-data\"\n\n    if installation.config.web_server_integration == \"apache\":\n        instance = Apache()\n        backend_service = Apache.service_name\n    elif installation.config.web_server_integration == \"uwsgi\":\n        instance = Multiple(uWSGIFrontend(), uWSGIBackend())\n        backend_service = uWSGIBackend.service_name\n    elif installation.config.web_server_integration == \"nginx+uwsgi\":\n        instance = Multiple(nginx(), uWSGIBackend())\n        backend_service = uWSGIBackend.service_name\n    else:\n        return True\n\n    data[\"installation.httpd.backend_service\"] = backend_service\n\n    return instance.prepare(mode, arguments, data)\n\ndef install(data):\n    if instance:\n        return instance.install(data)\n    return True\n\ndef upgrade(arguments, data):\n    if instance:\n        return instance.upgrade(arguments, data)\n    return True\n\ndef undo():\n    if instance:\n        instance.undo()\n\n    map(os.unlink, created_file)\n\n    for target, backup in renamed:\n        os.rename(backup, target)\n\ndef finish(mode, arguments, data):\n    for target, backup in renamed:\n        os.unlink(backup)\n\ndef start():\n    if instance:\n        return instance.start()\n    return True\n\ndef stop():\n    if instance:\n        return instance.stop()\n    return True\n"
  },
  {
    "path": "installation/initd.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport os\nimport os.path\nimport pwd\nimport grp\nimport subprocess\n\nimport installation\n\ncreated_file = []\nrenamed = []\n\nrclinks_added = False\nservicemanager_started = False\nservicemanager_stopped = False\n\ndef stop(identity=\"main\"):\n    global servicemanager_stopped\n    servicemanager_stopped = True\n    print\n    try:\n        subprocess.check_call([\"service\", \"critic-%s\" % identity, \"stop\"])\n    except subprocess.CalledProcessError:\n        return False\n\n    return True\n\ndef start(identity=\"main\"):\n    print\n    try:\n        subprocess.check_call([\"service\", \"critic-%s\" % identity, \"start\"])\n    except subprocess.CalledProcessError:\n        return False\n\n    global servicemanager_started\n    servicemanager_started = True\n\n    return True\n\ndef restart(identity=\"main\"):\n    print\n    try:\n        subprocess.check_call([\"service\", \"critic-%s\" % identity, \"restart\"])\n    except subprocess.CalledProcessError:\n        return False\n\n    return True\n\ndef install(data):\n    global servicemanager_started, rclinks_added\n\n    source_path = os.path.join(installation.root_dir, \"installation\", \"templates\", \"initd\")\n    target_path = os.path.join(\"/etc\", \"init.d\", \"critic-main\")\n\n    with open(target_path, \"w\") as target:\n        created_file.append(target_path)\n\n        os.chmod(target_path, 0755)\n        os.chown(target_path, installation.system.uid, installation.system.gid)\n\n        with open(source_path, \"r\") as source:\n            target.write((source.read().decode(\"utf-8\") % data).encode(\"utf-8\"))\n\n    subprocess.check_call([\"update-rc.d\", \"critic-main\", \"defaults\"])\n    rclinks_added = True\n\n    start()\n\n    return True\n\ndef upgrade(arguments, data):\n    source_path = os.path.join(installation.root_dir, \"installation\", \"templates\", \"initd\")\n    target_path = os.path.join(\"/etc\", \"init.d\", \"critic-main\")\n    backup_path = os.path.join(os.path.dirname(target_path), \"_\" + os.path.basename(target_path))\n\n    source = open(source_path, \"r\").read().decode(\"utf-8\") % data\n    target = open(target_path, \"r\").read().decode(\"utf-8\")\n\n    system_uid = pwd.getpwnam(data[\"installation.system.username\"]).pw_uid\n    system_gid = grp.getgrnam(data[\"installation.system.groupname\"]).gr_gid\n\n    if source != target:\n        def generateVersion(label, path):\n            if label == \"updated\":\n                with open(path, \"w\") as target:\n                    target.write(source.encode(\"utf-8\"))\n\n        update_query = installation.utils.UpdateModifiedFile(\n            arguments,\n            message=\"\"\"\\\nThe SysV init script is about to be updated.  Please check that no local\nmodifications are being overwritten.\n\n  Current version: %(current)s\n  Updated version: %(updated)s\n\nPlease note that if the modifications are not installed, the system is\nlikely to break.\n\"\"\",\n            versions={ \"current\": target_path,\n                       \"updated\": target_path + \".new\" },\n            options=[ (\"i\", \"install the updated version\"),\n                      (\"k\", \"keep the current version\"),\n                      (\"d\", (\"current\", \"updated\")) ],\n            generateVersion=generateVersion)\n\n        write_target = update_query.prompt() == \"i\"\n    else:\n        write_target = False\n\n    if write_target:\n        print \"Updated file: %s\" % target_path\n\n        if not arguments.dry_run:\n            os.rename(target_path, backup_path)\n            renamed.append((target_path, backup_path))\n\n            with open(target_path, \"w\") as target:\n                created_file.append(target_path)\n                os.chmod(target_path, 0755)\n                os.chown(target_path, system_uid, system_gid)\n                target.write(source.encode(\"utf-8\"))\n\n    return True\n\ndef undo():\n    if servicemanager_started:\n        stop()\n    elif servicemanager_stopped:\n        start()\n\n    map(os.unlink, created_file)\n\n    for target, backup in renamed: os.rename(backup, target)\n\n    if rclinks_added:\n        subprocess.check_call([\"update-rc.d\", \"critic-main\", \"remove\"])\n\ndef finish(mode, arguments, data):\n    for target, backup in renamed: os.unlink(backup)\n"
  },
  {
    "path": "installation/input.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\n\nimport inpututils\n\nheadless = False\n\ndef yes_or_no(prompt, default=None):\n    if headless:\n        if default is None:\n            print \"\"\"\nERROR: yes/no input requested in headless mode!\n  Prompt: %s\n\"\"\" % prompt\n            sys.exit(1)\n        else:\n            print \"%s %s\" % (prompt, \"y\" if default else \"n\")\n            return default\n\n    return inpututils.yes_or_no(prompt, default)\n\ndef string(prompt, default=None, check=None):\n    if headless:\n        if default is None:\n            print \"\"\"\nERROR: string input requested in headless mode!\n  Prompt: %s\n\"\"\" % prompt\n            sys.exit(1)\n        else:\n            print \"%s %s\" % (prompt, default)\n            if not check or inpututils.apply_check(check, default):\n                return default\n            else:\n                sys.exit(1)\n\n    return inpututils.string(prompt, default, check)\n\ndef password(prompt, default=None, twice=True):\n    if headless:\n        if default is None:\n            print \"\"\"\nERROR: password input requested in headless mode!\n  Prompt: %s\n\"\"\" % prompt\n            sys.exit(1)\n        else:\n            print \"%s %s\" % (prompt, \"****\")\n            return default\n\n    return inpututils.password(prompt, default, twice)\n"
  },
  {
    "path": "installation/lifecycle.json",
    "content": "{\n\t\"branch\": \"stable/1\",\n\t\"stable\": true\n}\n"
  },
  {
    "path": "installation/migrate.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport os\nimport sys\nimport json\n\nimport installation\n\ndef scripts_to_run(data):\n    git = data[\"installation.prereqs.git\"]\n    old_sha1 = data[\"sha1\"]\n    performed_migrations = data.get(\"migrations\", [])\n    scripts = []\n\n    if os.path.exists(\"installation/migrations\"):\n        for script in os.listdir(\"installation/migrations\"):\n            if not script.endswith(\".py\"):\n                continue\n            if script in performed_migrations:\n                continue\n\n            script_path = os.path.join(\"installation/migrations\", script)\n\n            if installation.utils.get_file_sha1(git, old_sha1, script_path) is not None:\n                # The migration script already existed when Critic was installed\n                # and there's thus no point in running it now.\n                continue\n\n            date_added = installation.utils.get_initial_commit_date(git, script_path)\n\n            scripts.append((date_added, script))\n\n    scripts.sort()\n    scripts = [script for (date_added, script) in scripts]\n\n    return scripts\n\ndef will_modify_dbschema(data):\n    for script in scripts_to_run(data):\n        if script.startswith(\"dbschema.\"):\n            return True\n    return False\n\ndef upgrade(arguments, data):\n    if \"migrations\" not in data:\n        data[\"migrations\"] = []\n\n    for script in scripts_to_run(data):\n        script_path = os.path.join(\"installation/migrations\", script)\n\n        print\n        print \"Running %s ...\" % script\n\n        if arguments.dry_run:\n            continue\n\n        env = os.environ.copy()\n\n        # This is \"/etc/critic/main\", set by upgrade.py, or something else\n        # if the --etc-dir/--identity arguments were used.\n        env[\"PYTHONPATH\"] = sys.path[0] + \":\" + installation.root_dir\n\n        installation.process.check_input([sys.executable, script_path,\n                                          \"--uid=%s\" % installation.system.uid,\n                                          \"--gid=%d\" % installation.system.gid],\n                                         stdin=json.dumps(data), env=env)\n\n        data[\"migrations\"].append(script)\n\n    return True\n"
  },
  {
    "path": "installation/migrations/dbschema.altertable.branches.add.archived.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport psycopg2\nimport argparse\nimport os\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\"--uid\", type=int)\nparser.add_argument(\"--gid\", type=int)\n\narguments = parser.parse_args()\n\nos.setgid(arguments.gid)\nos.setuid(arguments.uid)\n\nimport configuration\n\ndb = psycopg2.connect(**configuration.database.PARAMETERS)\ncursor = db.cursor()\n\ntry:\n    # Check if the 'archived' column already exists.\n    cursor.execute(\"SELECT archived FROM branches\")\nexcept psycopg2.ProgrammingError:\n    db.rollback()\n    cursor.execute(\"\"\"ALTER TABLE branches\n                          ADD archived BOOLEAN NOT NULL DEFAULT FALSE\"\"\")\n    db.commit()\n\ndb.close()\n"
  },
  {
    "path": "installation/migrations/dbschema.altertable.changesets.parent.dropnotnull.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport psycopg2\nimport json\nimport argparse\nimport os\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\"--uid\", type=int)\nparser.add_argument(\"--gid\", type=int)\n\narguments = parser.parse_args()\n\nos.setgid(arguments.gid)\nos.setuid(arguments.uid)\n\ndata = json.load(sys.stdin)\n\nimport configuration\n\ndb = psycopg2.connect(**configuration.database.PARAMETERS)\ncursor = db.cursor()\n\n# This command doesn't fail if the column already doesn't have a NOT\n# NULL constraint, so no reason to catch errors or try to determine\n# whether the constraint is there.\ncursor.execute(\"ALTER TABLE changesets ALTER parent DROP NOT NULL\")\n\ndb.commit()\ndb.close()\n"
  },
  {
    "path": "installation/migrations/dbschema.altertable.commentchainchanges.addressed_by.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport psycopg2\nimport json\nimport argparse\nimport os\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\"--uid\", type=int)\nparser.add_argument(\"--gid\", type=int)\n\narguments = parser.parse_args()\n\nos.setgid(arguments.gid)\nos.setuid(arguments.uid)\n\ndata = json.load(sys.stdin)\n\ndb = psycopg2.connect(database=\"critic\")\ncursor = db.cursor()\n\ntry:\n    # Make sure the columns don't already exist.\n    cursor.execute(\"SELECT from_addressed_by, to_addressed_by FROM commentchainchanges\")\n\n    # Above statement should have thrown a psycopg2.ProgrammingError, but it\n    # didn't, so just exit.\n    sys.exit(0)\nexcept psycopg2.ProgrammingError:\n    db.rollback()\nexcept:\n    raise\n\ncursor.execute(\"\"\"ALTER TABLE commentchainchanges\n                          ADD from_addressed_by INTEGER REFERENCES commits,\n                          ADD to_addressed_by INTEGER REFERENCES commits\"\"\")\n\ndb.commit()\ndb.close()\n"
  },
  {
    "path": "installation/migrations/dbschema.altertable.commentchainchanges.drop.review.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport psycopg2\nimport json\nimport argparse\nimport os\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\"--uid\", type=int)\nparser.add_argument(\"--gid\", type=int)\n\narguments = parser.parse_args()\n\nos.setgid(arguments.gid)\nos.setuid(arguments.uid)\n\ndata = json.load(sys.stdin)\n\ndb = psycopg2.connect(database=\"critic\")\ncursor = db.cursor()\n\ncursor.execute(\"\"\"ALTER TABLE commentchainchanges\n                    DROP COLUMN IF EXISTS review CASCADE\"\"\")\n\ndb.commit()\ndb.close()\n"
  },
  {
    "path": "installation/migrations/dbschema.altertable.commentchainlines.drop.commit.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport psycopg2\nimport json\nimport argparse\nimport os\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\"--uid\", type=int)\nparser.add_argument(\"--gid\", type=int)\n\narguments = parser.parse_args()\n\nos.setgid(arguments.gid)\nos.setuid(arguments.uid)\n\ndata = json.load(sys.stdin)\n\ndb = psycopg2.connect(database=\"critic\")\ncursor = db.cursor()\n\ntry:\n    # Check if the 'commit' column already doesn't exist.\n    cursor.execute(\"SELECT commit FROM commentchainlines\")\nexcept psycopg2.ProgrammingError:\n    # Seems it doesn't, so just exit.\n    sys.exit(0)\n\ncursor.execute(\"\"\"ALTER TABLE commentchainlines DROP commit\"\"\")\n\ndb.commit()\ndb.close()\n"
  },
  {
    "path": "installation/migrations/dbschema.altertable.comments.time.setdefaultnow.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2016 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport psycopg2\nimport json\nimport argparse\nimport os\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\"--uid\", type=int)\nparser.add_argument(\"--gid\", type=int)\n\narguments = parser.parse_args()\n\nos.setgid(arguments.gid)\nos.setuid(arguments.uid)\n\ndata = json.load(sys.stdin)\n\nimport configuration\n\ndb = psycopg2.connect(**configuration.database.PARAMETERS)\ncursor = db.cursor()\n\n# This command doesn't fail if the column already has a DEFAULT, so no reason to\n# catch errors or try to determine whether the constraint is there.\ncursor.execute(\"ALTER TABLE comments ALTER time SET DEFAULT NOW()\")\n\ndb.commit()\ndb.close()\n"
  },
  {
    "path": "installation/migrations/dbschema.altertable.previousreachable.rebase.ondeletecascade.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport psycopg2\nimport json\nimport argparse\nimport os\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\"--uid\", type=int)\nparser.add_argument(\"--gid\", type=int)\n\narguments = parser.parse_args()\n\nos.setgid(arguments.gid)\nos.setuid(arguments.uid)\n\ndata = json.load(sys.stdin)\n\ndb = psycopg2.connect(database=\"critic\")\ncursor = db.cursor()\n\n# This command doesn't fail if the foreign key constraint already has\n# \"on delete cascade\", and there's really no reason to try to figure\n# if it has; easier to just drop it and re-add it.\ncursor.execute(\"\"\"ALTER TABLE previousreachable\n                    DROP CONSTRAINT previousreachable_rebase_fkey,\n                    ADD FOREIGN KEY (rebase) REFERENCES reviewrebases (id) ON DELETE CASCADE\"\"\")\n\ndb.commit()\ndb.close()\n"
  },
  {
    "path": "installation/migrations/dbschema.altertable.repositories.drop.branch.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport psycopg2\nimport json\nimport argparse\nimport os\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\"--uid\", type=int)\nparser.add_argument(\"--gid\", type=int)\n\narguments = parser.parse_args()\n\nos.setgid(arguments.gid)\nos.setuid(arguments.uid)\n\ndata = json.load(sys.stdin)\n\ndb = psycopg2.connect(database=\"critic\")\ncursor = db.cursor()\n\ntry:\n    # Check if the 'branch' column already doesn't exist.\n    cursor.execute(\"SELECT branch FROM repositories\")\nexcept psycopg2.ProgrammingError:\n    # Seems it doesn't, so just exit.\n    sys.exit(0)\n\ncursor.execute(\"\"\"ALTER TABLE repositories DROP branch\"\"\")\n\ndb.commit()\ndb.close()\n"
  },
  {
    "path": "installation/migrations/dbschema.altertable.repositories.drop.relay.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2013 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport psycopg2\nimport argparse\nimport os\nimport shutil\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\"--uid\", type=int)\nparser.add_argument(\"--gid\", type=int)\n\narguments = parser.parse_args()\n\nos.setgid(arguments.gid)\nos.setuid(arguments.uid)\n\nimport configuration\n\ndb = psycopg2.connect(**configuration.database.PARAMETERS)\ncursor = db.cursor()\n\ntry:\n    # Check if the 'relay' column already doesn't exist (and also fetch all the\n    # relay paths for use below.)\n    cursor.execute(\"SELECT relay FROM repositories\")\nexcept psycopg2.ProgrammingError:\n    # Seems it doesn't exist, so just exit.\n    sys.exit(0)\n\nfailed = False\n\nfor (relay_path,) in cursor:\n    try:\n        shutil.rmtree(relay_path)\n    except OSError as error:\n        print (\"WARNING: Failed to remove directory: %s\\n  Error: %s\"\n               % (relay_path, error))\n        failed = True\n\nif failed:\n    print \"\"\"\nSome obsolete directories could not be removed.  They will no longer be used by\nCritic, so you probably want to look into deleting them manually.\n\"\"\"\n\ncursor.execute(\"ALTER TABLE repositories DROP relay\")\n\ndb.commit()\ndb.close()\n"
  },
  {
    "path": "installation/migrations/dbschema.altertable.reviewfilechanges.rename-columns.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2013 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport psycopg2\nimport json\nimport argparse\nimport os\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\"--uid\", type=int)\nparser.add_argument(\"--gid\", type=int)\n\narguments = parser.parse_args()\n\nos.setgid(arguments.gid)\nos.setuid(arguments.uid)\n\ndata = json.load(sys.stdin)\n\ndb = psycopg2.connect(database=\"critic\")\ncursor = db.cursor()\n\ntry:\n    # Make sure the columns don't already exist.\n    cursor.execute(\"SELECT from_state, to_state FROM reviewfilechanges\")\n\n    # Above statement should have thrown a psycopg2.ProgrammingError, but it\n    # didn't, so just exit.\n    sys.exit(0)\nexcept psycopg2.ProgrammingError:\n    db.rollback()\nexcept:\n    raise\n\ncursor.execute(\"\"\"ALTER TABLE reviewfilechanges\n                       RENAME \"from\" TO from_state\"\"\")\ncursor.execute(\"\"\"ALTER TABLE reviewfilechanges\n                       RENAME \"to\" TO to_state\"\"\")\n\ndb.commit()\ndb.close()\n"
  },
  {
    "path": "installation/migrations/dbschema.altertable.reviewmergeconfirmations.add.tail.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2013 Jens Lindström\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport psycopg2\nimport argparse\nimport os\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\"--uid\", type=int)\nparser.add_argument(\"--gid\", type=int)\n\narguments = parser.parse_args()\n\nos.setgid(arguments.gid)\nos.setuid(arguments.uid)\n\nimport configuration\n\ndb = psycopg2.connect(**configuration.database.PARAMETERS)\ncursor = db.cursor()\n\ntry:\n    # Check if the 'tail' column already exists.\n    cursor.execute(\"SELECT tail FROM reviewmergeconfirmations\")\nexcept psycopg2.ProgrammingError:\n    db.rollback()\n    cursor.execute(\"\"\"ALTER TABLE reviewmergeconfirmations\n                          ADD tail INTEGER REFERENCES commits ON DELETE CASCADE\"\"\")\n    db.commit()\n\ndb.close()\n"
  },
  {
    "path": "installation/migrations/dbschema.altertable.reviewrebases.add.equivalent_merge.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport psycopg2\nimport argparse\nimport os\nimport subprocess\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\"--uid\", type=int)\nparser.add_argument(\"--gid\", type=int)\n\narguments = parser.parse_args()\n\nos.setgid(arguments.gid)\nos.setuid(arguments.uid)\n\nimport configuration\n\ndb = psycopg2.connect(**configuration.database.PARAMETERS)\ncursor = db.cursor()\n\ntry:\n    # Check if the 'equivalent_merge' column already exists.\n    cursor.execute(\"SELECT equivalent_merge FROM reviewrebases\")\nexcept psycopg2.ProgrammingError:\n    db.rollback()\nelse:\n    # No error; change appears to have been made already.\n    db.close()\n    sys.exit(0)\n\ndef fetch_commit_sha1(commit_id):\n    cursor.execute(\"SELECT sha1 FROM commits WHERE id=%s\", (commit_id,))\n    (sha1,) = cursor.fetchone()\n    return sha1\n\ndef get_parent_sha1s(repository_path, sha1):\n    output = subprocess.check_output(\n        [configuration.executables.GIT, \"log\", \"-1\", \"--format=%P\", sha1],\n        cwd=repository_path)\n    return output.strip().split()\n\ndef is_ancestor_of(repository_path, ancestor_sha1, descendant_sha1):\n    try:\n        merge_base_sha1 = subprocess.check_output(\n            [configuration.executables.GIT, \"merge-base\",\n             ancestor_sha1, descendant_sha1],\n            cwd=repository_path).strip()\n    except subprocess.CalledProcessError:\n        return False\n    else:\n        return merge_base_sha1 == ancestor_sha1\n\ncursor.execute(\"\"\"ALTER TABLE reviewrebases\n                          ADD equivalent_merge INTEGER REFERENCES commits\"\"\")\n\n#\n# Move all references to equivalent merges stored in the |old_head| column of\n# existing review rebases over to the new |equivalent_merge| column, and restore\n# the value of the |old_head| to be the actual head of the review branch before\n# the rebase.\n#\n\ncursor.execute(\"\"\"SELECT repositories.path,\n                         reviewrebases.id,\n                         reviewrebases.old_head,\n                         reviewrebases.old_upstream,\n                         reviewrebases.new_upstream\n                    FROM reviewrebases\n                    JOIN reviews ON (reviews.id=reviewrebases.review)\n                    JOIN branches ON (branches.id=reviews.branch)\n                    JOIN repositories ON (repositories.id=branches.repository)\n                   WHERE new_head IS NOT NULL\n                     AND old_upstream IS NOT NULL\n                     AND new_upstream IS NOT NULL\"\"\")\n\nfor repository_path, rebase_id, old_head_id, old_upstream_id, new_upstream_id in cursor.fetchall():\n    old_head_sha1 = fetch_commit_sha1(old_head_id)\n    old_head_parent_sha1s = get_parent_sha1s(repository_path, old_head_sha1)\n\n    if len(old_head_parent_sha1s) != 2:\n        # Old head is not a merge commit (or is an 3-or-more-way merge,) so\n        # can't be an equivalent merge.\n        continue\n\n    old_upstream_sha1 = fetch_commit_sha1(old_upstream_id)\n    new_upstream_sha1 = fetch_commit_sha1(new_upstream_id)\n\n    if old_head_parent_sha1s[1] != new_upstream_sha1:\n        # An equivalent merge should be a merge with the real old head as the\n        # first parent and the new upstream as the second parent.  We can't\n        # really check the first parent in a meaningful way, but if the second\n        # parent is \"wrong\", then this can't be an equivalent merge.\n        continue\n\n    if not is_ancestor_of(repository_path, old_upstream_sha1, new_upstream_sha1):\n        # Old upstream is not an ancestor of the new upstream, meaning this is\n        # not a \"fast-forward\" rebase.  Such rebases don't have an equivalent\n        # merge, but rather a \"replayed rebase\".  The replayed rebase however\n        # isn't stored in the |old_head| column, so there is nothing to restore.\n        continue\n\n    # Alright, we're pretty sure that the old head is in fact an equivalent\n    # merge commit.  Store it in the new |equivalent_merge| column and restore\n    # the |old_head| column to the equivalent merge's first parent.\n\n    cursor.execute(\"SELECT id FROM commits WHERE sha1=%s\",\n                   (old_head_parent_sha1s[0],))\n    (real_old_head_id,) = cursor.fetchone()\n\n    cursor.execute(\"\"\"UPDATE reviewrebases\n                         SET old_head=%s,\n                             equivalent_merge=%s\n                       WHERE id=%s\"\"\",\n                   (real_old_head_id, old_head_id, rebase_id))\n\ndb.commit()\ndb.close()\n"
  },
  {
    "path": "installation/migrations/dbschema.altertable.reviewrebases.add.replayed_rebase.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport psycopg2\nimport argparse\nimport os\nimport subprocess\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\"--uid\", type=int)\nparser.add_argument(\"--gid\", type=int)\n\narguments = parser.parse_args()\n\nos.setgid(arguments.gid)\nos.setuid(arguments.uid)\n\nimport configuration\n\ndb = psycopg2.connect(**configuration.database.PARAMETERS)\ncursor = db.cursor()\n\ntry:\n    # Check if the 'replayed_rebase' column already exists.\n    cursor.execute(\"SELECT replayed_rebase FROM reviewrebases\")\nexcept psycopg2.ProgrammingError:\n    db.rollback()\nelse:\n    # No error; change appears to have been made already.\n    db.close()\n    sys.exit(0)\n\ncursor.execute(\"\"\"ALTER TABLE reviewrebases\n                          ADD replayed_rebase INTEGER REFERENCES commits\"\"\")\n\n#\n# Find all replayed rebases and store them in the new |replayed_rebase| column.\n# We identify them via 'conflicts' changesets added for review, whose child\n# (right-hand side) is the new head of a rebase.  The parent (left-hand side) of\n# such a changeset will be the replayed rebase commit.\n#\n# Note: It is theoretically possible for such a 'conflicts' changeset to exist\n# that is not actually indicative of a replayed rebase, if the rebase's new head\n# is a merge commit, and that merge commit is an equivalent merge commit created\n# for an earlier rebase of the same review.\n#\n# Also note: While theoretically possible, the aforementioned possibility is not\n# likely to have happened in practice.\n#\n\ncursor.execute(\"\"\"SELECT DISTINCT changesets.parent, reviewrebases.id\n                    FROM reviewrebases\n                    JOIN reviewchangesets ON (reviewchangesets.review=reviewrebases.review)\n                    JOIN changesets ON (changesets.id=reviewchangesets.changeset\n                                    AND changesets.child=reviewrebases.new_head)\n                   WHERE changesets.type='conflicts'\"\"\")\n\ncursor.executemany(\"\"\"UPDATE reviewrebases\n                         SET replayed_rebase=%s\n                       WHERE id=%s\"\"\",\n                   cursor.fetchall())\n\ndb.commit()\ndb.close()\n"
  },
  {
    "path": "installation/migrations/dbschema.altertable.reviewrecipientfilters.uid-can-be-null.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport psycopg2\nimport argparse\nimport os\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\"--uid\", type=int)\nparser.add_argument(\"--gid\", type=int)\n\narguments = parser.parse_args()\n\nos.setgid(arguments.gid)\nos.setuid(arguments.uid)\n\nimport configuration\n\ndb = psycopg2.connect(**configuration.database.PARAMETERS)\ncursor = db.cursor()\n\ncursor.execute(\"\"\"ALTER TABLE reviewrecipientfilters\n                      DROP CONSTRAINT IF EXISTS reviewrecipientfilters_pkey,\n                      DROP CONSTRAINT IF EXISTS reviewrecipientfilters_review_uid_key,\n                      ALTER uid DROP NOT NULL,\n                      ADD UNIQUE (review, uid)\"\"\")\n\ndb.commit()\ndb.close()\n"
  },
  {
    "path": "installation/migrations/dbschema.altertable.reviews.add.origin.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport psycopg2\nimport json\nimport argparse\nimport os\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\"--uid\", type=int)\nparser.add_argument(\"--gid\", type=int)\n\narguments = parser.parse_args()\n\nos.setgid(arguments.gid)\nos.setuid(arguments.uid)\n\ndata = json.load(sys.stdin)\n\ndb = psycopg2.connect(database=\"critic\")\ncursor = db.cursor()\n\ntry:\n    # Check if the 'origin' column already exists.\n    cursor.execute(\"SELECT origin FROM reviews\")\nexcept psycopg2.ProgrammingError:\n    # Seems it doesn't.\n    db.rollback()\nelse:\n    sys.exit(0)\n\n# Add the reviews.origin column.\ncursor.execute(\n    \"\"\"ALTER TABLE reviews\n         ADD origin INTEGER REFERENCES branches ON DELETE SET NULL\"\"\")\n\n# Copy the information in branches.review over to reviews.origin.\ncursor.execute(\"\"\"SELECT id, review\n                    FROM branches\n                   WHERE review IS NOT NULL\"\"\")\nrows = cursor.fetchall()\ncursor.executemany(\"\"\"UPDATE reviews\n                         SET origin=%s\n                       WHERE id=%s\"\"\",\n                   rows)\n\n# Drop the old branches.review column.\ncursor.execute(\"\"\"ALTER TABLE branches DROP review\"\"\")\n\ndb.commit()\ndb.close()\n"
  },
  {
    "path": "installation/migrations/dbschema.altertable.systemidentities.add.installed_sha1.installed_at.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2013 Martin Olsson\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport psycopg2\nimport argparse\nimport os\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\"--uid\", type=int)\nparser.add_argument(\"--gid\", type=int)\n\narguments = parser.parse_args()\n\nos.setgid(arguments.gid)\nos.setuid(arguments.uid)\n\nimport configuration\n\ndb = psycopg2.connect(**configuration.database.PARAMETERS)\ncursor = db.cursor()\n\ntry:\n    # Check if the 'installed_sha1' column already exists.\n    cursor.execute(\"SELECT installed_sha1 FROM systemidentities\")\nexcept psycopg2.ProgrammingError:\n    db.rollback()\n    cursor.execute(\"ALTER TABLE systemidentities ADD installed_sha1 CHAR(40)\")\n    cursor.execute(\"UPDATE systemidentities SET installed_sha1=''\")\n    cursor.execute(\"ALTER TABLE systemidentities ALTER installed_sha1 SET NOT NULL\")\n    db.commit()\n\ntry:\n    # Check if the 'installed_at' column already exists.\n    cursor.execute(\"SELECT installed_at FROM systemidentities\")\nexcept psycopg2.ProgrammingError:\n    db.rollback()\n    cursor.execute(\"ALTER TABLE systemidentities ADD installed_at TIMESTAMP DEFAULT NOW()\")\n    cursor.execute(\"ALTER TABLE systemidentities ALTER installed_at SET NOT NULL\")\n    db.commit()\n\ndb.close()\n"
  },
  {
    "path": "installation/migrations/dbschema.altertable.systemidentities.url-prefix.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2013 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport psycopg2\nimport argparse\nimport os\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\"--uid\", type=int)\nparser.add_argument(\"--gid\", type=int)\n\narguments = parser.parse_args()\n\nos.setgid(arguments.gid)\nos.setuid(arguments.uid)\n\nimport configuration\n\ndb = psycopg2.connect(**configuration.database.PARAMETERS)\ncursor = db.cursor()\n\ntry:\n    cursor.execute(\"SELECT key, url_prefix FROM systemidentities\")\nexcept psycopg2.ProgrammingError:\n    # We seem to have converted the table already, so just exit.\n    sys.exit(0)\n\nurl_prefixes = cursor.fetchall()\n\ncursor.execute(\"\"\"ALTER TABLE systemidentities\n                      DROP url_prefix,\n                      ADD anonymous_scheme VARCHAR(5),\n                      ADD authenticated_scheme VARCHAR(5),\n                      ADD hostname VARCHAR(256)\"\"\")\n\nif configuration.base.ACCESS_SCHEME in (\"http\", \"https\"):\n    anonymous_scheme = authenticated_scheme = configuration.base.ACCESS_SCHEME\nelse:\n    anonymous_scheme = \"http\"\n    authenticated_scheme = \"https\"\n\nfor key, url_prefix in url_prefixes:\n    if url_prefix.lower().startswith(\"https://\"):\n        hostname = url_prefix[len(\"https://\"):]\n    elif url_prefix.lower().startswith(\"http://\"):\n        hostname = url_prefix[len(\"http://\"):]\n    else:\n        # This would only happen if the system administrator manually\n        # modified the 'systemidentities' table, and any URL constructed\n        # with this URL prefix in the past would most likely have been\n        # broken already.\n\n        print \"\"\"\\\nWARNING: System identity %s's URL prefix was not recognized as either\n         HTTP or HTTPS.  It's assumed to be a plain hostname.\n\nThe URL prefix was: %r\"\"\" % (key, url_prefix)\n\n        hostname = url_prefix\n\n    cursor.execute(\"\"\"UPDATE systemidentities\n                         SET anonymous_scheme=%s,\n                             authenticated_scheme=%s,\n                             hostname=%s\n                       WHERE key=%s\"\"\",\n                   (anonymous_scheme, authenticated_scheme, hostname, key))\n\ncursor.execute(\"\"\"ALTER TABLE systemidentities\n                      ALTER anonymous_scheme SET NOT NULL,\n                      ALTER authenticated_scheme SET NOT NULL,\n                      ALTER hostname SET NOT NULL\"\"\")\n\ndb.commit()\ndb.close()\n"
  },
  {
    "path": "installation/migrations/dbschema.altertable.usergitemails.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport psycopg2\nimport json\nimport argparse\nimport os\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\"--uid\", type=int)\nparser.add_argument(\"--gid\", type=int)\n\narguments = parser.parse_args()\n\nos.setgid(arguments.gid)\nos.setuid(arguments.uid)\n\ndata = json.load(sys.stdin)\n\nimport configuration\n\ndb = psycopg2.connect(**configuration.database.PARAMETERS)\ncursor = db.cursor()\n\n# It's tricky to check whether constraints and indexes already exist,\n# so this script simply attempts to run commands that won't fail even\n# if run multiple times.\n\ncursor.execute(\"\"\"ALTER TABLE usergitemails\n                    DROP CONSTRAINT IF EXISTS usergitemails_pkey,\n                    ADD PRIMARY KEY (email, uid)\"\"\")\n\ncursor.execute(\"DROP INDEX IF EXISTS usergitemails_uid\")\ncursor.execute(\"CREATE INDEX usergitemails_uid ON usergitemails (uid)\")\n\ndb.commit()\ndb.close()\n"
  },
  {
    "path": "installation/migrations/dbschema.altertable.usersessions.add.labels.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2015 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport installation\n\n# Handles command line arguments and sets uid/gid.\ninstallation.utils.start_migration()\n\ndbschema = installation.utils.DatabaseSchema()\ndbschema.create_column(\"usersessions\", \"labels\", \"VARCHAR(256)\")\n"
  },
  {
    "path": "installation/migrations/dbschema.createindex.misc.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport psycopg2\nimport json\nimport argparse\nimport os\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\"--uid\", type=int)\nparser.add_argument(\"--gid\", type=int)\n\narguments = parser.parse_args()\n\nos.setgid(arguments.gid)\nos.setuid(arguments.uid)\n\ndata = json.load(sys.stdin)\n\ndb = psycopg2.connect(database=\"critic\")\ncursor = db.cursor()\n\ndef create_index(table, columns):\n    name = \"%s_%s\" % (table, \"_\".join(columns))\n    cursor.execute(\"DROP INDEX IF EXISTS %s\" % name)\n    cursor.execute(\"CREATE INDEX %s ON %s (%s)\" % (name, table, \", \".join(columns)))\n\n# Replaced by index over 'uid' and 'state'.\ncursor.execute(\"DROP INDEX IF EXISTS reviewfilechanges_uid\")\n\ncreate_index(\"reviewfiles\", [\"review\", \"state\"])\ncreate_index(\"reviewfilechanges\", [\"uid\", \"state\"])\ncreate_index(\"commentchains\", [\"review\", \"type\", \"state\"])\ncreate_index(\"comments\", [\"id\", \"chain\"])\n\ndb.commit()\ndb.close()\n"
  },
  {
    "path": "installation/migrations/dbschema.createtable.accesstokens.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2015 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport installation\n\n# Handles command line arguments and sets uid/gid.\ninstallation.utils.start_migration()\n\ndbschema = installation.utils.DatabaseSchema()\n\n# New definitions in dbschema.user.git.\ndbschema.update(\"\"\"\n\nCREATE TYPE systemaccesstype AS ENUM\n  ( -- The system is accessed as a named user.\n    'user',\n    -- The system is accessed by a system service or similar.\n    'system',\n    -- The system is accessed anonymously.\n    'anonymous' );\n\nCREATE TABLE accesstokens\n  ( id SERIAL PRIMARY KEY,\n\n    -- The type of access granted by this access token.\n    access_type systemaccesstype NOT NULL DEFAULT 'user',\n    -- The user (when access_type='user') or NULL.\n    uid INTEGER REFERENCES users ON DELETE CASCADE,\n\n    -- First part of access token (\"username\").\n    part1 VARCHAR(32) NOT NULL,\n    -- Second part of access token (\"password\").\n    part2 VARCHAR(32) NOT NULL,\n\n    -- Access token title.\n    title VARCHAR(256),\n\n    UNIQUE (part1, part2),\n\n    CONSTRAINT valid_user CHECK ((access_type='user' AND uid IS NOT NULL) OR\n                                 (access_type!='user' AND uid IS NULL)) );\n\n\"\"\")\n"
  },
  {
    "path": "installation/migrations/dbschema.createtable.knownremotes.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport psycopg2\nimport json\nimport argparse\nimport os\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\"--uid\", type=int)\nparser.add_argument(\"--gid\", type=int)\n\narguments = parser.parse_args()\n\nos.setgid(arguments.gid)\nos.setuid(arguments.uid)\n\ndata = json.load(sys.stdin)\n\ndb = psycopg2.connect(database=\"critic\")\ncursor = db.cursor()\n\ntry:\n    # Make sure the table doesn't already exist.\n    cursor.execute(\"SELECT 1 FROM knownremotes\")\n\n    # Above statement should have thrown a psycopg2.ProgrammingError, but it\n    # didn't, so just exit.\n    sys.exit(0)\nexcept psycopg2.ProgrammingError:\n    db.rollback()\nexcept:\n    raise\n\ncursor.execute(\"\"\"CREATE TABLE knownremotes\n                    ( url VARCHAR(256) PRIMARY KEY,\n                      pushing BOOLEAN NOT NULL\n                    )\"\"\")\n\ndb.commit()\ndb.close()\n"
  },
  {
    "path": "installation/migrations/dbschema.createtable.scheduledreviewbrancharchivals.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport psycopg2\nimport json\nimport argparse\nimport os\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\"--uid\", type=int)\nparser.add_argument(\"--gid\", type=int)\n\narguments = parser.parse_args()\n\nos.setgid(arguments.gid)\nos.setuid(arguments.uid)\n\ndata = json.load(sys.stdin)\n\nimport configuration\n\ndb = psycopg2.connect(**configuration.database.PARAMETERS)\ncursor = db.cursor()\n\ntry:\n    # Make sure the table doesn't already exist.\n    cursor.execute(\"SELECT 1 FROM scheduledreviewbrancharchivals\")\n\n    # Above statement should have thrown a psycopg2.ProgrammingError, but it\n    # didn't, so just exit.\n    sys.exit(0)\nexcept psycopg2.ProgrammingError: db.rollback()\nexcept: raise\n\n# Create the table.\ncursor.execute(\"\"\"\n\nCREATE TABLE scheduledreviewbrancharchivals\n  ( review INTEGER PRIMARY KEY REFERENCES reviews (id),\n    deadline TIMESTAMP NOT NULL );\n\n\"\"\")\n\n# For each closed or dropped review, schedule an archival of the review branch.\n# These archivals may end up being ignored, for instance because review branch\n# archiving was disabled altogether.\n#\n# The archiving is randomly distributed over a four week period starting two\n# weeks from now.\ncursor.execute(\"\"\"INSERT INTO scheduledreviewbrancharchivals (review, deadline)\n                       SELECT id, NOW() + INTERVAL '2 weeks' + random() * INTERVAL '4 weeks'\n                         FROM reviews\n                        WHERE state IN ('closed', 'dropped')\"\"\")\n\ndb.commit()\ndb.close()\n"
  },
  {
    "path": "installation/migrations/dbschema.createtable.timezones.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport psycopg2\nimport json\nimport argparse\nimport os\nimport datetime\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\"--uid\", type=int)\nparser.add_argument(\"--gid\", type=int)\n\narguments = parser.parse_args()\n\nos.setgid(arguments.gid)\nos.setuid(arguments.uid)\n\ndata = json.load(sys.stdin)\n\ndb = psycopg2.connect(database=\"critic\")\ncursor = db.cursor()\n\ntry:\n    # Make sure the table doesn't already exist.\n    cursor.execute(\"SELECT 1 FROM timezones\")\n\n    # Above statement should have thrown a psycopg2.ProgrammingError, but it\n    # didn't, so just exit.\n    sys.exit(0)\nexcept psycopg2.ProgrammingError:\n    db.rollback()\n\ncursor.execute(\"\"\"CREATE TABLE timezones\n                    ( name VARCHAR(256) PRIMARY KEY,\n                      abbrev VARCHAR(16) NOT NULL,\n                      utc_offset INTERVAL NOT NULL )\"\"\")\n\n# Additional timezones are copied from 'pg_timezone_names' by the Watchdog\n# service on startup.\n\ncursor.execute(\"INSERT INTO timezones (name, abbrev, utc_offset) VALUES (%s, %s, %s)\",\n               (\"Universal/UTC\", \"UTC\", datetime.timedelta()))\n\ndb.commit()\ndb.close()\n"
  },
  {
    "path": "installation/migrations/dbschema.createtable.useremails.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport psycopg2\nimport json\nimport argparse\nimport os\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\"--uid\", type=int)\nparser.add_argument(\"--gid\", type=int)\n\narguments = parser.parse_args()\n\nos.setgid(arguments.gid)\nos.setuid(arguments.uid)\n\ndata = json.load(sys.stdin)\n\nimport configuration\n\ndb = psycopg2.connect(**configuration.database.PARAMETERS)\ncursor = db.cursor()\n\ntry:\n    # Make sure the table doesn't already exist.\n    cursor.execute(\"SELECT 1 FROM useremails\")\n\n    # Above statement should have thrown a psycopg2.ProgrammingError, but it\n    # didn't, so just exit.\n    sys.exit(0)\nexcept psycopg2.ProgrammingError: db.rollback()\nexcept: raise\n\n# Create the table.\ncursor.execute(\"\"\"CREATE TABLE useremails\n                    ( id SERIAL PRIMARY KEY,\n                      uid INTEGER NOT NULL REFERENCES users ON DELETE CASCADE,\n                      email VARCHAR(256) NOT NULL,\n                      verified BOOLEAN,\n                      verification_token VARCHAR(256),\n\n                      UNIQUE (uid, email) )\"\"\")\n\n# Create records for all current email addresses in the system.  Set verified to\n# NULL which means the addresses can be used, but that they haven't gone through\n# the verification process.\ncursor.execute(\"\"\"INSERT INTO useremails (uid, email)\n                       SELECT id, email\n                         FROM users\n                        WHERE email IS NOT NULL\"\"\")\n\n# Drop the old 'users.email' column.\ncursor.execute(\"ALTER TABLE users DROP email\")\n\n# And create a new one based on the information we copied over to the new table.\ncursor.execute(\"ALTER TABLE users ADD email INTEGER REFERENCES useremails\")\ncursor.execute(\"SELECT id, uid FROM useremails\")\ncursor.executemany(\"UPDATE users SET email=%s WHERE id=%s\", cursor.fetchall())\n\ndb.commit()\ndb.close()\n"
  },
  {
    "path": "installation/migrations/dbschema.createtable.usersessions.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport psycopg2\nimport json\nimport argparse\nimport os\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\"--uid\", type=int)\nparser.add_argument(\"--gid\", type=int)\n\narguments = parser.parse_args()\n\nos.setgid(arguments.gid)\nos.setuid(arguments.uid)\n\ndata = json.load(sys.stdin)\n\nimport configuration\n\ndb = psycopg2.connect(**configuration.database.PARAMETERS)\ncursor = db.cursor()\n\ntry:\n    # Make sure the table doesn't already exist.\n    cursor.execute(\"SELECT 1 FROM usersessions\")\n\n    # Above statement should have thrown a psycopg2.ProgrammingError, but it\n    # didn't, so just exit.\n    sys.exit(0)\nexcept psycopg2.ProgrammingError: db.rollback()\nexcept: raise\n\ncursor.execute(\"\"\"CREATE TABLE usersessions\n                    ( key CHAR(28) PRIMARY KEY,\n                      uid INTEGER NOT NULL REFERENCES users,\n                      atime TIMESTAMP DEFAULT NOW() )\"\"\")\n\ndb.commit()\ndb.close()\n"
  },
  {
    "path": "installation/migrations/dbschema.createtables.accesscontrol.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2015 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport installation\n\n# Handles command line arguments and sets uid/gid.\ninstallation.utils.start_migration()\n\ndbschema = installation.utils.DatabaseSchema()\n\n# New definitions in dbschema.user.sql.\ndbschema.update(\"\"\"\n\nCREATE TYPE accesscontrolrule AS ENUM\n  ( 'allow',\n    'deny' );\n\nCREATE TABLE accesscontrolprofiles\n  ( id SERIAL PRIMARY KEY,\n    title TEXT,\n\n    -- Access token that this profile belongs to.\n    access_token INTEGER REFERENCES accesstokens ON DELETE CASCADE,\n\n    http accesscontrolrule NOT NULL DEFAULT 'allow',\n    repositories accesscontrolrule NOT NULL DEFAULT 'allow',\n    extensions accesscontrolrule NOT NULL DEFAULT 'allow',\n\n    UNIQUE (access_token) );\n\nCREATE TYPE httprequestmethod AS ENUM\n  ( 'GET',\n    'HEAD',\n    'OPTIONS',\n    'POST',\n    'PUT',\n    'DELETE' );\n\n-- Exceptions for HTTP requests.\nCREATE TABLE accesscontrol_http\n  ( id SERIAL PRIMARY KEY,\n\n    -- The profile this exception belongs to.\n    profile INTEGER NOT NULL REFERENCES accesscontrolprofiles ON DELETE CASCADE,\n\n    -- HTTP request method.  NULL means \"all methods\".\n    request_method httprequestmethod,\n    -- Python regular expression that must match the entire path.  NULL means\n    -- \"all paths\".\n    path_pattern TEXT );\nCREATE INDEX accesscontrol_http_profile\n          ON accesscontrol_http (profile);\n\nCREATE TABLE useraccesscontrolprofiles\n  ( -- The type of access that is controlled.\n    access_type systemaccesstype NOT NULL DEFAULT 'user',\n\n    -- The user (when access_type='user') or NULL.  If access_type='user' and\n    -- this is NULL, then this is the default profile association.\n    uid INTEGER REFERENCES users,\n\n    -- Access control profile.\n    profile INTEGER NOT NULL REFERENCES accesscontrolprofiles ON DELETE CASCADE,\n\n    CONSTRAINT valid_user CHECK (access_type='user' OR uid IS NULL) );\nCREATE INDEX useraccesscontrolprofiles_uid\n          ON useraccesscontrolprofiles (uid);\n\nCREATE TABLE labeledaccesscontrolprofiles\n  ( -- Authentication labels from user authentication, typically indicating some\n    -- type of group memberships. Labels should be sorted lexicographically and\n    -- separated by pipe ('|') characters.\n    labels VARCHAR(256) PRIMARY KEY,\n\n    -- Access control profile.\n    profile INTEGER NOT NULL REFERENCES accesscontrolprofiles ON DELETE CASCADE );\n\n\"\"\")\n\n# New definitions in dbschema.git.sql.\ndbschema.update(\"\"\"\n\nCREATE TYPE repositoryaccesstype AS ENUM\n  ( 'read',\n    'modify' );\n\nCREATE TABLE accesscontrol_repositories\n  ( id SERIAL PRIMARY KEY,\n\n    -- The profile this exception belongs to.\n    profile INTEGER NOT NULL REFERENCES accesscontrolprofiles ON DELETE CASCADE,\n\n    -- Type of access.  NULL means \"any type\".\n    access_type repositoryaccesstype,\n    -- Repository to access.  NULL means \"any repository\".\n    repository INTEGER REFERENCES repositories ON DELETE CASCADE );\nCREATE INDEX accesscontrol_repositories_profile\n          ON accesscontrol_repositories (profile);\n\n\"\"\")\n\n# Check if dbschema.extensions.sql has been loaded at all.  It wasn't until\n# extension support (the extend.py script) was fully added.  If the 'extensions'\n# table doesn't exist, it obviously hasn't, and the tables below would be added\n# along with everything else when dbschema.extensions.sql is loaded by\n# extend.py.\nif dbschema.table_exists(\"extensions\"):\n    # New definitions in dbschema.extensions.sql.\n    dbschema.update(\"\"\"\n\nCREATE TYPE extensionaccesstype AS ENUM\n  ( 'install',\n    'execute' );\n\nCREATE TABLE accesscontrol_extensions\n  ( id SERIAL PRIMARY KEY,\n\n    -- The profile this exception belongs to.\n    profile INTEGER NOT NULL REFERENCES accesscontrolprofiles ON DELETE CASCADE,\n\n    -- Type of extension access.  NULL means \"any type\".\n    access_type extensionaccesstype,\n    -- Extension key: <auther username>/<extension name> for user extensions and\n    -- <extension name> for system extensions.  NULL means \"any extension\".\n    extension_key TEXT );\nCREATE INDEX accesscontrol_extensions_profile\n          ON accesscontrol_extensions (profile);\n\n    \"\"\")\n"
  },
  {
    "path": "installation/migrations/dbschema.droptable.knownhosts.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2013 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport psycopg2\nimport argparse\nimport os\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\"--uid\", type=int)\nparser.add_argument(\"--gid\", type=int)\n\narguments = parser.parse_args()\n\nos.setgid(arguments.gid)\nos.setuid(arguments.uid)\n\nimport configuration\n\ndb = psycopg2.connect(**configuration.database.PARAMETERS)\ncursor = db.cursor()\n\ntry:\n    cursor.execute(\"SELECT 1 FROM knownhosts\")\nexcept psycopg2.ProgrammingError:\n    # Seems it doesn't exist, so just exit.\n    sys.exit(0)\n\ncursor.execute(\"DROP TABLE knownhosts\")\n\ndb.commit()\ndb.close()\n"
  },
  {
    "path": "installation/migrations/dbschema.extension-filterhook-role.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2013 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport psycopg2\nimport argparse\nimport os\nimport re\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\"--uid\", type=int)\nparser.add_argument(\"--gid\", type=int)\n\narguments = parser.parse_args()\n\nos.setgid(arguments.gid)\nos.setuid(arguments.uid)\n\nimport configuration\n\ndb = psycopg2.connect(**configuration.database.PARAMETERS)\ncursor = db.cursor()\n\ndef table_exists(table_name):\n    try:\n        cursor.execute(\"SELECT 1 FROM %s\" % table_name)\n\n        # Above statement would have thrown a psycopg2.ProgrammingError if the\n        # table didn't exist, but it didn't, so the table must exist.\n        return True\n    except psycopg2.ProgrammingError:\n        db.rollback()\n        return False\n\ndef createtable(statement):\n    (table_name,) = re.search(\"CREATE TABLE (\\w+)\", statement).groups()\n\n    # Make sure the table doesn't already exist.\n    if not table_exists(table_name):\n        cursor.execute(statement)\n        db.commit()\n\ndef createindex(statement):\n    (index_name,) = re.search(\"CREATE INDEX (\\w+)\", statement).groups()\n\n    cursor.execute(\"DROP INDEX IF EXISTS %s\" % index_name)\n    cursor.execute(statement)\n    db.commit()\n\ndef run_statements(statements):\n    for statement in statements.split(\";\"):\n        statement = statement.strip()\n\n        if not statement:\n            pass\n        elif statement.startswith(\"CREATE TABLE\"):\n            createtable(statement)\n        elif statement.startswith(\"CREATE INDEX\"):\n            createindex(statement)\n        else:\n            print >>sys.stderr, \"Unexpected SQL statement: %r\" % statement\n            sys.exit(1)\n\n# First check if dbschema.extensions.sql has been loaded at all.  It wasn't\n# until extension support (the extend.py script) was fully added.  If the\n# 'extensions' table doesn't exist, it obviously hasn't, and the tables below\n# would be added along with everything else when dbschema.extensions.sql is\n# loaded by extend.py.\n#\n# Also, the statements below depend on the basic extensions tables existing due\n# to foreign keys they set up.\nif not table_exists(\"extensions\"):\n    sys.exit(0)\n\nrun_statements(\"\"\"\n\nCREATE TABLE extensionfilterhookroles\n  ( role INTEGER NOT NULL REFERENCES extensionroles ON DELETE CASCADE,\n    name VARCHAR(64) NOT NULL,\n    title VARCHAR(64) NOT NULL,\n    role_description TEXT,\n    data_description TEXT );\n\nCREATE TABLE extensionhookfilters\n  ( id SERIAL PRIMARY KEY,\n    uid INTEGER NOT NULL REFERENCES users ON DELETE CASCADE,\n    extension INTEGER NOT NULL REFERENCES extensions ON DELETE CASCADE,\n    repository INTEGER NOT NULL REFERENCES repositories ON DELETE CASCADE,\n    name VARCHAR(64) NOT NULL,\n    path TEXT NOT NULL,\n    data TEXT );\nCREATE INDEX extensionhookfilters_uid_extension\n          ON extensionhookfilters (uid, extension);\nCREATE INDEX extensionhookfilters_repository\n          ON extensionhookfilters (repository);\n\nCREATE TABLE extensionfilterhookevents\n  ( id SERIAL PRIMARY KEY,\n    filter INTEGER NOT NULL REFERENCES extensionhookfilters ON DELETE CASCADE,\n    review INTEGER NOT NULL REFERENCES reviews ON DELETE CASCADE,\n    uid INTEGER NOT NULL REFERENCES users ON DELETE CASCADE,\n    data TEXT );\nCREATE TABLE extensionfilterhookcommits\n  ( event INTEGER NOT NULL REFERENCES extensionfilterhookevents ON DELETE CASCADE,\n    commit INTEGER NOT NULL REFERENCES commits );\nCREATE INDEX extensionfilterhookcommits_event\n          ON extensionfilterhookcommits (event);\nCREATE TABLE extensionfilterhookfiles\n  ( event INTEGER NOT NULL REFERENCES extensionfilterhookevents ON DELETE CASCADE,\n    file INTEGER NOT NULL REFERENCES files );\nCREATE INDEX extensionfilterhookfiles_event\n          ON extensionfilterhookfiles (event);\n\n\"\"\")\n\ndb.close()\n"
  },
  {
    "path": "installation/migrations/dbschema.external-authentication.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport psycopg2\nimport argparse\nimport os\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\"--uid\", type=int)\nparser.add_argument(\"--gid\", type=int)\n\narguments = parser.parse_args()\n\nos.setgid(arguments.gid)\nos.setuid(arguments.uid)\n\nimport configuration\n\ndb = psycopg2.connect(**configuration.database.PARAMETERS)\ncursor = db.cursor()\n\ndef create(table_name, statement):\n    try:\n        # Make sure the table doesn't already exist.\n        cursor.execute(\"SELECT 1 FROM %s\" % table_name)\n\n        # Above statement would have thrown a psycopg2.ProgrammingError if the\n        # table didn't exist, but it didn't, so assume the table doesn't need to\n        # be added.\n        return\n    except psycopg2.ProgrammingError:\n        db.rollback()\n\n    cursor.execute(statement)\n    db.commit()\n\ncreate(\"externalusers\", \"\"\"\n\nCREATE TABLE externalusers\n  ( id SERIAL PRIMARY KEY,\n    uid INTEGER REFERENCES users,\n    provider VARCHAR(16) NOT NULL,\n    account VARCHAR(256) NOT NULL,\n    email VARCHAR(256),\n    token VARCHAR(256),\n\n    UNIQUE (provider, account) );\n\n\"\"\")\n\ncreate(\"oauthstates\", \"\"\"\n\nCREATE TABLE oauthstates\n  ( state VARCHAR(64) PRIMARY KEY,\n    url TEXT,\n    time TIMESTAMP NOT NULL DEFAULT NOW() );\n\n\"\"\")\n\ndb.commit()\ndb.close()\n"
  },
  {
    "path": "installation/migrations/dbschema.files-and-directories.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2013 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport psycopg2\nimport json\nimport argparse\nimport os\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\"--uid\", type=int)\nparser.add_argument(\"--gid\", type=int)\n\narguments = parser.parse_args()\n\nos.setgid(arguments.gid)\nos.setuid(arguments.uid)\n\ndata = json.load(sys.stdin)\n\nimport configuration\n\ndb = psycopg2.connect(**configuration.database.PARAMETERS)\ncursor = db.cursor()\n\ndef column_exists(table, column):\n    try:\n        cursor.execute(\"SELECT %s FROM %s LIMIT 1\" % (column, table))\n        return True\n    except psycopg2.ProgrammingError:\n        db.rollback()\n        return False\n\nadded = [column_exists(\"files\", \"path\"),\n         column_exists(\"filters\", \"path\"),\n         column_exists(\"reviewfilters\", \"path\")]\n\nremoved = [column_exists(\"files\", \"directory\"),\n           column_exists(\"files\", \"name\"),\n           column_exists(\"filters\", \"directory\"),\n           column_exists(\"filters\", \"file\"),\n           column_exists(\"reviewfilters\", \"directory\"),\n           column_exists(\"reviewfilters\", \"file\"),\n           column_exists(\"directories\", \"id\")]\n\nif all(added) and not any(removed):\n    # All expected modifications appear to have taken place already.\n    sys.exit(0)\nelif any(added) or not all(removed):\n    # Some modifications appear to have taken place already, but not\n    # all.  This is bad, and possibly unrecoverable.  It's probably\n    # not a good idea to just run the commands below.\n    sys.stderr.write(\"\"\"\\\nThe database schema appears to be in an inconsistent state!\n\nPlease see installation/migrations/dbschema.files-and-directories.py\nand try to figure out which of the commands in it to run.\n\nAlternatively, restore a database backup from before the previous\nupgrade attempt, and then try running upgrade.py again.\n\"\"\")\n    sys.exit(1)\n\n# Add 'path' column to 'files' table.\ncursor.execute(\"ALTER TABLE files ADD path TEXT\")\ncursor.execute(\"UPDATE files SET path=fullfilename(id)\")\ncursor.execute(\"ALTER TABLE files ALTER path SET NOT NULL\")\ncursor.execute(\"CREATE UNIQUE INDEX files_path_md5 ON files (MD5(path))\")\ncursor.execute(\"CREATE INDEX files_path_gin ON files USING gin (STRING_TO_ARRAY(path, '/'))\")\n\n# Modify 'filters' table similarly.\ncursor.execute(\"ALTER TABLE filters ADD path TEXT\")\ncursor.execute(\"UPDATE filters SET path=fullfilename(file) WHERE file>0\")\ncursor.execute(\"UPDATE filters SET path=COALESCE(NULLIF(fulldirectoryname(directory), ''), '/') WHERE file=0\")\ncursor.execute(\"ALTER TABLE filters ALTER path SET NOT NULL\")\ncursor.execute(\"CREATE UNIQUE INDEX filters_repository_uid_path_md5 ON filters (repository, uid, MD5(path))\")\n\n# Modify 'reviewfilters' table similarly.\ncursor.execute(\"ALTER TABLE reviewfilters ADD path TEXT\")\ncursor.execute(\"UPDATE reviewfilters SET path=fullfilename(file) WHERE file>0\")\ncursor.execute(\"UPDATE reviewfilters SET path=COALESCE(NULLIF(fulldirectoryname(directory), ''), '/') WHERE file=0\")\ncursor.execute(\"ALTER TABLE reviewfilters ALTER path SET NOT NULL\")\ncursor.execute(\"CREATE UNIQUE INDEX reviewfilters_review_uid_path_md5 ON reviewfilters (review, uid, MD5(path))\")\n\n# Modify 'reviewfilterchanges' table similarly.\ncursor.execute(\"ALTER TABLE reviewfilterchanges ADD path TEXT\")\ncursor.execute(\"UPDATE reviewfilterchanges SET path=fullfilename(file) WHERE file>0\")\ncursor.execute(\"UPDATE reviewfilterchanges SET path=fulldirectoryname(directory) WHERE file=0\")\ncursor.execute(\"ALTER TABLE reviewfilterchanges ALTER path SET NOT NULL\")\n\n# Drop the now redundant 'directories' table.\ncursor.execute(\"ALTER TABLE files DROP directory, DROP name\")\ncursor.execute(\"ALTER TABLE filters DROP directory, DROP file, DROP specificity\")\ncursor.execute(\"ALTER TABLE reviewfilters DROP directory, DROP file\")\ncursor.execute(\"ALTER TABLE reviewfilterchanges DROP directory, DROP file\")\ncursor.execute(\"DROP TABLE directories\")\n\n# Drop various utility functions that are no longer necessary.\ncursor.execute(\"DROP FUNCTION IF EXISTS filepath()\")\ncursor.execute(\"DROP FUNCTION IF EXISTS directorypath()\")\ncursor.execute(\"DROP FUNCTION IF EXISTS subdirectories()\")\ncursor.execute(\"DROP FUNCTION IF EXISTS containedfiles()\")\ncursor.execute(\"DROP FUNCTION IF EXISTS fullfilename()\")\ncursor.execute(\"DROP FUNCTION IF EXISTS fulldirectoryname()\")\ncursor.execute(\"DROP FUNCTION IF EXISTS findfile()\")\ncursor.execute(\"DROP FUNCTION IF EXISTS finddirectory()\")\n\ndb.commit()\n\n# ALTER TYPE ... ADD VALUE cannot be executed inside a transaction block.\ndb.autocommit = True\n# Add filter type \"ignored\".\ncursor.execute(\"ALTER TYPE filtertype ADD VALUE 'ignored'\")\n\ndb.close()\n"
  },
  {
    "path": "installation/migrations/dbschema.fixup-extensionroles.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport psycopg2\nimport argparse\nimport os\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\"--uid\", type=int)\nparser.add_argument(\"--gid\", type=int)\n\narguments = parser.parse_args()\n\nos.setgid(arguments.gid)\nos.setuid(arguments.uid)\n\nimport configuration\n\ndb = psycopg2.connect(**configuration.database.PARAMETERS)\ncursor = db.cursor()\n\ntry:\n    # The extensions part of the database schema might not have been loaded at\n    # all; it isn't until extend.py is used to enable extensions support.\n    cursor.execute(\"SELECT 1 FROM extensions\")\nexcept psycopg2.ProgrammingError:\n    sys.exit(0)\n\ncursor.execute(\"\"\"SELECT id, version, script, function,\n                         extensionpageroles.path,\n                         extensioninjectroles.path,\n                         extensionprocesscommitsroles.role IS NULL\n                    FROM extensionroles\n         LEFT OUTER JOIN extensionpageroles\n                      ON (extensionpageroles.role=id)\n         LEFT OUTER JOIN extensioninjectroles\n                      ON (extensioninjectroles.role=id)\n         LEFT OUTER JOIN extensionprocesscommitsroles\n                      ON (extensionprocesscommitsroles.role=id)\"\"\")\n\nroles = set()\nduplicates = []\n\nfor row in cursor:\n    role_id = row[0]\n    role_key = row[1:]\n\n    if role_key in roles:\n        duplicates.append(role_id)\n    else:\n        roles.add(role_key)\n\nif duplicates:\n    print \"Removing %d duplicate rows from extensionroles.\" % len(duplicates)\n    cursor.execute(\"DELETE FROM extensionroles WHERE id=ANY (%s)\", (duplicates,))\n\ndb.commit()\ndb.close()\n"
  },
  {
    "path": "installation/migrations/dbschema.per-repository-or-filter-preferences.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2013 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport psycopg2\nimport argparse\nimport os\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\"--uid\", type=int)\nparser.add_argument(\"--gid\", type=int)\n\narguments = parser.parse_args()\n\nos.setgid(arguments.gid)\nos.setuid(arguments.uid)\n\nimport configuration\n\ndb = psycopg2.connect(**configuration.database.PARAMETERS)\ncursor = db.cursor()\n\ndef column_exists(table, column):\n    try:\n        cursor.execute(\"SELECT %s FROM %s LIMIT 1\" % (column, table))\n        return True\n    except psycopg2.ProgrammingError:\n        db.rollback()\n        return False\n\nadded = [column_exists(\"preferences\", \"per_system\"),\n         column_exists(\"preferences\", \"per_user\"),\n         column_exists(\"preferences\", \"per_repository\"),\n         column_exists(\"preferences\", \"per_filter\"),\n         column_exists(\"userpreferences\", \"repository\"),\n         column_exists(\"userpreferences\", \"filter\")]\n\nremoved = [column_exists(\"preferences\", \"default_string\"),\n           column_exists(\"preferences\", \"default_integer\")]\n\nif all(added) and not any(removed):\n    # All expected modifications appear to have taken place already.\n    sys.exit(0)\nelif any(added) or not all(removed):\n    # Some modifications appear to have taken place already, but not\n    # all.  This is bad, and possibly unrecoverable.  It's probably\n    # not a good idea to just run the commands below.\n    sys.stderr.write(\"\"\"\\\nThe database schema appears to be in an inconsistent state!\n\nPlease see\n  installation/migrations/dbschema.per-repository-or-filter-preferences.py\nand try to figure out which of the commands in it to run.\n\nAlternatively, restore a database backup from before the previous\nupgrade attempt, and then try running upgrade.py again.\n\"\"\")\n    sys.exit(1)\n\n# Drop the exiting 'userpreferences' PRIMARY KEY, since it conflicts with having\n# multiple settings for different repositories and filters.\ncursor.execute(\"\"\"ALTER TABLE userpreferences\n                      DROP CONSTRAINT userpreferences_pkey\"\"\")\n\n# Add new columns to 'preferences'.\ncursor.execute(\"\"\"ALTER TABLE preferences\n                      ADD per_system BOOLEAN NOT NULL DEFAULT TRUE,\n                      ADD per_user BOOLEAN NOT NULL DEFAULT TRUE,\n                      ADD per_repository BOOLEAN NOT NULL DEFAULT FALSE,\n                      ADD per_filter BOOLEAN NOT NULL DEFAULT FALSE\"\"\")\n\n# Add new columns to 'userpreferences'.\ncursor.execute(\"\"\"ALTER TABLE userpreferences\n                      ALTER uid DROP NOT NULL,\n                      ADD repository INTEGER REFERENCES repositories ON DELETE CASCADE,\n                      ADD filter INTEGER REFERENCES filters ON DELETE CASCADE\"\"\")\n\n# Move current system-wide default values over to the 'userpreferences' table as\n# rows with uid=NULL.\ncursor.execute(\"\"\"INSERT INTO userpreferences (item, integer, string)\n                       SELECT item, default_integer, default_string\n                         FROM preferences\"\"\")\n\n# Drop old default value columns from 'preferences'.\ncursor.execute(\"\"\"ALTER TABLE preferences\n                      DROP default_string,\n                      DROP default_integer\"\"\")\n\n# Add new constraints to 'userpreferences'.\ncursor.execute(\"\"\"ALTER TABLE userpreferences\n                      ADD CONSTRAINT check_uid_filter\n                               CHECK (filter IS NULL OR uid IS NOT NULL),\n                      ADD CONSTRAINT check_repository_filter\n                               CHECK (repository IS NULL OR filter IS NULL)\"\"\")\n\n# Add indexes used to check various uniqueness requirements involving NULL\n# values.\ncursor.execute(\"\"\"CREATE UNIQUE INDEX userpreferences_item\n                                   ON userpreferences (item)\n                                WHERE uid IS NULL\n                                  AND repository IS NULL\n                                  AND filter IS NULL\"\"\")\ncursor.execute(\"\"\"CREATE UNIQUE INDEX userpreferences_item_uid\n                                   ON userpreferences (item, uid)\n                                WHERE uid IS NOT NULL\n                                  AND repository IS NULL\n                                  AND filter IS NULL\"\"\")\ncursor.execute(\"\"\"CREATE UNIQUE INDEX userpreferences_item_repository\n                                   ON userpreferences (item, repository)\n                                WHERE uid IS NULL\n                                  AND repository IS NOT NULL\n                                  AND filter IS NULL\"\"\")\ncursor.execute(\"\"\"CREATE UNIQUE INDEX userpreferences_item_uid_repository\n                                   ON userpreferences (item, uid, repository)\n                                WHERE uid IS NOT NULL\n                                  AND repository IS NOT NULL\n                                  AND filter IS NULL\"\"\")\ncursor.execute(\"\"\"CREATE UNIQUE INDEX userpreferences_item_uid_filter\n                                   ON userpreferences (item, uid, filter)\n                                WHERE uid IS NOT NULL\n                                  AND repository IS NULL\n                                  AND filter IS NOT NULL\"\"\")\n\ndb.commit()\ndb.close()\n"
  },
  {
    "path": "installation/migrations/dbschema.review-constraints-tweaking.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2013 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport psycopg2\nimport argparse\nimport os\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\"--uid\", type=int)\nparser.add_argument(\"--gid\", type=int)\n\narguments = parser.parse_args()\n\nos.setgid(arguments.gid)\nos.setuid(arguments.uid)\n\nimport configuration\n\ndb = psycopg2.connect(**configuration.database.PARAMETERS)\ncursor = db.cursor()\n\ntry:\n    cursor.execute(\"CREATE INDEX reviewmessageids_review ON reviewmessageids (review)\")\nexcept psycopg2.ProgrammingError:\n    # The index probably already exists.\n    db.rollback()\nelse:\n    db.commit()\n\ncursor.execute(\"\"\"ALTER TABLE branches\n                    DROP CONSTRAINT IF EXISTS branches_review_fkey,\n                    ADD CONSTRAINT branches_review_fkey\n                      FOREIGN KEY (review) REFERENCES reviews ON DELETE CASCADE\"\"\")\n\ncursor.execute(\"\"\"ALTER TABLE checkbranchnotes\n                    DROP CONSTRAINT IF EXISTS checkbranchnotes_review_fkey,\n                    ADD CONSTRAINT checkbranchnotes_review_fkey\n                      FOREIGN KEY (review) REFERENCES reviews ON DELETE CASCADE\"\"\")\n\ndb.commit()\ndb.close()\n"
  },
  {
    "path": "installation/migrations/git.check-keepalive-references.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport psycopg2\nimport json\nimport argparse\nimport os\nimport subprocess\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\"--uid\", type=int)\nparser.add_argument(\"--gid\", type=int)\n\narguments = parser.parse_args()\n\nos.setgid(arguments.gid)\nos.setuid(arguments.uid)\n\ndata = json.load(sys.stdin)\n\nos.environ[\"HOME\"] = data[\"installation.paths.data_dir\"]\nos.chdir(os.environ[\"HOME\"])\n\ndb = psycopg2.connect(database=\"critic\")\n\ncursor = db.cursor()\ncursor.execute(\"\"\"SELECT repositories.path, reviews.id, branches.id, branches.name, commits.sha1\n                    FROM repositories\n                    JOIN branches ON (branches.repository=repositories.id)\n                    JOIN reviews ON (reviews.branch=branches.id)\n                    JOIN commits ON (commits.id=branches.head)\n                ORDER BY repositories.id, reviews.id\"\"\")\n\ncurrent_repository_path = None\nkeepalive_refs = None\nbranch_heads = None\n\nsys.stdout.write(\"Verifying keepalive references ...\\n\")\nsys.stdout.flush()\n\nfor repository_path, review_id, branch_id, branch_name, head_sha1 in cursor.fetchall():\n    if repository_path != current_repository_path:\n        keepalive_refs = set(subprocess.check_output(\n            [data[\"installation.prereqs.git\"],\n             \"--git-dir=\" + repository_path,\n             \"for-each-ref\", \"--format=%(objectname)\",\n             \"refs/keepalive/\"]).splitlines())\n        branch_heads = dict(\n            line.rsplit(\":\", 1) for line in\n            subprocess.check_output(\n                [data[\"installation.prereqs.git\"],\n                 \"--git-dir=\" + repository_path,\n                 \"for-each-ref\", \"--format=%(refname):%(objectname)\",\n                 \"refs/heads/\"]).splitlines())\n\n        current_repository_path = repository_path\n\n        sys.stdout.write(\"\\r\\x1b[K  %s\\n\" % current_repository_path)\n\n    sys.stdout.write(\"\\r\\x1b[K    r/%d\" % review_id)\n    sys.stdout.flush()\n\n    def add_keepalive(sha1, message):\n        keepalive_refs.add(sha1)\n\n        try:\n            subprocess.check_output(\n                [data[\"installation.prereqs.git\"],\n                 \"--git-dir=\" + repository_path,\n                 \"update-ref\", \"refs/keepalive/\" + sha1, sha1, \"0\" * 40],\n                stderr=subprocess.STDOUT)\n        except subprocess.CalledProcessError:\n            what = \"failed to add\"\n        else:\n            what = \"added\"\n\n        sys.stdout.write(\"\\r\\x1b[Kr/%d: %s keepalive ref: %s (%s)\\n\"\n                         % (review_id, what, sha1, message))\n        sys.stdout.flush()\n\n    ####################################################################\n    # Make sure the review branch in the Git repository references the #\n    # expected commit, or that the commit it ought to reference is at  #\n    # least kept alive.                                                #\n    ####################################################################\n\n    if branch_heads.get(\"refs/heads/\" + branch_name) != head_sha1:\n        if head_sha1 not in keepalive_refs:\n            if \"refs/heads/\" + branch_name not in branch_heads:\n                message = \"missing review branch\"\n            else:\n                message = \"incorrect review branch\"\n            add_keepalive(head_sha1, message)\n\n    ####################################################################\n    # Make sure all \"old head\" commits from all past rebases of the    #\n    # review are properly referenced by a keepalive ref.               #\n    ####################################################################\n\n    cursor.execute(\"\"\"SELECT commits.id, commits.sha1\n                        FROM commits\n                        JOIN reviewrebases ON (reviewrebases.old_head=commits.id)\n                       WHERE reviewrebases.review=%s\n                         AND reviewrebases.new_head IS NOT NULL\"\"\",\n                   (review_id,))\n\n    for old_head_id, old_head_sha1 in cursor.fetchall():\n        if old_head_sha1 not in keepalive_refs:\n            # There might exist an \"equivalent merge commit\" that has\n            # the recorded old head commit as one of its parent, and\n            # that is kept alive.  Normally, that merge commit would be\n            # recorded as the old head instead, but in old reviews this\n            # is not the case if the merge was \"clean\".\n            cursor.execute(\"\"\"SELECT commits.sha1\n                                FROM commits\n                                JOIN edges ON (edges.child=commits.id)\n                               WHERE edges.parent=%s\"\"\",\n                           (old_head_id,))\n            for candidate_sha1, in cursor:\n                if candidate_sha1 in keepalive_refs:\n                    # Note: Won't bother verifying that this actually is\n                    # an equivalent merge commit, and not some random\n                    # other commit with our old head as its parent.  If\n                    # it's kept alive it's kept alive.\n                    break\n            else:\n                add_keepalive(old_head_sha1, \"rebase old head\")\n\nsys.stdout.write(\"\\r\\x1b[K\")\nsys.stdout.flush()\n"
  },
  {
    "path": "installation/migrations/git.clean-up-temporary-references.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport psycopg2\nimport json\nimport argparse\nimport os\nimport subprocess\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\"--uid\", type=int)\nparser.add_argument(\"--gid\", type=int)\n\narguments = parser.parse_args()\n\nos.setgid(arguments.gid)\nos.setuid(arguments.uid)\n\ndata = json.load(sys.stdin)\n\nos.environ[\"HOME\"] = data[\"installation.paths.data_dir\"]\nos.chdir(os.environ[\"HOME\"])\n\ndb = psycopg2.connect(database=\"critic\")\ncursor = db.cursor()\n\ncursor.execute(\"SELECT path FROM repositories\")\n\nfor (path,) in cursor:\n    temporary_refs = subprocess.check_output(\n        [data[\"installation.prereqs.git\"],\n         \"--git-dir=%s\" % path,\n         \"for-each-ref\",\n         \"--format=%(refname)\",\n         \"refs/temporary/\",\n         \"refs/commit/\"]).splitlines()\n\n    for temporary_ref in temporary_refs:\n        subprocess.check_call(\n            [data[\"installation.prereqs.git\"],\n             \"--git-dir=%s\" % path,\n             \"update-ref\",\n             \"-d\", temporary_ref])\n\n    if temporary_refs:\n        print \"%s: purged %d temporary refs\" % (path, len(temporary_refs))\n"
  },
  {
    "path": "installation/migrations/git.convert-replays-into-keepalives.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport psycopg2\nimport json\nimport argparse\nimport os\nimport subprocess\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\"--uid\", type=int)\nparser.add_argument(\"--gid\", type=int)\n\narguments = parser.parse_args()\n\nos.setgid(arguments.gid)\nos.setuid(arguments.uid)\n\ndata = json.load(sys.stdin)\n\nos.environ[\"HOME\"] = data[\"installation.paths.data_dir\"]\nos.chdir(os.environ[\"HOME\"])\n\ndb = psycopg2.connect(database=\"critic\")\ncursor = db.cursor()\n\ncursor.execute(\"SELECT path FROM repositories\")\n\nfor (path,) in cursor:\n    output = subprocess.check_output(\n        [data[\"installation.prereqs.git\"],\n         \"--git-dir=%s\" % path,\n         \"for-each-ref\",\n         \"--format=%(objectname):%(refname)\",\n         \"refs/replays/\"])\n\n    replay_refs = [line.split(\":\") for line in output.splitlines()]\n\n    for sha1, ref_name in replay_refs:\n        subprocess.check_call(\n            [data[\"installation.prereqs.git\"],\n             \"--git-dir=%s\" % path,\n             \"update-ref\",\n             \"refs/keepalive/\" + sha1, sha1])\n\n        subprocess.check_call(\n            [data[\"installation.prereqs.git\"],\n             \"--git-dir=%s\" % path,\n             \"update-ref\",\n             \"-d\", ref_name, sha1])\n\n    if replay_refs:\n        print (\"%s: converted %d replay refs into keepalives\"\n               % (path, len(replay_refs)))\n"
  },
  {
    "path": "installation/migrations/git.rename-keepalive-chain.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport psycopg2\nimport json\nimport argparse\nimport os\nimport subprocess\n\nOLD_KEEPALIVE_REF_CHAIN = \"refs/keepalive-chain\"\nNEW_KEEPALIVE_REF_CHAIN = \"refs/internal/keepalive-chain\"\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\"--uid\", type=int)\nparser.add_argument(\"--gid\", type=int)\n\narguments = parser.parse_args()\n\nos.setgid(arguments.gid)\nos.setuid(arguments.uid)\n\ndata = json.load(sys.stdin)\n\nos.environ[\"HOME\"] = data[\"installation.paths.data_dir\"]\nos.chdir(os.environ[\"HOME\"])\n\ndb = psycopg2.connect(database=\"critic\")\ncursor = db.cursor()\n\ncursor.execute(\"SELECT path FROM repositories\")\n\nfor (path,) in cursor:\n    try:\n        # Create the new keepalive chain ref.\n        subprocess.check_output(\n            [data[\"installation.prereqs.git\"],\n             \"--git-dir=%s\" % path,\n             \"update-ref\", NEW_KEEPALIVE_REF_CHAIN, OLD_KEEPALIVE_REF_CHAIN],\n            stderr=subprocess.STDOUT)\n    except subprocess.CalledProcessError:\n        # Assume this was because OLD_KEEPALIVE_REF_CHAIN didn't exist.\n        pass\n    else:\n        # Delete the old keepalive chain ref.  This command fails if the old\n        # ref's value changed between this command and the previous one.\n        subprocess.check_call(\n            [data[\"installation.prereqs.git\"],\n             \"--git-dir=%s\" % path,\n             \"update-ref\", \"-d\", OLD_KEEPALIVE_REF_CHAIN, NEW_KEEPALIVE_REF_CHAIN])\n"
  },
  {
    "path": "installation/migrations/installation.config-pyc-file-permissions.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2013 Martin Olsson\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport os\nimport argparse\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\"--uid\", type=int)\nparser.add_argument(\"--gid\", type=int)\n\narguments = parser.parse_args()\n\nos.setgid(arguments.gid)\nos.setuid(arguments.uid)\n\nimport configuration\n\nconfig_dir = os.path.dirname(configuration.__file__)\nfor entry in os.listdir(config_dir):\n    if entry.endswith(\".py\"):\n        if entry.startswith(\"_\") and os.path.exists(os.path.join(config_dir, entry[1:])):\n            # If the upgrade modifies a configuration file, say file.py, it\n            # will keep a backup of the file stored as _file.py (also in the\n            # configuration directory) and there won't be a pyc file for the\n            # backup, so skip ahead to avoid unnecessarily printing the below warning.\n            continue\n        config_file = os.path.join(config_dir, entry)\n        pyc_file = config_file + \"c\"\n        try:\n            os.chmod(pyc_file, os.stat(config_file).st_mode)\n        except Exception as e:\n            print(\"WARNING: installation.config-pyc-file-permissions.py \"\n                  \"failed to restrict file permissions for '%s'. Please make \"\n                  \"sure all .pyc files in the Critic configuration directory \"\n                  \"exists, belongs to critic:critic and are chmod'd similar \"\n                  \"to the corresponding .py file. The specific error \"\n                  \"reported was: %s\" % (pyc_file, e))\n"
  },
  {
    "path": "installation/migrations/news.filter-system-rewrite.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2013 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport psycopg2\nimport json\nimport argparse\nimport os\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\"--uid\", type=int)\nparser.add_argument(\"--gid\", type=int)\n\narguments = parser.parse_args()\n\nos.setgid(arguments.gid)\nos.setuid(arguments.uid)\n\ndata = json.load(sys.stdin)\n\nimport configuration\n\ndb = psycopg2.connect(**configuration.database.PARAMETERS)\ncursor = db.cursor()\n\ntext = \"\"\"\\\nImproved Filters\n================\n\nCritic's Filters mechanism has been improved, in two significant ways:\n\n* filter paths can now contain wildcards, and\n* a third filter type, <b>Ignored</b>, has been added, that can be\n  used to exclude some files or directories otherwise matched by other\n  filters.\n\nFor more details, see the (new)\n  <a href='/tutorial?item=filters'>tutorial on the subject of filters</a>.\n\nThe UI for managing filters on your\n  <a href='/home'>Home page</a>\nhas also been significantly changed; now displaying all filter in all\nrepositories instead of only filters in a selected repository.\"\"\"\n\ncursor.execute(\"SELECT id FROM newsitems WHERE text=%s\", (text,))\n\nif cursor.fetchone():\n    # Identical news item already exists.\n    sys.exit(0)\n\ncursor.execute(\"INSERT INTO newsitems (text) VALUES (%s)\", (text,))\n\ndb.commit()\ndb.close()\n"
  },
  {
    "path": "installation/migrations/news.review-branch-archival.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 Jens Widell, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport psycopg2\nimport json\nimport argparse\nimport os\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\"--uid\", type=int)\nparser.add_argument(\"--gid\", type=int)\n\narguments = parser.parse_args()\n\nos.setgid(arguments.gid)\nos.setuid(arguments.uid)\n\ndata = json.load(sys.stdin)\n\nimport configuration\n\ndb = psycopg2.connect(**configuration.database.PARAMETERS)\ncursor = db.cursor()\n\ntext = \"\"\"\\\nReview branch archival\n======================\n\nThis Critic system now supports automatic archival of obsolete review branches.\nThis means that review branch refs that belong to reviews that are finished and\nclosed, or dropped, are eventually deleted from the repository.\n\nFor more information, see the [Review branch archival][tutorial]\ntutorial.\n\nFrom now on, archival of review branches is scheduled when reviews are closed or\ndropped.  For each existing already closed or dropped reviews in this system,\narchival will have been scheduled at a random time 2-6 weeks after the upgrade.\nThis news item's timestamp indicates when the upgrade took place.\n\n[tutorial]: /tutorial?item=archival\n\"\"\"\n\ncursor.execute(\"SELECT id FROM newsitems WHERE text=%s\", (text,))\n\nif cursor.fetchone():\n    # Identical news item already exists.\n    sys.exit(0)\n\ncursor.execute(\"INSERT INTO newsitems (text) VALUES (%s)\", (text,))\n\ndb.commit()\ndb.close()\n\n# Also print a \"news\" bulletin to the system administrator that\n# performs the upgrade:\n\nprint \"\"\"\nNOTE: This update adds a review branch archival mechanism, enabled by\n      default.  To find out more about it, including how to disable\n      it, please see the administration tutorial:\n\n  http://<this-system>/tutorial?item=administration#review_branch_archival\n\"\"\"\n"
  },
  {
    "path": "installation/migrations/news.review-quick-search.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2013 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport psycopg2\nimport json\nimport argparse\nimport os\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\"--uid\", type=int)\nparser.add_argument(\"--gid\", type=int)\n\narguments = parser.parse_args()\n\nos.setgid(arguments.gid)\nos.setuid(arguments.uid)\n\ndata = json.load(sys.stdin)\n\nimport configuration\n\ndb = psycopg2.connect(**configuration.database.PARAMETERS)\ncursor = db.cursor()\n\ntext = \"\"\"\\\nReview Quick Search\n===================\n\nCritic's mechanism for searching for reviews has been upgraded.  The existing\n  <a href=/search>search page</a>\nhas been made somewhat more user-friendly and capable.\n\nMore significantly, a new \"quick search\" feature has been added, which is a\nsearch dialog activated by pressing the <code>F</code> key on any Critic page\n(for instance this one.)  This dialog allows input of a search query and can be\nused to perform the same searches as the main search page.\n\nFor more details, see the (new)\n  <a href='/tutorial?item=search'>tutorial on the subject of searching</a>.\"\"\"\n\ncursor.execute(\"SELECT id FROM newsitems WHERE text=%s\", (text,))\n\nif cursor.fetchone():\n    # Identical news item already exists.\n    sys.exit(0)\n\ncursor.execute(\"INSERT INTO newsitems (text) VALUES (%s)\", (text,))\n\ndb.commit()\ndb.close()\n"
  },
  {
    "path": "installation/migrations/preference.commit.diff.rulerColumn.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2013 Rafał Chłodnicki, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport psycopg2\nimport json\nimport argparse\nimport os\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\"--uid\", type=int)\nparser.add_argument(\"--gid\", type=int)\n\narguments = parser.parse_args()\n\nos.setgid(arguments.gid)\nos.setuid(arguments.uid)\n\ndata = json.load(sys.stdin)\n\ndb = psycopg2.connect(database=\"critic\")\ncursor = db.cursor()\n\n# Make sure the preference doesn't already exist.\ncursor.execute(\"SELECT 1 FROM preferences WHERE item=%s\", (\"commit.diff.rulerColumn\",))\n\nif cursor.fetchone():\n    sys.exit(0)\n\ncursor.execute(\"INSERT INTO preferences (item, type, default_integer, description) VALUES (%s, %s, %s, %s)\",\n               (\"commit.diff.rulerColumn\", \"integer\", 0,\n                \"The column at which a ruler is shown. Can be set to 0 to disable the ruler.\"))\n\ndb.commit()\ndb.close()\n"
  },
  {
    "path": "installation/paths.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport os\nimport shutil\n\nimport installation\n\netc_dir = \"/etc/critic\"\nbin_dir = \"/usr/bin\"\ninstall_dir = \"/usr/share/critic\"\ndata_dir = \"/var/lib/critic\"\ncache_dir = \"/var/cache/critic\"\ngit_dir = \"/var/git\"\nlog_dir = \"/var/log/critic\"\nrun_dir = \"/var/run/critic\"\n\ndef prepare(mode, arguments, data):\n    global etc_dir, install_dir, data_dir, cache_dir, git_dir, log_dir, run_dir\n\n    if mode == \"install\":\n        all_ok = True\n\n        print \"\"\"\nCritic Installation: Paths\n==========================\n\"\"\"\n\n        def is_good_dir(path):\n            if not path: return \"empty path\"\n            elif not path.startswith(\"/\"): return \"must be an absolute path\"\n            elif os.path.exists(path) and not os.path.isdir(path):\n                return \"exists and is not a directory\"\n\n        def is_new_dir(path):\n            error = is_good_dir(path)\n            if error: return error\n            if os.path.exists(path):\n                return \"directory already exists (NOTE: if Critic is already \" \\\n                       \"installed and you want to upgrade to the latest \" \\\n                       \"version of Critic, then run upgrade.py rather than \" \\\n                       \"re-running install.py)\"\n\n        if arguments.etc_dir:\n            error = is_new_dir(arguments.etc_dir)\n            if error:\n                print \"Invalid --etc-dir argument: %s.\" % error\n                return False\n            etc_dir = arguments.etc_dir\n        else:\n            all_ok = False\n            etc_dir = installation.input.string(prompt=\"Where should Critic's configuration files be installed?\",\n                                                default=etc_dir,\n                                                check=is_new_dir)\n\n        if arguments.install_dir:\n            error = is_new_dir(arguments.install_dir)\n            if error:\n                print \"Invalid --install-dir argument: %s.\" % error\n                return False\n            install_dir = arguments.install_dir\n        else:\n            all_ok = False\n            install_dir = installation.input.string(prompt=\"Where should Critic's source code be installed?\",\n                                                    default=install_dir,\n                                                    check=is_new_dir)\n\n        if arguments.data_dir:\n            error = is_new_dir(arguments.data_dir)\n            if error:\n                print \"Invalid --data-dir argument: %s.\" % error\n                return False\n            data_dir = arguments.data_dir\n        else:\n            all_ok = False\n            data_dir = installation.input.string(prompt=\"Where should Critic's persistent data files live?\",\n                                                 default=data_dir,\n                                                 check=is_new_dir)\n\n        if arguments.cache_dir:\n            error = is_new_dir(arguments.cache_dir)\n            if error:\n                print \"Invalid --cache-dir argument: %s.\" % error\n                return False\n            cache_dir = arguments.cache_dir\n        else:\n            all_ok = False\n            cache_dir = installation.input.string(prompt=\"Where should Critic's temporary data files live?\",\n                                                  default=cache_dir,\n                                                  check=is_new_dir)\n\n        if arguments.git_dir:\n            error = is_new_dir(arguments.git_dir)\n            if error:\n                print \"Invalid --git-dir argument: %s.\" % error\n                return False\n            git_dir = arguments.git_dir\n        else:\n            all_ok = False\n            git_dir = installation.input.string(prompt=\"Where should Critic's Git repositories live?\",\n                                                default=git_dir,\n                                                check=is_new_dir)\n\n        if arguments.log_dir:\n            error = is_new_dir(arguments.log_dir)\n            if error:\n                print \"Invalid --log-dir argument: %s.\" % error\n                return False\n            log_dir = arguments.log_dir\n        else:\n            all_ok = False\n            log_dir = installation.input.string(prompt=\"Where should Critic's log files live?\",\n                                                default=log_dir,\n                                                check=is_good_dir)\n\n        if arguments.run_dir:\n            error = is_new_dir(arguments.run_dir)\n            if error:\n                print \"Invalid --run-dir argument: %s.\" % error\n                return False\n            run_dir = arguments.run_dir\n        else:\n            all_ok = False\n            run_dir = installation.input.string(prompt=\"Where should Critic's runtime files live?\",\n                                                default=run_dir,\n                                                check=is_good_dir)\n\n        if all_ok: print \"All okay.\"\n    else:\n        import configuration\n\n        def strip_identity(path):\n            if os.path.basename(path) == configuration.base.SYSTEM_IDENTITY:\n                return os.path.dirname(path)\n            else:\n                return path\n\n        etc_dir = strip_identity(configuration.paths.CONFIG_DIR)\n        install_dir = configuration.paths.INSTALL_DIR\n        data_dir = configuration.paths.DATA_DIR\n        cache_dir = strip_identity(configuration.paths.CACHE_DIR)\n        git_dir = configuration.paths.GIT_DIR\n        log_dir = strip_identity(configuration.paths.LOG_DIR)\n        run_dir = strip_identity(configuration.paths.RUN_DIR)\n\n    data[\"installation.paths.etc_dir\"] = etc_dir\n    data[\"installation.paths.install_dir\"] = install_dir\n    data[\"installation.paths.data_dir\"] = data_dir\n    data[\"installation.paths.cache_dir\"] = cache_dir\n    data[\"installation.paths.git_dir\"] = git_dir\n    data[\"installation.paths.log_dir\"] = log_dir\n    data[\"installation.paths.run_dir\"] = run_dir\n\n    return True\n\ncreated = []\n\ndef mkdir(path, mode=0750):\n    global created\n    if not os.path.isdir(path):\n        if not os.path.isdir(os.path.dirname(path)):\n            mkdir(os.path.dirname(path), mode)\n\n        if not installation.quiet:\n            print \"Creating directory '%s' ...\" % path\n\n        os.mkdir(path, mode)\n        created.append(path)\n        os.chown(path, installation.system.uid, installation.system.gid)\n\ndef mkdirs():\n    import stat\n\n    mkdir(os.path.join(etc_dir, \"main\"))\n    mkdir(bin_dir)\n    mkdir(install_dir, 0755)\n    mkdir(os.path.join(data_dir, \"relay\"))\n    mkdir(os.path.join(data_dir, \"temporary\"))\n    mkdir(os.path.join(data_dir, \"outbox\", \"sent\"), mode=0700)\n    mkdir(os.path.join(cache_dir, \"main\", \"highlight\"))\n    mkdir(git_dir)\n    mkdir(os.path.join(log_dir, \"main\"))\n    mkdir(os.path.join(run_dir, \"main\", \"sockets\"), mode=0755)\n    mkdir(os.path.join(run_dir, \"main\", \"wsgi\"))\n\n    if installation.config.coverage_dir:\n        mkdir(installation.config.coverage_dir)\n\n    os.chmod(git_dir, 0770 | stat.S_ISUID | stat.S_ISGID)\n\ndef install(data):\n    mkdirs()\n    return True\n\ndef upgrade(arguments, data):\n    mkdirs()\n    return True\n\ndef undo():\n    map(shutil.rmtree, reversed(created))\n"
  },
  {
    "path": "installation/prefs.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport os\nimport json\nimport textwrap\nimport subprocess\n\nimport installation\n\ndef add_preference(db, item, data, silent=False):\n    relevance = data.get(\"relevance\", {})\n\n    cursor = db.cursor()\n    cursor.execute(\"\"\"INSERT INTO preferences (item, type, description,\n                                               per_system, per_user,\n                                               per_repository, per_filter)\n                           VALUES (%s, %s, %s, %s, %s, %s, %s)\"\"\",\n                   (item, data[\"type\"], data[\"description\"],\n                    relevance.get(\"system\", True), relevance.get(\"user\", True),\n                    relevance.get(\"repository\", False), relevance.get(\"filter\", False)))\n\n    if data[\"type\"] == \"string\":\n        cursor.execute(\"\"\"INSERT INTO userpreferences (item, string)\n                               VALUES (%s, %s)\"\"\",\n                       (item, data[\"default\"]))\n    else:\n        cursor.execute(\"\"\"INSERT INTO userpreferences (item, integer)\n                               VALUES (%s, %s)\"\"\",\n                       (item, int(data[\"default\"])))\n\n    if not silent and not installation.quiet:\n        print \"Added preference: '%s'\" % item\n\ndef update_preference(db, item, data, type_changed):\n    relevance = data.get(\"relevance\", {})\n\n    cursor = db.cursor()\n    cursor.execute(\"\"\"UPDATE preferences\n                         SET type=%s,\n                             description=%s,\n                             per_system=%s,\n                             per_user=%s,\n                             per_repository=%s,\n                             per_filter=%s\n                       WHERE item=%s\"\"\",\n                   (data[\"type\"], data[\"description\"],\n                    relevance.get(\"system\", True), relevance.get(\"user\", True),\n                    relevance.get(\"repository\", False), relevance.get(\"filter\", False),\n                    item))\n\n    if data[\"type\"] == \"string\":\n        cursor.execute(\"\"\"UPDATE userpreferences\n                             SET integer=NULL,\n                                 string=%s\n                           WHERE item=%s\n                             AND uid IS NULL\n                             AND repository IS NULL\"\"\",\n                       (data[\"default\"], item))\n    else:\n        cursor.execute(\"\"\"UPDATE userpreferences\n                             SET integer=%s,\n                                 string=NULL\n                           WHERE item=%s\n                             AND uid IS NULL\n                             AND repository IS NULL\"\"\",\n                       (int(data[\"default\"]), item))\n\n    if type_changed:\n        # Delete all per-user or per-repository overrides; they will be of an\n        # incorrect type.\n        cursor.execute(\"\"\"DELETE FROM userpreferences\n                                WHERE item=%s\n                                  AND (uid IS NOT NULL\n                                    OR repository IS NOT NULL)\"\"\",\n                       (item,))\n\ndef remove_preference(db, item):\n    cursor = db.cursor()\n    cursor.execute(\"DELETE FROM userpreferences WHERE item=%s\", (item,))\n    cursor.execute(\"DELETE FROM preferences WHERE item=%s\", (item,))\n\ndef load_preferences(db):\n    cursor = db.cursor()\n    cursor.execute(\"\"\"SELECT preferences.item, type, integer, string, description\n                        FROM preferences\n                        JOIN userpreferences USING (item)\n                       WHERE uid IS NULL\n                         AND repository IS NULL\"\"\")\n    preferences = {}\n    for item, item_type, default_integer, default_string, description in cursor:\n        data = { \"type\": item_type,\n                 \"description\": description }\n        if item_type == \"string\":\n            data[\"default\"] = default_string\n        elif item_type == \"boolean\":\n            data[\"default\"] = bool(default_integer)\n        else:\n            data[\"default\"] = default_integer\n        preferences[item] = data\n    return preferences\n\ndef install(data):\n    path = os.path.join(installation.root_dir, \"src\", \"data\",\n                        \"preferences.json\")\n\n    with open(path) as preferences_file:\n        preferences = json.load(preferences_file)\n\n    import dbutils\n\n    with installation.utils.as_critic_system_user():\n        with dbutils.Database() as db:\n            for item in sorted(preferences.keys()):\n                add_preference(db, item, preferences[item], silent=True)\n\n            db.commit()\n\n            if not installation.quiet:\n                print \"Added %d preferences.\" % len(preferences)\n\n    return True\n\ndef upgrade(arguments, data):\n    git = data[\"installation.prereqs.git\"]\n    path = \"src/data/preferences.json\"\n\n    old_sha1 = data[\"sha1\"]\n    old_file_sha1 = installation.utils.get_file_sha1(git, old_sha1, path)\n\n    new_sha1 = installation.utils.run_git([git, \"rev-parse\", \"HEAD\"],\n                                          cwd=installation.root_dir).strip()\n    new_file_sha1 = installation.utils.get_file_sha1(git, new_sha1, path)\n\n    if old_file_sha1:\n        old_source = installation.utils.run_git([git, \"cat-file\", \"blob\", old_file_sha1],\n                                                cwd=installation.root_dir)\n        old_preferences = json.loads(old_source)\n    else:\n        old_preferences = {}\n\n    preferences_path = os.path.join(installation.root_dir, path)\n\n    with open(preferences_path) as preferences_file:\n        new_preferences = json.load(preferences_file)\n\n    def update_preferences(old_preferences, new_preferences, db_preferences):\n        for item in new_preferences.keys():\n            if item not in db_preferences:\n                add_preference(db, item, new_preferences[item])\n            elif db_preferences[item] != new_preferences[item]:\n                type_changed = False\n                if db_preferences[item][\"type\"] != new_preferences[item][\"type\"]:\n                    # If the type has changed, we really have to update it; code\n                    # will depend on it having the right type.\n                    update = True\n                    type_changed = True\n                elif item in old_preferences \\\n                        and db_preferences[item] == old_preferences[item]:\n                    # The preference in the database is identical to what we\n                    # originally installed; there should be no harm in updating\n                    # it.\n                    update = True\n                elif db_preferences[item][\"default\"] == new_preferences[item][\"default\"]:\n                    # The default value is the same => only description or flags\n                    # has changed.  Probably safe to silently update.\n                    update = True\n                else:\n                    if item in old_preferences \\\n                            and db_preferences[item][\"default\"] != old_preferences[item][\"default\"]:\n                        # The default value appears to have been changed in the\n                        # database.  Ask the user before overwriting it with an\n                        # updated default value.\n                        print\n                        print textwrap.fill(\n                            \"The default value for the preference '%s' has been \"\n                            \"changed in this version of Critic, but it appears to \"\n                            \"also have been modified in the database.\"\n                            % item)\n                        default = False\n                    else:\n                        # The default value has changed, and we don't know if\n                        # the value is what was originally installed, because we\n                        # don't know what was originally installed.  Ask the\n                        # user before overwriting the current value.\n                        print\n                        print textwrap.fill(\n                            \"The default value for the preference '%s' has been \"\n                            \"changed in this version of Critic.\"\n                            % item)\n                        default = True\n\n                    print\n                    print \"  Value in database: %r\" % db_preferences[item][\"default\"]\n                    print \"  New/updated value: %r\" % new_preferences[item][\"default\"]\n                    print\n\n                    update = installation.input.yes_or_no(\n                        \"Would you like to update the database with the new value?\",\n                        default=default)\n\n                if update:\n                    update_preference(db, item, new_preferences[item], type_changed)\n\n        # Only check for preferences to remove if the preference data has\n        # changed.  Otherwise, every upgrade would ask to remove any extra\n        # preferences in the database.\n        if old_file_sha1 != new_file_sha1:\n            for item in db_preferences.keys():\n                if item not in new_preferences:\n                    if item in old_preferences \\\n                            and db_preferences[item] == old_preferences[item]:\n                        # The preference in the database is identical to what we\n                        # originally installed; there should be no harm in\n                        # updating it.\n                        remove = True\n                    else:\n                        print\n                        print textwrap.fill(\n                            \"The preference '%s' exists in the database but \"\n                            \"not in the installation data, meaning it would \"\n                            \"not have been added to the database if this \"\n                            \"version of Critic was installed from scratch.\"\n                            % item)\n                        print\n\n                        remove = installation.input.yes_or_no(\n                            \"Would you like to remove it from the database?\",\n                            default=True)\n\n                    if remove:\n                        remove_preference(db, item)\n\n        db.commit()\n\n    import dbutils\n\n    with installation.utils.as_critic_system_user():\n        with dbutils.Database() as db:\n            update_preferences(old_preferences,\n                               new_preferences,\n                               load_preferences(db))\n\n    return True\n"
  },
  {
    "path": "installation/prereqs.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport os\nimport os.path\nimport re\nimport subprocess\n\nimport installation\n\nthis_module = sys.modules[__name__]\n\ndef find_executable(name):\n    for search_path in os.environ[\"PATH\"].split(\":\"):\n        path = os.path.join(search_path, name)\n        if os.path.isfile(path) and os.access(path, os.X_OK):\n            return path\n    return None\n\nheadless = False\n\naptget = None\naptget_approved = False\naptget_updated = False\n\nneed_blankline = False\n\ninstalled_packages = []\n\ndef blankline():\n    global need_blankline\n    if need_blankline:\n        print\n        need_blankline = False\n\ndef install_packages(*packages):\n    global aptget, aptget_approved, aptget_updated, need_blankline, all_ok\n    if aptget is None:\n        aptget = find_executable(\"apt-get\")\n    if aptget and not aptget_approved:\n        all_ok = False\n        print \"\"\"\\\nFound 'apt-get' executable in your $PATH.  This script can attempt to install\nmissing software using it.\n\"\"\"\n        aptget_approved = installation.input.yes_or_no(\n            prompt=\"Do you want to use 'apt-get' to install missing packages?\",\n            default=True)\n        if not aptget_approved: aptget = False\n    if aptget:\n        aptget_env = os.environ.copy()\n        if headless:\n            aptget_env[\"DEBIAN_FRONTEND\"] = \"noninteractive\"\n        if not aptget_updated:\n            subprocess.check_output(\n                [aptget, \"-qq\", \"update\"],\n                env=aptget_env)\n            aptget_updated = True\n        aptget_output = subprocess.check_output(\n            [aptget, \"-qq\", \"-y\", \"install\"] + list(packages),\n            env=aptget_env)\n        installed = {}\n        for line in aptget_output.splitlines():\n            match = re.match(r\"^Setting up ([^ ]+) \\(([^)]+)\\) \\.\\.\\.\", line)\n            if match:\n                package_name, version = match.groups()\n                if package_name in packages:\n                    need_blankline = True\n                    installed_packages.append((package_name, version))\n                    installed[package_name] = version\n                    print \"Installed: %s (%s)\" % (package_name, version)\n        return installed\n    else:\n        return False\n\nclass Prerequisite(object):\n    def __init__(self, name, packages, message):\n        self.name = name\n        self.packages = packages\n        self.message = message\n\n        setattr(this_module, name, self)\n\n    def install(self):\n        if self.check():\n            return True\n        if self.packages is None:\n            print \"ERROR: Installing '%s' is not supported!\" % self.name\n            return False\n        if aptget_approved and install_packages(*self.packages):\n            if self.check():\n                return True\n        blankline()\n        print self.message\n        print\n        if not aptget_approved:\n            install_packages(*self.packages)\n        if self.check():\n            return True\n        print \"ERROR: Installing '%s' failed!\" % self.name\n        return False\n\nclass Executable(Prerequisite):\n    def __init__(self, name, packages, message):\n        super(Executable, self).__init__(name, packages, message)\n        self.path = None\n\n    def check(self):\n        if not self.path:\n            self.path = find_executable(self.name)\n        return bool(self.path)\n\n    def install(self):\n        if self.check():\n            return True\n        blankline()\n        print \"No '%s' executable found in $PATH\" % self.name\n        print\n        return super(Executable, self).install()\n\nclass PythonLibrary(Prerequisite):\n    def __init__(self, name, packages, message):\n        super(PythonLibrary, self).__init__(name, packages, message)\n        self.available = False\n\n    def check(self):\n        if not self.available:\n            try:\n                subprocess.check_output(\n                    [sys.executable, \"-c\", \"import \" + self.name],\n                    stderr=subprocess.STDOUT)\n                self.available = True\n            except subprocess.CalledProcessError:\n                pass\n        return self.available\n\n    def install(self):\n        if self.check():\n            return True\n        blankline()\n        print \"Failed to import '%s'\" % self.name\n        print\n        return super(PythonLibrary, self).install()\n\nclass CustomCheck(Prerequisite):\n    \"\"\"Perform a custom check, and otherwise install packages\"\"\"\n\n    def __init__(self, callback, name, packages, message):\n        super(CustomCheck, self).__init__(name, packages, message)\n        self.callback = callback\n        self.available = False\n\n    def check(self):\n        if not self.available:\n            if self.callback():\n                self.available = True\n        return self.available\n\n    def install(self):\n        if self.check():\n            return True\n        return super(CustomCheck, self).install()\n\n# This one is hardcoded to the running interpreter (rather than what we might\n# find in the search path.)\nExecutable(\"python\", None, None).path = sys.executable\n\nprerequisites = [\n\n    # We won't bother trying to install this; it won't be missing.\n    Executable(\"tar\", None, None),\n\n    Executable(\"git\", [\"git-core\"], \"\"\"\\\nMake sure the Git version control system is installed.  Is Debian/Ubuntu the\npackage you need to install is 'git-core' (or 'git' in newer versions, but\n'git-core' typically still works.)  The source code can be downloaded here:\n\n  https://github.com/git/git\"\"\"),\n\n    Executable(\"psql\", [\"postgresql\", \"postgresql-client\"], \"\"\"\\\nMake sure the PostgreSQL database server and its client utilities are installed.\nIn Debian/Ubuntu, the packages you need to install are 'postgresql' and\n'postgresql-client'.\"\"\"),\n\n    PythonLibrary(\"psycopg2\", [\"python-psycopg2\"], \"\"\"\\\nFailed to import the 'psycopg2' module, which is used to access the PostgreSQL\ndatabase from Python.  In Debian/Ubuntu, the module is provided by the\n'python-psycopg2' package.  The source code can be downloaded here:\n\n  http://www.initd.org/psycopg/download/\"\"\"),\n\n    PythonLibrary(\"requests\", [\"python-requests\"], \"\"\"\\\nFailed to import the 'requests' module, which is used to perform URL requests.\nIn Debian/Ubuntu, the module is provided by the 'python-requests' package.  The\nsource code can be downloaded here:\n\n  https://github.com/kennethreitz/requests\"\"\"),\n\n    PythonLibrary(\"pygments\", [\"python-pygments\"], \"\"\"\\\nFailed to import the 'pygments' module, which is used for syntax highlighting.\nIn Debian/Ubuntu, the module is provided by the 'python-pygments' package.  The\nsource code can be downloaded here:\n\n  http://pygments.org/download/\"\"\"),\n\n]\n\n# The passlib library is only needed if Critic is configured to do\n# authentication, so doesn't go into the list above yet.\npasslib_library = PythonLibrary(\"passlib\", [\"python-passlib\"], \"\"\"\\\nFailed to import the 'passlib' module, which is required when Critic is\nconfigured to handle user authentication itself.  In Debian/Ubuntu, the module\nis provided by the 'python-passlib' package.  The source code can be downloaded\nhere:\n\n  https://pypi.python.org/pypi/passlib\"\"\")\n\ndef check_mod_wsgi():\n    return os.path.isfile(\"/etc/apache2/mods-available/wsgi.load\")\n\n# Things to check and install if Apache is to be used.\napache_prerequisites = [\n\n    Executable(\"apache2ctl\", [\"apache2\", \"libapache2-mod-wsgi\"], \"\"\"\\\nMake sure the Apache web server is installed.  In Debian/Ubuntu, the package you\nneed to install is 'apache2'.\n\nIn addition, the mod_wsgi Apache module needs to be installed.  In\nDebian/Ubuntu, the package you need to install is 'libapache2-mod-wsgi'.\"\"\"),\n\n    # Additional executables that we use but that should have been installed\n    # along with apache2ctl.\n    Executable(\"a2enmod\", None, None),\n    Executable(\"a2ensite\", None, None),\n    Executable(\"a2dismod\", None, None),\n    Executable(\"a2dissite\", None, None),\n\n    # This extra check is really only needed if Apache was already installed\n    # (and thus not installed by the prerequisite above).\n    CustomCheck(check_mod_wsgi, \"mod_wsgi\", [\"libapache2-mod-wsgi\"], \"\"\"\\\nThe WSGI Apache module (mod_wsgi) doesn't appear to be installed.  Make sure\nit's installed.  In Debian/Ubuntu, the package you need to install is\n'libapache2-mod-wsgi'.  The source code can be downloaded here:\n\n  http://code.google.com/p/modwsgi/wiki/DownloadTheSoftware?tm=2\"\"\"),\n\n]\n\n# Things to check and install if nginx is to be used.\nnginx_prerequisites = [\n\n    Executable(\"nginx\", [\"nginx\"], \"\"\"\\\nMake sure the nginx web server is installed.  In Debian/Ubuntu, the package you\nneed to install is 'nginx'.\"\"\"),\n\n]\n\n# Things to check and install if uWSGI is to be used.\nuwsgi_prerequisites = [\n\n    Executable(\"uwsgi\", [\"uwsgi\", \"uwsgi-plugin-python\"], \"\"\"\\\nMake sure the uWSGI application container server is installed.  In\nDebian/Ubuntu, the package you need to install is 'uwsgi'.\n\nIn addition, the uWSGI Python plugin needs to be installed.  In Debian/Ubuntu,\nthe package you need to install is 'uwsgi-plugin-python'.\"\"\"),\n\n    # This extra check is really only needed if uWSGI was already installed (and\n    # thus not installed by the prerequisite above).\n    Executable(\"uwsgi_python\", [\"uwsgi-plugin-python\"], \"\"\"\\\nThe uWSGI Python plugin doesn't appear to be installed.  Make sure it's\ninstalled.  In Debian/Ubuntu, the package you need to install is\n'uwsgi-plugin-python'.\"\"\"),\n\n]\n\ndef resolve_prerequisites():\n    if installation.config.auth_mode == \"critic\":\n        prerequisites.append(passlib_library)\n\n    if installation.config.web_server_integration == \"apache\":\n        prerequisites.extend(apache_prerequisites)\n    if \"nginx\" in installation.config.web_server_integration:\n        prerequisites.extend(nginx_prerequisites)\n    if \"uwsgi\" in installation.config.web_server_integration:\n        prerequisites.extend(uwsgi_prerequisites)\n\ndef prepare(mode, arguments, data):\n    global headless\n    headless = arguments.headless\n    return True\n\ndef install(data):\n    resolve_prerequisites()\n\n    print \"\"\"\nCritic Installation: Prerequisites\n==================================\n\"\"\"\n\n    if not all(prerequisite.install() for prerequisite in prerequisites):\n        return False\n\n    if installed_packages:\n        blankline()\n        print \"Installed %d packages.\" % len(installed_packages)\n        print\n    else:\n        print \"All prerequisites available.\"\n\n    data[\"installation.prereqs.python\"] = python.path\n    data[\"installation.prereqs.git\"] = git.path\n    data[\"installation.prereqs.tar\"] = tar.path\n\n    return True\n\ndef upgrade(arguments, data):\n    import configuration\n\n    python.path = configuration.executables.PYTHON\n    git.path = configuration.executables.GIT\n    tar.path = configuration.executables.TAR\n\n    resolve_prerequisites()\n\n    if not all(prerequisite.install() for prerequisite in prerequisites):\n        return False\n\n    data[\"installation.prereqs.python\"] = python.path\n    data[\"installation.prereqs.git\"] = git.path\n    data[\"installation.prereqs.tar\"] = tar.path\n\n    return True\n"
  },
  {
    "path": "installation/process.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport subprocess\n\ndef check_input(args, stdin, **kwargs):\n    assert isinstance(stdin, str)\n\n    child = subprocess.Popen(args, stdin=subprocess.PIPE, **kwargs)\n    stdout, stderr = child.communicate(stdin)\n\n    if child.returncode != 0:\n        raise subprocess.CalledProcessError(child.returncode, args, None)\n\n    return stdout\n"
  },
  {
    "path": "installation/qs/__init__.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\n__doc__ = \"Quick-start utilities.\"\n\nimport sqlite\nimport data\n"
  },
  {
    "path": "installation/qs/data.py",
    "content": "import sys\nimport os\nimport pwd\nimport grp\nimport subprocess\nimport json\nimport multiprocessing\n\nimport installation\n\ndef config(key):\n    return \"installation.config.\" + key\ndef system(key):\n    return \"installation.system.\" + key\ndef admin(key):\n    return \"installation.admin.\" + key\ndef database(key):\n    return \"installation.database.\" + key\ndef prereqs(key):\n    return \"installation.prereqs.\" + key\ndef paths(key):\n    return \"installation.paths.\" + key\ndef smtp(key):\n    return \"installation.smtp.\" + key\ndef extensions(key):\n    return \"installation.extensions.\" + key\n\ndef username():\n    return pwd.getpwuid(os.getuid()).pw_name\ndef groupname():\n    return grp.getgrgid(pwd.getpwuid(os.getuid()).pw_gid).gr_name\n\ndef which(name):\n    return subprocess.check_output(\"which \" + name, shell=True).strip()\n\ndef generate(arguments, database_path):\n    data = { config(\"password_hash_schemes\"): [installation.config.default_password_hash_scheme],\n             config(\"default_password_hash_scheme\"): installation.config.default_password_hash_scheme,\n             config(\"minimum_password_hash_time\"): installation.config.minimum_password_hash_time,\n             config(\"minimum_rounds\"): { installation.config.default_password_hash_scheme: 100 },\n             config(\"auth_database\"): \"internal\",\n             system(\"username\"): username(),\n             system(\"email\"): username() + \"@localhost\",\n             system(\"groupname\"): groupname(),\n             admin(\"username\"): username(),\n             admin(\"email\"): username() + \"@localhost\",\n             admin(\"fullname\"): username(),\n             system(\"hostname\"): \"localhost\",\n             system(\"recipients\"): arguments.system_recipient or [username() + \"@localhost\"],\n             config(\"auth_mode\"): \"critic\",\n             config(\"session_type\"): \"cookie\",\n             config(\"allow_anonymous_user\"): True,\n             config(\"allow_user_registration\"): True,\n             config(\"verify_email_addresses\"): arguments.testing,\n             config(\"access_scheme\"): \"http\",\n             config(\"enable_access_tokens\"): True,\n             config(\"repository_url_types\"): [\"http\"],\n             config(\"default_encodings\"): [\"utf-8\", \"latin-1\"],\n             database(\"driver\"): \"sqlite\",\n             database(\"parameters\"): { \"database\": database_path },\n             config(\"is_development\"): True,\n             config(\"coverage_dir\"): None,\n             prereqs(\"python\"): sys.executable,\n             prereqs(\"git\"): which(\"git\"),\n             prereqs(\"tar\"): which(\"tar\"),\n             paths(\"etc_dir\"): installation.paths.etc_dir,\n             paths(\"install_dir\"): installation.paths.install_dir,\n             paths(\"data_dir\"): installation.paths.data_dir,\n             paths(\"cache_dir\"): installation.paths.cache_dir,\n             paths(\"log_dir\"): installation.paths.log_dir,\n             paths(\"run_dir\"): installation.paths.run_dir,\n             paths(\"git_dir\"): installation.paths.git_dir,\n             smtp(\"host\"): arguments.smtp_host,\n             smtp(\"port\"): arguments.smtp_port,\n             smtp(\"username\"): json.dumps(arguments.smtp_username),\n             smtp(\"password\"): json.dumps(arguments.smtp_password),\n             smtp(\"use_ssl\"): False,\n             smtp(\"use_starttls\"): False,\n             config(\"is_quickstart\"): True,\n             config(\"is_testing\"): arguments.testing,\n             config(\"ldap_url\"): \"\",\n             config(\"ldap_search_base\"): \"\",\n             config(\"ldap_create_user\"): False,\n             config(\"ldap_username_attribute\"): \"\",\n             config(\"ldap_fullname_attribute\"): \"\",\n             config(\"ldap_email_attribute\"): \"\",\n             config(\"ldap_cache_max_age\"): 600,\n             extensions(\"enabled\"): False,\n             extensions(\"critic_v8_jsshell\"): \"NOT_INSTALLED\",\n             extensions(\"default_flavor\"): \"js/v8\",\n             config(\"highlight.max_workers\"): multiprocessing.cpu_count(),\n             # Setting changeset.max_workers to 1 is a workaround for some race\n             # conditions causing duplicate rows in (at least) the files table.\n             config(\"changeset.max_workers\"): 1,\n             config(\"archive_review_branches\"): True,\n             config(\"web_server_integration\"): \"none\" }\n\n    def provider(name):\n        prefix = \"provider_%s.\" % name\n        return { config(prefix + \"enabled\"): False,\n                 config(prefix + \"allow_user_registration\"): False,\n                 config(prefix + \"verify_email_addresses\"): False,\n                 config(prefix + \"client_id\"): None,\n                 config(prefix + \"client_secret\"): None,\n                 config(prefix + \"bypass_createuser\"): False,\n                 config(prefix + \"redirect_uri\"): None }\n\n    data.update(provider(\"github\"))\n    data.update(provider(\"google\"))\n\n    return data\n"
  },
  {
    "path": "installation/qs/sqlite.py",
    "content": "import sqlite3\nimport os\nimport re\nimport datetime\n\nimport installation\n\nIntegrityError = sqlite3.IntegrityError\nProgrammingError = sqlite3.ProgrammingError\nOperationalError = sqlite3.OperationalError\n\ndef convert_date(value):\n    try:\n        return datetime.datetime.fromtimestamp(int(value))\n    except ValueError:\n        return datetime.datetime.strptime(value, \"%Y-%m-%d\")\n\ndef convert_datetime(value):\n    try:\n        return datetime.datetime.fromtimestamp(int(value))\n    except ValueError:\n        return datetime.datetime.strptime(value, \"%Y-%m-%d %H:%M:%S\")\n\ndef convert_interval(value):\n    try:\n        return datetime.timedelta(seconds=int(value))\n    except ValueError:\n        return 0\n\ndef convert_boolean(value):\n    return bool(int(value))\n\nsqlite3.register_converter(\"DATE\", convert_date)\nsqlite3.register_converter(\"TIMESTAMP\", convert_datetime)\nsqlite3.register_converter(\"INTERVAL\", convert_interval)\nsqlite3.register_converter(\"BOOLEAN\", convert_boolean)\n\ndef sqltokens(command):\n    return re.findall(r\"\"\"\\$\\d+|!=|<>|<=|>=|'(?:''|[^'])*'|\"(?:[^\"])*\"|\\w+|[^\\s]\"\"\", command)\n\ndef sqlcommands(filename):\n    path = os.path.join(installation.root_dir, filename)\n    script = []\n    with open(path) as script_file:\n        for line in script_file:\n            fragment, _, comment = line.strip().partition(\"--\")\n            fragment = fragment.strip()\n            if fragment:\n                script.append(fragment)\n    script = \" \".join(script)\n    return filter(None, map(str.strip, script.split(\";\")))\n\ndef replace(query, old, new):\n    tokens = query if isinstance(query, list) else sqltokens(query)\n    old = sqltokens(old)\n    new = sqltokens(new)\n    start = 0\n    try:\n        while True:\n            for anchor_offset, anchor_token in enumerate(old):\n                if anchor_token[0] != \"$\":\n                    break\n            offset = map(str.upper, tokens).index(old[anchor_offset].upper(), start) - anchor_offset\n            data = {}\n            for index in range(len(old)):\n                if old[index][0] == \"$\":\n                    data[old[index]] = tokens[offset + index]\n                elif tokens[offset + index].upper() != old[index].upper():\n                    start = offset + 1\n                    break\n            else:\n                if data:\n                    use_new = map(lambda token: data.get(token, token), new)\n                else:\n                    use_new = new\n                tokens[offset:offset + len(old)] = use_new\n                start = offset + len(use_new)\n    except (IndexError, ValueError):\n        return \" \".join(tokens)\n\nclass Cursor(object):\n    def __init__(self, connection):\n        self.cursor = connection.cursor()\n\n    def massage(self, query, parameters):\n        self.flags = set()\n        while \"=ANY (%s)\" in query:\n            for index, parameter in enumerate(parameters):\n                if isinstance(parameter, (list, set, tuple)):\n                    query = query.replace(\"=ANY (%s)\", \" IN (%s)\" % \", \".join([\"?\"] * len(parameter)), 1)\n                    parameters[index:index + 1] = parameter\n                    break\n            else:\n                assert False, \"Failed to translate all occurrences of '=ANY (%s)' in query!\"\n        if query.endswith(\" RETURNING id\"):\n            self.flags.add(\"returning_id\")\n            query = query[:-len(\" RETURNING id\")]\n        tokens = sqltokens(query.replace(\"%s\", \"?\"))\n        replace(\n            tokens,\n            \"EXTRACT ('epoch' FROM NOW() - $1)\",\n            \"strftime('%s', 'now') - $1\")\n        replace(\n            tokens,\n            \"EXTRACT ('epoch' FROM (MIN($1) - NOW()))\",\n            \"MIN($1) - strftime('%s', 'now')\")\n        replace(tokens, \"NOW()\", \"cast(strftime('%s', 'now') as integer)\")\n        replace(tokens, \"TRUE\", \"1\")\n        replace(tokens, \"FALSE\", \"0\")\n        replace(tokens, \"'1 day'\", str(24 * 60 * 60))\n        replace(tokens, \"next::text\", \"datetime(next, 'unixepoch')\")\n        replace(tokens, \"commit\", '\"commit\"')\n        replace(tokens, \"transaction\", '\"transaction\"')\n        replace(tokens, \"MD5($1)\", \"$1\")\n        replace(tokens, \"FETCH FIRST ROW ONLY\", \"\")\n        replace(tokens, \"ASC NULLS FIRST\", \"ASC\")\n        replace(tokens, \"DESC NULLS LAST\", \"DESC\")\n        replace(tokens, \"chaincomments(commentchains.id)\",\n                \"\"\"(SELECT COUNT(*) FROM comments\n                                   WHERE chain=commentchains.id\n                                     AND state='current')\"\"\")\n        replace(tokens, \"chainunread(commentchains.id, ?)\",\n                \"\"\"(SELECT COUNT(*) FROM commentstoread\n                                    JOIN comments ON (comments.id=commentstoread.comment)\n                                   WHERE comments.chain=commentchains.id\n                                     AND comments.state='current'\n                                     AND commentstoread.uid=?)\"\"\")\n        replace(tokens, \"character_length(\", \"length(\")\n        replace(tokens, \"FOR UPDATE NOWAIT\", \"\")\n        replace(tokens, \"FOR UPDATE\", \"\")\n        replace(tokens, \"~\", \"regexp\")\n        replace(tokens, \"INTERVAL ?\", \"interval_seconds(?)\")\n        return \" \".join(tokens)\n\n    def execute(self, query, parameters=()):\n        parameters = list(parameters)\n        query = self.massage(query, parameters)\n        try:\n            self.cursor.execute(query, parameters)\n        except sqlite3.OperationalError as error:\n            raise Exception(\"Invalid query: %r %r\" % (error.message, query))\n        except sqlite3.InterfaceError as error:\n            raise Exception(\"Invalid parameters: %r %r for %r\" % (error.message, parameters, query))\n        if \"returning_id\" in self.flags:\n            self.cursor.execute(\"SELECT last_insert_rowid()\")\n\n    def executemany(self, query, parameters=()):\n        if \"=ANY (%s)\" in query:\n            # We'll rewrite the query string depending on the parameters, so\n            # must use execute() for each set of parameters.\n            for values in parameters:\n                self.execute(query, values)\n            return\n\n        parameters = [list(values) for values in parameters]\n        query = self.massage(query, None)\n\n        self.cursor.executemany(query, parameters)\n\n    def fetchone(self):\n        return self.cursor.fetchone()\n    def fetchall(self):\n        return self.cursor.fetchall()\n\n    def __iter__(self):\n        return iter(self.cursor)\n\ndef regexp(pattern, string):\n    return re.search(pattern, string) is not None\n\ndef interval_seconds(string):\n    match = re.match(r\"(?:(-?\\d+)\\s+days?)?\\s*\"\n                     r\"(?:(-?\\d+)\\s+hours?)?\\s*\"\n                     r\"(?:(-?\\d+)\\s+minutes?)?\\s*\"\n                     r\"(?:(-?\\d+)\\s+seconds?)?\",\n                     string, re.I)\n    days, hours, minutes, seconds = match.groups()\n    result = 0\n    if days is not None:\n        result += 86400 * int(days)\n    if hours is not None:\n        result += 3600 * int(hours)\n    if minutes is not None:\n        result += 60 * int(minutes)\n    if seconds is not None:\n        result += int(seconds)\n    return result\n\nclass Connection(object):\n    def __init__(self, **parameters):\n        self.connection = sqlite3.connect(\n            detect_types=sqlite3.PARSE_DECLTYPES,\n            **parameters)\n        self.connection.create_function(\"regexp\", 2, regexp)\n        self.connection.create_function(\"interval_seconds\", 1, interval_seconds)\n        self.connection.text_factory = str\n        # Foreign keys are disabled by default by SQLite; this enables them.\n        # This is a safe-guard against incorrect inserts or updates, but most\n        # importantly, it makes cascading deletes work, which we depend on.\n        self.connection.execute(\"PRAGMA foreign_keys=ON\")\n    def cursor(self):\n        return Cursor(self.connection)\n    def commit(self):\n        return self.connection.commit()\n    def rollback(self):\n        return self.connection.rollback()\n    def close(self):\n        return self.connection.close()\n\ndef connect(**parameters):\n    return Connection(**parameters)\n\ndef import_schema(database_path, filenames, quiet=False, verbose=False):\n    failed = False\n    enumerations = {}\n    commands = []\n    db = sqlite3.connect(database_path)\n\n    for filename in filenames:\n        commands.extend(sqlcommands(filename))\n\n    for command in commands:\n        if command.startswith(\"SET \"):\n            # Skip SET; only used to control the output from psql.\n            continue\n        elif re.match(r\"CREATE (?:UNIQUE )?INDEX \\w+_(?:md5|gin)\", command) \\\n                or re.match(r\"CREATE (?:UNIQUE )?INDEX .* WHERE \", command):\n            # Fancy index stuff not supported by sqlite.  Since they are\n            # optional (sans performance requirements) we just skip them.\n            continue\n        elif command.startswith(\"CREATE TABLE \") \\\n                or command.startswith(\"CREATE INDEX \") \\\n                or command.startswith(\"CREATE UNIQUE INDEX \") \\\n                or command.startswith(\"CREATE VIEW \") \\\n                or command.startswith(\"INSERT INTO \"):\n            tokens = sqltokens(command)\n            replace(tokens, \"DEFAULT NOW()\", \"DEFAULT (cast(strftime('%s', 'now') as integer))\")\n            replace(tokens, \"TRUE\", \"1\")\n            replace(tokens, \"FALSE\", \"0\")\n            replace(tokens, \"INTERVAL '0'\", \"0\")\n            replace(tokens, \"SERIAL PRIMARY KEY\", \"INTEGER PRIMARY KEY\")\n            replace(tokens, \"commit\", '\"commit\"')\n            replace(tokens, \"transaction\", '\"transaction\"')\n            for name, values in enumerations.items():\n                replace(tokens, \"$1 \" + name, \"$1 text check ($1 in (%s))\" % \", \".join(values))\n            command = \" \".join(tokens)\n        elif re.match(r\"CREATE TYPE \\w+ AS ENUM\", command):\n            tokens = sqltokens(command)\n            name = tokens[2]\n            values = filter(lambda token: re.match(\"'.*'$\", token),\n                            tokens[tokens.index(\"(\") + 1:tokens.index(\")\")])\n            enumerations[name] =  values\n            continue\n        elif command.startswith(\"ALTER TABLE \"):\n            # Used to add constraints after table creation, which sqlite doesn't\n            # support.\n            continue\n        else:\n            print \"Unrecognized:\", command\n            failed = True\n\n        if verbose:\n            words = command.split()\n            for word in words:\n                if word.upper() != word:\n                    print word\n                    break\n                print word,\n\n        try:\n            db.execute(command)\n        except Exception as error:\n            print \"Failed:\", command\n            print \"  \" + str(error)\n            failed = True\n\n    if not failed:\n        db.commit()\n\n    return not failed\n"
  },
  {
    "path": "installation/smtp.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport os\nimport json\n\nimport installation\n\nhost = None\nport = None\nusername = None\npassword = None\nuse_ssl = None\nuse_starttls = None\n\ndef add_arguments(mode, parser):\n    if mode == \"install\":\n        parser.add_argument(\"--smtp-host\", help=\"SMTP server hostname (or IP)\")\n        parser.add_argument(\"--smtp-port\", help=\"SMTP server port\")\n        parser.add_argument(\"--smtp-no-auth\", action=\"store_true\", help=\"no SMTP authentication required\")\n        parser.add_argument(\"--smtp-username\", help=\"SMTP authentication username\")\n        parser.add_argument(\"--smtp-password\", help=\"SMTP authentication password\")\n\n        # Using smtplib.SMTP_SSL()\n        parser.add_argument(\"--smtp-ssl\", dest=\"smtp_use_ssl\", action=\"store_const\", const=True,\n                            help=\"use SSL(/TLS) when connecting to SMTP server\")\n        parser.add_argument(\"--smtp-no-ssl-tls\", dest=\"smtp_use_ssl\", action=\"store_const\", const=False,\n                            help=\"don't use SSL(/TLS) when connecting to SMTP server\")\n\n        # Using smtplib.SMTP() + starttls()\n        parser.add_argument(\"--smtp-starttls\", dest=\"smtp_use_starttls\", action=\"store_const\", const=True,\n                            help=\"use STARTTLS when connecting to SMTP server\")\n        parser.add_argument(\"--smtp-no-starttls\", dest=\"smtp_use_starttls\", action=\"store_const\", const=False,\n                            help=\"don't use STARTTLS when connecting to SMTP server\")\n\n        parser.add_argument(\"--skip-testmail\", action=\"store_true\",\n                            help=\"do not send a test e-mail to verify that given SMTP settings actually work\")\n        parser.add_argument(\"--skip-testmail-check\", action=\"store_true\",\n                            help=\"do not ask whether the test e-mail arrived correctly\")\n\ndef prepare(mode, arguments, data):\n    global host, port, username, password, use_ssl, use_starttls\n\n    if mode == \"install\" or \"installation.smtp.host\" not in data:\n        print \"\"\"\nCritic Installation: SMTP\n=========================\n\nCritic needs an SMTP server to use for outgoing email traffic.  Emails\nare sent to regular Critic users to notify about changes in reviews, as\nwell as to the system administrator to alert about problems.\n\"\"\"\n\n        host = \"localhost\"\n        use_ssl = False\n        use_starttls = False\n\n        def valid_port(value):\n            try:\n                if not (0 < int(value) < 65536):\n                    raise ValueError\n            except ValueError:\n                return \"must be a valid TCP port number\"\n\n\n        if mode == \"install\":\n            if arguments.smtp_use_ssl and arguments.smtp_use_starttls:\n                print \"Invalid arguments: only one of --smtp-ssl and --smtp-starttls can be enabled.\"\n                return False\n            first = True\n        else:\n            # This case, an upgrade where installation.smtp.host is not recorded\n            # in \"data\"; happens when upgrading from a pre-5f0389f commit to\n            # 5f0389f or later.  Since upgrade.py doesn't have --smtp-* command\n            # line arguments, ignore \"arguments\" variable and go straight to\n            # manual input.\n            first = False\n\n        while True:\n            if first and arguments.smtp_use_ssl is not None:\n                use_ssl = arguments.smtp_use_ssl\n            else:\n                use_ssl = installation.input.yes_or_no(\"Use SSL when connecting to the SMTP server?\", default=use_ssl)\n\n            if not use_ssl:\n                if first and arguments.smtp_use_starttls is not None:\n                    use_starttls = arguments.smtp_use_starttls\n                else:\n                    use_starttls = installation.input.yes_or_no(\"Use STARTTLS when connecting to the SMTP server?\", default=use_starttls)\n\n            if first and arguments.smtp_host:\n                host = arguments.smtp_host\n            else:\n                host = installation.input.string(\"SMTP host:\", default=host)\n\n            if first and arguments.smtp_port:\n                error = valid_port(arguments.smtp_port)\n                if error:\n                    print \"Invalid --smtp-port argument: %s.\" % error\n                    return False\n\n                port = arguments.smtp_port\n            else:\n                if port is None:\n                    if use_ssl: port = \"465\"\n                    else: port = \"25\"\n\n                port = installation.input.string(\"SMTP port:\", default=port, check=valid_port)\n\n            need_password = False\n\n            if first and arguments.smtp_username:\n                username = arguments.smtp_username\n                need_password = True\n            elif (not first or not arguments.smtp_no_auth) \\\n                    and installation.input.yes_or_no(\"Does the SMTP server require authentication?\",\n                                                     default=username is not None):\n                username = installation.input.string(\"SMTP username:\", default=username)\n                need_password = True\n\n            if need_password:\n                if first and arguments.smtp_password:\n                    password = arguments.smtp_password\n                else:\n                    password = installation.input.password(\"SMTP password:\", default=password, twice=False)\n\n            print\n\n            if (not first or not arguments.skip_testmail) \\\n                    and installation.input.yes_or_no(\"Do you want to send a test email to verify the SMTP configuration?\",\n                                                     default=True if first else None):\n                import smtplib\n                import email.mime.text\n                import email.header\n\n                recipient = installation.input.string(\"To which email address?\", default=installation.admin.email)\n                failed = None\n\n                try:\n                    try:\n                        if use_ssl:\n                            connection = smtplib.SMTP_SSL(host, port, timeout=5)\n                        else:\n                            connection = smtplib.SMTP(host, port, timeout=5)\n                    except:\n                        failed = \"Couldn't connect to the SMTP server.\"\n                        raise\n\n                    if use_starttls:\n                        try:\n                            connection.starttls()\n                        except:\n                            failed = \"Failed to start TLS.\"\n                            raise\n\n                    if username is not None:\n                        try:\n                            connection.login(username, password)\n                        except:\n                            failed = \"Failed to login.\"\n                            raise\n\n                    message = email.mime.text.MIMEText(\"This is the configuration test email from Critic.\",\n                                                       \"plain\", \"us-ascii\")\n\n                    message[\"From\"] = email.header.Header(\"Critic System <%s>\" % installation.system.email)\n                    message[\"To\"] = email.header.Header(recipient)\n                    message[\"Subject\"] = email.header.Header(\"Test email from Critic\")\n\n                    try:\n                        connection.sendmail(installation.system.email, [recipient], message.as_string())\n                    except:\n                        failed = \"Failed to send the email.\"\n                        raise\n\n                    try:\n                        connection.quit()\n                    except:\n                        failed = \"Failed to close connection.\"\n                        raise\n\n                    print\n                    print \"Test email sent to %s.\" % recipient\n                    print\n                except Exception as exception:\n                    if not failed:\n                        failed = str(exception)\n\n                if failed:\n                    print \"\"\"\nCouldn't send the test email:\n\n  %s\n\nPlease check the configuration!\n\"\"\" % failed\n                elif (first and arguments.skip_testmail_check) \\\n                         or installation.input.yes_or_no(\"Did the test email arrive correctly?\") \\\n                         or not installation.input.yes_or_no(\"Do you want to modify the configuration?\", default=True):\n                    break\n            else:\n                break\n\n            first = False\n\n        port = int(port)\n    else:\n        import configuration\n\n        host = configuration.smtp.HOST\n        port = configuration.smtp.PORT\n        use_ssl = configuration.smtp.USE_SSL\n        use_starttls = configuration.smtp.USE_STARTTLS\n\n        credentials_path = os.path.join(configuration.paths.CONFIG_DIR,\n                                        \"configuration/smtp-credentials.json\")\n        try:\n            with open(credentials_path) as credentials_file:\n                credentials = json.load(credentials_file)\n\n            username = credentials[\"username\"]\n            password = credentials[\"password\"]\n        except:\n            username = getattr(configuration.smtp, \"USERNAME\")\n            password = getattr(configuration.smtp, \"PASSWORD\")\n\n    data[\"installation.smtp.host\"] = host\n    data[\"installation.smtp.port\"] = port\n    data[\"installation.smtp.username\"] = json.dumps(username)\n    data[\"installation.smtp.password\"] = json.dumps(password)\n    data[\"installation.smtp.use_ssl\"] = use_ssl\n    data[\"installation.smtp.use_starttls\"] = use_starttls\n\n    return True\n\ndef finish(mode, arguments, data):\n    del data[\"installation.smtp.username\"]\n    del data[\"installation.smtp.password\"]\n"
  },
  {
    "path": "installation/system.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport pwd\nimport grp\nimport subprocess\nimport argparse\n\nimport installation\n\nhostname = None\nusername = \"critic\"\nemail = None\nuid = None\n\ngroupname = \"critic\"\ngid = None\n\ncreate_system_user = None\ncreated_system_user = False\ncreate_system_group = None\ncreated_system_group = False\n\ndef fetch_uid_gid():\n    global uid, gid\n\n    uid = pwd.getpwnam(username).pw_uid\n    gid = grp.getgrnam(groupname).gr_gid\n\ndef add_arguments(mode, parser):\n    if mode != \"install\":\n        parser.add_argument(\"--system-recipient\", action=\"append\",\n                            dest=\"system_recipients\", help=argparse.SUPPRESS)\n        return\n\n    parser.add_argument(\"--system-hostname\", action=\"store\",\n                        help=\"FQDN of the system\")\n    parser.add_argument(\"--system-username\", action=\"store\",\n                        help=\"name of system user to run as\")\n    parser.add_argument(\"--force-create-system-user\", action=\"store_true\",\n                        help=(\"don't prompt for permission to create a new \"\n                              \"system user if doesn't exist\"))\n    parser.add_argument(\"--system-email\", action=\"store\",\n                        help=\"address used as sender of emails\")\n    parser.add_argument(\"--system-groupname\", action=\"store\",\n                        help=\"name of system group to run as\")\n    parser.add_argument(\"--force-create-system-group\", action=\"store_true\",\n                        help=(\"don't prompt for permission to create a new \"\n                              \"system group if it doesn't exist\"))\n    parser.add_argument(\"--system-recipient\", action=\"append\",\n                        dest=\"system_recipients\", metavar=\"SYSTEM_RECIPIENT\",\n                        help=(\"email recipient of automatic messages from \"\n                              \"the system\"))\n\ndef prepare(mode, arguments, data):\n    global hostname, username, email, create_system_user\n    global groupname, create_system_group\n    global uid, gid\n\n    if mode == \"install\":\n        print \"\"\"\nCritic Installation: System\n===========================\n\"\"\"\n\n        if arguments.system_hostname: hostname = arguments.system_hostname\n        else:\n            try: hostname = subprocess.check_output([\"hostname\", \"--fqdn\"]).strip()\n            except: pass\n\n            hostname = installation.input.string(prompt=\"What is the machine's FQDN?\",\n                                                        default=hostname)\n\n        while True:\n            if arguments.system_username: username = arguments.system_username\n            else:\n                username = installation.input.string(prompt=\"What system user should Critic run as?\",\n                                                            default=username)\n\n            try:\n                pwd.getpwnam(username)\n                user_exists = True\n            except:\n                user_exists = False\n\n            if user_exists:\n                print \"\"\"\nThe system user '%s' already exists.\n\"\"\" % username\n\n                if installation.input.yes_or_no(prompt=\"Use the existing system user '%s'?\" % username,\n                                                default=True):\n                    create_system_user = False\n                    break\n            else:\n                print \"\"\"\nThe system user '%s' doesn't exists.\n\"\"\" % username\n\n                if arguments.force_create_system_user or installation.input.yes_or_no(prompt=\"Create a system user named '%s'?\" % username,\n                                                default=True):\n                    create_system_user = True\n                    break\n\n        while True:\n            if arguments.system_groupname: groupname = arguments.system_groupname\n            else:\n                groupname = installation.input.string(prompt=\"What system group should Critic run as?\",\n                                                            default=groupname)\n\n            try:\n                grp.getgrnam(groupname)\n                group_exists = True\n            except:\n                group_exists = False\n\n            if group_exists:\n                print \"\"\"\nThe system group '%s' already exists.\n\"\"\" % groupname\n\n                if installation.input.yes_or_no(prompt=\"Use the existing system group '%s'?\" % groupname,\n                                                default=True):\n                    create_system_group = False\n                    break\n            else:\n                print \"\"\"\nThe system group '%s' doesn't exists.\n\"\"\" % groupname\n\n                if arguments.force_create_system_group or installation.input.yes_or_no(prompt=\"Create a system group named '%s'?\" % groupname,\n                                                default=True):\n                    create_system_group = True\n                    break\n\n        if arguments.system_email: email = arguments.system_email\n        else:\n            email = installation.input.string(prompt=\"What address should be used as the sender of emails from the system?\",\n                                              default=(\"%s@%s\" % (username, hostname)))\n    else:\n        import configuration\n\n        hostname = configuration.base.HOSTNAME\n        username = configuration.base.SYSTEM_USER_NAME\n        email = configuration.base.SYSTEM_USER_EMAIL\n\n        try: groupname = configuration.base.SYSTEM_GROUP_NAME\n        except AttributeError: groupname = data[\"installation.system.groupname\"]\n\n        fetch_uid_gid()\n\n    data[\"installation.system.hostname\"] = hostname\n    data[\"installation.system.username\"] = username\n    data[\"installation.system.email\"] = email\n    data[\"installation.system.groupname\"] = groupname\n\n    return True\n\ndef install(data):\n    global uid, gid\n\n    if create_system_group:\n        print \"Creating group '%s' ...\" % groupname\n\n        subprocess.check_call([\"addgroup\", \"--quiet\", \"--system\", groupname])\n\n    if create_system_user:\n        print \"Creating user '%s' ...\" % username\n\n        subprocess.check_call(\n            [\"adduser\", \"--quiet\", \"--system\", \"--ingroup=%s\" % groupname,\n             \"--home=%s\" % installation.paths.data_dir, \"--disabled-login\",\n             username])\n\n    uid = pwd.getpwnam(username).pw_uid\n    gid = grp.getgrnam(groupname).gr_gid\n\n    return True\n\ndef undo():\n    if created_system_user:\n        print \"Deleting user '%s' ...\" % username\n\n        subprocess.check_call([\"deluser\", \"--system\", username])\n\n    if created_system_group:\n        print \"Deleting group '%s' ...\" % groupname\n\n        subprocess.check_call([\"delgroup\", \"--system\", groupname])\n"
  },
  {
    "path": "installation/templates/apache/site.both",
    "content": "WSGIApplicationGroup %%{GLOBAL}\nWSGIProcessGroup critic-main\nWSGIDaemonProcess critic-main processes=2 \\\n                              threads=25 \\\n                              home=%(installation.paths.install_dir)s \\\n                              python-path=%(installation.paths.etc_dir)s/main:%(installation.paths.install_dir)s \\\n                              user=%(installation.system.username)s \\\n                              group=%(installation.system.groupname)s\n\n<VirtualHost *:80>\n\tServerAdmin %(installation.admin.email)s\n\tServerName %(installation.system.hostname)s\n\n\tWSGIImportScript %(installation.paths.install_dir)s/wsgistartup.py \\\n\t                 process-group=critic-main \\\n\t                 application-group=%%{GLOBAL}\n\n\tWSGIScriptAlias / %(installation.paths.install_dir)s/wsgi.py\n\n        WSGIPassAuthorization %(installation.apache.pass_auth)s\n\n\t# Possible values include: debug, info, notice, warn, error, crit,\n\t# alert, emerg.\n\tLogLevel warn\n\n\tErrorLog %(installation.paths.log_dir)s/main/error.log\n\tCustomLog %(installation.paths.log_dir)s/main/access.log combined\n\n\tAlias /static-resource/ \"%(installation.paths.install_dir)s/resources/\"\n\t<Directory \"%(installation.paths.install_dir)s/resources\">\n\t\tOptions FollowSymLinks\n\t\tAllowOverride None\n\t\tOrder allow,deny\n\t\tAllow from all\n\t\tExpiresActive On\n\t\tExpiresDefault A2592000\n\t</Directory>\n</VirtualHost>\n\n<VirtualHost *:443>\n\tServerAdmin %(installation.admin.email)s\n\tServerName %(installation.system.hostname)s\n\n\t# Uncomment these lines, and create a suitable ssl.conf file, to\n\t# actually enable SSL support.\n\t#SSLEngine on\n\t#Include /etc/apache2/ssl/ssl.conf\n\n\tWSGIImportScript %(installation.paths.install_dir)s/wsgistartup.py \\\n\t                 process-group=critic-main \\\n\t                 application-group=%%{GLOBAL}\n\n\tWSGIScriptAlias / %(installation.paths.install_dir)s/wsgi.py\n\n        WSGIPassAuthorization %(installation.apache.pass_auth)s\n\n\t# Possible values include: debug, info, notice, warn, error, crit,\n\t# alert, emerg.\n\tLogLevel warn\n\n\tErrorLog %(installation.paths.log_dir)s/main/error.log\n\tCustomLog %(installation.paths.log_dir)s/main/access.log combined\n\n\tAlias /static-resource/ \"%(installation.paths.install_dir)s/resources/\"\n\t<Directory \"%(installation.paths.install_dir)s/resources\">\n\t\tOptions FollowSymLinks\n\t\tAllowOverride None\n\t\tOrder allow,deny\n\t\tAllow from all\n\t\tExpiresActive On\n\t\tExpiresDefault A2592000\n\t</Directory>\n</VirtualHost>\n"
  },
  {
    "path": "installation/templates/apache/site.http",
    "content": "<VirtualHost *:80>\n\tServerAdmin %(installation.admin.email)s\n\tServerName %(installation.system.hostname)s\n\n\tWSGIApplicationGroup %%{GLOBAL}\n\tWSGIProcessGroup critic-main\n\tWSGIDaemonProcess critic-main processes=2 \\\n\t                              threads=25 \\\n\t                              home=%(installation.paths.install_dir)s \\\n\t                              python-path=%(installation.paths.etc_dir)s/main:%(installation.paths.install_dir)s \\\n\t                              user=%(installation.system.username)s \\\n\t                              group=%(installation.system.groupname)s\n\n\tWSGIImportScript %(installation.paths.install_dir)s/wsgistartup.py \\\n\t                 process-group=critic-main \\\n\t                 application-group=%%{GLOBAL}\n\n\tWSGIScriptAlias / %(installation.paths.install_dir)s/wsgi.py\n\n        WSGIPassAuthorization %(installation.apache.pass_auth)s\n\n\t# Possible values include: debug, info, notice, warn, error, crit,\n\t# alert, emerg.\n\tLogLevel warn\n\n\tErrorLog %(installation.paths.log_dir)s/main/error.log\n\tCustomLog %(installation.paths.log_dir)s/main/access.log combined\n\n\tAlias /static-resource/ \"%(installation.paths.install_dir)s/resources/\"\n\t<Directory \"%(installation.paths.install_dir)s/resources\">\n\t\tOptions FollowSymLinks\n\t\tAllowOverride None\n\t\tOrder allow,deny\n\t\tAllow from all\n\t\tExpiresActive On\n\t\tExpiresDefault A2592000\n\t</Directory>\n</VirtualHost>\n"
  },
  {
    "path": "installation/templates/apache/site.https",
    "content": "<VirtualHost *:80>\n\tServerAdmin %(installation.admin.email)s\n\tServerName %(installation.system.hostname)s\n\n\tRewriteEngine on\n\tRewriteCond %%{HTTP_HOST} (.*)\n\tRewriteRule ^(.*) https://%%1$1 [L,R,NE]\n</VirtualHost>\n\n<VirtualHost *:443>\n\tServerAdmin %(installation.admin.email)s\n\tServerName %(installation.system.hostname)s\n\n\t# Uncomment these lines, and create a suitable ssl.conf file, to\n\t# actually enable SSL support.\n\t#SSLEngine on\n\t#Include /etc/apache2/ssl/ssl.conf\n\n\tWSGIApplicationGroup %%{GLOBAL}\n\tWSGIProcessGroup critic-main\n\tWSGIDaemonProcess critic-main processes=2 \\\n\t                              threads=25 \\\n\t                              home=%(installation.paths.install_dir)s \\\n\t                              python-path=%(installation.paths.etc_dir)s/main:%(installation.paths.install_dir)s \\\n\t                              user=%(installation.system.username)s \\\n\t                              group=%(installation.system.groupname)s\n\n\tWSGIImportScript %(installation.paths.install_dir)s/wsgistartup.py \\\n\t                 process-group=critic-main \\\n\t                 application-group=%%{GLOBAL}\n\n\tWSGIScriptAlias / %(installation.paths.install_dir)s/wsgi.py\n\n        WSGIPassAuthorization %(installation.apache.pass_auth)s\n\n\t# Possible values include: debug, info, notice, warn, error, crit,\n\t# alert, emerg.\n\tLogLevel warn\n\n\tErrorLog %(installation.paths.log_dir)s/main/error.log\n\tCustomLog %(installation.paths.log_dir)s/main/access.log combined\n\n\tAlias /static-resource/ \"%(installation.paths.install_dir)s/resources/\"\n\t<Directory \"%(installation.paths.install_dir)s/resources\">\n\t\tOptions FollowSymLinks\n\t\tAllowOverride None\n\t\tOrder allow,deny\n\t\tAllow from all\n\t\tExpiresActive On\n\t\tExpiresDefault A2592000\n\t</Directory>\n</VirtualHost>\n"
  },
  {
    "path": "installation/templates/configuration/__init__.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport base\nimport auth\nimport paths\nimport services\nimport mimetypes\nimport executables\nimport limits\nimport extensions\nimport database\nimport smtp\nimport debug\n"
  },
  {
    "path": "installation/templates/configuration/auth.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2013 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\n# Accepted password hash schemes.  They need to be supported by the passlib\n# Python package; see http://packages.python.org/passlib for details.\nPASSWORD_HASH_SCHEMES = %(installation.config.password_hash_schemes)r\n\n# Default password hash scheme.  Must be included in PASSWORD_HASH_SCHEMES.\nDEFAULT_PASSWORD_HASH_SCHEME = %(installation.config.default_password_hash_scheme)r\n\n# (Approximate) minimum password hash time in seconds.  Higher means safer\n# passwords (more difficult to decrypt using brute-force) but slower sign-in\n# operation.\nMINIMUM_PASSWORD_HASH_TIME = %(installation.config.minimum_password_hash_time)r\n\n# Calibrated minimum rounds per password hash scheme.\nMINIMUM_ROUNDS = %(installation.config.minimum_rounds)r\n\n# External authentication providers.\nPROVIDERS = {\n\n    # GitHub OAuth-based authentication.\n    \"github\": {\n        \"enabled\": %(installation.config.provider_github.enabled)r,\n\n        # Allow authenticated user to create a Critic user.\n        \"allow_user_registration\": %(installation.config.provider_github.allow_user_registration)r,\n        # Verify user email addresses provided by GitHub.\n        \"verify_email_addresses\": %(installation.config.provider_github.verify_email_addresses)r,\n\n        # Client ID and secret.  These are generated by registering an\n        # application at https://github.com/settings/applications/new.\n        \"client_id\": %(installation.config.provider_github.client_id)r,\n        \"client_secret\": %(installation.config.provider_github.client_secret)r,\n\n        # Bypass /createuser on first sign in, creating a user automatically.\n        \"bypass_createuser\": %(installation.config.provider_github.bypass_createuser)r,\n\n        # Authentication callback URI.  This same URI must be provided\n        # to GitHub when registering the application.  The path\n        # component must be \"/oauth/github\".\n        \"redirect_uri\": %(installation.config.provider_github.redirect_uri)r\n    },\n\n    # Google OAuth-based authentication.\n    \"google\": {\n        \"enabled\": %(installation.config.provider_google.enabled)r,\n\n        # Allow authenticated user to create a Critic user.\n        \"allow_user_registration\": %(installation.config.provider_google.allow_user_registration)r,\n        # Verify user email addresses provided by Google.\n        \"verify_email_addresses\": %(installation.config.provider_google.verify_email_addresses)r,\n\n        # Client ID and secret.  These are generated by creating a project at\n        # https://cloud.google.com/console/project, and then creating an OAuth2\n        # client id using the project administration UI.\n        \"client_id\": %(installation.config.provider_google.client_id)r,\n        \"client_secret\": %(installation.config.provider_google.client_secret)r,\n\n        # Bypass /createuser on first sign in, creating a user automatically.\n        \"bypass_createuser\": %(installation.config.provider_google.bypass_createuser)r,\n\n        # Authentication callback URI.  This same URI must be provided\n        # to Google when creating the OAuth2 client id.  The path\n        # component must be \"/oauth/google\".\n        \"redirect_uri\": %(installation.config.provider_google.redirect_uri)r\n    },\n\n}\n\n# Authentication databases.\nDATABASES = {\n\n    # Using Critic's own user database for authentication.\n    \"internal\": {},\n\n    # Using an LDAP database for authentication.\n    \"ldap\": {\n        # Input fields.\n        #\n        # Each element is a tuple containing:\n        #  [0]: True if the field should use <input type=password>\n        #  [1]: Internal field identifier\n        #  [2]: Field label\n        #  [3]: (Optional) Longer description / help text\n        \"fields\": [\n            (False, \"username\", \"Username:\"),\n            (True, \"password\", \"Password:\"),\n        ],\n\n        # LDAP server URL.\n        \"url\": \"%(installation.config.ldap_url)s\",\n\n        # Use TLS when connecting to LDAP server.\n        \"use_tls\": True,\n\n        # Credentials field.\n        #\n        # Identifier of the field whose value will be used as the credentials\n        # (e.g. password) in the bind request used for authentication.\n        \"credentials\": \"password\",\n\n        # The following two values are all interpreted as Python format strings\n        # that can reference field values, e.g. using \"%%(username)s\".  The input\n        # values will have been escaped for safe usage in LDAP expressions.\n\n        # LDAP search base.\n        \"search_base\": \"%(installation.config.ldap_search_base)s\",\n\n        # LDAP search filter.\n        \"search_filter\": \"(uid=%%(username)s)\",\n\n        # The following settings control if and how Critic user records are\n        # created after successful authentication of a user.\n\n        # If true, Critic user records are created automatically if\n        # authentication succeeds but a matching record is not found.\n        \"create_user\": %(installation.config.ldap_create_user)r,\n\n        # User name LDAP attribute.\n        #\n        # This is the LDAP attribute whose value is used as the Critic username,\n        # both when looking for an existing user record and when creating a new\n        # one (if one isn't found.)\n        #\n        # If the attribute is missing or empty it will be considered an\n        # authentication error.\n        \"username_attribute\": \"%(installation.config.ldap_username_attribute)s\",\n\n        # Full name LDAP attribute.\n        #\n        # This is the LDAP attribute to use as the (initial) full name when\n        # creating a new Critic user record.  It is not used if an existing user\n        # record is found.\n        #\n        # If the attribute is missing or empty, the user is created with the\n        # username as full name.\n        \"fullname_attribute\": \"%(installation.config.ldap_fullname_attribute)s\",\n\n        # Email LDAP attribute.\n        #\n        # This is the LDAP attribute to use as the (initial) primary email\n        # address when creating a new Critic user record.  It is not used if an\n        # existing user record is found.\n        #\n        # If the attribute is missing or empty, the user is created with no\n        # primary email address.\n        \"email_attribute\": \"%(installation.config.ldap_email_attribute)s\",\n\n        # List of required LDAP groups.\n        #\n        # If the list is empty, no group membership is required.\n        \"require_groups\": [\n\n            # {\n            #     # Distinguished name of the required group.\n            #     \"dn\": \"cn=SomeGroup,ou=Groups,dc=example,dc=com\",\n            #\n            #     # Group attribute containing the list of members.\n            #     \"members_attribute\": \"memberUid\",\n            #\n            #     # Value to search for in the list of members.\n            #     #\n            #     # The value is interpreted as a Python format string, and can\n            #     # reference field values.  It can also reference the\n            #     # distinguished name of the user signing in as \"%%(dn)s\".\n            #     \"member_value\": \"%%(username)s\",\n            # },\n\n        ],\n\n        # Maximum age of cached successful authentication attempts, in seconds.\n        # If set to zero, caching is disabled altogether.\n        \"cache_max_age\": %(installation.config.ldap_cache_max_age)r,\n    },\n\n}\n\nDATABASE = %(installation.config.auth_database)r\n\nENABLE_ACCESS_TOKENS = %(installation.config.enable_access_tokens)r\n"
  },
  {
    "path": "installation/templates/configuration/base.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\n# The name of the system identity which this configuration applies to.\nSYSTEM_IDENTITY = \"main\"\n\n# The name of the system user that Critic runs as.\nSYSTEM_USER_NAME = \"%(installation.system.username)s\"\n\n# The email address to use in the \"Sender:\" header in all generated\n# emails, and in the \"From:\" header in emails unless there's a real\n# user whose email address it makes sense to use instead.\nSYSTEM_USER_EMAIL = \"%(installation.system.email)s\"\n\n# The name of the system group that Critic runs as.\nSYSTEM_GROUP_NAME = \"%(installation.system.groupname)s\"\n\n# List of recipients of system messages such as automatic error\n# reports generated when unexpected errors occur.\nSYSTEM_RECIPIENTS = %(installation.system.recipients)r\n\n# The primary FQDN of the server.  This is used when generating message IDs for\n# emails, and should *not* be different in different system identities, since\n# then email threading will not work properly.\nHOSTNAME = \"%(installation.system.hostname)s\"\n\n# The way Critic identifies/authenticates users: \"host\", \"critic\" or the name of\n# one of the supported external authentication providers.\nAUTHENTICATION_MODE = \"%(installation.config.auth_mode)s\"\n\n# If AUTHENTICATION_MODE=\"critic\", type of session: \"httpauth\" or \"cookie\".  If\n# AUTHENTICATION_MODE=<external>, this must be \"cookie\".\nSESSION_TYPE = \"%(installation.config.session_type)s\"\n\n# If AUTHENTICATION_MODE!=\"host\" and SESSION_TYPE=\"cookie\", maximum age of\n# session in seconds.  Zero means no maximum age; session is valid until user\n# logs out.\nSESSION_MAX_AGE = 0\n\n# Allow (restricted) anonymous access to the system.  Only supported if\n# AUTHENTICATION_MODE!=\"host\" and SESSION_TYPE=\"cookie\".\nALLOW_ANONYMOUS_USER = %(installation.config.allow_anonymous_user)r\n\n# Web server (HTTPD) integration.\n#\n# Supported alternatives:\n#\n#  \"apache\"      => Apache (2.2 or 2.4) with mod_wsgi.\n#  \"nginx+uwsgi\" => nginx with uWSGI as WSGI back-end.\n#  \"uwsgi\"       => uWSGI as both HTTP front-end and WSGI back-end.\n#  \"none\"        => no supported integration (manual by system administrator).\nWEB_SERVER_INTEGRATION = \"%(installation.config.web_server_integration)s\"\n\n# Access scheme: \"http\", \"https\" or \"both\".\nACCESS_SCHEME = \"%(installation.config.access_scheme)s\"\n\n# Supported repository URL types (when displayed in UI and in emails):\n#\n#  \"git\"  => \"git://hostname/path.git\"\n#  \"http\" => \"http://hostname/path.git\" or \"https://hostname/path.git\"\n#  \"ssh\"  => \"ssh://hostname/path.git\"\n#  \"host\" => \"hostname:/path.git\"\n#\n# where 'hostname' is the system's FQDN and 'path.git' is the repository's path\n# relative configuration.paths.GIT_DIR.\n#\n# The 'http' choice means HTTP or HTTPS depending on the ACCESS_SCHEME setting\n# and whether the user is anonymous or not.\n#\n# Note: Only 'http' is currently supported natively by Critic.  For 'git' to\n# work, the system administrator must configure 'git daemon' to run manually.\n# For 'ssh' and 'host' to work (they mean the same thing, only with different\n# syntax) system user accounts must be created, and SSH access provided.  See\n# the system administration tutorial for more information.\nREPOSITORY_URL_TYPES = %(installation.config.repository_url_types)r\n\n# Default encodings to attempt to decode text (such as source code)\n# as, in order of decreasing precedence.  The encoding names should be\n# valid for use as the encoding argument to Python's str.decode()\n# function.\nDEFAULT_ENCODINGS = %(installation.config.default_encodings)r\n\n# Allow (unattended) user registration.  If False, user registration can still\n# be enabled for a specific external user authentication provider; see auth.py.\nALLOW_USER_REGISTRATION = %(installation.config.allow_user_registration)r\n\n# Regular expression (source) that user names provided by new users must match.\n# A None value is equivalent to a pattern that matches all strings.  Empty user\n# names or user names containing only white-space characters are not allowed,\n# regardless this setting.\n#\n# Note that users created by the system administrator using criticctl are\n# not subjected to this restriction.\nUSER_NAME_PATTERN = r\"^\\w[-\\._\\w]*\\w$\"\n\n# Description of above pattern, shown to the user if a provided user name fails\n# to match the pattern.\nUSER_NAME_PATTERN_DESCRIPTION = (\n    \"Must contain only alpha-numerics, periods, underscores or dashes, and \"\n    \"must start and end with alpha-numerics, and be at least two characters \"\n    \"long.\")\n\n# Require verification of email addresses provided by users before sending\n# emails to them.  This does not affect email addresses set by the system\n# administrator via the 'criticctl' utility or via the web interface.\nVERIFY_EMAIL_ADDRESSES = %(installation.config.verify_email_addresses)r\n\n# Archive review branches belonging to closed or dropped reviews a configurable\n# amount of time after the review was closed or dropped.  Archiving a review\n# branch means deleting the review branch ref, but the commits on the branch are\n# kept alive for all eternity.  An archived branch can be resurrected at any\n# time, and is resurrected automatically should the review be reopened.\n#\n# Archiving branches avoids an ever increasing number of refs in repositories\n# that over time leads to performance degradation.\nARCHIVE_REVIEW_BRANCHES = %(installation.config.archive_review_branches)r\n"
  },
  {
    "path": "installation/templates/configuration/database.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\n# The database server: \"postgresql\" for a real install, \"sqlite\" for a quick-\n# start.\nDRIVER = %(installation.database.driver)r\n\n# Dictionary whose members are passed as keyword arguments to\n# psycopg2.connect().\nPARAMETERS = %(installation.database.parameters)r\n"
  },
  {
    "path": "installation/templates/configuration/debug.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2013 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\n# True if this is a development installation of the system; False if it is a\n# production installation.  Causes stack traces from unexpected exceptions to be\n# displayed to all users rather than only to those with the \"developer\" role.\n# Also changes the favicon and the color of the \"Opera\" text in page headers\n# from red to black.\nIS_DEVELOPMENT = %(installation.config.is_development)r\n\n# True if this is an installation by the automatic testing framework.\nIS_TESTING = %(installation.config.is_testing)r\n\n# True if this is an instance started using the installation/quickstart.py\n# script.\nIS_QUICKSTART = %(installation.config.is_quickstart)r\n\n# Directory to write code coverage results to.  If None, code coverage is not\n# written, and more importantly, not measured in the first place.\nCOVERAGE_DIR = %(installation.config.coverage_dir)r\n"
  },
  {
    "path": "installation/templates/configuration/executables.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\n# Python executable.\nPYTHON = \"%(installation.prereqs.python)s\"\n\n# Git executable.\nGIT = \"%(installation.prereqs.git)s\"\n\n# Add these to the environment when running Git commands\nGIT_ENV = { }\n\n# Tar executable.\nTAR = \"%(installation.prereqs.tar)s\"\n\n# JSShell executable (only needed for extensions support.)\nJSSHELL = None\n"
  },
  {
    "path": "installation/templates/configuration/extensions.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport configuration\nimport os.path\n\n# Whether extension support is enabled.  If False, the rest of the\n# configuration in this file is irrelevant.\nENABLED = %(installation.extensions.enabled)s\n\n# Where to search for system extensions.\nSYSTEM_EXTENSIONS_DIR = os.path.join(configuration.paths.DATA_DIR, \"extensions\")\n\n# Name of directory under users' $HOME in which to search for user extensions.\n# If set to None, user extensions support is disabled.\nUSER_EXTENSIONS_DIR = \"CriticExtensions\"\n\nFLAVORS = {\n    \"js/v8\":\n        { \"executable\": \"%(installation.extensions.critic_v8_jsshell)s\",\n          \"library\": os.path.join(configuration.paths.INSTALL_DIR, \"library\", \"js\", \"v8\") }\n    }\n\nDEFAULT_FLAVOR = \"%(installation.extensions.default_flavor)s\"\n\n# Directory into which extension version snapshots are installed.\nINSTALL_DIR = os.path.join(configuration.paths.DATA_DIR, \"extension-snapshots\")\n\n# Directory into which extension repository work copies are created.\nWORKCOPY_DIR = os.path.join(configuration.paths.DATA_DIR, \"temporary\", \"EXTENSIONS\")\n\n# Long timeout, in seconds.  Used for extension \"Page\" roles.\nLONG_TIMEOUT = 300\n# Short timeout, in seconds.  Used for all other roles.\nSHORT_TIMEOUT = 5\n"
  },
  {
    "path": "installation/templates/configuration/limits.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\n# When a file is added, and it has more lines than these limits,\n# output a \"File was added.\" placeholder instead of the entire file,\n# with an option to fetch all the lines.  The _RECOGNIZED option is\n# for a file in a recognized (and syntax highlighted) language, the\n# _UNRECOGNIZED option is for other files.\nMAXIMUM_ADDED_LINES_RECOGNIZED = 8000\nMAXIMUM_ADDED_LINES_UNRECOGNIZED = 2000\n\n# Reject any single ref update that causes more than this many new\n# commits to be added to the repository.  The likely cause for hitting\n# this limit is pushing a branch to the wrong repository, which just\n# causes bloat in the receiving repository.\nPUSH_COMMIT_LIMIT = 10000\n\n# For branches containing more commits than this, fall back to simpler\n# branch log rendering for performance reasons.\nMAXIMUM_REACHABLE_COMMITS = 4000\n\n# Maximum number of commits when /createreview is loaded with the\n# 'branch' URI parameter to create a review of all commits on a branch.\nMAXIMUM_REVIEW_COMMITS = 2000\n"
  },
  {
    "path": "installation/templates/configuration/mimetypes.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nMIMETYPES = { \"txt\": \"text/plain\",\n              \"html\": \"text/html\",\n              \"xml\": \"application/xml\",\n              \"xsl\": \"application/xml\",\n              \"svg\": \"image/svg+xml\",\n              \"css\": \"text/css\",\n              \"js\": \"text/javascript\",\n              \"json\": \"application/json\",\n              \"png\": \"image/png\",\n              \"gif\": \"image/gif\",\n              \"jpg\": \"image/jpeg\",\n              \"ico\": \"image/vnd.microsoft.icon\",\n              \"bmp\": \"image/x-bmp\" }\n"
  },
  {
    "path": "installation/templates/configuration/paths.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport configuration\nimport os.path\n\n# Directory where system configuration is stored.\nCONFIG_DIR = os.path.join(\"%(installation.paths.etc_dir)s\", configuration.base.SYSTEM_IDENTITY)\n\n# Directory where the main system is installed.\nINSTALL_DIR = \"%(installation.paths.install_dir)s\"\n\n# Directory under which data files of a more permanent nature are\n# stored.\nDATA_DIR = \"%(installation.paths.data_dir)s\"\n\n# Directory under which cache files that can be discarded at any time\n# are stored.\nCACHE_DIR = os.path.join(\"%(installation.paths.cache_dir)s\", configuration.base.SYSTEM_IDENTITY)\n\n# Directory in which log files are stored.\nLOG_DIR = os.path.join(\"%(installation.paths.log_dir)s\", configuration.base.SYSTEM_IDENTITY)\n\n# Directory in which pid files are stored.\nRUN_DIR = os.path.join(\"%(installation.paths.run_dir)s\", configuration.base.SYSTEM_IDENTITY)\n\n# Directory in which WSGI daemon process pid files are stored.\nWSGI_PIDFILE_DIR = os.path.join(RUN_DIR, \"wsgi\")\n\n# Directory in which Unix socket files are created.\nSOCKETS_DIR = os.path.join(RUN_DIR, \"sockets\")\n\n# Directory where the main (public) git repositories are stored.\nGIT_DIR = \"%(installation.paths.git_dir)s\"\n\n# Directory in which emails are stored pending delivery.\nOUTBOX = os.path.join(DATA_DIR, \"outbox\")\n"
  },
  {
    "path": "installation/templates/configuration/services.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport configuration\nimport os.path\n\ndef service(name, address=0, module=0, pidfile_path=0, logfile_path=0, loglevel=0):\n    if address      == 0: address      = os.path.join(configuration.paths.SOCKETS_DIR, name + \".unix\")\n    if module       == 0: module       = \"background.\" + name\n    if pidfile_path == 0: pidfile_path = os.path.join(configuration.paths.RUN_DIR, name + \".pid\")\n    if logfile_path == 0: logfile_path = os.path.join(configuration.paths.LOG_DIR, name + \".log\")\n    if loglevel     == 0: loglevel     = \"info\"\n\n    return { \"name\": name,\n             \"address\": address,\n             \"module\": module,\n             \"pidfile_path\": pidfile_path,\n             \"logfile_path\": logfile_path,\n             \"loglevel\": loglevel }\n\nHIGHLIGHT         = service(name=\"highlight\")\nCHANGESET         = service(name=\"changeset\")\nGITHOOK           = service(name=\"githook\")\nBRANCHTRACKER     = service(name=\"branchtracker\",     address=None)\nMAILDELIVERY      = service(name=\"maildelivery\",      address=None)\nWATCHDOG          = service(name=\"watchdog\",          address=None)\nMAINTENANCE       = service(name=\"maintenance\",       address=None)\nEXTENSIONTASKS    = service(name=\"extensiontasks\",    address=None)\nEXTENSIONRUNNER   = service(name=\"extensionrunner\")\nSERVICEMANAGER    = service(name=\"servicemanager\")\n\nHIGHLIGHT[\"cache_dir\"] = os.path.join(configuration.paths.CACHE_DIR, \"highlight\")\nHIGHLIGHT[\"min_context_length\"] = 5\nHIGHLIGHT[\"max_context_length\"] = 256\nHIGHLIGHT[\"max_workers\"] = %(installation.config.highlight.max_workers)d\nHIGHLIGHT[\"compact_at\"] = (3, 15)\n\nCHANGESET[\"max_workers\"] = %(installation.config.changeset.max_workers)d\nCHANGESET[\"rss_limit\"] = 1024 ** 3\nCHANGESET[\"purge_at\"] = (2, 15)\n\n# Timeout (in seconds) passed to smtplib.SMTP().\nMAILDELIVERY[\"timeout\"] = 10\n\nWATCHDOG[\"rss_soft_limit\"] = 1024 ** 3\nWATCHDOG[\"rss_hard_limit\"] = 2 * WATCHDOG[\"rss_soft_limit\"]\n\nMAINTENANCE[\"maintenance_at\"] = (4, 0)\n\nEXTENSIONRUNNER[\"cached_processes\"] = 5\n\nSERVICEMANAGER[\"services\"] = [HIGHLIGHT,\n                              CHANGESET,\n                              GITHOOK,\n                              BRANCHTRACKER,\n                              MAILDELIVERY,\n                              WATCHDOG,\n                              MAINTENANCE,\n                              EXTENSIONTASKS,\n                              EXTENSIONRUNNER]\n"
  },
  {
    "path": "installation/templates/configuration/smtp-credentials.json",
    "content": "{ \"username\": %(installation.smtp.username)s,\n  \"password\": %(installation.smtp.password)s }"
  },
  {
    "path": "installation/templates/configuration/smtp.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nHOST = %(installation.smtp.host)r\nPORT = %(installation.smtp.port)r\nUSE_SSL = %(installation.smtp.use_ssl)r\nUSE_STARTTLS = %(installation.smtp.use_starttls)r\n\nMAX_ATTEMPTS = 10\n"
  },
  {
    "path": "installation/templates/criticctl",
    "content": "#!%(installation.prereqs.python)s\n# -*- mode: python -*-\n\nimport sys\nimport argparse\nimport os\nimport pwd\nimport grp\n\n# To avoid accidentally creating files owned by root.\nsys.dont_write_bytecode = True\n\nsystem_user_uid = pwd.getpwnam(\"%(installation.system.username)s\").pw_uid\nsystem_user_gid = grp.getgrnam(\"%(installation.system.groupname)s\").gr_gid\n\ntry:\n    os.setegid(system_user_gid)\n    os.seteuid(system_user_uid)\nexcept OSError:\n    print >>sys.stderr, \"ERROR: Failed to set UID = %(installation.system.username)s. Run as root?\"\n    sys.exit(1)\n\nargv = sys.argv[1:]\n\nglobal_argv = []\ncommand = None\ncommand_argv = []\n\nparser = argparse.ArgumentParser(\n    description=\"Critic administration interface\",\n    usage=\"%%(prog)s [-h] [--etc-dir ETC_DIR] [--identify IDENTITY] COMMAND [options]\",\n    add_help=False)\n\nparser.add_argument(\"--help\", \"-h\", action=\"store_true\",\n                    help=\"show this help message and exit\")\nparser.add_argument(\"--etc-dir\", \"-e\", default=\"%(installation.paths.etc_dir)s\",\n                    help=\"Critic configuration directory [default=%(installation.paths.etc_dir)s]\")\nparser.add_argument(\"--identity\", \"-i\", default=\"main\",\n                    help=\"system identity to manage [default=main]\")\n\nwhile argv:\n    argument = argv[0]\n    if argument in (\"--help\", \"-h\"):\n        global_argv.append(argument)\n        del argv[0]\n        continue\n    elif argument in (\"--etc-dir\", \"-e\",\n                      \"--identity\", \"-i\"):\n        global_argv.extend(argv[:2])\n        del argv[:2]\n        continue\n    elif argument.startswith(\"--etc-dir=\") \\\n            or argument.startswith(\"-e\") \\\n            or argument.startswith(\"--identity=\") \\\n            or argument.startswith(\"-i\"):\n        global_argv.append(argument)\n        del argv[0]\n        continue\n    elif argument.startswith(\"-\"):\n        # Invalid argument; add it to global_argv so that parser.parse_args()\n        # below fails.\n        global_argv.append(argument)\n\n    break\n\nif argv:\n    command = argv[0]\n    command_argv = argv[1:]\n\narguments = parser.parse_args(global_argv)\n\netc_path = os.path.join(arguments.etc_dir, arguments.identity)\n\nclass Error(Exception):\n    pass\n\ntry:\n    if not os.access(arguments.etc_dir, os.R_OK | os.X_OK):\n        raise Error(\"Directory is inaccessible: %%s\" %% arguments.etc_dir)\n\n    if not os.path.isdir(etc_path):\n        raise Error(\"Invalid identity: %%s\" %% arguments.identity)\n\n    sys.path.insert(0, etc_path)\n\n    try:\n        import configuration\n    except ImportError:\n        raise Error(\"Failed to import: configuration\")\n\n    sys.path.insert(1, configuration.paths.INSTALL_DIR)\n    sys.path.insert(2, configuration.paths.DATA_DIR)\n\n    try:\n        import maintenance.criticctl\n    except ImportError:\n        raise Error(\"Failed to import: maintenance.criticctl\")\n\n    sys.exit(maintenance.criticctl.main(parser, arguments.help,\n                                        command, command_argv))\nexcept Error as error:\n    if arguments.help:\n        parser.print_help()\n        print\n\n    print >>sys.stderr, \"ERROR: %%s\" %% error.message\n    sys.exit(1)\n"
  },
  {
    "path": "installation/templates/initd",
    "content": "#!/bin/sh\nset -e\n\n### BEGIN INIT INFO\n# Provides:\t\tcritic-main\n# Required-Start:\tpostgresql $local_fs $remote_fs $network $time\n# Required-Stop:\tpostgresql $local_fs $remote_fs $network $time\n# Should-Start:\n# Should-Stop:\n# X-Start-Before:\t%(installation.httpd.backend_service)s\n# X-Stop-After:\t\t%(installation.httpd.backend_service)s\n# Default-Start:\t2 3 4 5\n# Default-Stop:\t\t0 1 6\n# Short-Description:\tCritic code review system (main)\n### END INIT INFO\n\ncritic_etc=%(installation.paths.etc_dir)s/main\ncritic_root=%(installation.paths.install_dir)s\ncritic_run=%(installation.paths.run_dir)s\npidfile=%(installation.paths.run_dir)s/main/servicemanager.pid\n\n. /lib/lsb/init-functions\n\nstart () {\n    log_daemon_msg \"Starting Critic service manager\" \"servicemanager.py\"\n\n    PYTHONPATH=$critic_etc:$critic_root %(installation.prereqs.python)s -m background.servicemanager\n\n    log_end_msg $?\n}\n\nstop () {\n    log_daemon_msg \"Stopping Critic service manager\" \"servicemanager.py\"\n\n    if test -f $pidfile\n    then\n        pid=$(cat $pidfile)\n\n        if kill -TERM $pid\n        then\n            while test -f $pidfile\n            do\n                sleep 0.1\n            done\n        else\n            rm $pidfile\n        fi\n    fi\n\n    log_end_msg 0\n}\n\ncase \"$1\" in\n    start)\n        start\n        ;;\n\n    stop)\n        stop\n        ;;\n\n    restart)\n        stop\n        start\n        ;;\n\n    *)\n        echo \"Usage: $0 {start|stop|restart}\"\n        exit 1\n        ;;\nesac\n\nexit 0\n"
  },
  {
    "path": "installation/templates/nginx/site.both",
    "content": "server {\n\tlisten 80;\n\tlisten [::]:80;\n\tlisten 443 ssl;\n\tlisten [::]:443 ssl;\n\n\tserver_name %(installation.system.hostname)s;\n\n\t# Further SSL configuration is required!\n\t#\n\t#ssl_certificate ???.crt\n\t#ssl_certificate_key ???.key\n\t#ssl_protocols ???\n\t#ssl_ciphers ???\n\n\tlocation / {\n\t\tuwsgi_pass unix://%(installation.paths.run_dir)s/main/sockets/uwsgi.unix;\n\t\tinclude uwsgi_params;\n\t}\n\n\tlocation /static-resource/ {\n\t\talias %(installation.paths.install_dir)s/resources/;\n\t\texpires 30d;\n\n\t\ttypes {\n\t\t\ttext/css css;\n\t\t\ttext/javascript js;\n\t\t\timage/png png;\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "installation/templates/nginx/site.http",
    "content": "server {\n\tlisten 80;\n\tlisten [::]:80;\n\n\tserver_name %(installation.system.hostname)s;\n\n\tlocation / {\n\t\tuwsgi_pass unix://%(installation.paths.run_dir)s/main/sockets/uwsgi.unix;\n\t\tinclude uwsgi_params;\n\t}\n\n\tlocation /static-resource/ {\n\t\talias %(installation.paths.install_dir)s/resources/;\n\t\texpires 30d;\n\n\t\ttypes {\n\t\t\ttext/css css;\n\t\t\ttext/javascript js;\n\t\t\timage/png png;\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "installation/templates/nginx/site.https",
    "content": "server {\n\tlisten 80;\n\tserver_name %(installation.system.hostname)s;\n\treturn 301 https://$server_name$request_uri;\n}\n\nserver {\n\tlisten 443 ssl;\n\tlisten [::]:443 ssl;\n\n\tserver_name %(installation.system.hostname)s;\n\n\t# Further SSL configuration is required!\n\t#\n\t#ssl_certificate ???.crt\n\t#ssl_certificate_key ???.key\n\t#ssl_protocols ???\n\t#ssl_ciphers ???\n\n\tlocation / {\n\t\tuwsgi_pass unix://%(installation.paths.run_dir)s/main/sockets/uwsgi.unix;\n\t\tinclude uwsgi_params;\n\t}\n\n\tlocation /static-resource/ {\n\t\talias %(installation.paths.install_dir)s/resources/;\n\t\texpires 30d;\n\n\t\ttypes {\n\t\t\ttext/css css;\n\t\t\ttext/javascript js;\n\t\t\timage/png png;\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "installation/templates/uwsgi/app.backend.ini",
    "content": "[uwsgi]\nplugins = python\n\nmaster = true\nsocket = %(installation.paths.run_dir)s/main/sockets/uwsgi.unix\n# Make %(installation.httpd.username)s the owner of the socket, so that it can connect.\nchown-socket = %(installation.httpd.username)s:%(installation.system.groupname)s\nchmod-socket = 660\n\npython-path = %(installation.paths.etc_dir)s/main\npython-path = %(installation.paths.install_dir)s\nwsgi-file = %(installation.paths.install_dir)s/wsgi.py\n\nprocesses = 2\nthreads = 25\n\n# Run as the Critic system user/group.\nuid = %(installation.system.username)s\ngid = %(installation.system.groupname)s\n"
  },
  {
    "path": "installation/templates/uwsgi/app.frontend.ini.both",
    "content": "[uwsgi]\nmaster = true\n\n# Use a shared socket to allow binding to a privileged port without running as\n# root.\nshared-socket = :80\nhttp = =0\n\n# Further SSL configuration is required!\n#shared-socket = :443\n#https = =1,???.crt,???.key,HIGH\n\n# Redirect to the Critic backend.\nhttp-to = %(installation.paths.run_dir)s/main/sockets/uwsgi.unix\n\n# Run as the \"web server\" user/group.\nuid = %(installation.httpd.username)s\ngid = %(installation.httpd.groupname)s\n"
  },
  {
    "path": "installation/templates/uwsgi/app.frontend.ini.http",
    "content": "[uwsgi]\nmaster = true\n\n# Use a shared socket to allow binding to a privileged port without running as\n# root.\nshared-socket = :80\nhttp = =0\n\n# Redirect to the Critic backend.\nhttp-to = %(installation.paths.run_dir)s/main/sockets/uwsgi.unix\n\n# Run as the \"web server\" user/group.\nuid = %(installation.httpd.username)s\ngid = %(installation.httpd.groupname)s\n"
  },
  {
    "path": "installation/templates/uwsgi/app.frontend.ini.https",
    "content": "[uwsgi]\nmaster = true\n\n# Use a shared socket to allow binding to a privileged port without running as\n# root.\nshared-socket = :80\nhttp-to-https = =0\n\n# Further SSL configuration is required!\n#shared-socket = :443\n#https = =1,???.crt,???.key,HIGH\n\n# Redirect to the Critic backend.\nhttp-to = %(installation.paths.run_dir)s/main/sockets/uwsgi.unix\n\n# Run as the \"web server\" user/group.\nuid = %(installation.httpd.username)s\ngid = %(installation.httpd.groupname)s\n"
  },
  {
    "path": "installation/utils.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport os\nimport sys\nimport textwrap\nimport subprocess\nimport tempfile\nimport datetime\nimport hashlib\nimport contextlib\nimport json\n\nimport installation\n\nclass UpdateModifiedFile:\n    def __init__(self, arguments, message, versions, options, generateVersion):\n        \"\"\"\\\n        Constructor.\n\n        Arguments:\n\n          arguments  Command-line arguments.\n          message    Printed once.\n          versions   Dictionary (label => path) of file versions involved.\n          options    List of options to present to the user.\n          prompt     Prompt printed when asking what to do.\n        \"\"\"\n\n        self.__arguments = arguments\n        self.__message = message\n        self.__versions = versions\n        self.__options = options\n        self.__option_keys = [key for key, action in options]\n        self.__option_map = dict((key, action) for key, action in options)\n        self.__generateVersion = generateVersion\n        self.__generated = []\n\n    def printMessage(self):\n        print self.__message % self.__versions\n\n    def printOptions(self):\n        alternatives = []\n\n        for key, action in self.__options:\n            if isinstance(action, str):\n                alternatives.append(\"'%s' to %s\" % (key, action))\n            else:\n                alternatives.append(\"'%s' to display the differences between the %s version and the %s version\"\n                                    % (key, action[0], action[1]))\n\n        print textwrap.fill(\"Input %s and %s.\" % (\", \".join(alternatives[:-1]), alternatives[-1]))\n        print\n\n    def displayDifferences(self, from_version, to_version):\n        print\n        print \"=\" * 80\n\n        diff = subprocess.Popen([\"diff\", \"-u\", self.__versions[from_version], self.__versions[to_version]])\n        diff.wait()\n\n        print \"=\" * 80\n        print\n\n    def prompt(self):\n        if self.__arguments.headless:\n            # The first choice is typically \"install updated version\" or \"remove\n            # (obsolete) file\" and is appropriate when --headless was used.\n            return self.__options[0][0]\n\n        try:\n            for label, path in self.__versions.items():\n                if not os.path.exists(path):\n                    self.__generateVersion(label, path)\n                    self.__generated.append(path)\n\n            self.printMessage()\n\n            while True:\n                self.printOptions()\n\n                def validResponse(value):\n                    if value not in self.__option_keys:\n                        return \"please answer %s or %s\" % (\", \".join(self.__option_keys[:-1]), self.__option_keys[-1])\n\n                response = installation.input.string(\"What do you want to do?\", check=validResponse)\n                action = self.__option_map[response]\n\n                if isinstance(action, str):\n                    print\n                    return response\n\n                from_version, to_version = action\n\n                self.displayDifferences(from_version, to_version)\n        finally:\n            for path in self.__generated:\n                os.unlink(path)\n\ndef run_git(args, **kwargs):\n    with installation.utils.as_effective_user_from_path(\n            os.path.join(installation.root_dir, \".git\")):\n        return subprocess.check_output(args, **kwargs)\n\ndef update_from_template(arguments, data, template_path, target_path, message):\n    git = data[\"installation.prereqs.git\"]\n\n    old_commit_sha1 = data[\"sha1\"]\n    new_commit_sha1 = run_git([git, \"rev-parse\", \"HEAD\"],\n                              cwd=installation.root_dir).strip()\n\n    old_template = read_file(git, old_commit_sha1, template_path)\n    new_template = read_file(git, new_commit_sha1, template_path)\n\n    old_source = old_template.decode(\"utf-8\") % data\n    new_source = new_template.decode(\"utf-8\") % data\n\n    with open(target_path) as target_file:\n        current_source = target_file.read().decode(\"utf-8\")\n\n    if current_source == new_source:\n        # The current version is what we would install now.  Nothing to do.\n        return\n    elif old_source == current_source:\n        # The current version is what we installed (or would have installed with\n        # the old template and current settings.)  Update the target file\n        # without asking.\n        write_target = True\n    else:\n        def generate_version(label, path):\n            if label == \"installed\":\n                source = old_source\n            elif label == \"updated\":\n                source = new_source\n            else:\n                return\n            write_file(path, source.encode(\"utf-8\"))\n\n        versions = \"\"\"\\\n  Installed version: %(installed)s\n  Current version:   %(current)s\n  Updated version:   %(updated)s\"\"\"\n\n        update_query = UpdateModifiedFile(\n            arguments,\n            message=message % { \"versions\": versions },\n            versions={ \"installed\": target_path + \".org\",\n                       \"current\": target_path,\n                       \"updated\": target_path + \".new\" },\n            options=[ (\"i\", \"install the updated version\"),\n                      (\"k\", \"keep the current version\"),\n                      (\"do\", (\"installed\", \"current\")),\n                      (\"dn\", (\"current\", \"updated\")) ],\n            generateVersion=generate_version)\n\n        write_target = update_query.prompt() == \"i\"\n\n    if write_target:\n        print \"Updated file: %s\" % target_path\n\n        if not arguments.dry_run:\n            backup_path = os.path.join(os.path.dirname(target_path),\n                                       \".%s.org\" % os.path.basename(target_path))\n            copy_file(target_path, backup_path)\n            with open(target_path, \"w\") as target_file:\n                target_file.write(new_source.encode(\"utf-8\"))\n            return backup_path\n\ndef write_file(path, source):\n    # Use os.open() with O_EXCL to avoid trampling some existing file.\n    fd = os.open(path, os.O_WRONLY | os.O_CREAT | os.O_EXCL)\n    with os.fdopen(fd, \"w\") as target:\n        target.write(source)\n\ndef copy_file(source_path, target_path):\n    with open(source_path) as source:\n        stat = os.fstat(source.fileno())\n        # Use os.open() with O_EXCL to avoid trampling some existing file.\n        fd = os.open(target_path, os.O_WRONLY | os.O_CREAT | os.O_EXCL)\n        with os.fdopen(fd, \"w\") as target:\n            target.write(source.read())\n            os.fchmod(target.fileno(), stat.st_mode)\n            os.fchown(target.fileno(), stat.st_uid, stat.st_gid)\n\ndef hash_file(git, path):\n    if os.path.islink(path):\n        value = os.readlink(path)\n    else:\n        with open(path) as file:\n            value = file.read()\n    return hashlib.sha1(\"blob %d\\0%s\" % (len(value), value)).hexdigest()\n\ndef get_entry_sha1(git, commit_sha1, path, entry_type):\n    lstree = run_git([git, \"ls-tree\", commit_sha1, path],\n                     cwd=installation.root_dir).strip()\n\n    if lstree:\n        lstree_mode, lstree_type, lstree_sha1, lstree_path = lstree.split()\n\n        assert lstree_type == entry_type\n        assert lstree_path == path\n\n        return lstree_sha1\n    else:\n        return None\n\ndef get_file_sha1(git, commit_sha1, path):\n    return get_entry_sha1(git, commit_sha1, path, \"blob\")\n\ndef get_tree_sha1(git, commit_sha1, path):\n    return get_entry_sha1(git, commit_sha1, path, \"tree\")\n\ndef read_file(git, commit_sha1, path):\n    file_sha1 = get_file_sha1(git, commit_sha1, path)\n    if file_sha1 is None:\n        return None\n    return run_git([git, \"cat-file\", \"blob\", file_sha1],\n                   cwd=installation.root_dir)\n\ndef get_initial_commit_date(git, path):\n    initial_commit_timestamp = run_git([git, \"log\", \"--oneline\",\n            \"--format=%ct\", \"--\", path], cwd=installation.root_dir).splitlines()[-1]\n    return datetime.datetime.fromtimestamp(int(initial_commit_timestamp))\n\ndef clean_root_pyc_files():\n    print \"Cleaning up .pyc files owned by root ...\"\n    for root, _, files in os.walk(installation.root_dir):\n        for file in files:\n            file = os.path.join(root, file)\n            if file.endswith(\".pyc\") and os.stat(file).st_uid == 0:\n                os.unlink(file)\n\n@contextlib.contextmanager\ndef temporary_cwd():\n    saved_cwd = os.getcwd()\n    os.chdir(tempfile.gettempdir())\n    try:\n        yield\n    finally:\n        os.chdir(saved_cwd)\n\n@contextlib.contextmanager\ndef as_critic_system_user():\n    if installation.is_quick_start:\n        yield\n        return\n\n    saved_cwd = os.getcwd()\n    os.chdir(tempfile.gettempdir())\n    os.setegid(installation.system.gid)\n    os.seteuid(installation.system.uid)\n    try:\n        yield\n    finally:\n        os.seteuid(os.getresuid()[0])\n        os.setegid(os.getresgid()[0])\n        os.chdir(saved_cwd)\n\n@contextlib.contextmanager\ndef as_effective_user_from_path(path):\n    stat = os.stat(path)\n    os.setegid(stat.st_gid)\n    os.seteuid(stat.st_uid)\n    try:\n        yield\n    finally:\n        os.seteuid(os.getresuid()[0])\n        os.setegid(os.getresgid()[0])\n\ndef deunicode(v):\n    if type(v) == unicode: return v.encode(\"utf-8\")\n    elif type(v) == list: return map(deunicode, v)\n    elif type(v) == dict: return dict([(deunicode(a), deunicode(b)) for a, b in v.items()])\n    else: return v\n\ndef read_install_data(arguments, fail_softly=False):\n    etc_path = os.path.join(arguments.etc_dir, arguments.identity)\n\n    if not os.path.isdir(etc_path):\n        if fail_softly:\n            return None\n        print \"\"\"\nERROR: %s: no such directory\nHINT: Make sure the --etc-dir[=%s] and --identity[=%s] options\n      correctly define where the installed system's configuration is stored.\"\"\" % (etc_path, arguments.etc_dir, arguments.identity)\n        sys.exit(1)\n\n    sys.path.insert(0, etc_path)\n\n    try:\n        import configuration\n    except ImportError:\n        if fail_softly:\n            return None\n        print \"\"\"\nERROR: Failed to import 'configuration' module.\nHINT: Make sure the --etc-dir[=%s] and --identity[=%s] options\n      correctly define where the installed system's configuration is stored.\"\"\" % (arguments.etc_dir, arguments.identity)\n        sys.exit(1)\n\n    install_data_path = os.path.join(configuration.paths.INSTALL_DIR, \".install.data\")\n\n    if not os.path.isfile(install_data_path):\n        if fail_softly:\n            return None\n        print \"\"\"\\\n%s: no such file\n\nThis installation of Critic appears to be incomplete or corrupt.\"\"\" % install_data_path\n        sys.exit(1)\n\n    try:\n        with open(install_data_path, \"r\") as install_data_file:\n            install_data = deunicode(json.load(install_data_file))\n            if not isinstance(install_data, dict): raise ValueError\n    except ValueError:\n        if fail_softly:\n            return None\n        print \"\"\"\\\n%s: failed to parse JSON object to dictionary\n\nThis installation of Critic appears to be incomplete or corrupt.\"\"\" % install_data_path\n        sys.exit(1)\n\n    return install_data\n\ndef write_install_data(arguments, install_data):\n    install_data_path = os.path.join(installation.paths.install_dir, \".install.data\")\n\n    if not getattr(arguments, \"dry_run\", False):\n        with open(install_data_path, \"w\") as install_data_file:\n            json.dump(install_data, install_data_file)\n\n        os.chown(install_data_path, installation.system.uid, installation.system.gid)\n        os.chmod(install_data_path, 0640)\n\ndef start_migration():\n    import sys\n    import argparse\n    import os\n\n    parser = argparse.ArgumentParser()\n    parser.add_argument(\"--uid\", type=int)\n    parser.add_argument(\"--gid\", type=int)\n\n    arguments = parser.parse_args()\n\n    os.setgid(arguments.gid)\n    os.setuid(arguments.uid)\n\nclass DatabaseSchema(object):\n    \"\"\"Database schema updating utility class\n\n       This class is primarily intended for use in migration scripts.\"\"\"\n\n    def __init__(self):\n        import configuration\n        import psycopg2\n\n        self.db = psycopg2.connect(**configuration.database.PARAMETERS)\n\n    def table_exists(self, table_name):\n        import psycopg2\n\n        try:\n            self.db.cursor().execute(\"SELECT 1 FROM %s LIMIT 1\" % table_name)\n        except psycopg2.ProgrammingError:\n            self.db.rollback()\n            return False\n        else:\n            # Above statement would have thrown a psycopg2.ProgrammingError if the\n            # table didn't exist, but it didn't, so the table must exist.\n            return True\n\n    def column_exists(self, table_name, column_name):\n        import psycopg2\n\n        try:\n            self.db.cursor().execute(\"SELECT %s FROM %s LIMIT 1\"\n                                     % (column_name, table_name))\n        except psycopg2.ProgrammingError:\n            self.db.rollback()\n            return False\n        else:\n            # Above statement would have thrown a psycopg2.ProgrammingError if the\n            # table didn't exist, but it didn't, so the table must exist.\n            return True\n\n    def create_table(self, statement):\n        import re\n\n        (table_name,) = re.search(\"CREATE TABLE (\\w+)\", statement).groups()\n\n        # Make sure the table doesn't already exist.\n        if not self.table_exists(table_name):\n            self.db.cursor().execute(statement)\n            self.db.commit()\n\n    def create_index(self, statement):\n        import re\n\n        (index_name,) = re.search(\"CREATE INDEX (\\w+)\", statement).groups()\n\n        cursor = self.db.cursor()\n        cursor.execute(\"DROP INDEX IF EXISTS %s\" % index_name)\n        cursor.execute(statement)\n        self.db.commit()\n\n    def create_column(self, table_name, column_name, column_definition):\n        if not self.column_exists(table_name, column_name):\n            self.db.cursor().execute(\n                \"ALTER TABLE %s ADD %s %s\"\n                % (table_name, column_name, column_definition))\n            self.db.commit()\n\n    def type_exists(self, type_name):\n        import psycopg2\n\n        try:\n            self.db.cursor().execute(\"SELECT NULL::%s\" % type_name)\n        except psycopg2.ProgrammingError:\n            self.db.rollback()\n            return False\n        else:\n            # Above statement would have thrown a psycopg2.ProgrammingError if the\n            # type didn't exist, but it didn't, so the table must exist.\n            return True\n\n    def create_type(self, statement):\n        import re\n\n        (type_name,) = re.search(\"CREATE TYPE (\\w+)\", statement).groups()\n\n        # Make sure the type doesn't already exist.\n        if not self.type_exists(type_name):\n            self.db.cursor().execute(statement)\n            self.db.commit()\n\n    def update(self, statements):\n        # Remove top-level comments; they interfere with out very simple\n        # statement identification below.  Other comments are fine.\n        lines = [line for line in statements.splitlines()\n                 if not line.startswith(\"--\")]\n        statements = \"\\n\".join(lines)\n\n        for statement in statements.split(\";\"):\n            statement = statement.strip()\n\n            if not statement:\n                continue\n\n            if statement.startswith(\"CREATE TABLE\"):\n                self.create_table(statement)\n            elif statement.startswith(\"CREATE INDEX\"):\n                self.create_index(statement)\n            elif statement.startswith(\"CREATE TYPE\"):\n                self.create_type(statement)\n            else:\n                print >>sys.stderr, \"Unexpected SQL statement: %r\" % statement\n                sys.exit(1)\n\ndef read_lifecycle(git=None, sha1=None):\n    filename = \"installation/lifecycle.json\"\n    if sha1 is None:\n        path = os.path.join(installation.root_dir, filename)\n        with open(path, \"r\") as lifecycle_file:\n            lifecycle_source = lifecycle_file.read()\n    else:\n        lifecycle_source = read_file(git, sha1, filename)\n        if lifecycle_source is None:\n            # For systems installed before the lifecycle.json file was\n            # introduced, hard-code the file's initial content.\n            return {\n                \"branch\": \"version/1\",\n                \"stable\": True\n            }\n    return deunicode(json.loads(lifecycle_source))\n"
  },
  {
    "path": "pylint.rc",
    "content": "[MASTER]\n\n# Specify a configuration file.\n#rcfile=\n\n# Python code to execute, usually for sys.path manipulation such as\n# pygtk.require().\n#init-hook=\n\n# Profiled execution.\nprofile=no\n\n# Add files or directories to the blacklist. They should be base names, not\n# paths.\nignore=CVS\n\n# Pickle collected data for later comparisons.\npersistent=yes\n\n# List of plugins (as comma separated values of python modules names) to load,\n# usually to register additional checkers.\nload-plugins=\n\n\n[MESSAGES CONTROL]\n\n# Enable the message, report, category or checker with the given id(s). You can\n# either give multiple identifier separated by comma (,) or put this option\n# multiple time.\n#enable=\n\n# Disable the message, report, category or checker with the given id(s). You\n# can either give multiple identifier separated by comma (,) or put this option\n# multiple time (only on the command line, not in the configuration file where\n# it should appear only once).\n\n# C0111 = Missing docstring\n# C0301 = Line too long\n# C0302 = Too many lines in module\n# C0321 = More than one statement on a single line\n# R0911 = Too many return statements\n# R0912 = Too many branches\n# R0913 = Too many arguments\n# R0914 = Too many local variables\n# R0915 = Too many statements\n\ndisable=C0111,C0301,C0302,C0321,R0911,R0912,R0913,R0914,R0915\n\n\n[REPORTS]\n\n# Set the output format. Available formats are text, parseable, colorized, msvs\n# (visual studio) and html\noutput-format=text\n\n# Include message's id in output\ninclude-ids=no\n\n# Put messages in a separate file for each module / package specified on the\n# command line instead of printing them on stdout. Reports (if any) will be\n# written in a file name \"pylint_global.[txt|html]\".\nfiles-output=no\n\n# Tells whether to display a full report or only the messages\nreports=yes\n\n# Python expression which should return a note less than 10 (10 is the highest\n# note). You have access to the variables errors warning, statement which\n# respectively contain the number of errors / warnings messages and the total\n# number of statements analyzed. This is used by the global evaluation report\n# (RP0004).\nevaluation=10.0 - ((float(5 * error + warning + refactor + convention) / statement) * 10)\n\n# Add a comment according to your evaluation note. This is used by the global\n# evaluation report (RP0004).\ncomment=no\n\n\n[BASIC]\n\n# Required attributes for module, separated by a comma\nrequired-attributes=\n\n# List of builtins function names that should not be used, separated by a comma\nbad-functions=\n\n# Regular expression which should only match correct module names\nmodule-rgx=(([a-z_][a-z0-9_]*)|([A-Z][a-zA-Z0-9]+))$\n\n# Regular expression which should only match correct module level names\nconst-rgx=(([A-Z_][A-Z0-9_]*)|(__.*__))$\n\n# Regular expression which should only match correct class names\nclass-rgx=[A-Z_][a-zA-Z0-9]+$\n\n# Regular expression which should only match correct function names\nfunction-rgx=[a-z][a-z0-9_]*$|[a-z][a-z0-9]*([A-Z][a-z0-9]*)*$\n\n# Regular expression which should only match correct method names\nmethod-rgx=[a-z_][a-z0-9_]{2,30}$\n\n# Regular expression which should only match correct instance attribute names\nattr-rgx=[a-z_][a-z0-9_]{2,30}$\n\n# Regular expression which should only match correct argument names\nargument-rgx=[a-z_][a-z0-9_]+$\n\n# Regular expression which should only match correct variable names\nvariable-rgx=[a-z_][a-z0-9_]+$\n\n# Regular expression which should only match correct list comprehension /\n# generator expression variable names\ninlinevar-rgx=[A-Za-z_][A-Za-z0-9_]*$\n\n# Good variable names which should always be accepted, separated by a comma\ngood-names=i,j,k,ex,Run,_\n\n# Bad variable names which should always be refused, separated by a comma\nbad-names=foo,bar,baz,toto,tutu,tata\n\n# Regular expression which should only match functions or classes name which do\n# not require a docstring\nno-docstring-rgx=__.*__\n\n\n[SIMILARITIES]\n\n# Minimum lines number of a similarity.\nmin-similarity-lines=4\n\n# Ignore comments when computing similarities.\nignore-comments=yes\n\n# Ignore docstrings when computing similarities.\nignore-docstrings=yes\n\n\n[MISCELLANEOUS]\n\n# List of note tags to take in consideration, separated by a comma.\nnotes=FIXME,XXX,TODO\n\n\n[FORMAT]\n\n# Maximum number of characters on a single line.\nmax-line-length=80\n\n# Maximum number of lines in a module\nmax-module-lines=1000\n\n# String used as indentation unit. This is usually \" \" (4 spaces) or \"\\t\" (1\n# tab).\nindent-string='    '\n\n\n[VARIABLES]\n\n# Tells whether we should check for unused import in __init__ files.\ninit-import=no\n\n# A regular expression matching the beginning of the name of dummy variables\n# (i.e. not used).\ndummy-variables-rgx=_|dummy\n\n# List of additional names supposed to be defined in builtins. Remember that\n# you should avoid to define new builtins when possible.\nadditional-builtins=\n\n\n[TYPECHECK]\n\n# Tells whether missing members accessed in mixin class should be ignored. A\n# mixin class is detected if its name ends with \"mixin\" (case insensitive).\nignore-mixin-members=yes\n\n# List of classes names for which member attributes should not be checked\n# (useful for classes with attributes dynamically set).\nignored-classes=SQLObject\n\n# When zope mode is activated, add a predefined set of Zope acquired attributes\n# to generated-members.\nzope=no\n\n# List of members which are set dynamically and missed by pylint inference\n# system, and so shouldn't trigger E0201 when accessed. Python regular\n# expressions are accepted.\ngenerated-members=REQUEST,acl_users,aq_parent\n\n\n[IMPORTS]\n\n# Deprecated modules which should not be used, separated by a comma\ndeprecated-modules=regsub,string,TERMIOS,Bastion,rexec\n\n# Create a graph of every (i.e. internal and external) dependencies in the\n# given file (report RP0402 must not be disabled)\nimport-graph=\n\n# Create a graph of external dependencies in the given file (report RP0402 must\n# not be disabled)\next-import-graph=\n\n# Create a graph of internal dependencies in the given file (report RP0402 must\n# not be disabled)\nint-import-graph=\n\n\n[CLASSES]\n\n# List of interface methods to ignore, separated by a comma. This is used for\n# instance to not check methods defines in Zope's Interface base class.\nignore-iface-methods=isImplementedBy,deferred,extends,names,namesAndDescriptions,queryDescriptionFor,getBases,getDescriptionFor,getDoc,getName,getTaggedValue,getTaggedValueTags,isEqualOrExtendedBy,setTaggedValue,isImplementedByInstancesOf,adaptWith,is_implemented_by\n\n# List of method names used to declare (i.e. assign) instance attributes.\ndefining-attr-methods=__init__,__new__,setUp\n\n# List of valid names for the first argument in a class method.\nvalid-classmethod-first-arg=cls\n\n\n[DESIGN]\n\n# Maximum number of arguments for function / method\nmax-args=5\n\n# Argument names that match this expression will be ignored. Default to name\n# with leading underscore\nignored-argument-names=_.*\n\n# Maximum number of locals for function / method body\nmax-locals=15\n\n# Maximum number of return / yield for function / method body\nmax-returns=6\n\n# Maximum number of branch for function / method body\nmax-branchs=12\n\n# Maximum number of statements in function / method body\nmax-statements=50\n\n# Maximum number of parents for a class (see R0901).\nmax-parents=7\n\n# Maximum number of attributes for a class (see R0902).\nmax-attributes=7\n\n# Minimum number of public methods for a class (see R0903).\nmin-public-methods=2\n\n# Maximum number of public methods for a class (see R0904).\nmax-public-methods=20\n\n\n[EXCEPTIONS]\n\n# Exceptions that will emit a warning when being caught. Defaults to\n# \"Exception\"\novergeneral-exceptions=Exception\n"
  },
  {
    "path": "pythonversion.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2013 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\n\ndef check(message=None):\n    if sys.version_info[0] != 2 or sys.version_info[1] < 7:\n        print \"\"\"\nERROR: Unsupported Python version!  Critic requires Python 2.7.x or\nlater, and does not support Python 3.x.\n\"\"\"\n\n        if message:\n            print message\n\n        sys.exit(2)\n"
  },
  {
    "path": "quickstart.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2013 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport os\nimport argparse\nimport tempfile\nimport shutil\nimport wsgiref.simple_server\nimport subprocess\nimport threading\nimport requests\nimport json\nimport signal\nimport time\nimport py_compile\nimport contextlib\n\nsys.path.insert(0, os.path.dirname(__file__))\n\nadmin_username = os.environ.get(\"LOGNAME\", \"admin\")\n\nparser = argparse.ArgumentParser(\"python quickstart.py\",\n                                 description=\"Critic instance quick-start utility script.\")\nparser.add_argument(\"--quiet\", action=\"store_true\",\n                    help=\"Suppress most output\")\nparser.add_argument(\"--testing\", action=\"store_true\",\n                    help=argparse.SUPPRESS)\n\nparser.add_argument(\"--admin-username\", default=admin_username,\n                    help=argparse.SUPPRESS)\nparser.add_argument(\"--admin-fullname\", default=admin_username,\n                    help=argparse.SUPPRESS)\nparser.add_argument(\"--admin-email\", default=admin_username + \"@localhost\",\n                    help=argparse.SUPPRESS)\nparser.add_argument(\"--admin-password\", default=\"1234\",\n                    help=argparse.SUPPRESS)\nparser.add_argument(\"--system-recipient\", action=\"append\",\n                    help=argparse.SUPPRESS)\n\nparser.add_argument(\"--state-dir\", \"-s\",\n                    help=\"State directory [default=temporary dir]\")\nparser.add_argument(\"--http-host\", default=\"\",\n                    help=\"Hostname the HTTP server listens at [default=ANY]\")\nparser.add_argument(\"--http-port\", \"-p\", default=8080, type=int,\n                    help=\"Port the HTTP server listens at [default=8080]\")\n\nparser.add_argument(\"--smtp-host\", default=\"localhost\",\n                    help=\"Hostname of SMTP server to use [default=localhost]\")\nparser.add_argument(\"--smtp-port\", default=25, type=int,\n                    help=\"Port of SMTP server to use [default=25]\")\nparser.add_argument(\"--smtp-username\",\n                    help=\"SMTP username [default=none]\")\nparser.add_argument(\"--smtp-password\",\n                    help=\"SMTP password [default=none]\")\n\nparser.add_argument(\"--run\", action=\"store_true\",\n                    help=argparse.SUPPRESS)\nparser.add_argument(\"--run-state-dir\",\n                    help=argparse.SUPPRESS)\nparser.add_argument(\"--run-http-port\", type=int,\n                    help=argparse.SUPPRESS)\n\narguments = parser.parse_args()\n\nquiet = arguments.quiet or arguments.testing\n\nif arguments.run:\n    import critic\n\n    def handle_interrupt(signum, frame):\n        pid_filename = os.path.join(arguments.run_state_dir, \"run\",\n                                    \"main\", \"servicemanager.pid\")\n\n        if os.path.isfile(pid_filename):\n            with open(pid_filename) as pid_file:\n                servicemanager_pid = int(pid_file.read().strip())\n\n            os.kill(servicemanager_pid, signal.SIGTERM)\n\n            while os.path.isfile(pid_filename):\n                time.sleep(0.1)\n\n        os._exit(0)\n\n    signal.signal(signal.SIGINT, handle_interrupt)\n\n    subprocess.check_call([sys.executable, \"-m\", \"background.servicemanager\"])\n\n    class CriticWSGIRequestHandler(wsgiref.simple_server.WSGIRequestHandler):\n        def log_message(self, *args, **kwargs):\n            if not quiet:\n                wsgiref.simple_server.WSGIRequestHandler.log_message(\n                    self, *args, **kwargs)\n\n    server = wsgiref.simple_server.make_server(\n        host=arguments.http_host,\n        port=arguments.run_http_port,\n        app=critic.main,\n        handler_class=CriticWSGIRequestHandler)\n\n    server_address_path = os.path.join(arguments.run_state_dir, \"server_address\")\n\n    with open(server_address_path, \"w\") as server_address_file:\n        server_address_file.write(\"%s:%d\" % (server.server_name, server.server_port))\n\n    # This call will never return.  This is fine; we just want to block\n    # forever (or until we receive a SIGINT.)\n    server.serve_forever()\n\nfailed_imports = False\n\ntry:\n    import pygments\nexcept ImportError:\n    print \"\"\"\\\nERROR: Failed to import 'pygments'; code will not be syntax highlighted.\nHINT: On Debian/Ubuntu, install the 'python-pygments' package to eliminate\n      this problem.\n\"\"\"\n    failed_imports = True\n\ntry:\n    import passlib\nexcept ImportError:\n    print \"\"\"\\\nERROR: Failed to import 'passlib'; passwords will be encrypted insecurely.\nHINT: On Debian/Ubuntu, install the 'python-passlib' package to eliminate\n      this problem.\n\"\"\"\n    failed_imports = True\n\nif failed_imports:\n    if arguments.testing:\n        print \"FATAL: Won't run test suite with missing imports.\"\n        sys.exit(1)\n    else:\n        print \"\"\"\\\nSome functionality will be missing due to missing Python packages.  Press\nENTER to go ahead and quick-start Critic anyway, or CTRL-c to abort.\n\"\"\"\n\n        try:\n            sys.stdin.readline()\n        except KeyboardInterrupt:\n            sys.exit(1)\n\nif arguments.testing:\n    os.setsid()\n\nimport installation\nimport installation.qs\n\ninstallation.quiet = True\n\nif arguments.state_dir:\n    state_dir = arguments.state_dir\n    if not os.path.isdir(state_dir):\n        os.makedirs(state_dir)\nelse:\n    state_dir = tempfile.mkdtemp()\n    if arguments.testing:\n        print \"STATE=%s\" % state_dir\n\ndatabase_path = os.path.join(state_dir, \"critic.db\")\ninitialize_database = not os.path.exists(database_path)\nadd_repository = None\n\nclass CompilationFailed(Exception):\n    pass\n\ndef compile_all_sources():\n    success = True\n    for dirname, _, filenames in os.walk(\"src\"):\n        for filename in filenames:\n            if filename[0] == \".\":\n                continue\n            if not filename.endswith(\".py\"):\n                continue\n            path = os.path.join(dirname, filename)\n            try:\n                py_compile.compile(path, doraise=True)\n            except py_compile.PyCompileError as error:\n                if success:\n                    # First error.  Create some space.\n                    print \"\\n\"\n                print \"ERROR: Failed to compile %s:\\n%s\" % (path, error)\n                success = False\n    if not success:\n        raise CompilationFailed()\n\n@contextlib.contextmanager\ndef activity(what):\n    if quiet:\n        yield\n    else:\n        sys.stdout.write(what + \" ...\")\n        sys.stdout.flush()\n        yield\n        sys.stdout.write(\" done.\\n\")\n\ntry:\n    try:\n        with activity(\"Compiling all sources\"):\n            compile_all_sources()\n    except CompilationFailed:\n        sys.exit(1)\n\n    installation.is_quick_start = True\n\n    if initialize_database:\n        with activity(\"Initializing database\"):\n            installation.qs.sqlite.import_schema(\n                database_path,\n                filenames=installation.database.SCHEMA_FILES,\n                quiet=quiet)\n\n    installation.system.uid = os.getuid()\n    installation.system.gid = os.getgid()\n\n    installation.paths.etc_dir = os.path.join(state_dir, \"etc\")\n    installation.paths.bin_dir = os.path.join(state_dir, \"bin\")\n    installation.paths.install_dir = os.path.join(os.getcwd(), \"src\")\n    installation.paths.data_dir = os.path.join(state_dir, \"data\")\n    installation.paths.cache_dir = os.path.join(state_dir, \"cache\")\n    installation.paths.log_dir = os.path.join(state_dir, \"log\")\n    installation.paths.run_dir = os.path.join(state_dir, \"run\")\n    installation.paths.git_dir = os.path.join(state_dir, \"git\")\n\n    data = installation.qs.data.generate(arguments, database_path)\n\n    with activity(\"Installing the system\"):\n        installation.paths.install(data)\n\n        if not os.path.isfile(os.path.join(installation.paths.bin_dir, \"criticctl\")):\n            installation.criticctl.install(data)\n\n        if not os.path.isfile(os.path.join(installation.paths.etc_dir, \"main\", \"configuration\", \"__init__.py\")):\n            installation.config.install(data)\n\n        if initialize_database:\n            installation.prefs.install(data)\n\n    config_dir = os.path.join(installation.paths.etc_dir, \"main\")\n    install_dir = installation.paths.install_dir\n    root_dir = installation.root_dir\n\n    sys.path.insert(0, config_dir)\n    sys.path.insert(1, install_dir)\n\n    if initialize_database:\n        import dbutils\n        import auth\n\n        db = dbutils.Database()\n        cursor = db.cursor()\n\n        cursor.execute(\"\"\"INSERT INTO systemidentities (key, name, anonymous_scheme,\n                                                        authenticated_scheme, hostname,\n                                                        description, installed_sha1)\n                               VALUES ('main', 'main', 'http', 'http', 'localhost', 'Main', ?)\"\"\",\n                       (subprocess.check_output(\"git rev-parse HEAD\", shell=True).strip(),))\n\n        db.commit()\n\n        admin = dbutils.User.create(\n            db,\n            name=arguments.admin_username,\n            fullname=arguments.admin_fullname,\n            email=arguments.admin_email,\n            email_verified=None,\n            password=auth.hashPassword(arguments.admin_password))\n\n        if not arguments.testing:\n            if not quiet:\n                print\n\n            print (\"Created administrator user %r with password '1234'\"\n                   % data[\"installation.admin.username\"])\n\n        cursor.execute(\"\"\"INSERT INTO userroles (uid, role)\n                               SELECT %s, name\n                                 FROM roles\"\"\",\n                       (admin.id,))\n\n        db.commit()\n        db.close()\n\n    the_system = None\n    server_address = None\n    server_name = None\n    server_port = str(arguments.http_port)\n\n    def startTheSystem():\n        global the_system, server_name, server_port, server_address\n\n        server_address_path = os.path.join(state_dir, \"server_address\")\n        if os.path.isfile(server_address_path):\n            os.unlink(server_address_path)\n\n        the_system = subprocess.Popen(\n            [sys.executable] + sys.argv + [\"--run\",\n                                           \"--run-state-dir\", state_dir,\n                                           \"--run-http-port\", server_port],\n            env={ \"PYTHONPATH\": \":\".join([config_dir, install_dir, root_dir]) })\n\n        while not os.path.isfile(server_address_path):\n            time.sleep(0.1)\n\n            if the_system.poll() is not None:\n                the_system = None\n                return False\n\n        with open(server_address_path) as server_address_file:\n            server_address = server_address_file.read()\n\n        server_name, _, server_port = server_address.partition(\":\")\n        return True\n\n    def stopTheSystem():\n        global the_system\n\n        if the_system:\n            the_system.send_signal(signal.SIGINT)\n            the_system.wait()\n\n        the_system = None\n\n    def restartTheSystem():\n        compile_all_sources()\n        stopTheSystem()\n        startTheSystem()\n\n    def getNewestModificationTime():\n        newest = 0\n        for dirpath, dirnames, filenames in os.walk(\".\"):\n            for filename in filenames:\n                if filename[0] != \".\" and filename.endswith(\".py\"):\n                    path = os.path.join(dirpath, filename)\n                    newest = max(os.stat(path).st_mtime, newest)\n        return newest\n\n    running_mtime = getNewestModificationTime()\n\n    startTheSystem()\n\n    if not arguments.testing:\n        print \"Listening at: http://%s:%s/\" % (server_name, server_port)\n\n        import dbutils\n\n        db = dbutils.Database()\n        db.cursor().execute(\"\"\"UPDATE systemidentities\n                                  SET hostname=?\"\"\",\n                            (\"%s:%s\" % (server_name, server_port),))\n        db.commit()\n        db.close()\n    else:\n        print \"HTTP=%s:%s\" % (server_name, server_port)\n\n    if not os.listdir(installation.paths.git_dir) and not arguments.testing:\n        if not quiet:\n            print\n            print \"Creating critic.git repository ...\"\n\n        pid_filename = os.path.join(\n            state_dir, \"run\", \"main\", \"branchtracker.pid\")\n\n        while not os.path.isfile(pid_filename):\n            time.sleep(0.1)\n\n        current_ref = subprocess.check_output(\n            [\"git\", \"rev-parse\", \"--symbolic-full-name\", \"HEAD\"]).strip()\n\n        if current_ref.startswith(\"refs/heads/\"):\n            remote_branch = local_branch = current_ref[len(\"refs/heads/\"):]\n            if local_branch.startswith(\"r/\"):\n                local_branch = local_branch[2:]\n        else:\n            remote_branch = local_branch = \"master\"\n\n        kwargs = {}\n\n        session = requests.Session()\n        response = session.post(\n            \"http://%s/validatelogin\" % server_address,\n            data=json.dumps({ \"fields\": { \"username\": data[\"installation.admin.username\"],\n                                          \"password\": \"1234\" }}))\n        if response.status_code in (401, 404):\n            kwargs[\"auth\"] = (data[\"installation.admin.username\"], \"1234\")\n        response = session.post(\n            \"http://%s/addrepository\" % server_address,\n            data=json.dumps({ \"name\": \"critic\",\n                              \"path\": \"critic\",\n                              \"mirror\": { \"remote_url\": \"file://\" + installation.root_dir,\n                                          \"remote_branch\": remote_branch,\n                                          \"local_branch\": local_branch }}),\n            **kwargs)\n\n    if not quiet:\n        print\n\n    if arguments.testing:\n        print \"STARTED\"\n\n        restart_requested = False\n        def handle_sigusr1(signum, frame):\n            global restart_requested\n            restart_requested = True\n        signal.signal(signal.SIGUSR1, handle_sigusr1)\n\n        while True:\n            time.sleep(3600)\n            if restart_requested:\n                restart_requested = False\n                restartTheSystem()\n                print \"RESTARTED\"\n    else:\n        while True:\n            current_mtime = getNewestModificationTime()\n            if current_mtime > running_mtime:\n                print\n                try:\n                    with activity(\"Sources changed, restarting the system\"):\n                        restartTheSystem()\n                except CompilationFailed:\n                    pass\n                else:\n                    print\n\n                running_mtime = current_mtime\n            else:\n                time.sleep(1)\nexcept KeyboardInterrupt:\n    pass\nfinally:\n    if not arguments.quiet and not arguments.testing:\n        print \"Shutting down ...\"\n\n    try:\n        stopTheSystem()\n    except NameError:\n        # Failure happened before stopTheSystem() was declared.\n        pass\n\n    if not arguments.state_dir:\n        if not arguments.quiet and not arguments.testing:\n            print \"Cleaing up ...\"\n\n        shutil.rmtree(state_dir)\n"
  },
  {
    "path": "src/api/__init__.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\n\"\"\"Critic API\"\"\"\n\nfrom apiobject import APIObject\nfrom apierror import (APIError, PermissionDenied, TransactionError,\n                      ResultDelayedError)\n\nimport config\nimport critic\nimport user\nimport review\nimport reviewsummary\nimport repository\nimport filters\nimport branch\nimport commit\nimport commitset\nimport changeset\nimport filechange\nimport filediff\nimport filecontent\nimport log\nimport preference\nimport accesstoken\nimport accesscontrolprofile\nimport labeledaccesscontrolprofile\nimport extension\nimport file\nimport comment\nimport reply\nimport batch\nimport reviewablefilechange\n\nimport transaction\n"
  },
  {
    "path": "src/api/accesscontrolprofile.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2015 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\n\nclass AccessControlProfileError(api.APIError):\n    \"\"\"Base exception for all errors related to the AccessControlProfile\n       class\"\"\"\n    pass\n\nclass InvalidAccessControlProfileId(AccessControlProfileError):\n    \"\"\"Raised when an invalid access control profile id is used\"\"\"\n\n    def __init__(self, value):\n        \"\"\"Constructor\"\"\"\n        super(InvalidAccessControlProfileId, self).__init__(\n            \"Invalid access control profile id: %d\" % value)\n        self.value = value\n\nclass AccessControlProfile(api.APIObject):\n    \"\"\"Representation of a an access control profile\"\"\"\n\n    RULE_VALUES = frozenset([\"allow\", \"deny\"])\n\n    @property\n    def id(self):\n        \"\"\"The profile's unique id\"\"\"\n        return self._impl.id\n\n    @property\n    def title(self):\n        \"\"\"The profile's title, or None\"\"\"\n        return self._impl.title\n\n    @property\n    def access_token(self):\n        \"\"\"The access token that owns this profile, or None\"\"\"\n        return self._impl.getAccessToken(self.critic)\n\n    class Category(object):\n        \"\"\"Representation of an access control category\n\n           Each category is controlled by a rule (\"allow\" or \"deny\") and a list\n           of exceptions (possibly empty).  The effective policy is the rule,\n           unless an exception applies, in which case it's the opposite of the\n           rule.\"\"\"\n\n        def __init__(self, rule, exceptions):\n            self.rule = rule\n            self.exceptions = exceptions\n\n    class HTTPException(object):\n        \"\"\"Representation of an exception for the \"http\" category\n\n           The exception consists of the HTTP request method and a regular\n           expression that must match the entire request path.\"\"\"\n\n        REQUEST_METHODS = frozenset([\"GET\", \"HEAD\", \"OPTIONS\",\n                                     \"POST\", \"PUT\", \"DELETE\"])\n\n        def __init__(self, exception_id, request_method, path_pattern):\n            self.id = exception_id\n            self.request_method = request_method\n            self.path_pattern = path_pattern\n\n    @property\n    def http(self):\n        \"\"\"Access control category \"http\"\n\n           This category controls web frontend requests.\n\n           Exceptions are of the type HTTPException.\"\"\"\n        return self._impl.getHTTP(self.critic)\n\n    class RepositoryException(object):\n        \"\"\"Representation of an exception for the \"repositories\" category\n\n           The exception consists of the access type (\"read\" or \"modify\") and the\n           repository.\"\"\"\n\n        ACCESS_TYPES = frozenset([\"read\", \"modify\"])\n\n        def __init__(self, exception_id, access_type, repository):\n            self.id = exception_id\n            self.access_type = access_type\n            self.repository = repository\n\n    @property\n    def repositories(self):\n        \"\"\"Access control category \"repositories\"\n\n           This category controls access to Git repositories, both via the web\n           frontend and the Git hook.  Note that read-only Git access over SSH\n           is not controlled by access control.\n\n           Exceptions are of the type RepositoryException.\"\"\"\n        return self._impl.getRepositories(self.critic)\n\n    class ExtensionException(object):\n        \"\"\"Representation of an exception for the \"extensions\" category\n\n           The exception consists of the access type (\"install\" or \"execute\")\n           and the extension.\"\"\"\n\n        ACCESS_TYPES = frozenset([\"install\", \"execute\"])\n\n        def __init__(self, exception_id, access_type, extension):\n            self.id = exception_id\n            self.access_type = access_type\n            self.extension = extension\n\n    @property\n    def extensions(self):\n        \"\"\"Access control category \"extensions\"\n\n           This category controls access to any functionality provided by an\n           extension.\n\n           Exceptions are of the type ExtensionException.\"\"\"\n        return self._impl.getExtensions(self.critic)\n\ndef fetch(critic, profile_id):\n    \"\"\"Fetch an AccessControlProfile object with the given profile id\"\"\"\n    import api.impl\n    assert isinstance(critic, api.critic.Critic)\n    return api.impl.accesscontrolprofile.fetch(critic, int(profile_id))\n\ndef fetchAll(critic, title=None):\n    \"\"\"Fetch AccessControlProfile objects for all primary profiles in the system\n\n       A profile is primary if it is not the additional restrictions imposed for\n       accesses authenticated with an access token.\n\n       If |title| is not None, fetch only profiles with a matching title.\"\"\"\n    import api.impl\n    assert isinstance(critic, api.critic.Critic)\n    if title is not None:\n        title = str(title)\n    return api.impl.accesscontrolprofile.fetchAll(critic, title)\n"
  },
  {
    "path": "src/api/accesstoken.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2015 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\n\nclass AccessTokenError(api.APIError):\n    \"\"\"Base exception for all errors related to the AccessToken class\"\"\"\n    pass\n\nclass InvalidAccessTokenId(AccessTokenError):\n    \"\"\"Raised when an invalid access token id is used\"\"\"\n\n    def __init__(self, value):\n        \"\"\"Constructor\"\"\"\n        super(InvalidAccessTokenId, self).__init__(\n            \"Invalid access token id: %d\" % value)\n        self.value = value\n\nclass AccessToken(api.APIObject):\n    \"\"\"Representation of an access token\"\"\"\n\n    @property\n    def access_type(self):\n        \"\"\"The type of access granted by this access token\"\"\"\n        return self._impl.access_type\n\n    @property\n    def id(self):\n        \"\"\"The access token's unique id\"\"\"\n        return self._impl.id\n\n    @property\n    def user(self):\n        \"\"\"The user authenticated by the access token, or None\"\"\"\n        return self._impl.getUser(self.critic)\n\n    @property\n    def part1(self):\n        \"\"\"The first part of the access token\"\"\"\n        return self._impl.part1\n\n    @property\n    def part2(self):\n        \"\"\"The second part of the access token\"\"\"\n        return self._impl.part2\n\n    @property\n    def title(self):\n        \"\"\"The access token's title, or None\"\"\"\n        return self._impl.title\n\n    @property\n    def profile(self):\n        \"\"\"The access token's access control profile\"\"\"\n        return self._impl.getProfile(self.critic)\n\ndef fetch(critic, token_id):\n    \"\"\"Fetch an AccessToken object with the given token id\"\"\"\n    import api.impl\n    assert isinstance(critic, api.critic.Critic)\n    return api.impl.accesstoken.fetch(critic, int(token_id))\n\ndef fetchAll(critic, user=None):\n    \"\"\"Fetch AccessToken objects for all primary profiles in the system\n\n       A profile is primary if it is not the additional restrictions imposed for\n       accesses authenticated with an access token.\n\n       If |user| is not None, return only access tokens belonging to the\n       specified user.\"\"\"\n    import api.impl\n    assert isinstance(critic, api.critic.Critic)\n    assert user is None or isinstance(user, api.user.User)\n    return api.impl.accesstoken.fetchAll(critic, user)\n"
  },
  {
    "path": "src/api/apierror.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nclass APIError(Exception):\n    \"\"\"Base exception for all errors caused by incorrect API usage (including\n       invalid input.)\"\"\"\n    pass\n\nclass PermissionDenied(Exception):\n    \"\"\"Exception raised on correct API usage that the current user is not\n       allowed.\"\"\"\n\n    @staticmethod\n    def raiseUnlessAdministrator(critic):\n        if not (critic.actual_user and\n                critic.actual_user.hasRole(\"administrator\")):\n            raise PermissionDenied(\"Must be an administrator\")\n\n    @staticmethod\n    def raiseUnlessUser(critic, required_user):\n        if critic.actual_user != required_user:\n            PermissionDenied.raiseUnlessAdministrator(critic)\n\nclass TransactionError(APIError):\n    \"\"\"Base exception for transaction errors.\"\"\"\n    pass\n\nclass ResultDelayedError(Exception):\n    \"\"\"Base exception for all errors caused by the result being\n       temporarily unavailable\"\"\"\n    pass\n"
  },
  {
    "path": "src/api/apiobject.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nclass APIObject(object):\n    \"\"\"Base class of all significant API classes\n\n       Exposes the Critic session object as the read-only 'critic' attribute.\n\n       Also holds the reference to the internal implementation object, which\n       should only be used in the implementation of the API.\"\"\"\n\n    def __init__(self, critic, impl):\n        self.__critic = critic\n        self.__impl = impl\n\n    def __int__(self):\n        return self.id\n    def __hash__(self):\n        return hash(int(self))\n    def __eq__(self, other):\n        return int(self) == int(other)\n\n    @property\n    def critic(self):\n        \"\"\"The Critic session object used to create the API object\"\"\"\n        return self.__critic\n\n    @property\n    def _impl(self):\n        \"\"\"Underlying object implementation\n\n           This value should not be used outside the implementation of the\n           API.\"\"\"\n        return self.__impl\n\n    def _set_impl(self, impl):\n        \"\"\"Set the underlying object implementation\n\n           This method should not be called outside the implementation of the\n           API.\"\"\"\n        self.__impl = impl\n"
  },
  {
    "path": "src/api/batch.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2017 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\n\nclass BatchError(api.APIError):\n    pass\n\nclass InvalidBatchId(BatchError):\n    \"\"\"Raised when an invalid batch id is used.\"\"\"\n\n    def __init__(self, batch_id):\n        \"\"\"Constructor\"\"\"\n        super(InvalidBatchId, self).__init__(\"Invalid batch id: %d\" % batch_id)\n\nclass Batch(api.APIObject):\n    @property\n    def id(self):\n        \"\"\"The batch's unique id, or None for unsubmitted changes\"\"\"\n        return self._impl.id\n\n    @property\n    def is_empty(self):\n        \"\"\"True if the batch contains no changes\"\"\"\n        return self._impl.isEmpty(self.critic)\n\n    @property\n    def review(self):\n        \"\"\"The review to which changes were submitted\"\"\"\n        return self._impl.getReview(self.critic)\n\n    @property\n    def author(self):\n        \"\"\"The author of the changes in the batch\n\n           The author is returned as an api.user.User object.\"\"\"\n        return self._impl.getAuthor(self.critic)\n\n    @property\n    def timestamp(self):\n        \"\"\"The time of submission, or None for unsubmitted changes\n\n           The timestamp is returned as a datetime.datetime object.\"\"\"\n        return self._impl.timestamp\n\n    @property\n    def comment(self):\n        \"\"\"The author's overall comment\n\n           The comment is returned as an api.comment.Note object, or None if the\n           author did not provide a comment.\"\"\"\n        return self._impl.getComment(self.critic)\n\n    @property\n    def created_comments(self):\n        \"\"\"Created comments\n\n           The comments are returned as a set of api.comment.Comment objects.\"\"\"\n        return self._impl.getCreatedComments(self.critic)\n\n    @property\n    def written_replies(self):\n        \"\"\"Written replies\n\n           The replies are returned as a set of api.reply.Reply objects.\"\"\"\n        return self._impl.getWrittenReplies(self.critic)\n\n    @property\n    def resolved_issues(self):\n        \"\"\"Resolved issues\n\n           The issues are returned as a set of api.comment.Comment objects. Note\n           that the comment objects represent the current state, and that they\n           may be api.comment.Note objects, and if they are api.comment.Issue\n           objects, that their `state` attribute will not necessarily be\n           \"resolved\".\"\"\"\n        return self._impl.getResolvedIssues(self.critic)\n\n    @property\n    def reopened_issues(self):\n        \"\"\"Reopened issues\n\n           The issues are returned as a set of api.comment.Comment objects. Note\n           that the comment objects represent the current state, and that they\n           may be api.comment.Note objects, and if they are api.comment.Issue\n           objects, that their `state` attribute will not necessarily be\n           \"open\".\"\"\"\n        return self._impl.getReopenedIssues(self.critic)\n\n    @property\n    def morphed_comments(self):\n        \"\"\"Morphed comments (comments whose types was changed)\n\n           The comments are returned as a dictionary mapping api.comment.Comment\n           objects to their new type as a string (\"issue\" or \"note\".) Note that\n           the comment object itself represents the current state, and its type\n           will not necessarily match the new type it's mapped to.\"\"\"\n        return self._impl.getMorphedComments(self.critic)\n\n    @property\n    def reviewed_file_changes(self):\n        \"\"\"Reviewed file changes\n\n           The reviewed changes are returned as a set of\n           api.reviewablefilechanges.ReviewableFileChanges objects. Note that\n           the file changes objects represent the current state, and their\n           `reviewed_by` attribute will not necessarily be the author of this\n           batch.\"\"\"\n        return self._impl.getReviewedFileChanges(self.critic)\n\n    @property\n    def unreviewed_file_changes(self):\n        \"\"\"Unreviewed file changes\n\n           The unreviewed changes are returned as a set of\n           api.reviewablefilechanges.ReviewableFileChanges objects. Note that\n           the file changes objects represent the current state, and their\n           `reviewed_by` attribute will not necessarily be None.\"\"\"\n        return self._impl.getUnreviewedFileChanges(self.critic)\n\ndef fetch(critic, batch_id):\n    \"\"\"Fetch the Batch object with the given id\"\"\"\n    import api.impl\n    assert isinstance(critic, api.critic.Critic)\n    assert isinstance(batch_id, int)\n    return api.impl.batch.fetch(critic, batch_id)\n\ndef fetchAll(critic, review=None, author=None):\n    \"\"\"Fetch all Batch objects\n\n       If |review| is not None, only batches in the specified review are\n       returned.\n\n       If |author| is not None, only batches authored by the specified user are\n       returned.\"\"\"\n    import api.impl\n    assert isinstance(critic, api.critic.Critic)\n    assert review is None or isinstance(review, api.review.Review)\n    assert author is None or isinstance(author, api.user.User)\n    return api.impl.batch.fetchAll(critic, review, author)\n\ndef fetchUnpublished(critic, review):\n    \"\"\"Fetch a Batch object representing current unpublished changes\n\n       The Batch object's |id| and |timestamp| objects will be None to signal\n       that the object does not represent a real object. If the current user has\n       no unpublished changes, the object's |is_empty| attribute will be true.\n\n       Only the currently authenticated user's unpublished changes are\n       returned.\"\"\"\n    import api.impl\n    assert isinstance(critic, api.critic.Critic)\n    assert isinstance(review, api.review.Review)\n    return api.impl.batch.fetchUnpublished(critic, review)\n"
  },
  {
    "path": "src/api/branch.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\n\nclass BranchError(api.APIError):\n    \"\"\"Base exception for all errors related to the Branch class.\"\"\"\n    pass\n\nclass InvalidBranchId(BranchError):\n    \"\"\"Raised when an invalid branch id is used.\"\"\"\n\n    def __init__(self, branch_id):\n        \"\"\"Constructor\"\"\"\n        super(InvalidBranchId, self).__init__(\n            \"Invalid branch id: %d\" % branch_id)\n\nclass InvalidBranchName(BranchError):\n    \"\"\"Raised when an invalid branch name is used.\"\"\"\n\n    def __init__(self, name):\n        \"\"\"Constructor\"\"\"\n        super(InvalidBranchName, self).__init__(\n            \"Invalid branch name: %r\" % name)\n\nclass Branch(api.APIObject):\n    \"\"\"Representation of a Git branch, according to Critic\n\n       Critic extends Git's branch concept by adding a heuristically determined\n       base branch, and a derived restricted set of commits that belong to the\n       branch by (initially) excluding those reachable from the base branch.\"\"\"\n\n    @property\n    def id(self):\n        \"\"\"The branch's unique id\"\"\"\n        return self._impl.id\n\n    @property\n    def name(self):\n        \"\"\"The branch's name excluding the 'refs/heads/' prefix\"\"\"\n        return self._impl.name\n\n    @property\n    def repository(self):\n        \"\"\"The repository that contains the branch\n\n           The repository is returned as an api.repository.Repository object.\"\"\"\n        return self._impl.getRepository(self.critic)\n\n    @property\n    def head(self):\n        \"\"\"The branch's head commit\"\"\"\n        return self._impl.getHead(self.critic)\n\n    @property\n    def commits(self):\n        \"\"\"The commits belonging to the branch\n\n           The return value is an api.commitset.CommitSet object.\n\n           Note: This set of commits is the commits that are actually reachable\n                 from the head of the branch.  If the branch is a review branch\n                 that has been rebased, this is not the same as the commits that\n                 are considered part of the review.\"\"\"\n        return self._impl.getCommits(self.critic)\n\ndef fetch(critic, branch_id=None, repository=None, name=None):\n    \"\"\"Fetch a Branch object with the given id or name\n\n       When a name is provided, a repository must also be provided.\"\"\"\n    import api.impl\n    assert (branch_id is None) != (name is None)\n    assert name is None or repository is not None\n    return api.impl.branch.fetch(critic, branch_id, repository, name)\n\ndef fetchAll(critic, repository=None):\n    \"\"\"Fetch Branch objects for all branches\n\n       If a repository is provided, restrict the return value to branches in the\n       specified repository.\"\"\"\n    import api.impl\n    assert (repository is None or\n            isinstance(repository, api.repository.Repository))\n    return api.impl.branch.fetchAll(critic, repository)\n"
  },
  {
    "path": "src/api/changeset.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2017 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\n\nclass ChangesetError(api.APIError):\n    pass\n\nclass ChangesetBackgroundServiceError(ChangesetError):\n    pass\n\nclass InvalidChangesetId(ChangesetError):\n    pass\n\nclass NotImplementedError(ChangesetError):\n    pass\n\nclass ChangesetDelayed(api.ResultDelayedError):\n    pass\n\nclass AutomaticChangesetEmpty(ChangesetError):\n    \"\"\"Raised when fetching an automatic changeset, and no changes were found\"\"\"\n    pass\n\nclass Changeset(api.APIObject):\n    \"\"\"Representation of a diff\"\"\"\n\n    AUTOMATIC_MODES = frozenset([\n        # All changes in the review.\n        \"everything\",\n        # All changes in the review that are either assigned to the current user\n        # or that matches one of the current user's watcher filters.\n        \"relevant\",\n        # All changes in the review that are assigned to the current user.\n        \"reviewable\",\n        # All pending changes in the review that are assigned to the current\n        # user.\n        \"pending\",\n    ])\n\n    def __str__(self):\n        return str(self._impl.id) + \" (\" + str(self._impl.type) + \")\"\n\n    @property\n    def id(self):\n        return self._impl.id\n\n    @property\n    def repository(self):\n        \"\"\"The repository containing the compared commits\"\"\"\n        return self._impl.repository\n\n    @property\n    def type(self):\n        return self._impl.type\n\n    @property\n    def from_commit(self):\n        return self._impl.getFromCommit()\n\n    @property\n    def to_commit(self):\n        return self._impl.getToCommit()\n\n    @property\n    def files(self):\n        return api.filechange.fetchAll(self.critic, self)\n\n    @property\n    def contributing_commits(self):\n        return self._impl.getContributingCommits(self.critic)\n\ndef fetch(critic, repository, changeset_id=None, from_commit=None,\n          to_commit=None, single_commit=None, review=None, automatic=None):\n    \"\"\"Fetch a single changeset from the given repository\"\"\"\n\n    import api.impl\n    if changeset_id is not None:\n        assert (from_commit is None and to_commit is None and single_commit is None)\n    else:\n        if automatic is not None:\n            assert isinstance(review, api.review.Review)\n            assert automatic in Changeset.AUTOMATIC_MODES\n            assert (from_commit is None and to_commit is None and\n                    single_commit is None)\n        else:\n            assert (from_commit is None) == (to_commit is None)\n            assert (single_commit is None) != (from_commit is None)\n            assert (from_commit is None or\n                    isinstance(from_commit, api.commit.Commit))\n            assert (to_commit is None or\n                    isinstance(to_commit, api.commit.Commit))\n            if single_commit is not None:\n                assert isinstance(single_commit, api.commit.Commit)\n                assert len(single_commit.parents) <= 1\n            if from_commit is not None and to_commit is not None:\n                assert from_commit.id != to_commit.id\n    return api.impl.changeset.fetch(\n        critic, repository, changeset_id, from_commit, to_commit, single_commit,\n        review, automatic)\n"
  },
  {
    "path": "src/api/comment.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2017 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\n\nclass CommentError(api.APIError):\n    pass\n\nclass InvalidCommentId(CommentError):\n    \"\"\"Raised when an invalid comment id is used.\"\"\"\n\n    def __init__(self, comment_id):\n        \"\"\"Constructor\"\"\"\n        super(InvalidCommentId, self).__init__(\n            \"Invalid comment id: %d\" % comment_id)\n        self.comment_id = comment_id\n\nclass InvalidCommentIds(CommentError):\n    \"\"\"Raised by fetchMany() when invalid comment ids are used.\"\"\"\n\n    def __init__(self, comment_ids):\n        \"\"\"Constructor\"\"\"\n        super(InvalidCommentIds, self).__init__(\n            \"Invalid comment ids: %s\" % \", \".join(map(str, comment_ids)))\n        self.comment_ids = comment_ids\n\nclass InvalidLocation(CommentError):\n    \"\"\"Raised when attempting to specify an invalid comment location\"\"\"\n\n    pass\n\nclass Comment(api.APIObject):\n    TYPE_VALUES = frozenset([\"issue\", \"note\"])\n\n    @property\n    def id(self):\n        \"\"\"The comment's unique id\"\"\"\n        return self._impl.id\n\n    @property\n    def type(self):\n        \"\"\"The comment's type\n\n           The type is one of \"issue\" and \"note\".\"\"\"\n        pass\n\n    @property\n    def is_draft(self):\n        \"\"\"True if the comment is not yet published\n\n           Unpublished comments are not displayed to other users.\"\"\"\n        return self._impl.is_draft\n\n    @property\n    def review(self):\n\n        \"\"\"The review to which the comment belongs\n\n           The review is returned as an api.review.Review object.\"\"\"\n        return self._impl.getReview(self.critic)\n\n    @property\n    def author(self):\n        \"\"\"The comment's author\n\n           The author is returned as an api.user.User object.\"\"\"\n        return self._impl.getAuthor(self.critic)\n\n    @property\n    def timestamp(self):\n        \"\"\"The comment's timestamp\n\n           The return value is a datetime.datetime object.\"\"\"\n        return self._impl.timestamp\n\n    @property\n    def location(self):\n        \"\"\"The location of the comment, or None\n\n           If the comment was made against lines in a commit message, the return\n           value is a api.comment.CommitMessageLocation object.  If the comment\n           was made against lines in a file version, the return value is\n           api.comment.FileVersionLocation object.  Otherwise, the return value\n           is None.\"\"\"\n        return self._impl.getLocation(self.critic)\n\n    @property\n    def text(self):\n        \"\"\"The comment's text\"\"\"\n        return self._impl.text\n\n    @property\n    def replies(self):\n        \"\"\"The replies to the comment\n\n           The replies are returned as a list of api.reply.Reply objects.\"\"\"\n        return self._impl.getReplies(self.critic)\n\n    class DraftChanges(object):\n        \"\"\"Draft changes to the comment\"\"\"\n\n        def __init__(self, author, is_draft, reply, new_type):\n            self.__author = author\n            self.__is_draft = is_draft\n            self.__reply = reply\n            self.__new_type = new_type\n\n        @property\n        def author(self):\n            \"\"\"The author of these draft changes\n\n               The author is returned as an api.user.User object.\"\"\"\n            return self.__author\n\n        @property\n        def is_draft(self):\n            \"\"\"True if the comment itself is a draft (not published)\"\"\"\n            return self.__is_draft\n\n        @property\n        def reply(self):\n            \"\"\"The current unpublished reply\n\n               The reply is returned as an api.reply.Reply object, or None if\n               there is no current unpublished reply.\"\"\"\n            return self.__reply\n\n        @property\n        def new_type(self):\n            \"\"\"The new type of an unpublished type change\n\n               The type is returned as a string. Comment.TYPE_VALUES defines the\n               set of possible return values.\"\"\"\n            return self.__new_type\n\n    @property\n    def draft_changes(self):\n        \"\"\"The comment's current draft changes\n\n           The draft changes are returned as a Comment.DraftChanges object, or\n           None if the current user has no unpublished changes to this comment.\n\n           If the comment is currently an issue, or the current user has an\n           unpublished change of the comment's type to issue, the returned\n           object will be an Issue.DraftChanges instead.\"\"\"\n        return self._impl.getDraftChanges(self.critic)\n\nclass Issue(Comment):\n    STATE_VALUES = frozenset([\"open\", \"addressed\", \"resolved\"])\n\n    @property\n    def type(self):\n        return \"issue\"\n\n    @property\n    def state(self):\n        \"\"\"The issue's state\n\n           The state is one of the strings \"open\", \"addressed\" or \"resolved\".\"\"\"\n        return self._impl.state\n\n    @property\n    def addressed_by(self):\n        \"\"\"The commit that addressed the issue, or None\n\n           The value is an api.commit.Commit object, or None if the issue's\n           state is not \"addressed\".\"\"\"\n        return self._impl.getAddressedBy(self.critic)\n\n    @property\n    def resolved_by(self):\n        \"\"\"The user that resolved the issue, or None\n\n           The value is an api.user.User object, or None if the issue's state is\n           not \"resolved\".\"\"\"\n        return self._impl.getResolvedBy(self.critic)\n\n    class DraftChanges(Comment.DraftChanges):\n        \"\"\"Draft changes to the issue\"\"\"\n\n        def __init__(self, author, is_draft, reply, new_type, new_state,\n                     new_location):\n            super(Issue.DraftChanges, self).__init__(\n                author, is_draft, reply, new_type)\n            self.__new_state = new_state\n            self.__new_location = new_location\n\n        @property\n        def new_state(self):\n            \"\"\"The issue's new state\n\n               The new state is returned as a string, or None if the current\n               user has not resolved or reopened the issue. Issue.STATE_VALUES\n               defines the set of possible return values.\"\"\"\n            return self.__new_state\n\n        @property\n        def new_location(self):\n            \"\"\"The issue's new location\n\n               The new location is returned as a FileVersionLocation objects, or\n               None if the issue has not been reopened, or if it was manually\n               resolved rather than addressed and did not need to be relocated\n               when being reopened.\n\n               Since only issues in file version locations can be addressed,\n               that is the only possible type of new location.\"\"\"\n            return self.__new_location\n\nclass Note(Comment):\n    @property\n    def type(self):\n        return \"note\"\n\nclass Location(api.APIObject):\n    TYPE_VALUES = frozenset([\"general\", \"commit-message\", \"file-version\"])\n\n    def __len__(self):\n        \"\"\"Return the the length of the location, in lines\"\"\"\n        return (self.last_line - self.first_line) + 1\n\n    @property\n    def type(self):\n        \"\"\"The location's type\n\n           The type is one of \"commit-message\" and \"file-version\".\"\"\"\n        pass\n\n    @property\n    def first_line(self):\n        \"\"\"The line number of the first commented line\n\n           Note that line numbers are one-based.\"\"\"\n        return self._impl.first_line\n\n    @property\n    def last_line(self):\n        \"\"\"The line number of the last commented line\n\n           Note that line numbers are one-based.\"\"\"\n        return self._impl.last_line\n\nclass CommitMessageLocation(Location):\n    @property\n    def type(self):\n        return \"commit-message\"\n\n    @property\n    def commit(self):\n        \"\"\"The commit whose message was commented\"\"\"\n        return self._impl.getCommit(self.critic)\n\n    @staticmethod\n    def make(critic, first_line, last_line, commit):\n        return api.impl.comment.makeCommitMessageLocation(\n            critic, first_line, last_line, commit)\n\nclass FileVersionLocation(Location):\n    @property\n    def type(self):\n        return \"file-version\"\n\n    @property\n    def changeset(self):\n        \"\"\"The changeset containing the comment\n\n           The changeset is returned as an api.changeset.Changeset object.\n\n           If the comment was created while looking at a diff, this will\n           initially be that changeset. As additional commits are added to the\n           review, this changeset may be \"extended\" to contain those added\n           commits.\n\n           This is the ideal changeset to use to display the comment, unless it\n           is an issue that has been addressed, in which case a better changeset\n           would be the diff of the commit returned by Issue.addressed_by.\n\n           If the user did not make the comment while looking at a diff but\n           rather while looking at a single version of the file, then this\n           attribute returns None.\n\n           If this is an object returned by translateTo() called with a\n           changeset argument, then this will be that changeset.\"\"\"\n        return self._impl.getChangeset(self.critic)\n\n    @property\n    def side(self):\n        \"\"\"The commented side (\"old\" or \"new\") of the changeset\n\n           If the user did not make the comment while looking at a changeset\n           (i.e. a diff) but rather while looking at a single version of the\n           file, then this attribute returns None.\"\"\"\n        return self._impl.side\n\n    @property\n    def commit(self):\n        \"\"\"The commit whose version of the file this location references\n\n           The commit is returned as an api.commit.Commit object.\n\n           If this is an object returned by translateTo() called with a commit\n           argument, then this is the commit that was given as an argument to\n           it. If this is the primary location of the comment (returned from\n           Comment.location) then this is the commit whose version of the file\n           the comment was originally made against, or None if the comment was\n           made while looking at a diff.\"\"\"\n        return self._impl.getCommit(self.critic)\n\n    @property\n    def file(self):\n        \"\"\"The commented file\"\"\"\n        return self._impl.getFile(self.critic)\n\n    @property\n    def is_translated(self):\n        \"\"\"True if this is a location returned by |translateTo()|\"\"\"\n        return self._impl.is_translated\n\n    def translateTo(self, changeset=None, commit=None):\n        \"\"\"Return a translated file version location, or None\n\n           The location is translated to the version of the file in a certain\n           commit. If |changeset| is not None, that commit is the changeset's\n           |to_commit|, unless the comment is not present there, and otherwise\n           the changeset's |from_commit|. If |commit| is not None, that's the\n           commit.\n\n           If the comment is not present in the commit, None is returned.\n\n           The returned object's |is_translated| will be True.\n\n           If the |changeset| argument is not None, then the returned object's\n           |changeset| will be that changeset, and its |side| will reflect which\n           of its |from_commit| and |to_commit| ended up being used. The\n           returned object's |commit| will be None.\n\n           If the |commit| argument is not None, the returned object's |commit|\n           will be that commit, and its |changeset| and |side| will be None.\"\"\"\n        assert changeset is None \\\n                or isinstance(changeset, api.changeset.Changeset)\n        assert commit is None or isinstance(commit, api.commit.Commit)\n        assert (changeset is None) != (commit is None)\n        return self._impl.translateTo(self.critic, changeset, commit)\n\n    @staticmethod\n    def make(critic, first_line, last_line, file, changeset=None, side=None,\n             commit=None):\n        # File is required.\n        assert isinstance(file, api.file.File)\n        # Changeset and side go together.\n        assert (changeset is None) == (side is None)\n        assert (changeset is None) \\\n                or isinstance(changeset, api.changeset.Changeset)\n        # Commit conflicts with changeset, but one is required.\n        assert (commit is None) != (changeset is None)\n        assert (commit is None) or isinstance(commit, api.commit.Commit)\n        return api.impl.comment.makeFileVersionLocation(\n            critic, first_line, last_line, file, changeset, side, commit)\n\ndef fetch(critic, comment_id):\n    \"\"\"Fetch the Comment object with the given id\"\"\"\n    import api.impl\n    assert isinstance(critic, api.critic.Critic)\n    assert isinstance(comment_id, int)\n    return api.impl.comment.fetch(critic, comment_id)\n\ndef fetchMany(critic, comment_ids):\n    \"\"\"Fetch multiple Comment objects with the given ids\"\"\"\n    import api.impl\n    assert isinstance(critic, api.critic.Critic)\n    comment_ids = list(comment_ids)\n    assert all(isinstance(comment_id, int) for comment_id in comment_ids)\n    return api.impl.comment.fetchMany(critic, comment_ids)\n\ndef fetchAll(critic, review=None, author=None, comment_type=None, state=None,\n             location_type=None, changeset=None, commit=None):\n    \"\"\"Fetch all Comment objects\n\n       If |review| is not None, only comments in the specified review are\n       returned.\n\n       If |author| is not None, only comments created by the specified user are\n       returned.\n\n       If |comment_type| is not None, only comments of the specified type are\n       returned.\n\n       If |state| is not None, only issues in the specified state are returned.\n       This implies type=\"issue\".\n\n       If |location_type| is not None, only issues in the specified type of\n       location are returned.\n\n       If |changeset| is not None, only comments against file versions that are\n       referenced by the specified changeset are returned. Must be combined with\n       |review|, and can not be combined with |commit|.\n\n       If |commit| is not None, only comments against the commit's message or\n       file versions referenced by the commit are returned. Must be combined\n       with |review|, and can not be combined with |changeset|.\"\"\"\n    import api.impl\n    assert isinstance(critic, api.critic.Critic)\n    assert review is None or isinstance(review, api.review.Review)\n    assert author is None or isinstance(author, api.user.User)\n    assert comment_type is None or comment_type in Comment.TYPE_VALUES\n    assert state is None or state in Issue.STATE_VALUES\n    assert state is None or comment_type in (None, \"issue\")\n    assert location_type is None or location_type in Location.TYPE_VALUES\n    assert changeset is None or isinstance(changeset, api.changeset.Changeset)\n    assert changeset is None or review is not None\n    assert commit is None or isinstance(commit, api.commit.Commit)\n    assert commit is None or review is not None\n    assert changeset is None or commit is None\n    return api.impl.comment.fetchAll(critic, review, author, comment_type,\n                                     state, location_type, changeset, commit)\n"
  },
  {
    "path": "src/api/commit.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\n\nclass CommitError(api.APIError):\n    pass\n\nclass InvalidCommitId(CommitError):\n    \"\"\"Raised when an invalid commit id is used\"\"\"\n    def __init__(self, commit_id):\n        super(InvalidCommitId, self).__init__(\n            \"Invalid commit id: %r\" % commit_id)\n\nclass InvalidSHA1(CommitError):\n    \"\"\"Raised when a given SHA-1 is invalid as a commit reference\"\"\"\n    def __init__(self, sha1):\n        super(InvalidSHA1, self).__init__(\"Invalid commit SHA-1: %r\" % sha1)\n        self.sha1 = sha1\n\nclass NotAFile(CommitError):\n    \"\"\"Raised when attempting to access a non-file as a file\"\"\"\n    def __init__(self, path):\n        super(NotAFile, self).__init__(\"Path is not a file: %s\" % path)\n        self.path = path\n\nclass Commit(api.APIObject):\n    \"\"\"Representation of a Git commit\"\"\"\n\n    def __str__(self):\n        return self.sha1\n    def __repr__(self):\n        return \"api.commit.Commit(sha1=%r)\" % self.sha1\n    def __hash__(self):\n        return hash(str(self))\n    def __eq__(self, other):\n        return str(self) == str(other)\n\n    @property\n    def id(self):\n        \"\"\"The commit's unique database id\"\"\"\n        return self._impl.getId(self.critic)\n\n    @property\n    def repository(self):\n        \"\"\"The repository containing the commit\"\"\"\n        return self._impl.repository\n\n    @property\n    def sha1(self):\n        \"\"\"The commit's full 40 character SHA-1\"\"\"\n        return self._impl.sha1\n\n    @property\n    def tree(self):\n        \"\"\"The SHA-1 of the tree object referenced by the commit\"\"\"\n        return self._impl.tree\n\n    @property\n    def summary(self):\n        \"\"\"The commit's single-line summary\n\n           This is the first line of the commit message, unless that line starts\n           with 'fixup!' or 'squash!', in which case the returned summary is the\n           first non-empty line after that, with '[fixup] ' or '[squash] '\n           prepended.  If there is no such non-empty line, the returned summary\n           is just '[fixup]' or '[squash]'.\"\"\"\n        return self._impl.getSummary()\n\n    @property\n    def message(self):\n        \"\"\"The commit's full commit message\"\"\"\n        return self._impl.message\n\n    @property\n    def parents(self):\n        \"\"\"The commit's parents\n\n           The return value is a list of api.Commit objects.\"\"\"\n        return self._impl.getParents(self.critic)\n\n    @property\n    def description(self):\n        \"\"\"A string describing the commit in \"friendly\" way, or None\n\n           This will typically be a tag name or a branch name; in the case of a\n           branch name with \"tip of\" prepended if this commit is referenced\n           directly by that branch.\"\"\"\n        return self._impl.getDescription(self.critic)\n\n    class UserAndTimestamp(object):\n        \"\"\"Representation of the author or committer meta-data of a commit\"\"\"\n\n        def __init__(self, name, email, timestamp):\n            self.name = name\n            self.email = email\n            self.timestamp = timestamp\n\n    @property\n    def author(self):\n        \"\"\"The commit's \"author\" meta-data\"\"\"\n        return self._impl.getAuthor(self.critic)\n\n    @property\n    def committer(self):\n        \"\"\"The commit's \"committer\" meta-data\"\"\"\n        return self._impl.getCommitter(self.critic)\n\n    def isAncestorOf(self, commit):\n        \"\"\"Return True if |self| is an ancestor of |commit|\n\n           Also return True if |self| is |commit|, meaning a commit is\n           considered an ancestor of itself.\"\"\"\n        assert isinstance(commit, Commit)\n        return self._impl.isAncestorOf(commit)\n\n    class FileInformation(object):\n        \"\"\"Basic information about a file in a commit\"\"\"\n\n        def __init__(self, file, mode, sha1, size):\n            self.__file = file\n            self.__mode = mode\n            self.__sha1 = sha1\n            self.__size = size\n\n        @property\n        def file(self):\n            \"\"\"The represented file\"\"\"\n            return self.__file\n\n        @property\n        def mode(self):\n            \"\"\"The file's \"UNIX\" mode as an integer\"\"\"\n            # FIXME: Should we use an integer sub-class, like\n            #        gitutils.Tree.Entry.Mode?\n            return self.__mode\n\n        @property\n        def sha1(self):\n            \"\"\"The file content's SHA-1\"\"\"\n            return self.__sha1\n\n        @property\n        def size(self):\n            \"\"\"The file's size in bytes\"\"\"\n            return self.__size\n\n    def getFileInformation(self, file):\n        \"\"\"Look up information about a file in the commit\n\n           The entry is returned as an Commit.FileInformation object, or None if\n           the path was not found in the commit's tree. If the path is found but\n           is not a blob (e.g. because it's a directory), NotAFile is raised.\"\"\"\n        assert isinstance(file, api.file.File)\n        return self._impl.getFileInformation(file)\n\n    def getFileContents(self, file):\n        \"\"\"Fetch the blob (contents) of a file in the commit\n\n           The return value is a string, or None if the path was not found in\n           the commit's tree. If the path is found but is not a blob\n           (e.g. because it is a directory), NotAFile is raised.\"\"\"\n        assert isinstance(file, api.file.File)\n        return self._impl.getFileContents(file)\n\n    def getFileLines(self, file):\n        \"\"\"Fetch the lines of a file in the commit\n\n           Much like getFileContents(), but splits the returned string into a\n           list of strings in a consistent way that matches how other parts of\n           Critic treats line breaks, and thus compatible with stored line\n           numbers.\n\n           Note: commit.getFileContents(...).splitlines() is *not* correct!\"\"\"\n        assert isinstance(file, api.file.File)\n        return self._impl.getFileLines(file)\n\ndef fetch(repository, commit_id=None, sha1=None, ref=None):\n    \"\"\"Fetch a Git commit from the given repository\n\n       The commit can be identified by its unique (internal) database id, its\n       SHA-1 (full 40 character string) or by an arbitrary ref that resolves to\n       a commit object (possibly via tag objects) when given to the\n       'git rev-parse' command.\"\"\"\n    import api.impl\n    assert isinstance(repository, api.repository.Repository)\n    assert (ref is None) != ((commit_id is None) and (sha1 is None))\n    return api.impl.commit.fetch(repository, commit_id, sha1, ref)\n\ndef fetchMany(repository, commit_ids=None, sha1s=None):\n    \"\"\"Fetch multiple Git commits from the given repository\n\n       The commits can be identified by their unique (internal) database ids, or\n       by their SHA-1s (full 40 character strings.)\"\"\"\n    import re\n    import api.impl\n    assert isinstance(repository, api.repository.Repository)\n    assert (commit_ids is None) != (sha1s is None)\n    if commit_ids:\n        commit_ids = list(commit_ids)\n        assert all(isinstance(commit_id, int) for commit_id in commit_ids)\n    else:\n        re_sha1 = re.compile(\"^[0-9A-Fa-f]{40}$\")\n        sha1s = list(sha1s)\n        assert all(isinstance(sha1, str) and re_sha1.match(sha1)\n                   for sha1 in sha1s)\n    return api.impl.commit.fetchMany(repository, commit_ids, sha1s)\n"
  },
  {
    "path": "src/api/commitset.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\n\nclass InvalidCommitRange(Exception):\n    \"\"\"Raised by calculateFromRange() when the range is not simple\n\n       Simple in this context means that the commit that defines the start of\n       the range is an ancestor of the commit that defines the end of the range,\n       and all commits in-between.\"\"\"\n    pass\n\nclass CommitSet(api.APIObject):\n    \"\"\"Representation of a set of Commit objects\"\"\"\n\n    def __iter__(self):\n        return iter(self._impl)\n    def __len__(self):\n        return len(self._impl)\n    def __contains__(self, item):\n        return item in self._impl\n    def __hash__(self):\n        return hash(self._impl)\n    def __eq__(self, other):\n        assert isinstance(other, CommitSet)\n        return self._impl == other._impl\n    def __nonzero__(self):\n        return len(self._impl) != 0\n    def __repr__(self):\n        return \"api.commitset.CommitSet(%r)\" % list(self.topo_ordered)\n\n    @property\n    def date_ordered(self):\n        \"\"\"The commits in the set in (commit) timestamp order\n\n           The return value is a generator producing api.commit.Commit objects.\n           Commits are guaranteed to precede their parents, even if the actual\n           commit timestamp order is the opposite.\"\"\"\n        return self._impl.getDateOrdered()\n\n    @property\n    def topo_ordered(self):\n        \"\"\"The commits in the set in \"topological\" order\n\n           The return value is a generator producing api.commit.Commit objects.\n           Commits are guaranteed to precede their parents, and as far as\n           possible immediately precede their parent.\n\n           It is only valid to call this function on commit sets with a single\n           head (those whose 'heads' attribute returns a set of length 1.)\"\"\"\n        assert not self or len(self.heads) == 1\n        return self._impl.getTopoOrdered()\n\n    @property\n    def heads(self):\n        \"\"\"The head commits of the set\n\n           The return value is a frozenset of Commit objects.\n\n           A \"head commit\" is defined as any commit in the set that is\n           not an immediate parent of another commit in the set.\"\"\"\n        return self._impl.heads\n\n    @property\n    def tails(self):\n        \"\"\"The tail commits of the set\n\n           The return value is a frozenset of Commit objects.\n\n           A \"tail commit\" is defined as any commit that is a parent of\n           a commit in the set but isn't itself in the set.\"\"\"\n        return self._impl.tails\n\n    @property\n    def filtered_tails(self):\n        \"\"\"The filtered tail commits of the set\n\n           The return value is a frozenset of Commit objects.\n\n           The returned set will contain each tail commit that isn't an ancestor\n           of another tail commit of the set. If the tail commits of the set are\n           all different commits on an upstream branch, then this will only\n           return the latest one.\"\"\"\n\n        return self._impl.getFilteredTails(self.critic)\n\n    def getChildrenOf(self, commit):\n        \"\"\"Return the commits in the set that are children of the commit\n\n           The return value is a set of Commit objects.\"\"\"\n        assert isinstance(commit, api.commit.Commit)\n        return self._impl.getChildrenOf(commit)\n\n    def getParentsOf(self, commit):\n        \"\"\"Return the intersection of the commit's parents and the set\n\n           The return value is a list of Commit objects, in the same\n           order as in \"commit.parents\".\"\"\"\n        assert isinstance(commit, api.commit.Commit)\n        return self._impl.getParentsOf(commit)\n\n    def getDescendantsOf(self, commit, include_self=False):\n        \"\"\"Return the intersection of the commit's descendants and the set\n\n           The return value is another CommitSet object.  If 'include_self' is\n           True, the commit itself is included in the returned set.\n\n           The argument can also be a iterable, in which case the returned set\n           is the union of the sets that would be returned for each commit in\n           the iterable.\"\"\"\n        if isinstance(commit, api.commit.Commit):\n            commits = [commit]\n        else:\n            commits = list(commit)\n        assert all(isinstance(commit, api.commit.Commit) for commit in commits)\n        assert all(commit in self or commit in self.tails for commit in commits)\n        return self._impl.getDescendantsOf(commits, include_self)\n\n    def getAncestorsOf(self, commit, include_self=False):\n        \"\"\"Return the intersection of the commit's ancestors and the set\n\n           The return value is another CommitSet object.  If 'include_self' is\n           True, the commit itself is included in the returned set.\n\n           The argument can also be a iterable, in which case the returned set\n           is the union of the sets that would be returned for each commit in\n           the iterable.\"\"\"\n        if isinstance(commit, api.commit.Commit):\n            commits = [commit]\n        else:\n            commits = list(commit)\n        assert all(isinstance(commit, api.commit.Commit) for commit in commits)\n        return self._impl.getAncestorsOf(commits, include_self)\n\n    def union(self, commits):\n        commits = set(commits)\n        assert all(isinstance(commit, api.commit.Commit) for commit in commits)\n        return self._impl.union(self.critic, commits)\n    def __or__(self, commits):\n        return self.union(commits)\n\n    def intersection(self, commits):\n        commits = set(commits)\n        assert all(isinstance(commit, api.commit.Commit) for commit in commits)\n        return self._impl.intersection(self.critic, commits)\n    def __and__(self, commits):\n        return self.intersection(commits)\n\n    def difference(self, commits):\n        commits = set(commits)\n        assert all(isinstance(commit, api.commit.Commit) for commit in commits)\n        return self._impl.difference(self.critic, commits)\n    def __sub__(self, commits):\n        return self.difference(commits)\n\n    def symmetric_difference(self, commits):\n        commits = set(commits)\n        assert all(isinstance(commit, api.commit.Commit) for commit in commits)\n        return self._impl.symmetric_difference(self.critic, commits)\n    def __xor__(self, commits):\n        return self.symmetric_difference(commits)\n\ndef create(critic, commits):\n    \"\"\"Create a CommitSet object from an iterable of Commit objects\"\"\"\n    import api.impl\n    assert isinstance(critic, api.critic.Critic)\n    if not isinstance(commits, CommitSet):\n        commits = list(commits)\n        assert all(isinstance(commit, api.commit.Commit) for commit in commits)\n    return api.impl.commitset.create(critic, commits)\n\ndef calculateFromRange(critic, from_commit, to_commit):\n    \"\"\"Calculate a set of commits from a commit range\"\"\"\n    import api.impl\n    assert isinstance(critic, api.critic.Critic)\n    assert isinstance(from_commit, api.commit.Commit)\n    assert isinstance(to_commit, api.commit.Commit)\n    assert from_commit.repository == to_commit.repository\n    return api.impl.commitset.calculateFromRange(critic, from_commit, to_commit)\n"
  },
  {
    "path": "src/api/config.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\n\nclass ConfigurationError(api.APIError):\n    pass\n\nclass InvalidGroup(ConfigurationError):\n    def __init__(self, name):\n        super(ConfigurationError, self).__init__(\n            \"Invalid configuration group: %s\" % name)\n\nclass InvalidKey(ConfigurationError):\n    def __init__(self, group, name):\n        super(ConfigurationError, self).__init__(\n            \"Invalid configuration key: %s::%s\" % (group, name))\n\nclass WrongType(ConfigurationError):\n    def __init__(self, group, name, read_as):\n        super(ConfigurationError, self).__init__(\n            \"Wrong type: %s::%s (read as %s)\" % (group, name, read_as))\n\ndef getValue(group, key):\n    import configuration\n    if not hasattr(configuration, group):\n        raise InvalidGroup(group)\n    module = getattr(configuration, group)\n    if not hasattr(module, key):\n        raise InvalidKey(group, key)\n    return getattr(module, key)\n\ndef getBoolean(group, key):\n    value = getValue(group, key)\n    if not isinstance(value, bool):\n        raise WrongType(group, key, \"boolean\")\n    return value\n\ndef getInteger(group, key):\n    value = getValue(group, key)\n    if not isinstance(value, int):\n        raise WrongType(group, key, \"integer\")\n    return value\n\ndef getString(group, key):\n    value = getValue(group, key)\n    if not isinstance(value, basestring):\n        raise WrongType(group, key, \"string\")\n    return value\n"
  },
  {
    "path": "src/api/critic.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\n\nclass Critic(object):\n    def __init__(self, impl):\n        self._impl = impl\n\n    @property\n    def effective_user(self):\n        return self._impl.getEffectiveUser(self)\n\n    @property\n    def actual_user(self):\n        return self._impl.actual_user\n\n    @property\n    def access_token(self):\n        \"\"\"Access token used to authenticate\"\"\"\n        return self._impl.access_token\n\n    @property\n    def database(self):\n        return self._impl.database\n\n    def getDatabaseCursor(self):\n        \"\"\"Return a read-only database cursor object\n\n           This cursor object can only be used to execute SELECT queries.\"\"\"\n        return self._impl.database.readonly_cursor()\n\n    def getUpdatingDatabaseCursor(self, *tables):\n        \"\"\"Return a database cursor for updating\n\n           The return value is a \"context manager\", which returns the actual\n           cursor object when entered and either commits or rolls back the\n           current transaction when exited.  The actual cursor object can only\n           be used to update the tables specified as arguments, using INSERT,\n           UPDATE or DELETE queries.\n\n           The cursor object can also be used to execute SELECT queries (against\n           any tables.)\"\"\"\n        return self._impl.database.updating_cursor(*tables)\n\n    def setActualUser(self, user):\n        assert isinstance(user, api.user.User)\n        assert self._impl.actual_user is None\n        self._impl.actual_user = user\n\n    def setAccessToken(self, access_token):\n        \"\"\"Set the access token used to authenticate\"\"\"\n        assert self._impl.access_token is None\n        self._impl.access_token = access_token\n\ndef startSession(for_user=False, for_system=False, for_testing=False):\n    import api.impl\n    assert sum((for_user, for_system, for_testing)) == 1\n    return api.impl.critic.startSession(for_user, for_system, for_testing)\n"
  },
  {
    "path": "src/api/extension.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2015 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\n\nclass ExtensionError(api.APIError):\n    \"\"\"Base exception for all errors related to the Extension class\"\"\"\n    pass\n\nclass InvalidExtensionId(ExtensionError):\n    \"\"\"Raised when a invalid extension id is used\"\"\"\n\n    def __init__(self, value):\n        \"\"\"Constructor\"\"\"\n        super(InvalidExtensionId, self).__init__(\n            \"Invalid extension id: %r\" % value)\n        self.value = value\n\nclass InvalidExtensionKey(ExtensionError):\n    \"\"\"Raised when an invalid extension key is used\"\"\"\n\n    def __init__(self, value):\n        \"\"\"Constructor\"\"\"\n        super(InvalidExtensionKey, self).__init__(\n            \"Invalid extension key: %r\" % value)\n        self.value = value\n\nclass Extension(api.APIObject):\n    \"\"\"Representation of a Critic extension\"\"\"\n\n    @property\n    def id(self):\n        \"\"\"The extension's unique id\"\"\"\n        return self._impl.id\n\n    @property\n    def name(self):\n        \"\"\"The extension's name\"\"\"\n        return self._impl.name\n\n    @property\n    def key(self):\n        \"\"\"The extension's unique key\n\n           For a system extension, the key is the extension's name.  For other\n           extensions, the key is the publisher's username followed by a slash\n           followed by the extension's name.\"\"\"\n        return self._impl.getKey(self.critic)\n\n    @property\n    def publisher(self):\n        \"\"\"The extension's publisher\n\n           The user that published the extension.  This may not be the author\n           (who may not be a user of this Critic system.)\n\n           None if this is a system extension.\"\"\"\n        return self._impl.getPublisher(self.critic)\n\n    @property\n    def default_version(self):\n        \"\"\"The default extension version\n\n           This is typically the version whose extension description and other\n           metadata should be presented as the extension's true metadata.\"\"\"\n        return self._impl.getDefaultVersion()\n\ndef fetch(critic, extension_id=None, key=None):\n    \"\"\"Fetch an Extension object with the given extension id or key\n\n       Exactly one of the 'extension_id' and 'key' arguments can be used.\n\n       Exceptions:\n\n         InvalidExtensionId: if 'extension_id' is used and is not a valid\n                             extension id.\n         InvalidExtensionKey: if 'key' is used and is not a valid extensions\n                              key.\"\"\"\n    import api.impl\n    assert isinstance(critic, api.critic.Critic)\n    assert (extension_id is None) != (key is None)\n    return api.impl.extension.fetch(critic, extension_id, key)\n\ndef fetchAll(critic, publisher=None, installed_by=None):\n    \"\"\"Fetch Extension objects for all extensions in the system\n\n       If 'publisher' is not None, it must be an api.user.User object, and only\n       extensions published by this user are returned.\n\n       If 'installed_by' is not None, it must be an api.user.User object, and\n       only extensions that this user has installed are returned.  This may\n       include extensions that are universally installed (i.e. installed for all\n       users, and not by this user directly.)\"\"\"\n    import api.impl\n    assert isinstance(critic, api.critic.Critic)\n    assert publisher is None or isinstance(publisher, api.user.User)\n    assert installed_by is None or isinstance(installed_by, api.user.User)\n    return api.impl.extension.fetchAll(critic, publisher, installed_by)\n"
  },
  {
    "path": "src/api/file.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2017 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\n\nclass FileError(api.APIError):\n    pass\n\nclass InvalidFileId(FileError):\n    \"\"\"Raised when an invalid file id is used.\"\"\"\n\n    def __init__(self, file_id):\n        \"\"\"Constructor\"\"\"\n        super(InvalidFileId, self).__init__(\n            \"Invalid file id: %d\" % file_id)\n\nclass InvalidPath(FileError):\n    \"\"\"Raised when an invalid path is used.\"\"\"\n\n    def __init__(self, path):\n        \"\"\"Constructor\"\"\"\n        super(InvalidPath, self).__init__(\n            \"Invalid path: %s\" % path)\n\nclass File(api.APIObject):\n    def __str__(self):\n        return self.path\n\n    @property\n    def id(self):\n        \"\"\"The path's unique id\"\"\"\n        return self._impl.id\n\n    @property\n    def path(self):\n        \"\"\"The path\"\"\"\n        return self._impl.path\n\ndef fetch(critic, file_id=None, path=None, create=False):\n    \"\"\"Fetch a \"file\" (file id / path mapping)\n\n       If a path is used, and |create| is True, a mapping is created if one\n       didn't already exist.\"\"\"\n    import api.impl\n    assert isinstance(critic, api.critic.Critic)\n    assert (file_id is None) != (path is None)\n    if file_id is not None:\n        file_id = int(file_id)\n    if path is not None:\n        path = str(path)\n    assert isinstance(create, bool)\n    return api.impl.file.fetch(critic, file_id, path, create)\n\ndef fetchMany(critic, file_ids=None, paths=None, create=False):\n    \"\"\"Fetch multiple \"files\" (file id / path mappings)\n\n       If paths are used, and |create| is True, a mapping is created if one\n       didn't already exist.\"\"\"\n    import api.impl\n    assert isinstance(critic, api.critic.Critic)\n    assert (file_ids is None) != (paths is None)\n    if file_ids is not None:\n        file_ids = [int(file_id) for file_id in file_ids]\n    if paths is not None:\n        paths = [str(path) for path in paths]\n    assert isinstance(create, bool)\n    return api.impl.file.fetchMany(critic, file_ids, paths, create)\n"
  },
  {
    "path": "src/api/filechange.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2017 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\n\nclass FileChangeError(api.APIError):\n    pass\n\nclass InvalidFileChangeId(FileChangeError):\n    def __init__(self, changeset_id, file_id):\n        super(InvalidFileChangeId, self).__init__(\n            \"Invalid file change id: %d:%d\" % (changeset_id, file_id))\n\nclass FileChange(api.APIObject):\n    \"\"\"Representation of the changes to a file introduced by a changeset\"\"\"\n\n    def __hash__(self):\n        return hash((self.changeset, self.file))\n    def __eq__(self, other):\n        return self.changeset == other.changeset and self.file == other.file\n\n    @property\n    def file(self):\n        return self._impl.getFile(self.critic)\n\n    @property\n    def changeset(self):\n        return self._impl.changeset\n\n    @property\n    def old_sha1(self):\n        return self._impl.old_sha1\n\n    @property\n    def old_mode(self):\n        return self._impl.old_mode\n\n    @property\n    def new_sha1(self):\n        return self._impl.new_sha1\n\n    @property\n    def new_mode(self):\n        return self._impl.new_mode\n\ndef fetch(critic, changeset, file):\n    assert isinstance(critic, api.critic.Critic)\n    assert isinstance(changeset, api.changeset.Changeset)\n    assert isinstance(file, api.file.File)\n    return api.impl.filechange.fetch(critic, changeset, file)\n\ndef fetchAll(critic, changeset):\n    assert isinstance(critic, api.critic.Critic)\n    assert isinstance(changeset, api.changeset.Changeset)\n    return api.impl.filechange.fetchAll(critic, changeset)\n"
  },
  {
    "path": "src/api/filecontent.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2017 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\n\nclass FilecontentError(api.APIError):\n    pass\n\nclass Filecontent(api.APIObject):\n    \"\"\"Representation of some context\"\"\"\n\n    def getLines(self, first_row=None, last_row=None):\n        assert first_row is None or isinstance(first_row, int)\n        assert last_row is None or isinstance(last_row, int)\n\n        return self._impl.getLines(first_row, last_row)\n\nclass Line:\n    \"\"\"Representation of a line from some version of a file\"\"\"\n    def __init__(self, parts, offset):\n        self.__parts = parts\n        self.__offset = offset\n\n    @property\n    def parts(self):\n        return self.__parts\n\n    @property\n    def offset(self):\n        return self.__offset\n\ndef fetch(critic, repository, blob_sha1, file_obj):\n    assert isinstance(critic, api.critic.Critic)\n    assert isinstance(repository, api.repository.Repository)\n    assert isinstance(blob_sha1, str)\n    assert isinstance(file_obj, api.file.File)\n\n    return api.impl.filecontent.fetch(critic, repository, blob_sha1, file_obj)\n"
  },
  {
    "path": "src/api/filediff.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2017 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\n\nclass FilediffError(api.APIError):\n    pass\n\nclass FilediffParserError(api.APIError):\n    pass\n\nclass FilediffDelayed(api.ResultDelayedError):\n    pass\n\nclass Filediff(api.APIObject):\n    \"\"\"Representation of the source code for a file in a changeset\n\n       A filediff has a list of macro chunks, where each macro chunk represents\n       a partition of a file.\"\"\"\n\n    def __hash__(self):\n        return hash((\"filediff\", self.filechange))\n    def __eq__(self, other):\n        return self.filechange == other.filechange\n\n    @property\n    def filechange(self):\n        return self._impl.filechange\n\n    @property\n    def old_count(self):\n        return self._impl.new_count\n\n    @property\n    def new_count(self):\n        return self._impl.new_count\n\n    def getMacroChunks(self, context_lines, comments=None, ignore_chunks=False):\n        assert isinstance(context_lines, int)\n        if comments is not None:\n            comments = list(comments)\n            assert all(isinstance(comment, api.comment.Comment)\n                       for comment in comments)\n        assert isinstance(ignore_chunks, bool)\n        return self._impl.getMacroChunks(\n            self.critic, context_lines, comments, ignore_chunks)\n\nclass MacroChunk(object):\n    \"\"\"Representation of a partition of a file\n\n       A macro chunk contains all lines in the range from the first to the last.\n       In other words, if a line is between the first and last line of this\n       macro chunk, it will be included in this macro chunk.\n\n       A macro chunk also contains old and new offsets and counts, which\n       describe where in the file the lines are from, as well as how many are on\n       each side. The two sides represents the old and new version of the file,\n       where the old version is what the file looked like just before the first\n       (earliest) commit of the changeset, and the new version is what the file\n       looked like just after the last (latest) commit of the changeset.\"\"\"\n\n    def __init__(self, impl_macro_chunk):\n        self.__impl = impl_macro_chunk\n\n    @property\n    def chunks(self):\n        return self.__impl.legacy_macro_chunk.chunks\n\n    @property\n    def old_offset(self):\n        return self.__impl.legacy_macro_chunk.old_offset\n\n    @property\n    def new_offset(self):\n        return self.__impl.legacy_macro_chunk.new_offset\n\n    @property\n    def old_count(self):\n        return self.__impl.legacy_macro_chunk.old_count\n\n    @property\n    def new_count(self):\n        return self.__impl.legacy_macro_chunk.new_count\n\n    @property\n    def lines(self):\n        return self.__impl.getLines()\n\nclass Line(object):\n    \"\"\"Representation of a line of a file\n\n       A line represents a change from the old version of a file, to the new\n       version of a file.\n\n       A line has a type, which is one of the following:\n         CONTEXT\n         DELETED\n         MODIFIED\n         REPLACED\n         INSERTED\n         WHITESPACE\n         CONFLICT\n\n       The type of the line describes how the line changed.\n       \"\"\"\n\n    def __init__(self, impl_line):\n        self.__impl = impl_line\n\n    @property\n    def type(self):\n        return self.__impl.legacy_line.type\n\n    @property\n    def old_offset(self):\n        return self.__impl.legacy_line.old_offset\n\n    @property\n    def new_offset(self):\n        return self.__impl.legacy_line.new_offset\n\n    @property\n    def content(self):\n        return self.__impl.getContent()\n\n    @property\n    def is_whitespace(self):\n        return self.__impl.is_whitespace\n\n    @property\n    def analysis(self):\n        return self.__impl.analysis\n\n    @property\n    def type_string(self):\n        return self.__impl.type_string()\n\nclass Part(object):\n    \"\"\"Representation of a part of a line of code\n\n       A part has a type, which describes what kind of content it contains.\n       It can also have a state, meaning the part is either something that was\n       removed (in the old version of a file), or added (in the new version of\n       a file).\n\n       A part also has some content, which is typically a word (ex. for, in, if)\n       or an operator (ex. =, !=, [, ]).\"\"\"\n\n    def __init__(self, impl_part):\n        self.__impl = impl_part\n\n    @property\n    def type(self):\n        return self.__impl.type\n\n    @property\n    def content(self):\n        return self.__impl.content\n\n    @property\n    def state(self):\n        return self.__impl.state\n\ndef fetch(critic, filechange):\n    assert isinstance(critic, api.critic.Critic)\n    assert isinstance(filechange, api.filechange.FileChange), filechange\n\n    return api.impl.filediff.fetch(critic, filechange)\n\ndef fetchAll(critic, changeset):\n    assert isinstance(critic, api.critic.Critic)\n    assert isinstance(changeset, api.changeset.Changeset)\n    assert comments is None or isinstance(comments, list)\n    assert isinstance(context_lines, int)\n\n    return api.impl.filediff.fetchAll(critic, changeset)\n"
  },
  {
    "path": "src/api/filters.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\n\nclass FilterError(api.APIError):\n    \"\"\"Base exception for all errors related to the User class.\"\"\"\n    pass\n\nclass Filter(api.APIObject):\n    \"\"\"Base class of RepositoryFilter and ReviewFilter\"\"\"\n\n    @property\n    def subject(self):\n        \"\"\"The filter's subject\n\n           The subject is the user that the filter applies to.\"\"\"\n        return self._impl.getSubject(self.critic)\n\n    @property\n    def type(self):\n        \"\"\"The filter's type\n\n           The type is always one of \"reviewer\", \"watcher\" and \"ignore\".\"\"\"\n        return self._impl.type\n\n    @property\n    def path(self):\n        \"\"\"The filter's path\"\"\"\n        return self._impl.path\n\nclass InvalidRepositoryFilterId(FilterError):\n    \"\"\"Raised when an invalid repository filter id is used\"\"\"\n\n    def __init__(self, value):\n        \"\"\"Constructor\"\"\"\n        super(InvalidRepositoryFilterId, self).__init__(\n            \"Invalid repository filter id: %r\" % value)\n        self.value = value\n\nclass RepositoryFilter(Filter):\n    \"\"\"Representation of a repository filter\n\n       A repository filter is a filter that applies to all reviews in a\n       repository.\"\"\"\n\n    @property\n    def id(self):\n        \"\"\"The repository filter's unique id\"\"\"\n        return self._impl.id\n\n    @property\n    def repository(self):\n        \"\"\"The repository filter's repository\"\"\"\n        return self._impl.getRepository(self.critic)\n\n    @property\n    def delegates(self):\n        \"\"\"The repository filter's delegates, or None\n\n           The delegates are returned as a frozenset of api.user.User objects.\n           If the filter's type is not \"reviewer\", this attribute's value is\n           None.\"\"\"\n        return self._impl.getDelegates(self.critic)\n\ndef fetchRepositoryFilter(critic, filter_id):\n    \"\"\"Fetch a RepositoryFilter object with the given filter id\"\"\"\n    assert isinstance(critic, api.critic.Critic)\n    return api.impl.filters.fetchRepositoryFilter(critic, int(filter_id))\n\nclass ReviewFilter(Filter):\n    \"\"\"Representation of a review filter\n\n       A review filter is a filter that applies to a single review only.\"\"\"\n\n    @property\n    def id(self):\n        \"\"\"The review filter's unique id\"\"\"\n        return self._impl.id\n\n    @property\n    def review(self):\n        \"\"\"The review filter's review\"\"\"\n        return self._impl.getReview(self.critic)\n\n    @property\n    def creator(self):\n        \"\"\"The review filter's creator\n\n           This is the user that created the review filter, which can be\n           different from the filter's subject.\"\"\"\n        return self._impl.getCreator(self.critic)\n"
  },
  {
    "path": "src/api/impl/__init__.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\n\"\"\"Critic API implementation\"\"\"\n\nimport critic\nimport user\nimport repository\nimport filters\nimport branch\nimport commit\nimport commitset\nimport changeset\nimport filechange\nimport filediff\nimport filecontent\nimport review\nimport reviewsummary\nimport log\nimport accesstoken\nimport accesscontrolprofile\nimport labeledaccesscontrolprofile\nimport extension\nimport file\nimport comment\nimport reply\nimport batch\nimport reviewablefilechange\n"
  },
  {
    "path": "src/api/impl/accesscontrolprofile.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2015 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\nimport apiobject\nimport dbutils\n\npublic_class = api.accesscontrolprofile.AccessControlProfile\n\nHTTPException = public_class.HTTPException\nRepositoryException = public_class.RepositoryException\nExtensionException = public_class.ExtensionException\n\nclass AccessControlProfile(apiobject.APIObject):\n    wrapper_class = api.accesscontrolprofile.AccessControlProfile\n\n    def __init__(self, profile_id, title, token_id, *rules):\n        self.id = profile_id\n        self.title = title\n        self.__token_id = token_id\n        (self.http_rule,\n         self.repositories_rule,\n         self.extensions_rule) = rules\n\n    def getAccessToken(self, critic):\n        if self.__token_id is None:\n            return None\n        return api.accesstoken.fetch(critic, self.__token_id)\n\n    def getHTTP(self, critic):\n        cursor = critic.getDatabaseCursor()\n        cursor.execute(\"\"\"SELECT id, request_method, path_pattern\n                            FROM accesscontrol_http\n                           WHERE profile=%s\n                        ORDER BY id ASC\"\"\",\n                       (self.id,))\n        return public_class.Category(\n            self.http_rule,\n            [HTTPException(exception_id, request_method, path_pattern)\n             for exception_id, request_method, path_pattern, in cursor])\n\n    def getRepositories(self, critic):\n        cursor = critic.getDatabaseCursor()\n        cursor.execute(\"\"\"SELECT id, access_type, repository\n                            FROM accesscontrol_repositories\n                           WHERE profile=%s\n                        ORDER BY id ASC\"\"\",\n                       (self.id,))\n        return public_class.Category(\n            self.repositories_rule,\n            [RepositoryException(exception_id, access_type,\n                                 api.repository.fetch(critic, repository_id)\n                                 if repository_id is not None else None)\n             for exception_id, access_type, repository_id in cursor])\n\n    def getExtensions(self, critic):\n        import configuration\n        if not configuration.extensions.ENABLED:\n            return public_class.Category(self.extensions_rule, [])\n        cursor = critic.getDatabaseCursor()\n        cursor.execute(\"\"\"SELECT id, access_type, extension_key\n                            FROM accesscontrol_extensions\n                           WHERE profile=%s\n                        ORDER BY id ASC\"\"\",\n                       (self.id,))\n        return public_class.Category(\n            self.extensions_rule,\n            [ExtensionException(exception_id, access_type,\n                                api.extension.fetch(critic, key=extension_key)\n                                if extension_key is not None else None)\n             for exception_id, access_type, extension_key in cursor])\n\n    @staticmethod\n    def refresh(critic, tables, cached_profiles):\n        if \"accesscontrolprofiles\" not in tables:\n            return\n\n        AccessControlProfile.updateAll(\n            critic,\n            \"\"\"SELECT id, title, access_token, http, repositories, extensions\n                 FROM accesscontrolprofiles\n                WHERE id=ANY (%s)\"\"\",\n            cached_profiles)\n\n@AccessControlProfile.cached()\ndef fetch(critic, profile_id):\n    cursor = critic.getDatabaseCursor()\n    cursor.execute(\"\"\"SELECT id, title, access_token, http, repositories,\n                             extensions\n                        FROM accesscontrolprofiles\n                       WHERE id=%s\"\"\",\n                   (profile_id,))\n    try:\n        return next(AccessControlProfile.make(critic, cursor))\n    except StopIteration:\n        raise api.accesscontrolprofile.InvalidAccessControlProfileId(profile_id)\n\ndef fetchAll(critic, title):\n    cursor = critic.getDatabaseCursor()\n    if title is None:\n        cursor.execute(\"\"\"SELECT id, title, NULL, http, repositories, extensions\n                            FROM accesscontrolprofiles\n                           WHERE access_token IS NULL\n                        ORDER BY id ASC\"\"\")\n    else:\n        cursor.execute(\"\"\"SELECT id, title, NULL, http, repositories, extensions\n                            FROM accesscontrolprofiles\n                           WHERE access_token IS NULL\n                             AND title=%s\n                        ORDER BY id ASC\"\"\",\n                       (title,))\n    return list(AccessControlProfile.make(critic, cursor))\n"
  },
  {
    "path": "src/api/impl/accesstoken.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2015 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\nimport apiobject\n\nclass AccessToken(apiobject.APIObject):\n    wrapper_class = api.accesstoken.AccessToken\n\n    def __init__(self, token_id, access_type, user_id, part1, part2, title):\n        self.id = token_id\n        self.access_type = access_type\n        self.__user_id = user_id\n        self.part1 = part1\n        self.part2 = part2\n        self.title = title\n        self.__profile_id = None\n\n    def getUser(self, critic):\n        if self.__user_id is None:\n            return None\n        return api.user.fetch(critic, self.__user_id)\n\n    def getProfile(self, critic):\n        if self.__profile_id is None:\n            cursor = critic.getDatabaseCursor()\n            cursor.execute(\"\"\"SELECT id\n                                FROM accesscontrolprofiles\n                               WHERE access_token=%s\"\"\",\n                           (self.id,))\n            row = cursor.fetchone()\n            if not row:\n                return None\n            self.__profile_id, = row\n        return api.accesscontrolprofile.fetch(critic, self.__profile_id)\n\n    @staticmethod\n    def refresh(critic, tables, cached_tokens):\n        if not tables.intersection((\"accesstokens\", \"accesscontrolprofiles\")):\n            return\n\n        AccessToken.updateAll(\n            critic,\n            \"\"\"SELECT id, access_type, uid, part1, part2, title\n                 FROM accesstokens\n                WHERE id=ANY (%s)\"\"\",\n            cached_tokens)\n\n@AccessToken.cached()\ndef fetch(critic, token_id):\n    cursor = critic.getDatabaseCursor()\n    cursor.execute(\n        \"\"\"SELECT id, access_type, uid, part1, part2, title\n             FROM accesstokens\n            WHERE id=%s\"\"\",\n        (token_id,))\n    try:\n        return next(AccessToken.make(critic, cursor))\n    except StopIteration:\n        raise api.accesstoken.InvalidAccessTokenId(token_id)\n\ndef fetchAll(critic, user):\n    cursor = critic.getDatabaseCursor()\n    if user is None:\n        cursor.execute(\n            \"\"\"SELECT id, access_type, uid, part1, part2, title\n                 FROM accesstokens\"\"\")\n    else:\n        cursor.execute(\n            \"\"\"SELECT id, access_type, uid, part1, part2, title\n                 FROM accesstokens\n                WHERE uid=%s\"\"\",\n            (user.id,))\n    return list(AccessToken.make(critic, cursor))\n"
  },
  {
    "path": "src/api/impl/apiobject.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2015 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nclass APIObject(object):\n    def wrap(self, critic):\n        return self.wrapper_class(critic, self)\n\n    @classmethod\n    def create(Implementation, critic, *args):\n        return Implementation(*args).wrap(critic)\n\n    @classmethod\n    def get_cached(Implementation, critic, item_id):\n        return critic._impl.lookup(Implementation, item_id)\n\n    @classmethod\n    def add_cached(Implementation, critic, item_id, item):\n        critic._impl.assign(Implementation, item_id, item)\n\n    @classmethod\n    def make(Implementation, critic, args_list, ignored_errors=(),\n             cache_key=lambda args: args[0]):\n        for args in args_list:\n            item_id = cache_key(args)\n            try:\n                item = critic._impl.lookup(Implementation, item_id)\n            except KeyError:\n                try:\n                    item = Implementation.create(critic, *args)\n                except ignored_errors:\n                    continue\n                Implementation.add_cached(critic, item_id, item)\n\n            yield item\n\n    @classmethod\n    def cached(Implementation, InvalidIdError=None,\n               cache_key=lambda args: args[0]):\n        def wrap(fetch):\n            def wrapper(critic, *args):\n                item_id = cache_key(args)\n                if item_id is not None:\n                    try:\n                        return critic._impl.lookup(Implementation, item_id)\n                    except KeyError:\n                        pass\n                result = fetch(critic, *args)\n                if InvalidIdError is None:\n                    return result\n                try:\n                    return next(result)\n                except StopIteration:\n                    raise InvalidIdError(item_id)\n            return wrapper\n        return wrap\n\n    @classmethod\n    def cachedMany(Implementation, InvalidIdsError,\n                   cache_keys=lambda args: args[0]):\n        def wrap(fetchMany):\n            def wrapper(critic, *args):\n                items = {}\n                item_ids = cache_keys(args)\n                try:\n                    cache = critic._impl.lookup(Implementation)\n                except KeyError:\n                    cache = {}\n                uncached_ids = set(item_ids) - set(cache.keys())\n                items = {item_id: cache[item_id]\n                         for item_id in item_ids\n                         if item_id in cache}\n                if uncached_ids:\n                    items.update(\n                        (item.id, item)\n                        for item in fetchMany(critic, list(uncached_ids)))\n                if len(items) < len(set(item_ids)):\n                    invalid_ids = sorted(set(item_ids) - set(items.keys()))\n                    raise InvalidIdsError(invalid_ids)\n                return [items[item_id] for item_id in item_ids]\n            return wrapper\n        return wrap\n\n    @classmethod\n    def allCached(Implementation, critic):\n        \"\"\"Return all cached objects of this type\n\n           The cached objects are returned as a dictionary mapping the object id\n           to the object. This dictionary should not be modified.\"\"\"\n        # Don't catch KeyError here. Something is probably wrong if this\n        # function is called when no objects of the type are cached.\n        return critic._impl.lookup(Implementation)\n\n    @staticmethod\n    def refresh(critic, tables, cached_objects):\n        \"\"\"Refresh objects after transaction commit\n\n           The |tables| parameter is a set of database tables that were modified\n           in the transaction. The |cached_objects| parameter is a dictionary\n           mapping object ids to cached objects (wrappers) of this type.\"\"\"\n        pass\n\n    @classmethod\n    def updateAll(Implementation, critic, query, cached_objects):\n        \"\"\"Execute the query and update all cached objects\n\n           The query must take a single parameter, which is a list of object\n           ids. It will be executed with the list of ids of all cached\n           objects. Each returned row must have the id of the object as the\n           first item, and the implementation constructor must take the row\n           as a whole as arguments:\n\n             new_impl = Implementation(*row)\"\"\"\n        cursor = critic.getDatabaseCursor()\n        cursor.execute(query, (cached_objects.keys(),))\n        for row in cursor:\n            cached_objects[row[0]]._set_impl(Implementation(*row))\n"
  },
  {
    "path": "src/api/impl/batch.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2017 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\nimport apiobject\n\nclass ModifiedComment(object):\n    def __init__(self, comment_id, new_type, new_state):\n        self.id = comment_id\n        self.new_type = new_type\n        self.new_state = new_state\n\nclass Batch(apiobject.APIObject):\n    wrapper_class = api.batch.Batch\n\n    def __init__(self, batch_id, review_id, author_id, comment_id, timestamp):\n        self.id = batch_id\n        self.__review_id = review_id\n        self.__author_id = author_id\n        self.__comment_id = comment_id\n        self.timestamp = timestamp\n        self.__created_comment_ids = None\n        self.__written_reply_ids = None\n        self.__modified_comments = None\n        self.__reviewed_file_changes = None\n        self.__unreviewed_file_changes = None\n\n    def isEmpty(self, critic):\n        self.loadCommentChanges(critic)\n        self.loadFileChanges(critic)\n        return not (self.__created_comment_ids or\n                    self.__written_reply_ids or\n                    self.__modified_comments or\n                    self.__reviewed_file_changes or\n                    self.__unreviewed_file_changes)\n\n    def getReview(self, critic):\n        return api.review.fetch(critic, self.__review_id)\n\n    def getAuthor(self, critic):\n        if self.__author_id is None:\n            return None\n        return api.user.fetch(critic, self.__author_id)\n\n    def getComment(self, critic):\n        if self.__comment_id is None:\n            return None\n        return api.comment.fetch(critic, self.__comment_id)\n\n    def getCreatedComments(self, critic):\n        if self.__created_comment_ids is None:\n            self.loadCommentChanges(critic)\n        return set(api.comment.fetchMany(critic, self.__created_comment_ids))\n\n    def getWrittenReplies(self, critic):\n        if self.__written_reply_ids is None:\n            self.loadCommentChanges(critic)\n        return set(api.reply.fetchMany(critic, self.__written_reply_ids))\n\n    def getResolvedIssues(self, critic):\n        if self.__modified_comments is None:\n            self.loadCommentChanges(critic)\n        return set(api.comment.fetchMany(\n            critic, (modified_comment.id\n                     for modified_comment in self.__modified_comments\n                     if modified_comment.new_state == \"closed\")))\n\n    def getReopenedIssues(self, critic):\n        if self.__modified_comments is None:\n            self.loadCommentChanges(critic)\n        return set(api.comment.fetchMany(\n            critic, (modified_comment.id\n                     for modified_comment in self.__modified_comments\n                     if modified_comment.new_state == \"open\")))\n\n    def getMorphedComments(self, critic):\n        if self.__modified_comments is None:\n            self.loadCommentChanges(critic)\n        new_type_by_comment_id = {\n            modified_comment.id: modified_comment.new_type\n            for modified_comment in self.__modified_comments\n            if modified_comment.new_type is not None\n        }\n        comments = api.comment.fetchMany(critic, new_type_by_comment_id.keys())\n        return {\n            comment: new_type_by_comment_id[comment.id]\n            for comment in comments\n        }\n\n    def getReviewedFileChanges(self, critic):\n        if self.__reviewed_file_changes is None:\n            self.loadFileChanges(critic)\n        return api.reviewablefilechange.fetchMany(\n            critic, self.__reviewed_file_changes)\n\n    def getUnreviewedFileChanges(self, critic):\n        if self.__reviewed_file_changes is None:\n            self.loadFileChanges(critic)\n        return api.reviewablefilechange.fetchMany(\n            critic, self.__unreviewed_file_changes)\n\n    def __queryCondition(self):\n        if self.id is None:\n            condition = \"state='draft'\"\n            batch_id = ()\n        else:\n            condition = \"batch=%s\"\n            batch_id = (self.id,)\n        return condition, batch_id\n\n    def loadCommentChanges(self, critic):\n        cursor = critic.getDatabaseCursor()\n        condition, batch_id = self.__queryCondition()\n        cursor.execute(\"\"\"SELECT commentchains.id, comments.id,\n                                 commentchains.first_comment=comments.id\n                            FROM commentchains\n                            JOIN comments ON (comments.chain=commentchains.id)\n                           WHERE commentchains.review=%s\n                             AND commentchains.state!='empty'\n                             AND comments.uid=%s\n                             AND comments.state!='deleted'\n                             AND comments.{}\"\"\".format(condition),\n                       (self.__review_id, self.__author_id) + batch_id)\n        self.__created_comment_ids = []\n        self.__written_reply_ids = []\n        for comment_id, reply_id, is_initial in cursor:\n            # Don't include the note that is the batch's comment.\n            if comment_id == self.__comment_id:\n                continue\n            if is_initial:\n                self.__created_comment_ids.append(comment_id)\n            else:\n                self.__written_reply_ids.append(reply_id)\n        cursor.execute(\n            \"\"\"SELECT commentchains.id, to_type, to_state\n                 FROM commentchains\n                 JOIN commentchainchanges\n                      ON (commentchainchanges.chain=commentchains.id)\n                WHERE commentchains.review=%s\n                  AND commentchains.state!='empty'\n                  AND commentchainchanges.uid=%s\n                  AND (commentchainchanges.state='performed'\n                       OR commentchainchanges.from_state=commentchains.state\n                       OR commentchainchanges.from_type=commentchains.type)\n                  AND commentchainchanges.{}\"\"\".format(condition),\n            (self.__review_id, self.__author_id) + batch_id)\n        self.__modified_comments = []\n        for comment_id, new_type, new_state in cursor:\n            self.__modified_comments.append(ModifiedComment(\n                comment_id, new_type, new_state))\n\n    def loadFileChanges(self, critic):\n        cursor = critic.getDatabaseCursor()\n        condition, batch_id = self.__queryCondition()\n        cursor.execute(\n            \"\"\"SELECT reviewfiles.id, to_state\n                 FROM reviewfiles\n                 JOIN reviewfilechanges\n                      ON (reviewfilechanges.file=reviewfiles.id)\n                WHERE reviewfiles.review=%s\n                  AND reviewfilechanges.uid=%s\n                  AND (reviewfilechanges.state='performed'\n                       OR reviewfilechanges.to_state!=reviewfiles.state)\n                  AND reviewfilechanges.{}\"\"\".format(condition),\n            (self.__review_id, self.__author_id) + batch_id)\n        rows = cursor.fetchall()\n        self.__reviewed_file_changes = set(\n            filechange_id\n            for filechange_id, to_state in rows\n            if to_state == 'reviewed')\n        self.__unreviewed_file_changes = set(\n            filechange_id\n            for filechange_id, to_state in rows\n            if to_state == 'pending')\n\n@Batch.cached(api.batch.InvalidBatchId)\ndef fetch(critic, batch_id):\n    cursor = critic.getDatabaseCursor()\n    cursor.execute(\"\"\"SELECT id, review, uid, comment, time\n                        FROM batches\n                       WHERE id=%s\"\"\",\n                   (batch_id,))\n    return Batch.make(critic, cursor)\n\ndef fetchAll(critic, review, author):\n    conditions = [\"TRUE\"]\n    values = []\n    if review:\n        conditions.append(\"review=%s\")\n        values.append(review.id)\n    if author:\n        conditions.append(\"uid=%s\")\n        values.append(author.id)\n    cursor = critic.getDatabaseCursor()\n    cursor.execute(\"\"\"SELECT id, review, uid, comment, time\n                        FROM batches\n                       WHERE {}\"\"\".format(\" AND \".join(conditions)),\n                   values)\n    return list(Batch.make(critic, cursor))\n\ndef fetchUnpublished(critic, review):\n    author_id = critic.effective_user.id\n    return Batch(None, review.id, author_id, None, None).wrap(critic)\n"
  },
  {
    "path": "src/api/impl/branch.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\nimport apiobject\n\nclass Branch(apiobject.APIObject):\n    wrapper_class = api.branch.Branch\n\n    def __init__(self, branch_id, name, repository_id, head_id):\n        self.id = branch_id\n        self.name = name\n        self.__repository_id = repository_id\n        self.__head_id = head_id\n        self.__head = None\n        self.__commits = None\n\n    def getRepository(self, critic):\n        return api.repository.fetch(critic, repository_id=self.__repository_id)\n\n    def getHead(self, critic):\n        if self.__head is None:\n            self.__head = api.commit.fetch(\n                self.getRepository(critic), commit_id=self.__head_id)\n        return self.__head\n\n    def getCommits(self, critic):\n        if self.__commits is None:\n            repository = self.getRepository(critic)\n            cursor = critic.getDatabaseCursor()\n            cursor.execute(\"\"\"SELECT commit\n                                FROM reachable\n                               WHERE branch=%s\"\"\",\n                           (self.id,))\n            self.__commits = api.commitset.create(\n                critic, (api.commit.fetch(repository, commit_id)\n                         for (commit_id,) in cursor))\n        return self.__commits\n\n@Branch.cached()\ndef fetch(critic, branch_id, repository, name):\n    cursor = critic.getDatabaseCursor()\n    if branch_id is not None:\n        cursor.execute(\"\"\"SELECT id, name, repository, head\n                            FROM branches\n                           WHERE id=%s\"\"\",\n                       (branch_id,))\n    else:\n        cursor.execute(\"\"\"SELECT id, name, repository, head\n                            FROM branches\n                           WHERE repository=%s\n                             AND name=%s\"\"\",\n                       (repository.id, name,))\n    try:\n        return next(Branch.make(critic, cursor))\n    except StopIteration:\n        if branch_id is not None:\n            raise api.branch.InvalidBranchId(branch_id)\n        else:\n            raise api.branch.InvalidBranchName(name)\n\ndef fetchAll(critic, repository):\n    cursor = critic.getDatabaseCursor()\n    if repository is not None:\n        cursor.execute(\"\"\"SELECT id, name, repository, head\n                            FROM branches\n                           WHERE repository=%s\n                        ORDER BY name\"\"\",\n                       (repository.id,))\n    else:\n        cursor.execute(\"\"\"SELECT id, name, repository, head\n                            FROM branches\n                        ORDER BY name\"\"\")\n    return list(Branch.make(critic, cursor))\n"
  },
  {
    "path": "src/api/impl/branch_unittest.py",
    "content": "def basic(arguments):\n    import api\n\n    assert arguments.sha1 is not None, \"missing argument: --sha1\"\n    assert arguments.name is not None, \"missing argument: --name\"\n\n    critic = api.critic.startSession(for_testing=True)\n    repository = api.repository.fetch(critic, repository_id=1)\n    branch = api.branch.fetch(\n        critic, repository=repository, name=arguments.name)\n\n    assert isinstance(branch, api.branch.Branch)\n    assert isinstance(branch.id, int)\n    assert isinstance(branch.name, str)\n    assert branch.name == arguments.name\n    assert branch.repository is repository\n    assert isinstance(branch.head, api.commit.Commit)\n    assert branch.head.sha1 == arguments.sha1\n    assert isinstance(branch.commits, api.commitset.CommitSet)\n    assert len(branch.commits) == 5\n    assert len(branch.commits.heads) == 1\n    assert branch.head in branch.commits.heads\n\n    assert api.branch.fetch(critic, branch_id=branch.id) is branch\n\n    branches = api.branch.fetchAll(critic)\n    assert isinstance(branches, list)\n    assert all(isinstance(branch, api.branch.Branch) for branch in branches)\n    assert branch in branches\n\n    branches = api.branch.fetchAll(critic, repository=repository)\n    assert isinstance(branches, list)\n    assert all(isinstance(branch, api.branch.Branch) for branch in branches)\n    assert branch in branches\n\n    try:\n        api.branch.fetch(critic, branch_id=4711)\n    except api.branch.InvalidBranchId:\n        pass\n    except Exception as error:\n        assert False, \"wrong exception raised: %s\" % error\n    else:\n        assert False, \"no exception raised\"\n\n    try:\n        api.branch.fetch(\n            critic, repository=repository, name=arguments.name + \"-wrong\")\n    except api.branch.InvalidBranchName:\n        pass\n    except Exception as error:\n        assert False, \"wrong exception raised: %s\" % error\n    else:\n        assert False, \"no exception raised\"\n\n    print \"basic: ok\"\n\ndef main(argv):\n    import argparse\n\n    parser = argparse.ArgumentParser()\n\n    parser.add_argument(\"--sha1\")\n    parser.add_argument(\"--name\")\n    parser.add_argument(\"tests\", nargs=argparse.REMAINDER)\n\n    arguments = parser.parse_args(argv)\n\n    for test in arguments.tests:\n        if test == \"basic\":\n            basic(arguments)\n"
  },
  {
    "path": "src/api/impl/changeset.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2017 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nfrom __future__ import absolute_import\n\nimport api\nimport api.impl\nfrom api.impl import apiobject\nimport changeset.client\nfrom gitutils import GitReferenceError\nimport diff\n\nclass Changeset(apiobject.APIObject):\n    wrapper_class = api.changeset.Changeset\n\n    def __init__(self, id, changeset_type, from_commit_id, to_commit_id, files, repository):\n        self.id = id\n        self.type = changeset_type\n        self.__from_commit_id = from_commit_id\n        self.__to_commit_id = to_commit_id\n        self.__filediffs = None\n        self.repository = repository\n\n    def getFromCommit(self):\n        if self.__from_commit_id is None:\n            return None\n        return api.commit.fetch(\n            self.repository, commit_id=self.__from_commit_id)\n\n    def getToCommit(self):\n        if self.__to_commit_id is None:\n            return None\n        return api.commit.fetch(\n            self.repository, commit_id=self.__to_commit_id)\n\n    def getContributingCommits(self, critic):\n        if self.__from_commit_id is None:\n            return None\n        try:\n            return api.commitset.calculateFromRange(\n                critic, self.getFromCommit(), self.getToCommit())\n        except api.commitset.InvalidCommitRange:\n            return None\n\n\ndef fetch(critic, repository, changeset_id, from_commit, to_commit,\n          single_commit, review, automatic):\n    if changeset_id is not None:\n        return fetch_by_id(critic, repository, changeset_id)\n\n    if review and automatic:\n        # Handle automatic changesets using legacy code, and by setting the\n        # |from_commit|/|to_commit| or |single_commit| arguments.\n        import dbutils\n        import request\n        import page.showcommit\n\n        legacy_user = dbutils.User.fromAPI(critic.effective_user)\n        legacy_review = dbutils.Review.fromAPI(review)\n\n        try:\n            from_sha1, to_sha1, all_commits, listed_commits = \\\n                page.showcommit.commitRangeFromReview(\n                    critic.database, legacy_user, legacy_review, automatic, [])\n        except request.DisplayMessage:\n            # FIXME: This error message could be better. The legacy code does\n            # report more useful error messages, but does it in a way that's\n            # pretty tied to the old HTML UI. Some refactoring is needed.\n            raise api.changeset.ChangesetError(\"Automatic mode failed\")\n        except page.showcommit.NoChangesFound:\n            assert automatic != \"everything\"\n            raise api.changeset.AutomaticChangesetEmpty(\"No %s changes found\"\n                                                        % automatic)\n\n        from_commit = api.commit.fetch(repository, sha1=from_sha1)\n        to_commit = api.commit.fetch(repository, sha1=to_sha1)\n\n        if from_commit == to_commit:\n            single_commit = to_commit\n            from_commit = to_commit = None\n\n    if from_commit and to_commit:\n        changeset_id = get_changeset_id(\n            critic, repository, from_commit, to_commit)\n        if changeset_id is not None:\n            return fetch_by_id(critic, repository, changeset_id)\n        request_changeset_creation(\n            critic, repository.name, \"custom\", from_commit=from_commit,\n            to_commit=to_commit)\n        raise api.changeset.ChangesetDelayed()\n\n    assert single_commit\n\n    if len(single_commit.parents) > 0:\n        from_commit = single_commit.parents[0]\n    else:\n        from_commit = None\n    changeset_id = get_changeset_id(\n        critic, repository, from_commit, single_commit)\n    if changeset_id is not None:\n        return fetch_by_id(critic, repository, changeset_id)\n    request_changeset_creation(\n        critic, repository.name, \"direct\", to_commit=single_commit)\n    raise api.changeset.ChangesetDelayed()\n\n\ndef fetch_by_id(critic, repository, changeset_id):\n    try:\n        return critic._impl.lookup(api.changeset.Changeset,\n                                   (int(repository), changeset_id))\n    except KeyError:\n        pass\n\n    cursor = critic.getDatabaseCursor()\n\n    cursor.execute(\n        \"\"\"SELECT type, parent, child\n             FROM changesets\n            WHERE id=%s\"\"\",\n        (changeset_id,))\n\n    row = cursor.fetchone()\n\n    if not row:\n        raise api.changeset.InvalidChangesetId(id)\n\n    (changeset_type, from_commit_id, to_commit_id) = row\n\n    cursor.execute(\n        \"\"\"SELECT file\n             FROM fileversions\n            WHERE changeset=%s\"\"\",\n        (changeset_id,))\n\n    files = api.file.fetchMany(critic, (file_id for (file_id,) in cursor))\n\n    changeset = Changeset(\n        changeset_id, changeset_type, from_commit_id, to_commit_id,\n        sorted(files, key=lambda file: file.path), repository).wrap(critic)\n\n    critic._impl.assign(\n        api.changeset.Changeset, (int(repository), changeset_id), changeset)\n\n    return changeset\n\n\ndef get_changeset_id(critic, repository, from_commit, to_commit):\n    cursor = critic.getDatabaseCursor()\n    if from_commit:\n        cursor.execute(\n            \"\"\"SELECT id\n                 FROM changesets\n                WHERE parent=%s AND child=%s\"\"\",\n            (from_commit.id, to_commit.id))\n    else:\n        cursor.execute(\n            \"\"\"SELECT id\n                 FROM changesets\n                WHERE parent IS NULL AND child=%s\"\"\",\n            (to_commit.id,))\n    row = cursor.fetchone()\n    if row:\n        return row[0]\n    else:\n        return None\n\n\ndef request_changeset_creation(critic,\n                               repository_name,\n                               changeset_type,\n                               from_commit=None,\n                               to_commit=None):\n    request = { \"changeset_type\": changeset_type,\n                \"repository_name\": repository_name}\n    if changeset_type == \"direct\":\n        request[\"child_sha1\"] = to_commit.sha1\n    elif changeset_type == \"custom\":\n        request[\"parent_sha1\"] = from_commit.sha1\n        request[\"child_sha1\"] = to_commit.sha1\n    elif changeset_type == \"merge\":\n        request[\"child_sha1\"] = to_commit.sha1\n    elif changeset_type == \"conflicts\":\n        request[\"parent_sha1\"] = from_commit.sha1\n        request[\"child_sha1\"] = to_commit.sha1\n    try:\n        changeset.client.requestChangesets([request], async=True)\n    except changeset.client.ChangesetBackgroundServiceError as error:\n        raise api.changeset.ChangesetBackgroundServiceError(error)\n"
  },
  {
    "path": "src/api/impl/changeset_unittest.py",
    "content": "FROM_SHA1 = \"573c5ff15ad95cfbc3e2f2efb0a638a4a78c17a7\"\nFROM_SINGLE_SHA1 = \"aabc2b10c930a9e72fe9587a6e8634087bb3efe1\"\nTO_SHA1 = \"6dc8e9c2d952028286d4b83475947bd0b1410860\"\nROOT_SHA1 = \"ee37c47f6f6a14afa6912c1cc58a9f49d2a29acd\"\n\nCUSTOM_PATHLIST = frozenset([\"src/auth/accesscontrol.py\",\n                             \"src/operation/__init__.py\",\n                             \"src/operation/createreview.py\",\n                             \"src/page/createreview.py\",\n                             \"src/page/utils.py\",\n                             \"testing/__init__.py\",\n                             \"testing/repository.py\",\n                             \"testing/virtualbox.py\"])\n\nSINGLE_PATHLIST = frozenset([\"testing/__init__.py\",\n                             \"testing/repository.py\",\n                             \"testing/virtualbox.py\"])\n\nROOT_PATHLIST = frozenset([\".gitignore\",\n                           \"CONTRIBUTORS\",\n                           \"COPYING\",\n                           \"INSTALL\",\n                           \"MIT-LICENSE.txt\",\n                           \"README\",\n                           \"auth.py\",\n                           \"background/__init__.py\",\n                           \"background/branchtracker.py\",\n                           \"background/branchtrackerhook.py\",\n                           \"background/changeset.py\",\n                           \"background/daemon.py\",\n                           \"background/githook.py\",\n                           \"background/highlight.py\",\n                           \"background/maildelivery.py\",\n                           \"background/servicemanager.py\",\n                           \"background/utils.py\",\n                           \"background/watchdog.py\",\n                           \"base.py\",\n                           \"batchprocessor.py\",\n                           \"changeset/__init__.py\",\n                           \"changeset/client.py\",\n                           \"changeset/create.py\",\n                           \"changeset/detectmoves.py\",\n                           \"changeset/html.py\",\n                           \"changeset/load.py\",\n                           \"changeset/process.py\",\n                           \"changeset/text.py\",\n                           \"changeset/utils.py\",\n                           \"clexer.py\",\n                           \"cli.py\",\n                           \"comments.pgsql\",\n                           \"changeset/html.py\",\n                           \"changeset/load.py\",\n                           \"changeset/process.py\",\n                           \"changeset/text.py\",\n                           \"changeset/utils.py\",\n                           \"clexer.py\",\n                           \"cli.py\",\n                           \"comments.pgsql\",\n                           \"config.py.empty\",\n                           \"critic.py\",\n                           \"dbaccess.py\",\n                           \"dbclean.sql\",\n                           \"dbschema.comments.sql\",\n                           \"dbschema.extensions.sql\",\n                           \"dbschema.sql\",\n                           \"dbutils.py\",\n                           \"diff.py\",\n                           \"diff/__init__.py\",\n                           \"diff/analyze.py\",\n                           \"diff/context.py\",\n                           \"diff/html.py\",\n                           \"diff/merge.py\",\n                           \"diff/parse.py\",\n                           \"diffutils.py\",\n                           \"documentation/concepts.txt\",\n                           \"down.py\",\n                           \"extensions.py\",\n                           \"gitutils.py\",\n                           \"hooks/pre-receive\",\n                           \"htmlutils.py\",\n                           \"index.py\",\n                           \"install.py\",\n                           \"installation/__init__.py\",\n                           \"installation/admin.py\",\n                           \"installation/apache.py\",\n                           \"installation/config.py\",\n                           \"installation/criticctl.py\",\n                           \"installation/database.py\",\n                           \"installation/files.py\",\n                           \"installation/git.py\",\n                           \"installation/initd.py\",\n                           \"installation/input.py\",\n                           \"installation/paths.py\",\n                           \"installation/prefs.py\",\n                           \"installation/prereqs.py\",\n                           \"installation/process.py\",\n                           \"installation/system.py\",\n                           \"installation/templates/configuration/__init__.py\",\n                           \"installation/templates/configuration/base.py\",\n                           \"installation/templates/configuration/database.py\",\n                           \"installation/templates/configuration/executables.py\",\n                           \"installation/templates/configuration/extensions.py\",\n                           \"installation/templates/configuration/limits.py\",\n                           \"installation/templates/configuration/mimetypes.py\",\n                           \"installation/templates/configuration/paths.py\",\n                           \"installation/templates/configuration/services.py\",\n                           \"installation/templates/configuration/smtp.py\",\n                           \"installation/templates/criticctl\",\n                           \"installation/templates/initd\",\n                           \"installation/templates/site\",\n                           \"linkify.py\",\n                           \"log/__init__.py\",\n                           \"log/commitset.py\",\n                           \"log/html.py\",\n                           \"log/tree.py\",\n                           \"mailutils.py\",\n                           \"maintenance/check-branches.py\",\n                           \"maintenance/check-commits.py\",\n                           \"maintenance/dumppreferences.py\",\n                           \"maintenance/installpreferences.py\",\n                           \"maintenance/progress.py\",\n                           \"operation/__init__.py\",\n                           \"operation/addrepository.py\",\n                           \"operation/autocompletedata.py\",\n                           \"operation/blame.py\",\n                           \"operation/createcomment.py\",\n                           \"operation/createreview.py\",\n                           \"operation/draftchanges.py\",\n                           \"operation/editresource.py\",\n                           \"operation/extensioninstallation.py\",\n                           \"operation/fetchlines.py\",\n                           \"operation/manipulateassignments.py\",\n                           \"operation/manipulatecomment.py\",\n                           \"operation/manipulatefilters.py\",\n                           \"operation/manipulatereview.py\",\n                           \"operation/manipulateuser.py\",\n                           \"operation/markfiles.py\",\n                           \"operation/news.py\",\n                           \"operation/rebasereview.py\",\n                           \"operation/recipientfilter.py\",\n                           \"operation/servicemanager.py\",\n                           \"operation/trackedbranch.py\",\n                           \"page/__init__.py\",\n                           \"page/addrepository.py\",\n                           \"page/basic.py\",\n                           \"page/branches.py\",\n                           \"page/checkbranch.py\",\n                           \"page/config.py\",\n                           \"page/confirmmerge.py\",\n                           \"page/createreview.py\",\n                           \"page/dashboard.py\",\n                           \"page/editresource.py\",\n                           \"page/filterchanges.py\",\n                           \"page/home.py\",\n                           \"page/manageextensions.py\",\n                           \"page/managereviewers.py\",\n                           \"page/news.py\",\n                           \"page/repositories.py\",\n                           \"page/search.py\",\n                           \"page/services.py\",\n                           \"page/showbatch.py\",\n                           \"page/showbranch.py\",\n                           \"page/showcomment.py\",\n                           \"page/showcommit.py\",\n                           \"page/showfile.py\",\n                           \"page/showreview.py\",\n                           \"page/showreviewlog.py\",\n                           \"page/showtree.py\",\n                           \"page/statistics.py\",\n                           \"page/tutorial.py\",\n                           \"page/utils.py\",\n                           \"path.pgsql\",\n                           \"profiling.py\",\n                           \"request.py\",\n                           \"resources/.gitattributes\",\n                           \"resources/.gitignore\",\n                           \"resources/autocomplete.js\",\n                           \"resources/basic.css\",\n                           \"resources/basic.js\",\n                           \"resources/branches.css\",\n                           \"resources/branches.js\",\n                           \"resources/changeset.css\",\n                           \"resources/changeset.js\",\n                           \"resources/checkbranch.css\",\n                           \"resources/checkbranch.js\",\n                           \"resources/comment.css\",\n                           \"resources/comment.js\",\n                           \"resources/config.css\",\n                           \"resources/config.js\",\n                           \"resources/confirmmerge.css\",\n                           \"resources/confirmmerge.js\",\n                           \"resources/createreview.css\",\n                           \"resources/createreview.js\",\n                           \"resources/dashboard.css\",\n                           \"resources/dashboard.js\",\n                           \"resources/diff.css\",\n                           \"resources/editresource.css\",\n                           \"resources/editresource.js\",\n                           \"resources/favicon-dev.png\",\n                           \"resources/favicon.png\",\n                           \"resources/filterchanges.css\",\n                           \"resources/filterchanges.js\",\n                           \"resources/home.css\",\n                           \"resources/home.js\",\n                           \"resources/images/ui-bg_flat_75_aaaaaa_40x100.png\",\n                           \"resources/images/ui-bg_glass_100_f5f0e5_1x400.png\",\n                           \"resources/images/ui-bg_glass_25_cb842e_1x400.png\",\n                           \"resources/images/ui-bg_glass_70_ede4d4_1x400.png\",\n                           \"resources/images/ui-bg_highlight-hard_100_f4f0ec_1x100.png\",\n                           \"resources/images/ui-bg_highlight-hard_65_fee4bd_1x100.png\",\n                           \"resources/images/ui-bg_highlight-hard_75_f5f5b5_1x100.png\",\n                           \"resources/images/ui-bg_inset-soft_100_f4f0ec_1x100.png\",\n                           \"resources/images/ui-icons_c47a23_256x240.png\",\n                           \"resources/images/ui-icons_cb672b_256x240.png\",\n                           \"resources/images/ui-icons_f08000_256x240.png\",\n                           \"resources/images/ui-icons_f35f07_256x240.png\",\n                           \"resources/images/ui-icons_ff7519_256x240.png\",\n                           \"resources/images/ui-icons_ffffff_256x240.png\",\n                           \"resources/jquery-1.7.1.min.js\",\n                           \"resources/jquery-tooltip.css\",\n                           \"resources/jquery-tooltip.js\",\n                           \"resources/jquery-ui-1.8.17.custom.css\",\n                           \"resources/jquery-ui-1.8.17.custom.min.js\",\n                           \"resources/jquery-ui-autocomplete-html.js\",\n                           \"resources/jquery-ui.css\",\n                           \"resources/jquery-ui.js\",\n                           \"resources/jquery.js\",\n                           \"resources/log.css\",\n                           \"resources/log.js\",\n                           \"resources/manageextensions.css\",\n                           \"resources/manageextensions.js\",\n                           \"resources/managereviewers.css\",\n                           \"resources/managereviewers.js\",\n                           \"resources/message.css\",\n                           \"resources/newrepository.css\",\n                           \"resources/newrepository.js\",\n                           \"resources/news.css\",\n                           \"resources/news.js\",\n                           \"resources/repositories.css\",\n                           \"resources/repositories.js\",\n                           \"resources/review.css\",\n                           \"resources/review.js\",\n                           \"resources/seal-of-approval-left.png\",\n                           \"resources/search.css\",\n                           \"resources/search.js\",\n                           \"resources/services.css\",\n                           \"resources/services.js\",\n                           \"resources/showbatch.css\",\n                           \"resources/showbranch.css\",\n                           \"resources/showcomment.js\",\n                           \"resources/showfile.css\",\n                           \"resources/showfile.js\",\n                           \"resources/showreview.css\",\n                           \"resources/showreview.js\",\n                           \"resources/showreviewlog.css\",\n                           \"resources/showtree.css\",\n                           \"resources/statistics.css\",\n                           \"resources/syntax.css\",\n                           \"resources/tabify.css\",\n                           \"resources/tabify.js\",\n                           \"resources/tutorial.css\",\n                           \"resources/tutorial.js\",\n                           \"resources/whitespace.css\",\n                           \"review/__init__.py\",\n                           \"review/comment/__init__.py\",\n                           \"review/filters.py\",\n                           \"review/html.py\",\n                           \"review/mail.py\",\n                           \"review/report.py\",\n                           \"review/utils.py\",\n                           \"roles.sql\",\n                           \"syntaxhighlight/__init__.py\",\n                           \"syntaxhighlight/clexer.py\",\n                           \"syntaxhighlight/context.py\",\n                           \"syntaxhighlight/cpp.py\",\n                           \"syntaxhighlight/generate.py\",\n                           \"syntaxhighlight/generic.py\",\n                           \"syntaxhighlight/request.py\",\n                           \"textformatting.py\",\n                           \"textutils.py\",\n                           \"tutorials/checkbranch.txt\",\n                           \"tutorials/rebasing.txt\",\n                           \"tutorials/reconfiguring.txt\",\n                           \"tutorials/repository.txt\",\n                           \"tutorials/requesting.txt\",\n                           \"tutorials/reviewing.txt\",\n                           \"utf8utils.py\",\n                           \"wsgi.py\",\n                           \"wsgistartup.py\"])\n\ndef pre():\n    import api\n\n    critic = api.critic.startSession(for_testing=True)\n    repository = api.repository.fetch(critic, name=\"critic\")\n\n    custom_changeset(\"pre\", api, critic, repository)\n    direct_changeset(\"pre\", api, critic, repository)\n    root_changeset(\"pre\", api, critic, repository)\n    bad_changesets(\"pre\", api, critic, repository)\n\n    print(\"pre: ok\")\n\n\ndef post():\n    import api\n\n    critic = api.critic.startSession(for_testing=True)\n    repository = api.repository.fetch(critic, name=\"critic\")\n\n    custom_changeset(\"post\", api, critic, repository)\n    direct_changeset(\"post\", api, critic, repository)\n    root_changeset(\"post\", api, critic, repository)\n\n    print(\"post: ok\")\n\n\ndef is_empty_changeset(changeset, changeset_type):\n    return (changeset.id == None and\n            changeset.type == changeset_type and\n            changeset.files == None)\n\ndef check_types(changeset):\n    return (isinstance(changeset.id, (int, long)) and\n            isinstance(changeset.type, str) and\n            isinstance(changeset.files, list))\n\n\ndef custom_changeset(phase, api, critic, repository):\n    if phase == \"pre\":\n        from_commit = api.commit.fetch(repository, sha1=FROM_SHA1)\n        to_commit = api.commit.fetch(repository, sha1=TO_SHA1)\n\n        try:\n            api.changeset.fetch(critic,\n                                repository,\n                                from_commit=from_commit,\n                                to_commit=to_commit)\n        except api.changeset.ChangesetDelayed:\n            pass\n\n    elif phase == \"post\":\n        from_commit = api.commit.fetch(repository, sha1=FROM_SHA1)\n        to_commit = api.commit.fetch(repository, sha1=TO_SHA1)\n        custom_changeset = api.changeset.fetch(critic,\n                                               repository,\n                                               from_commit=from_commit,\n                                               to_commit=to_commit)\n        assert (check_types(custom_changeset) and\n                custom_changeset.type == \"custom\"),\\\n            \"custom_changeset has incorrect types\"\n\n        paths = frozenset([filechange.file.path for filechange in custom_changeset.files])\n\n        assert paths == CUSTOM_PATHLIST,\\\n            \"files in changeset deviate from expected files\"\n\n    else:\n        raise Exception\n\n\ndef direct_changeset(phase, api, critic, repository):\n    if phase == \"pre\":\n        single_commit = api.commit.fetch(repository, sha1=TO_SHA1)\n        from_single_commit = api.commit.fetch(repository, sha1=FROM_SINGLE_SHA1)\n        try:\n            api.changeset.fetch(critic,\n                                repository,\n                                single_commit=single_commit)\n        except api.changeset.ChangesetDelayed:\n            pass\n\n        try:\n            api.changeset.fetch(critic,\n                                repository,\n                                from_commit=from_single_commit,\n                                to_commit=single_commit)\n        except api.changeset.ChangesetDelayed:\n            pass\n\n        try:\n            api.changeset.fetch(critic,\n                                repository,\n                                from_commit=single_commit,\n                                to_commit=from_single_commit)\n        except api.changeset.ChangesetDelayed:\n            pass\n\n    elif phase == \"post\":\n        single_commit = api.commit.fetch(repository, sha1=TO_SHA1)\n        from_single_commit = api.commit.fetch(repository, sha1=FROM_SINGLE_SHA1)\n        changeset = api.changeset.fetch(critic,\n                                        repository,\n                                        single_commit=single_commit)\n        equiv_changeset = api.changeset.fetch(critic,\n                                              repository,\n                                              from_commit=from_single_commit,\n                                              to_commit=single_commit)\n\n        assert (changeset.id == equiv_changeset.id),\\\n            \"changeset and equiv_changeset have different ids\"\n\n        assert (changeset.type == equiv_changeset.type),\\\n            \"changeset and equiv_changeset have different types\"\n\n        files = [filechange.file for filechange in changeset.files]\n        equiv_files = [filechange.file for filechange in equiv_changeset.files]\n\n        changeset_files = frozenset(\n            (file.id, file.path) for file in files)\n        changeset_paths = frozenset(file.path for file in files)\n        equiv_changeset_files = frozenset(\n            (file.id, file.path) for file in equiv_files)\n\n        assert (changeset_files == equiv_changeset_files),\\\n            \"changeset and equiv_changeset have different files\"\n\n        assert (changeset_paths == SINGLE_PATHLIST),\\\n            \"changeset has other files than expected\"\n\n    else:\n        raise Exception\n\n\ndef root_changeset(phase, api, critic, repository):\n    if phase == \"pre\":\n        root_commit = api.commit.fetch(repository, ref=ROOT_SHA1)\n        try:\n            api.changeset.fetch(critic, repository, single_commit=root_commit)\n        except api.changeset.ChangesetDelayed:\n            pass\n\n    elif phase == \"post\":\n        root_commit = api.commit.fetch(repository, ref=ROOT_SHA1)\n        root_changeset = api.changeset.fetch(critic, repository, single_commit=root_commit)\n        assert (root_changeset.type == \"direct\"),\\\n            \"root_changeset should be direct changeset\"\n        assert (isinstance(root_changeset.id, int)),\\\n            \"root_changeset.id should be integer\"\n        root_paths = frozenset([filechange.file.path for filechange in root_changeset.files])\n\n        assert (root_paths == ROOT_PATHLIST),\\\n            \"root_changeset has other files than expected\"\n\n    else:\n        raise Exception\n\n\ndef bad_changesets(phase, api, critic, repository):\n    if phase == \"pre\":\n        params_list = [(None, None, None, None, AssertionError),\n                       (-5, None, None, None, api.changeset.InvalidChangesetId),\n                       (None, None, None, \"00g0\", api.repository.InvalidRef),\n                       (None, \"00g0\", TO_SHA1, None, api.repository.InvalidRef),\n                       (None, FROM_SHA1, \"00g0\", None, api.repository.InvalidRef),\n                       (1, FROM_SHA1, TO_SHA1, TO_SHA1, AssertionError),\n                       (None, TO_SHA1, TO_SHA1, None, AssertionError)]\n        for (changeset_id, from_commit_ref, to_commit_ref, single_commit_ref,\n             expected_error) in params_list:\n            try:\n                if from_commit_ref is not None:\n                    from_commit = api.commit.fetch(\n                        repository, ref=from_commit_ref)\n                else:\n                    from_commit = None\n                if to_commit_ref is not None:\n                    to_commit = api.commit.fetch(repository, ref=to_commit_ref)\n                else:\n                    to_commit = None\n                if single_commit_ref is not None:\n                    single_commit = api.commit.fetch(\n                        repository, ref=single_commit_ref)\n                else:\n                    single_commit = None\n                changeset = api.changeset.fetch(\n                    critic, repository, changeset_id=changeset_id,\n                    from_commit=from_commit, to_commit=to_commit,\n                    single_commit=single_commit)\n            except expected_error:\n                pass\n            else:\n                assert False,\\\n                    \"Invalid/missing parameters should raise exception\"\n    else:\n        raise Exception\n"
  },
  {
    "path": "src/api/impl/comment.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2017 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\nimport apiobject\nimport dbutils\n\nclass Comment(apiobject.APIObject):\n    wrapper_class = api.comment.Comment\n\n    STATE_MAP = {\n        # \"Is draft\" is a separate attribute, so use the state it would have\n        # once published instead.\n        \"draft\": \"open\",\n\n        # \"Closed\" is only used in the database, really, the UI has always\n        # called it \"Resolve\" (action) / \"Resolved\" (state).\n        \"closed\": \"resolved\"\n    }\n\n    @staticmethod\n    def __translateState(state):\n        return Comment.STATE_MAP.get(state, state)\n\n    def __init__(self, chain_id, review_id, batch_id, author_id, comment_type,\n                 state, side, timestamp, text, file_id, first_commit_id,\n                 last_commit_id, addressed_by_id, resolved_by_id):\n        self.id = chain_id\n        self.is_draft = state == \"draft\"\n        self.state = Comment.__translateState(state)\n        self.__review_id = review_id\n        self.__batch_id = batch_id\n        self.__author_id = author_id\n        self.side = side\n        self.timestamp = timestamp\n        self.text = text\n        self.__file_id = file_id\n        self.__first_commit_id = first_commit_id\n        self.__last_commit_id = last_commit_id\n        self.__addressed_by_id = addressed_by_id\n        self.__resolved_by_id = resolved_by_id\n\n        self.__type = comment_type\n        if comment_type == \"issue\":\n            self.wrapper_class = api.comment.Issue\n        else:\n            self.wrapper_class = api.comment.Note\n\n    def getReview(self, critic):\n        return api.review.fetch(critic, self.__review_id)\n\n    def getAuthor(self, critic):\n        return api.user.fetch(critic, self.__author_id)\n\n    def getLocation(self, critic):\n        cursor = critic.getDatabaseCursor()\n        if self.__file_id is not None:\n            repository = self.getReview(critic).repository\n            if self.side == \"old\":\n                commit = api.commit.fetch(repository, self.__first_commit_id)\n            else:\n                commit = api.commit.fetch(repository, self.__last_commit_id)\n            file_sha1 = commit.getFileInformation(\n                api.file.fetch(critic, file_id=self.__file_id)).sha1\n            cursor.execute(\"\"\"SELECT first_line, last_line\n                                FROM commentchainlines\n                               WHERE chain=%s\n                                 AND sha1=%s\n                                 AND (state!='draft' OR uid=%s)\"\"\",\n                           (self.id, file_sha1, critic.effective_user.id))\n            first_line, last_line = cursor.fetchone()\n            location = FileVersionLocation(\n                self, first_line, last_line, repository, self.__file_id,\n                first_commit_id=self.__first_commit_id,\n                last_commit_id=self.__last_commit_id, side=self.side)\n        elif self.__first_commit_id is not None:\n            repository = self.getReview(critic).repository\n            commit = api.commit.fetch(\n                repository, commit_id=self.__first_commit_id)\n            cursor.execute(\"\"\"SELECT first_line, last_line\n                                FROM commentchainlines\n                               WHERE chain=%s\n                                 AND sha1=%s\n                                 AND (state!='draft' OR uid=%s)\"\"\",\n                           (self.id, commit.sha1, critic.effective_user.id))\n            first_line, last_line = cursor.fetchone()\n            # FIXME: Make commit message comment line numbers one-based too!\n            first_line += 1\n            last_line += 1\n            # FIXME: ... and then delete the above two lines of code.\n            location = CommitMessageLocation(\n                first_line, last_line, repository, self.__first_commit_id)\n        else:\n            return None\n        return location.wrap(critic)\n\n    def getReplies(self, critic):\n        return api.impl.reply.fetchForComment(critic, self.id)\n\n    def getAddressedBy(self, critic):\n        if self.state != \"addressed\":\n            return None\n        repository = self.getReview(critic).repository\n        return api.commit.fetch(repository, commit_id=self.__addressed_by_id)\n\n    def getResolvedBy(self, critic):\n        if self.state != \"resolved\":\n            return None\n        return api.user.fetch(critic, user_id=self.__resolved_by_id)\n\n    def getDraftChanges(self, critic):\n        if critic.effective_user.is_anonymous:\n            return None\n        if self.is_draft:\n            return api.comment.Comment.DraftChanges(\n                critic.effective_user, True, None, None)\n        cursor = critic.getDatabaseCursor()\n        cursor.execute(\"\"\"SELECT id\n                            FROM comments\n                           WHERE uid=%s\n                             AND chain=%s\n                             AND state='draft'\"\"\",\n                       (critic.effective_user.id, self.id))\n        row = cursor.fetchone()\n        if row:\n            reply_id, = row\n            reply = api.reply.fetch(critic, reply_id)\n        else:\n            reply = None\n        effective_type = self.__type\n        new_type = None\n        new_state = None\n        new_location = None\n        cursor.execute(\"\"\"SELECT from_state, to_state, from_type, to_type,\n                                 from_last_commit, to_last_commit,\n                                 from_addressed_by, to_addressed_by\n                            FROM commentchainchanges\n                           WHERE uid=%s\n                             AND chain=%s\n                             AND state='draft'\"\"\",\n                       (critic.effective_user.id, self.id))\n        row = cursor.fetchone()\n        if not row:\n            if reply is None:\n                return None\n        else:\n            (from_state, to_state, from_type, to_type,\n             from_last_commit, to_last_commit,\n             from_addressed_by, to_addressed_by) = row\n            if to_type is not None and from_type == self.__type:\n                effective_type = new_type = to_type\n            if to_state is not None:\n                if Comment.__translateState(from_state) == self.state:\n                    new_state = Comment.__translateState(to_state)\n            # FIXME: Handle new location.\n        if effective_type == \"note\":\n            return api.comment.Comment.DraftChanges(\n                critic.effective_user, False, reply, new_type)\n        return api.comment.Issue.DraftChanges(\n            critic.effective_user, False, reply, new_type, new_state,\n            new_location)\n\n    @staticmethod\n    def refresh(critic, tables, cached_comments):\n        if not tables.intersection((\"commentchains\", \"comments\")):\n            return\n\n        Comment.updateAll(\n            critic,\n            \"\"\"SELECT commentchains.id, review, commentchains.batch,\n                      commentchains.uid, type, commentchains.state,\n                      origin, commentchains.time, comments.comment, file,\n                      first_commit, last_commit, addressed_by, closed_by\n                 FROM commentchains\n                 JOIN comments ON (comments.id=first_comment)\n                WHERE commentchains.id=ANY (%s)\"\"\",\n            cached_comments)\n\n@Comment.cached(api.comment.InvalidCommentId)\ndef fetch(critic, comment_id):\n    cursor = critic.getDatabaseCursor()\n    cursor.execute(\"\"\"SELECT commentchains.id, review, commentchains.batch,\n                             commentchains.uid, type, commentchains.state,\n                             origin, commentchains.time, comments.comment, file,\n                             first_commit, last_commit, addressed_by, closed_by\n                        FROM commentchains\n                        JOIN comments ON (comments.id=first_comment)\n                       WHERE commentchains.id=%s\n                         AND commentchains.state!='empty'\"\"\",\n                   (comment_id,))\n    return Comment.make(critic, cursor)\n\n@Comment.cachedMany(api.comment.InvalidCommentIds)\ndef fetchMany(critic, comment_ids):\n    cursor = critic.getDatabaseCursor()\n    cursor.execute(\"\"\"SELECT commentchains.id, review, commentchains.batch,\n                             commentchains.uid, type, commentchains.state,\n                             origin, commentchains.time, comments.comment, file,\n                             first_commit, last_commit, addressed_by, closed_by\n                        FROM commentchains\n                        JOIN comments ON (comments.id=first_comment)\n                       WHERE commentchains.id=ANY (%s)\n                         AND commentchains.state!='empty'\"\"\",\n                   (comment_ids,))\n    return Comment.make(critic, cursor)\n\ndef fetchAll(critic, review, author, comment_type, state, location_type,\n             changeset, commit):\n    joins = [\"JOIN comments ON (comments.id=first_comment)\"]\n    conditions = [\"(commentchains.state!='draft' OR commentchains.uid=%s)\",\n                  \"commentchains.state!='empty'\"]\n    values = [critic.effective_user.id]\n    if review:\n        conditions.append(\"commentchains.review=%s\")\n        values.append(review.id)\n    if author:\n        conditions.append(\"commentchains.uid=%s\")\n        values.append(author.id)\n    if comment_type:\n        conditions.append(\"commentchains.type=%s\")\n        values.append(comment_type)\n    if state:\n        if state == \"resolved\":\n            state = \"closed\"\n        conditions.append(\"commentchains.state=%s\")\n        values.append(state)\n    if location_type:\n        if location_type == \"commit-message\":\n            conditions.extend([\"commentchains.file IS NULL\",\n                               \"commentchains.first_commit IS NOT NULL\"])\n        else:\n            conditions.extend([\"commentchains.file IS NOT NULL\"])\n    if changeset is not None:\n        joins.extend([\n            \"JOIN commentchainlines\"\n            \" ON (commentchainlines.chain=commentchains.id)\",\n            \"JOIN fileversions\"\n            \" ON (fileversions.file=commentchains.file AND\"\n            \"     commentchainlines.sha1 IN (fileversions.old_sha1,\"\n            \"                                fileversions.new_sha1))\"\n        ])\n        conditions.append(\"fileversions.changeset=%s\")\n        values.append(changeset.id)\n    cursor = critic.getDatabaseCursor()\n    cursor.execute(\n        \"\"\"SELECT DISTINCT commentchains.id, commentchains.review,\n                           commentchains.batch, commentchains.uid,\n                           commentchains.type, commentchains.state,\n                           commentchains.origin, commentchains.time,\n                           comments.comment, commentchains.file,\n                           commentchains.first_commit,\n                           commentchains.last_commit,\n                           commentchains.addressed_by, commentchains.closed_by\n             FROM commentchains\n  LEFT OUTER JOIN batches ON (batches.comment=commentchains.id)\n                  {}\n            WHERE batches.id IS NULL\n              AND {}\n         ORDER BY commentchains.id\"\"\".format(\n             \" \".join(joins),\n             \" AND \".join(conditions)),\n        values)\n    comments = list(Comment.make(critic, cursor))\n    if commit is not None:\n        comments_by_id = { comment.id: comment for comment in comments }\n        cursor.execute(\n            \"\"\"SELECT chain, sha1\n                 FROM commentchainlines\n                WHERE chain=ANY (%s)\"\"\",\n            (comments_by_id.keys(),))\n        comments_by_sha1 = dict()\n        for comment_id, sha1 in cursor:\n            comments_by_sha1.setdefault(sha1, set()).add(\n                comments_by_id[comment_id])\n        file_versions_cache = {}\n        filtered_comments = []\n        for comment in comments:\n            if not comment.location:\n                continue\n            if comment.location.type == \"commit-message\":\n                if comment.location.commit == commit:\n                    filtered_comments.append(comment)\n                continue\n            file_id = comment.location.file.id\n            if file_id not in file_versions_cache:\n                try:\n                    file_information = \\\n                        commit.getFileInformation(comment.location.file)\n                except api.commit.NotAFile:\n                    file_information = None\n                file_versions_cache[file_id] = file_information\n            else:\n                file_information = file_versions_cache[file_id]\n            if file_information is not None:\n                if comment in comments_by_sha1.get(file_information.sha1, ()):\n                    filtered_comments.append(comment)\n        return filtered_comments\n    return comments\n\nclass Location(apiobject.APIObject):\n    def __init__(self, first_line, last_line):\n        self.first_line = first_line\n        self.last_line = last_line\n\nclass CommitMessageLocation(Location):\n    wrapper_class = api.comment.CommitMessageLocation\n\n    def __init__(self, first_line, last_line, repository, commit_id):\n        super(CommitMessageLocation, self).__init__(first_line, last_line)\n        self.repository = repository\n        self.__commit_id = commit_id\n\n    def getCommit(self, critic):\n        return api.commit.fetch(self.repository, self.__commit_id)\n\ndef makeCommitMessageLocation(critic, first_line, last_line, commit):\n    max_line = len(commit.message.splitlines())\n\n    if last_line < first_line:\n        raise api.comment.InvalidLocation(\n            \"first_line must be equal to or less than last_line\")\n    if last_line > max_line:\n        raise api.comment.InvalidLocation(\n            \"last_line must be less than or equal to the number of lines in \"\n            \"the commit message\")\n\n    return CommitMessageLocation(first_line, last_line, commit.repository,\n                                 commit.id).wrap(critic)\n\nclass FileVersionLocation(Location):\n    wrapper_class = api.comment.FileVersionLocation\n\n    def __init__(self, comment, first_line, last_line, repository, file_id,\n                 changeset=None, first_commit_id=None, last_commit_id=None,\n                 side=None, commit=None, commit_id=None, is_translated=False):\n        super(FileVersionLocation, self).__init__(first_line, last_line)\n        self.comment = comment\n        if first_commit_id is not None and first_commit_id == last_commit_id:\n            commit_id = last_commit_id\n            first_commit_id = last_commit_id = side = None\n        self.repository = repository\n        self.__file_id = file_id\n        self.__changeset = changeset\n        self.__first_commit_id = first_commit_id\n        self.__last_commit_id = last_commit_id\n        self.side = side\n        self.__commit = commit\n        self.__commit_id = commit_id\n        self.is_translated = is_translated\n\n    def getChangeset(self, critic):\n        if self.__changeset:\n            return self.__changeset\n        if self.side is None:\n            # Comment was made while looking at a single version of the file,\n            # not while looking at a diff where the file was modified.\n            return None\n        from_commit = api.commit.fetch(\n            self.repository, commit_id=self.__first_commit_id)\n        to_commit = api.commit.fetch(\n            self.repository, commit_id=self.__last_commit_id)\n        return api.changeset.fetch(critic, self.repository,\n                                   from_commit=from_commit, to_commit=to_commit)\n\n    def getCommit(self, critic):\n        if self.__commit:\n            return self.__commit\n        if self.__commit_id is None:\n            return None\n        return api.commit.fetch(self.repository, commit_id=self.__commit_id)\n\n    def getFile(self, critic):\n        return api.file.fetch(critic, file_id=self.__file_id)\n\n    def translateTo(self, critic, changeset, commit):\n        cursor = critic.getDatabaseCursor()\n\n        def translateToCommit(target_commit, side):\n            try:\n                file_information = target_commit.getFileInformation(\n                    self.getFile(critic))\n            except api.commit.NotAFile:\n                raise KeyError\n            if not file_information:\n                raise KeyError\n            cursor.execute(\"\"\"SELECT first_line, last_line\n                                FROM commentchainlines\n                               WHERE chain=%s\n                                 AND sha1=%s\"\"\",\n                           (self.comment.id, file_information.sha1,))\n            row = cursor.fetchone()\n            if row is None:\n                raise KeyError\n            first_line, last_line = row\n            return FileVersionLocation(\n                self.comment, first_line, last_line, self.repository,\n                self.__file_id, changeset=changeset, side=side, commit=commit,\n                is_translated=True).wrap(critic)\n\n        if changeset:\n            try:\n                return translateToCommit(changeset.to_commit, \"new\")\n            except KeyError:\n                pass\n            if changeset.from_commit:\n                try:\n                    return translateToCommit(changeset.from_commit, \"old\")\n                except KeyError:\n                    pass\n        else:\n            try:\n                return translateToCommit(commit, None)\n            except KeyError:\n                pass\n\n        return None\n\ndef makeFileVersionLocation(critic, first_line, last_line, file, changeset,\n                            side, commit):\n    if changeset is not None:\n        repository = changeset.repository\n        if side == \"old\":\n            check_commit = changeset.from_commit\n        else:\n            check_commit = changeset.to_commit\n    else:\n        repository = commit.repository\n        check_commit = commit\n\n    max_line = len(check_commit.getFileLines(file))\n\n    if last_line < first_line:\n        raise api.comment.InvalidLocation(\n            \"first_line must be equal to or less than last_line\")\n    if last_line > max_line:\n        raise api.comment.InvalidLocation(\n            \"last_line must be less than or equal to the number of lines in \"\n            \"the file version\")\n\n    return FileVersionLocation(\n        None, first_line, last_line, repository, file.id, changeset=changeset,\n        side=side, commit=commit).wrap(critic)\n"
  },
  {
    "path": "src/api/impl/comment_unittest.py",
    "content": "import sys\nimport datetime\n\ndef basic(arguments):\n    import api\n\n    critic = api.critic.startSession(for_testing=True)\n    repository = api.repository.fetch(critic, name=\"critic\")\n    branch = api.branch.fetch(\n        critic, repository=repository, name=arguments.review)\n    review = api.review.fetch(critic, branch=branch)\n    alice = api.user.fetch(critic, name=\"alice\")\n    bob = api.user.fetch(critic, name=\"bob\")\n    dave = api.user.fetch(critic, name=\"dave\")\n    erin = api.user.fetch(critic, name=\"erin\")\n\n    all_comments = api.comment.fetchAll(critic)\n    assert isinstance(all_comments, list)\n\n    EXPECTED = {\n        0: { \"text\": \"This is a general issue.\",\n             \"location\": None,\n             \"type\": \"issue\",\n             \"state\": \"open\" },\n        1: { \"text\": \"This is a general note.\",\n             \"location\": None,\n             \"type\": \"issue\",\n             \"state\": \"open\" },\n        2: { \"text\": \"This is a commit issue.\",\n             \"location\": (\"commit-message\", 1, 3),\n             \"type\": \"issue\",\n             \"state\": \"resolved\",\n             \"resolved_by\": dave },\n        3: { \"text\": \"This is a commit note.\",\n             \"location\": (\"commit-message\", 5, 5),\n             \"type\": \"note\" },\n        4: { \"text\": \"This is a file issue.\",\n             \"location\": (\"file-version\", 1, 3),\n             \"type\": \"issue\",\n             \"state\": \"open\" },\n        5: { \"text\": \"This is a file note.\",\n             \"location\": (\"file-version\", 9, 9),\n             \"type\": \"note\" }\n    }\n\n    def check_comment(comment):\n        assert isinstance(comment, api.comment.Comment)\n        assert isinstance(comment.id, int)\n        assert api.comment.fetch(critic, comment_id=comment.id) is comment\n\n        expected = EXPECTED[comment_id_map[comment.id]]\n\n        assert isinstance(comment.type, str)\n        assert comment.type == expected[\"type\"]\n        assert isinstance(comment.is_draft, bool)\n        assert not comment.is_draft\n        assert comment.review is review\n        assert comment.author is alice\n        assert isinstance(comment.timestamp, datetime.datetime)\n        assert isinstance(comment.text, str)\n        assert comment.text == expected[\"text\"]\n\n        if comment.type == \"note\":\n            assert isinstance(comment, api.comment.Note)\n            return\n\n        assert isinstance(comment, api.comment.Issue)\n        assert isinstance(comment.state, str)\n        assert comment.state == expected[\"state\"]\n        if comment.state == \"resolved\":\n            assert comment.resolved_by is expected[\"resolved_by\"]\n        else:\n            assert comment.resolved_by is None\n        if comment.state == \"addressed\":\n            assert comment.addressed_by is expected[\"addressed_by\"]\n        else:\n            assert comment.addressed_by is None\n\n        if expected[\"location\"] is None:\n            assert comment.location is None\n        else:\n            location_type, first_line, last_line = expected[\"location\"]\n            if location_type == \"file-version\":\n                # FileVersionLocation is not yet supported.\n                return\n            assert comment.location.type == location_type\n            assert comment.location.first_line == first_line\n            assert comment.location.last_line == last_line, (comment.location.last_line, last_line)\n            assert isinstance(comment.location, api.comment.Location)\n            if location_type == \"commit-message\":\n                assert isinstance(\n                    comment.location, api.comment.CommitMessageLocation)\n            else:\n                assert isinstance(\n                    comment.location, api.comment.FileVersionLocation)\n\n    comments = api.comment.fetchAll(critic, review=review)\n    assert isinstance(comments, list)\n    assert len(comments) == 6\n\n    comment_id_map = {\n        comment.id: index\n        for index, comment in enumerate(comments)\n    }\n\n    for comment in comments:\n        check_comment(comment)\n\n    some_comments = api.comment.fetchMany(critic, [3, 2, 1])\n\n    assert len(some_comments) == 3\n    assert some_comments[0].id == 3\n    assert some_comments[0] is api.comment.fetch(critic, 3)\n    assert some_comments[1].id == 2\n    assert some_comments[1] is api.comment.fetch(critic, 2)\n    assert some_comments[2].id == 1\n    assert some_comments[2] is api.comment.fetch(critic, 1)\n\n    print \"basic: ok\"\n\ndef main(argv):\n    import argparse\n\n    parser = argparse.ArgumentParser()\n\n    parser.add_argument(\"--review\")\n    parser.add_argument(\"tests\", nargs=argparse.REMAINDER)\n\n    arguments = parser.parse_args(argv)\n\n    for test in arguments.tests:\n        if test == \"basic\":\n            basic(arguments)\n"
  },
  {
    "path": "src/api/impl/commit.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport calendar\nimport datetime\nimport re\n\nimport api\nimport apiobject\nimport api.impl\n\nimport diff.parse\nimport gitutils\n\nRE_FOLLOWUP = re.compile(\"(fixup|squash)!.*(?:\\n[ \\t]*)+(.*)\")\n\nclass Commit(apiobject.APIObject):\n    wrapper_class = api.commit.Commit\n\n    def __init__(self, repository, internal):\n        self.repository = repository\n        self.internal = internal\n        self.sha1 = internal.sha1\n        self.tree = internal.tree\n        self.message = internal.message\n\n    def getId(self, critic):\n        return self.internal.getId(critic.database)\n\n    def getSummary(self):\n        match = RE_FOLLOWUP.match(self.message)\n        if match:\n            followup_type, summary = match.groups()\n            return \"[%s] %s\" % (followup_type, summary)\n        return self.message.split(\"\\n\", 1)[0]\n\n    def getParents(self, critic):\n        return [fetch(self.repository, None, sha1, None)\n                for sha1 in self.internal.parents]\n\n    def getDescription(self, critic):\n        return self.internal.repository.describe(critic.database, self.sha1)\n\n    def getAuthor(self, critic):\n        return api.commit.Commit.UserAndTimestamp(\n            self.internal.author.name,\n            self.internal.author.email,\n            datetime.datetime.fromtimestamp(\n                calendar.timegm(self.internal.author.time)))\n\n    def getCommitter(self, critic):\n        return api.commit.Commit.UserAndTimestamp(\n            self.internal.committer.name,\n            self.internal.committer.email,\n            datetime.datetime.fromtimestamp(\n                calendar.timegm(self.internal.committer.time)))\n\n    def isAncestorOf(self, commit):\n        return self.internal.isAncestorOf(commit._impl.internal)\n\n    def getFileInformation(self, file):\n        import stat\n        internal = self.internal.getFileEntry(file.path)\n        if internal is None:\n            return None\n        if internal.type != \"blob\" or not stat.S_ISREG(internal.mode):\n            raise api.commit.NotAFile(file.path)\n        return api.commit.Commit.FileInformation(\n            file, int(internal.mode), internal.sha1, internal.size)\n\n    def getFileContents(self, file):\n        information = self.getFileInformation(file)\n        if information is None:\n            return None\n        return self.internal.repository.fetch(information.sha1).data\n\n    def getFileLines(self, file):\n        contents = self.getFileContents(file)\n        if contents is None:\n            return None\n        return diff.parse.splitlines(contents)\n\n    @staticmethod\n    def create(critic, repository, commit_id, sha1):\n        try:\n            internal = gitutils.Commit.fromSHA1(\n                db=critic.database,\n                repository=repository._impl.getInternal(critic),\n                sha1=sha1,\n                commit_id=commit_id)\n        except gitutils.GitReferenceError:\n            raise api.commit.InvalidSHA1(sha1)\n        return Commit(repository, internal).wrap(critic)\n\ndef fetch(repository, commit_id, sha1, ref):\n    critic = repository.critic\n\n    def commit_id_from_sha1():\n        cursor = critic.getDatabaseCursor()\n        cursor.execute(\"\"\"SELECT id\n                            FROM commits\n                           WHERE sha1=%s\"\"\",\n                       (sha1,))\n        row = cursor.fetchone()\n        if not row:\n            raise api.commit.InvalidSHA1(sha1)\n        (commit_id,) = row\n        return commit_id\n\n    def sha1_from_commit_id():\n        cursor = critic.getDatabaseCursor()\n        cursor.execute(\"\"\"SELECT sha1\n                            FROM commits\n                           WHERE id=%s\"\"\",\n                       (commit_id,))\n        row = cursor.fetchone()\n        if not row:\n            raise api.commit.InvalidCommitId(commit_id)\n        (sha1,) = row\n        return sha1\n\n    if ref is not None:\n        sha1 = repository.resolveRef(ref, expect=\"commit\")\n\n    if commit_id is not None:\n        try:\n            return Commit.get_cached(critic, (int(repository), commit_id))\n        except KeyError:\n            pass\n\n        if sha1 is None:\n            sha1 = sha1_from_commit_id()\n    else:\n        try:\n            return Commit.get_cached(critic, (int(repository), sha1))\n        except KeyError:\n            pass\n\n        commit_id = commit_id_from_sha1()\n\n    commit = Commit.create(critic, repository, commit_id, sha1)\n\n    Commit.add_cached(critic, (int(repository), commit_id), commit)\n    Commit.add_cached(critic, (int(repository), sha1), commit)\n\n    return commit\n\ndef fetchMany(repository, commit_ids, sha1s):\n    critic = repository.critic\n    cursor = critic.getDatabaseCursor()\n    if commit_ids:\n        cursor.execute(\n            \"\"\"SELECT id, sha1\n                 FROM commits\n                WHERE id=ANY (%s)\"\"\",\n            (commit_ids,))\n\n        rows = cursor.fetchall()\n        if len(rows) != len(set(commit_ids)):\n            found = set(commit_id for commit_id, _ in rows)\n            for commit_id in commit_ids:\n                if commit_id not in found:\n                    raise api.commit.InvalidCommitId(commit_id)\n\n        commits = {\n            commit_id: sha1\n            for commit_id, sha1 in rows\n        }\n\n        return [fetch(repository, commit_id, commits[commit_id], None)\n                for commit_id in commit_ids]\n    else:\n        cursor.execute(\n            \"\"\"SELECT id, sha1\n                 FROM commits\n                WHERE sha1=ANY (%s)\"\"\",\n            (sha1s,))\n\n        rows = cursor.fetchall()\n        if len(rows) != len(set(sha1s)):\n            found = set(sha1 for _, sha1 in rows)\n            for sha1 in sha1s:\n                if sha1 not in found:\n                    raise api.commit.InvalidSHA1(sha1)\n\n        commits = {\n            sha1: commit_id\n            for commit_id, sha1 in rows\n        }\n\n        return [fetch(repository, commits[sha1], sha1, None)\n                for sha1 in sha1s]\n"
  },
  {
    "path": "src/api/impl/commit_unittest.py",
    "content": "import datetime\n\n# This is the commit that added the testing framework:\nCOMMIT_SHA1 = \"78d7849db854f3544d7291cce96a0a4fa6d6843d\"\n\n# This is its tree object:\nCOMMIT_TREE = \"c102e63ed1d612e48d3372c223559192fcf500ce\"\n\n# This is its commit message:\nCOMMIT_MESSAGE = \"\"\"\\\nHigh-level testing framework\n\nFramework for automated installation and \"black-box\" testing of Critic\nrunning in a VirtualBox instance.\n\"\"\"\n\n# This is its \"summary\":\nCOMMIT_SUMMARY = \"High-level testing framework\"\n\n# This is the SHA-1 of its parent:\nCOMMIT_PARENT_SHA1 = \"cf0ecdeafb682bd03fba9a5bbc94e125101a5a0f\"\n\n# This is its author name, email address and timestamp:\nCOMMIT_AUTHOR_NAME = \"Jens Lindstrom\"\nCOMMIT_AUTHOR_EMAIL = \"jl@opera.com\"\nCOMMIT_AUTHOR_TS = datetime.datetime.fromtimestamp(1364402400)\n\n# This is its committer name, email address and timestamp:\nCOMMIT_COMMITTER_NAME = \"Jens Lindstrom\"\nCOMMIT_COMMITTER_EMAIL = \"jl@opera.com\"\nCOMMIT_COMMITTER_TS = datetime.datetime.fromtimestamp(1365369848)\n\ndef basic():\n    import api\n\n    critic = api.critic.startSession(for_testing=True)\n    repository = api.repository.fetch(critic, name=\"critic\")\n\n    try:\n        api.commit.fetch(critic, sha1=COMMIT_SHA1)\n    except AssertionError:\n        pass\n    else:\n        assert False\n\n    try:\n        api.commit.fetch(repository)\n    except AssertionError:\n        pass\n    else:\n        assert False\n\n    try:\n        api.commit.fetch(repository, sha1=COMMIT_SHA1, ref=\"something\")\n    except AssertionError:\n        pass\n    else:\n        assert False\n\n    try:\n        api.commit.fetch(repository, commit_id=0, ref=\"something\")\n    except AssertionError:\n        pass\n    else:\n        assert False\n\n    commit = api.commit.fetch(repository, sha1=COMMIT_SHA1)\n\n    assert str(commit) == COMMIT_SHA1\n    assert repr(commit) == \"api.commit.Commit(sha1=%r)\" % COMMIT_SHA1\n    assert hash(commit) == hash(COMMIT_SHA1)\n    assert commit == COMMIT_SHA1\n    assert COMMIT_SHA1 == commit\n\n    assert isinstance(commit.id, int), type(commit.id)\n    assert isinstance(commit.sha1, str), type(commit.sha1)\n    assert commit.sha1 == COMMIT_SHA1, commit.sha1\n    assert isinstance(commit.tree, str), type(commit.tree)\n    assert commit.tree == COMMIT_TREE, commit.tree\n\n    assert isinstance(commit.summary, str), type(commit.summary)\n    assert commit.summary == COMMIT_SUMMARY, commit.summary\n    assert isinstance(commit.message, str), type(commit.message)\n    assert commit.message == COMMIT_MESSAGE, commit.message\n\n    assert isinstance(commit.parents, list), type(commit.parents)\n    assert len(commit.parents) == 1, len(commit.parents)\n    assert isinstance(commit.parents[0], api.commit.Commit), \\\n        type(commit.parents[0])\n    assert commit.parents[0].sha1 == COMMIT_PARENT_SHA1, commit.parents[0].sha1\n\n    assert isinstance(commit.description, str), type(commit.description)\n    assert commit.description == \"master\", commit.description\n\n    assert isinstance(commit.author, api.commit.Commit.UserAndTimestamp), \\\n        type(commit.author)\n    assert isinstance(commit.author.name, str), type(commit.author.name)\n    assert commit.author.name == COMMIT_AUTHOR_NAME, commit.author.name\n    assert isinstance(commit.author.email, str), type(commit.author.email)\n    assert commit.author.email == COMMIT_AUTHOR_EMAIL, commit.author.email\n    assert isinstance(commit.author.timestamp, datetime.datetime), \\\n        type(commit.author.timestamp)\n    assert commit.author.timestamp == COMMIT_AUTHOR_TS, commit.author.timestamp\n\n    assert isinstance(commit.committer, api.commit.Commit.UserAndTimestamp), \\\n        type(commit.committer)\n    assert isinstance(commit.committer.name, str), \\\n        type(commit.committer.name)\n    assert commit.committer.name == COMMIT_COMMITTER_NAME, \\\n        commit.committer.name\n    assert isinstance(commit.committer.email, str), type(commit.committer.email)\n    assert commit.committer.email == COMMIT_COMMITTER_EMAIL, \\\n        commit.committer.email\n    assert isinstance(commit.committer.timestamp, datetime.datetime), \\\n        type(commit.committer.timestamp)\n    assert commit.committer.timestamp == COMMIT_COMMITTER_TS, \\\n        commit.committer.timestamp\n\n    try:\n        api.commit.fetch(repository, commit_id=47114711)\n    except api.commit.InvalidCommitId:\n        pass\n    except Exception as error:\n        assert False, \"wrong exception raised: %s\" % error\n    else:\n        assert False, \"no exception raised\"\n\n    try:\n        api.commit.fetch(repository, sha1=\"\".join(reversed(COMMIT_SHA1)))\n    except api.commit.InvalidSHA1:\n        pass\n    except Exception as error:\n        assert False, \"wrong exception raised: %s\" % error\n    else:\n        assert False, \"no exception raised\"\n\n    print \"basic: ok\"\n"
  },
  {
    "path": "src/api/impl/commitset.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\nimport apiobject\n\nclass CommitSet(apiobject.APIObject):\n    wrapper_class = api.commitset.CommitSet\n\n    def __init__(self, commits):\n        self.commits = frozenset(commits)\n        self.__children = {}\n\n        parents = set()\n        for commit in self.commits:\n            parents.update(commit.parents)\n            for parent in commit.parents:\n                self.__children.setdefault(parent, set()).add(commit)\n\n        self.heads = frozenset(self.commits - parents)\n        self.tails = frozenset(parents - self.commits)\n\n    def __iter__(self):\n        return iter(self.commits)\n    def __len__(self):\n        return len(self.commits)\n    def __contains__(self, item):\n        return str(item) in self.commits\n    def __hash__(self):\n        return hash(self.commits)\n    def __eq__(self, other):\n        return self.commits == other.commits\n\n    def getFilteredTails(self, critic):\n        if not self.commits:\n            return frozenset()\n\n        legacy_repository = next(iter(self.commits)).repository._impl.getInternal(critic)\n        candidates = set(self.tails)\n        result = set()\n\n        while candidates:\n            tail = candidates.pop()\n\n            eliminated = set()\n            for other in candidates:\n                base = legacy_repository.mergebase([tail, other])\n                if base == tail:\n                    # Tail is an ancestor of other: tail should not be included\n                    # in the returned set.\n                    break\n                elif base == other:\n                    # Other is an ancestor of tail: other should not be included\n                    # in the returned set.\n                    eliminated.add(other)\n            else:\n                result.add(tail)\n            candidates -= eliminated\n\n        return frozenset(result)\n\n    def getDateOrdered(self):\n        queue = sorted(self.commits,\n                       key=lambda commit: commit.committer.timestamp,\n                       reverse=True)\n        included = set()\n\n        while queue:\n            commit = queue.pop(0)\n            if commit in included:\n                continue\n            if commit in self.__children:\n                remaining_children = self.__children[commit] - included\n                if remaining_children:\n                    # Some descendants of this commit have not yet been emitted;\n                    # we have to delay this commit.  Insert the commit after the\n                    # earliest remaining descendant in the list.  This means\n                    # that as soon as we've processed all descendants, we retry\n                    # the commit.\n                    queue.insert(max(queue.index(child)\n                                     for child in remaining_children) + 1,\n                                 commit)\n                    continue\n            yield commit\n            included.add(commit)\n\n    def getTopoOrdered(self):\n        if not self:\n            return\n\n        head = set(self.heads).pop()\n\n        queue = [head]\n        included = set()\n\n        while queue:\n            commit = queue.pop(0)\n            if commit in included:\n                continue\n            if commit in self.__children and self.__children[commit] - included:\n                # Some descendants of this commit have not yet been emitted; we\n                # have to delay this commit.  We can only delay this commit if\n                # the queue is non-empty, so assert that it isn't.\n                assert queue\n                queue.append(commit)\n                continue\n            yield commit\n            included.add(commit)\n            parents = sorted((parent for parent in commit.parents\n                              if parent in self and parent not in included),\n                             key=lambda commit: commit.committer.timestamp,\n                             reverse=False)\n            queue[:0] = parents\n\n    def getChildrenOf(self, commit):\n        return set(self.__children.get(commit, []))\n\n    def getParentsOf(self, commit):\n        return [parent for parent in commit.parents\n                if parent in self]\n\n    def getDescendantsOf(self, commits, include_self):\n        descendants = set()\n        if include_self:\n            descendants.update(commits)\n        queue = set()\n        for commit in commits:\n            queue.update(self.getChildrenOf(commit))\n        while queue:\n            descendant = queue.pop()\n            descendants.add(descendant)\n            children = self.__children.get(descendant)\n            if children:\n                queue.update(children - descendants)\n        return create(commits[0].critic, descendants)\n\n    def getAncestorsOf(self, commits, include_self):\n        ancestors = set()\n        if include_self:\n            ancestors.update(commits)\n        queue = set()\n        for commit in commits:\n            queue.update(self.getParentsOf(commit))\n        while queue:\n            ancestor = queue.pop()\n            ancestors.add(ancestor)\n            queue.update(set(self.getParentsOf(ancestor)) - ancestors)\n        return create(commits[0].critic, ancestors)\n\n    def union(self, critic, commits):\n        return create(critic, self.commits.union(commits))\n    def intersection(self, critic, commits):\n        return create(critic, self.commits.intersection(commits))\n    def difference(self, critic, commits):\n        return create(critic, self.commits.difference(commits))\n    def symmetric_difference(self, critic, commits):\n        return create(critic, self.commits.symmetric_difference(commits))\n\ndef create(critic, commits):\n    if isinstance(commits, api.commitset.CommitSet):\n        assert isinstance(commits._impl, CommitSet)\n        impl = commits._impl\n    else:\n        impl = CommitSet(commits)\n\n    return api.commitset.CommitSet(critic, impl)\n\ndef calculateFromRange(critic, from_commit, to_commit):\n    repository = to_commit.repository\n\n    if from_commit in to_commit.parents:\n        return create(critic, [to_commit])\n\n    if not from_commit.isAncestorOf(to_commit):\n        raise api.commitset.InvalidCommitRange(\n            \"Start-of-range commit is not an ancestor of end-of-range commit\")\n\n    commitset = create(critic, repository.listCommits(include=to_commit,\n                                                      exclude=from_commit))\n\n    if len(commitset.tails) > 1:\n        raise api.commitset.InvalidCommitRange(\n            \"Start-of-range commit is not an ancestor of all included commits\")\n\n    return commitset\n"
  },
  {
    "path": "src/api/impl/commitset_unittest.py",
    "content": "def basic(arguments):\n    import api\n\n    assert arguments.prefix is not None, \"missing argument: --prefix\"\n\n    critic = api.critic.startSession(for_testing=True)\n    repository = api.repository.fetch(critic, name=\"critic\")\n\n    # This set of commits should exist in the repository; each referenced by a\n    # branch named prefix + letter.\n    #\n    #  (X)\n    #   |\n    #   A\n    #  / \\\n    # C   B  (Y)\n    # |   |\\ /\n    # D   | G\n    #  \\ /  |\n    #   E   H\n    #   |   |\\\n    #   F   K \\\n    #    \\ /   I\n    #     M    |\n    #     |    J\n    #     N\n    #     |\n    #     L\n\n    ALL_LETTERS = \"ABCDEFGHIJKLMN\"\n\n    commits = { letter: api.commit.fetch(repository,\n                                         ref=arguments.prefix + letter)\n                for letter in ALL_LETTERS + \"XY\" }\n\n    def make(letters, fn=lambda commits: api.commitset.create(critic, commits)):\n        return fn(commits[letter] for letter in letters)\n    def tostring(commits):\n        return \"\".join(commit.summary for commit in commits)\n\n    commitset = make(ALL_LETTERS)\n\n    assert isinstance(commitset, api.commitset.CommitSet)\n    assert len(commitset) == 14\n    assert len(commitset.heads) == 2\n    assert commits[\"J\"] in commitset.heads\n    assert commits[\"L\"] in commitset.heads\n    assert len(commitset.tails) == 2\n    assert commits[\"X\"] in commitset.tails\n    assert commits[\"Y\"] in commitset.tails\n\n    for commit in make(ALL_LETTERS, list):\n        assert commit in commitset\n\n    from_L = commitset.getAncestorsOf(commits[\"L\"], include_self=True)\n    from_J = commitset.getAncestorsOf(commits[\"J\"], include_self=True)\n\n    assert len(from_L.heads) == 1\n    assert commits[\"L\"] in from_L.heads\n\n    assert len(from_J.heads) == 1\n    assert commits[\"J\"] in from_J.heads\n\n    assert tostring(from_L.topo_ordered) == \"LNMFEDCKHGBA\"\n    assert tostring(from_J.topo_ordered) == \"JIHGBA\"\n\n    assert tostring(from_L.date_ordered) == \"LNMKHGFEDCBA\"\n    assert tostring(from_J.date_ordered) == \"JIHGBA\"\n\n    assert commitset.getChildrenOf(commits[\"A\"]) == make(\"BC\", set)\n    assert commitset.getChildrenOf(commits[\"B\"]) == make(\"EG\", set)\n    assert commitset.getChildrenOf(commits[\"C\"]) == make(\"D\", set)\n    assert commitset.getChildrenOf(commits[\"D\"]) == make(\"E\", set)\n    assert commitset.getChildrenOf(commits[\"E\"]) == make(\"F\", set)\n    assert commitset.getChildrenOf(commits[\"F\"]) == make(\"M\", set)\n    assert commitset.getChildrenOf(commits[\"G\"]) == make(\"H\", set)\n    assert commitset.getChildrenOf(commits[\"H\"]) == make(\"KI\", set)\n    assert commitset.getChildrenOf(commits[\"I\"]) == make(\"J\", set)\n    assert commitset.getChildrenOf(commits[\"J\"]) == make(\"\", set)\n    assert commitset.getChildrenOf(commits[\"K\"]) == make(\"M\", set)\n    assert commitset.getChildrenOf(commits[\"L\"]) == make(\"\", set)\n    assert commitset.getChildrenOf(commits[\"M\"]) == make(\"N\", set)\n    assert commitset.getChildrenOf(commits[\"N\"]) == make(\"L\", set)\n    assert commitset.getChildrenOf(commits[\"X\"]) == make(\"A\", set)\n    assert commitset.getChildrenOf(commits[\"Y\"]) == make(\"G\", set)\n\n    assert commitset.getParentsOf(commits[\"A\"]) == make(\"\", list)\n    assert commitset.getParentsOf(commits[\"B\"]) == make(\"A\", list)\n    assert commitset.getParentsOf(commits[\"C\"]) == make(\"A\", list)\n    assert commitset.getParentsOf(commits[\"D\"]) == make(\"C\", list)\n    assert commitset.getParentsOf(commits[\"E\"]) == make(\"DB\", list)\n    assert commitset.getParentsOf(commits[\"F\"]) == make(\"E\", list)\n    assert commitset.getParentsOf(commits[\"G\"]) == make(\"B\", list)\n    assert commitset.getParentsOf(commits[\"H\"]) == make(\"G\", list)\n    assert commitset.getParentsOf(commits[\"I\"]) == make(\"H\", list)\n    assert commitset.getParentsOf(commits[\"J\"]) == make(\"I\", list)\n    assert commitset.getParentsOf(commits[\"K\"]) == make(\"H\", list)\n    assert commitset.getParentsOf(commits[\"L\"]) == make(\"N\", list)\n    assert commitset.getParentsOf(commits[\"M\"]) == make(\"FK\", list)\n    assert commitset.getParentsOf(commits[\"N\"]) == make(\"M\", list)\n\n    assert commitset.getDescendantsOf(commits[\"A\"]) == make(\"BCDEFGHIJKLMN\")\n    assert commitset.getDescendantsOf(commits[\"B\"]) == make(\"EFGHIJKLMN\")\n    assert commitset.getDescendantsOf(commits[\"C\"]) == make(\"DEFLMN\")\n    assert commitset.getDescendantsOf(commits[\"D\"]) == make(\"EFLMN\")\n    assert commitset.getDescendantsOf(commits[\"E\"]) == make(\"FLMN\")\n    assert commitset.getDescendantsOf(commits[\"F\"]) == make(\"LMN\")\n    assert commitset.getDescendantsOf(commits[\"G\"]) == make(\"HIJKLMN\")\n    assert commitset.getDescendantsOf(commits[\"H\"]) == make(\"IJKLMN\")\n    assert commitset.getDescendantsOf(commits[\"I\"]) == make(\"J\")\n    assert commitset.getDescendantsOf(commits[\"J\"]) == make(\"\")\n    assert commitset.getDescendantsOf(commits[\"K\"]) == make(\"LMN\")\n    assert commitset.getDescendantsOf(commits[\"L\"]) == make(\"\")\n    assert commitset.getDescendantsOf(commits[\"M\"]) == make(\"NL\")\n    assert commitset.getDescendantsOf(commits[\"N\"]) == make(\"L\")\n    assert commitset.getDescendantsOf(commits[\"X\"]) == make(\"ABCDEFGHIJKLMN\")\n    assert commitset.getDescendantsOf(commits[\"Y\"]) == make(\"GHIJKLMN\")\n    assert commitset.getDescendantsOf(commits[\"L\"], True) == make(\"L\")\n    assert commitset.getDescendantsOf(\n        commit for commit in [commits[\"I\"], commits[\"N\"]]) == make(\"JL\")\n\n    assert commitset.getAncestorsOf(commits[\"A\"]) == make(\"\")\n    assert commitset.getAncestorsOf(commits[\"B\"]) == make(\"A\")\n    assert commitset.getAncestorsOf(commits[\"C\"]) == make(\"A\")\n    assert commitset.getAncestorsOf(commits[\"D\"]) == make(\"AC\")\n    assert commitset.getAncestorsOf(commits[\"E\"]) == make(\"ABCD\")\n    assert commitset.getAncestorsOf(commits[\"F\"]) == make(\"ABCDE\")\n    assert commitset.getAncestorsOf(commits[\"G\"]) == make(\"AB\")\n    assert commitset.getAncestorsOf(commits[\"H\"]) == make(\"ABG\")\n    assert commitset.getAncestorsOf(commits[\"I\"]) == make(\"ABGH\")\n    assert commitset.getAncestorsOf(commits[\"J\"]) == make(\"ABGHI\")\n    assert commitset.getAncestorsOf(commits[\"K\"]) == make(\"ABGH\")\n    assert commitset.getAncestorsOf(commits[\"L\"]) == make(\"ABCDEFGHKMN\")\n    assert commitset.getAncestorsOf(commits[\"M\"]) == make(\"ABCDEFGHK\")\n    assert commitset.getAncestorsOf(commits[\"N\"]) == make(\"ABCDEFGHKM\")\n    assert commitset.getAncestorsOf(commits[\"A\"], True) == make(\"A\")\n    assert commitset.getAncestorsOf(\n        commit for commit in [commits[\"D\"], commits[\"G\"]]) == make(\"ABC\")\n\n    assert (make(\"ABC\") | make(\"BCD\")) == make(\"ABCD\")\n    assert (make(\"ABC\") & make(\"BCD\")) == make(\"BC\")\n    assert (make(\"ABC\") - make(\"BCD\")) == make(\"A\")\n    assert (make(\"ABC\") ^ make(\"BCD\")) == make(\"AD\")\n\n    print \"basic: ok\"\n\ndef main(argv):\n    import argparse\n\n    parser = argparse.ArgumentParser()\n\n    parser.add_argument(\"--prefix\")\n    parser.add_argument(\"tests\", nargs=argparse.REMAINDER)\n\n    arguments = parser.parse_args(argv)\n\n    for test in arguments.tests:\n        if test == \"basic\":\n            basic(arguments)\n"
  },
  {
    "path": "src/api/impl/config_unittest.py",
    "content": "def basic():\n    import api\n\n    assert api.config.getBoolean(\"debug\", \"IS_TESTING\") is True\n    assert api.config.getBoolean(\"smtp\", \"USE_SSL\") is False\n    assert api.config.getInteger(\"smtp\", \"MAX_ATTEMPTS\") == 10\n    assert api.config.getString(\"base\", \"SYSTEM_IDENTITY\") == \"main\"\n    assert api.config.getValue(\"base\", \"REPOSITORY_URL_TYPES\") == [\"http\"]\n\n    try:\n        api.config.getValue(\"invalid\", \"IRRELEVANT\")\n    except api.config.InvalidGroup as error:\n        assert error.message == \"Invalid configuration group: invalid\"\n    else:\n        assert False\n\n    try:\n        api.config.getValue(\"base\", \"INVALID\")\n    except api.config.InvalidKey as error:\n        assert error.message == \"Invalid configuration key: base::INVALID\"\n    else:\n        assert False\n\n    try:\n        api.config.getBoolean(\"base\", \"SYSTEM_USER_NAME\")\n    except api.config.WrongType as error:\n        assert error.message == (\"Wrong type: base::SYSTEM_USER_NAME \"\n                                 \"(read as boolean)\")\n    else:\n        assert False\n\n    try:\n        api.config.getInteger(\"base\", \"SYSTEM_USER_NAME\")\n    except api.config.WrongType as error:\n        assert error.message == (\"Wrong type: base::SYSTEM_USER_NAME \"\n                                 \"(read as integer)\")\n    else:\n        assert False\n\n    try:\n        api.config.getString(\"debug\", \"IS_TESTING\")\n    except api.config.WrongType as error:\n        assert error.message == (\"Wrong type: debug::IS_TESTING \"\n                                 \"(read as string)\")\n    else:\n        assert False\n\n    print \"basic: ok\"\n"
  },
  {
    "path": "src/api/impl/critic.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\nimport dbutils\n\nclass NoKey(object):\n    pass\n\nclass Critic(object):\n    def __init__(self):\n        self.database = None\n        self.actual_user = None\n        self.access_token = None\n        self.__cache = {}\n\n    def setDatabase(self, database):\n        self.database = database\n\n    def getEffectiveUser(self, critic):\n        if self.actual_user:\n            return self.actual_user\n        return api.user.anonymous(critic)\n\n    def lookup(self, cls, key=NoKey):\n        objects = self.__cache[cls]\n        if key is NoKey:\n            return objects\n        return objects[key]\n\n    def assign(self, cls, key, value):\n        self.__cache.setdefault(cls, {})[key] = value\n\n    @staticmethod\n    def transactionEnded(critic, tables):\n        for Implementation, cached_objects in critic._impl.__cache.items():\n            if hasattr(Implementation, \"refresh\"):\n                Implementation.refresh(critic, tables, cached_objects)\n        return True\n\ndef startSession(for_user, for_system, for_testing):\n    critic = api.critic.Critic(Critic())\n\n    if for_user:\n        database = dbutils.Database.forUser(critic)\n    elif for_system:\n        database = dbutils.Database.forSystem(critic)\n    else:\n        database = dbutils.Database.forTesting(critic)\n\n    critic._impl.setDatabase(database)\n    return critic\n"
  },
  {
    "path": "src/api/impl/extension.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2015 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\nimport apiobject\n\nimport configuration\n\nclass Extension(apiobject.APIObject):\n    wrapper_class = api.extension.Extension\n\n    def __init__(self, extension_id, name, publisher_id):\n        self.id = extension_id\n        self.name = name\n        self.__publisher_id = publisher_id\n\n    def getKey(self, critic):\n        publisher = self.getPublisher(critic)\n        if publisher is None:\n            return self.name\n        return \"%s/%s\" % (publisher.name, self.name)\n\n    def getPublisher(self, critic):\n        if self.__publisher_id is None:\n            return None\n        return api.user.fetch(critic, self.__publisher_id)\n\n@Extension.cached()\ndef fetch(critic, extension_id, key):\n    if not configuration.extensions.ENABLED:\n        raise api.extension.ExtensionError(\"Extension support not enabled\")\n    cursor = critic.getDatabaseCursor()\n    if extension_id is not None:\n        cursor.execute(\n            \"\"\"SELECT id, name, author\n                 FROM extensions\n                WHERE id=%s\"\"\",\n            (extension_id,))\n    else:\n        publisher_name, _, extension_name = key.partition(\"/\")\n        if extension_name is None:\n            extension_name = publisher_name\n            cursor.execute(\n                \"\"\"SELECT id, name, author\n                     FROM extensions\n                    WHERE author IS NULL\n                      AND name=%s\"\"\",\n                (extension_name,))\n        else:\n            cursor.execute(\n                \"\"\"SELECT extensions.id, extensions.name, author\n                     FROM extensions\n                     JOIN users ON (users.id=author)\n                    WHERE extensions.name=%s\n                      AND users.name=%s\"\"\",\n                (extension_name, publisher_name))\n    try:\n        return next(Extension.make(critic, cursor))\n    except StopIteration:\n        if extension_id is not None:\n            raise api.extension.InvalidExtensionId(extension_id)\n        else:\n            raise api.extension.InvalidExtensionKey(key)\n\ndef fetchAll(critic, publisher, installed_by):\n    if not configuration.extensions.ENABLED:\n        raise api.extension.ExtensionError(\"Extension support not enabled\")\n    cursor = critic.getDatabaseCursor()\n    if installed_by:\n        if publisher:\n            cursor.execute(\n                \"\"\"SELECT extensions.id, name, author\n                     FROM extensions\n                     JOIN extensioninstalls ON (extension=extensions.id)\n                    WHERE author=%s\n                      AND uid=%s\"\"\",\n                (publisher.id, installed_by.id))\n        else:\n            cursor.execute(\n                \"\"\"SELECT extensions.id, name, author\n                     FROM extensions\n                     JOIN extensioninstalls ON (extension=extensions.id)\n                    WHERE uid=%s\"\"\",\n                (installed_by.id,))\n    elif publisher:\n        cursor.execute(\n            \"\"\"SELECT id, name, author\n                 FROM extensions\n                WHERE author=%s\"\"\",\n            (publisher.id,))\n    else:\n        cursor.execute(\n            \"\"\"SELECT id, name, author\n                 FROM extensions\"\"\")\n    return list(Extension.make(critic, cursor))\n"
  },
  {
    "path": "src/api/impl/file.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2017 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\nimport apiobject\nimport dbutils\n\nclass File(apiobject.APIObject):\n    wrapper_class = api.file.File\n\n    def __init__(self, file_id, path):\n        self.id = file_id\n        self.path = path\n\ndef _fetch_by_ids(critic, file_ids):\n    # FIXME: Optimize this to do a single database query. Currently, there will\n    # be one (fast) query per file.\n    try:\n        for file_id in file_ids:\n            internal = dbutils.File.fromId(critic.database, file_id)\n            yield (int(internal), str(internal))\n    except dbutils.InvalidFileId:\n        raise api.file.InvalidFileId(file_id)\n\ndef _fetch_by_paths(critic, paths, create):\n    # FIXME: Optimize this to do a single database query. Currently, there will\n    # be one (fast) query per file.\n    try:\n        for path in paths:\n            internal = dbutils.File.fromPath(\n                critic.database, path, insert=create)\n            yield (int(internal), str(internal))\n    except dbutils.InvalidPath:\n        raise api.file.InvalidPath(path)\n\n@File.cached()\ndef fetch(critic, file_id, path, create):\n    if file_id is not None:\n        items = _fetch_by_ids(critic, [file_id])\n    else:\n        items = _fetch_by_paths(critic, [path], create)\n    return next(File.make(critic, items))\n\ndef fetchMany(critic, file_ids, paths, create):\n    if file_ids is not None:\n        items = _fetch_by_ids(critic, file_ids)\n    else:\n        items = _fetch_by_paths(critic, paths, create)\n    return list(File.make(critic, items))\n"
  },
  {
    "path": "src/api/impl/filechange.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2017 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\nimport api.impl\nimport apiobject\n\nclass FileChange(apiobject.APIObject):\n    wrapper_class = api.filechange.FileChange\n\n    def __init__(self, changeset,\n                 file_id, old_sha1, old_mode, new_sha1, new_mode):\n        self.changeset = changeset\n        self.__file_id = file_id\n        self.old_sha1 = old_sha1 if old_sha1 != \"0\" * 40 else None\n        self.old_mode = old_mode\n        self.new_sha1 = new_sha1 if new_sha1 != \"0\" * 40 else None\n        self.new_mode = new_mode\n\n    def getFile(self, critic):\n        return api.file.fetch(critic, self.__file_id)\n\n@FileChange.cached(api.filechange.InvalidFileChangeId,\n                   cache_key=lambda (changeset, file): (changeset.id, file.id))\ndef fetch(critic, changeset, file):\n    cursor = critic.getDatabaseCursor()\n    cursor.execute(\n        \"\"\"SELECT file, old_sha1, old_mode, new_sha1, new_mode\n             FROM fileversions\n            WHERE changeset=%s\n              AND file=%s\"\"\",\n        (changeset.id, file.id,))\n    def cache_key(args):\n        return args[0].id, args[1]\n    return FileChange.make(critic, ((changeset,) + row for row in cursor),\n                           cache_key=cache_key)\n\ndef fetchAll(critic, changeset):\n    cursor = critic.getDatabaseCursor()\n    cursor.execute(\n        \"\"\"SELECT file, old_sha1, old_mode, new_sha1, new_mode\n             FROM fileversions\n             JOIN files ON (files.id=file)\n            WHERE changeset=%s\n         ORDER BY files.path\"\"\",\n        (changeset.id,))\n    def cache_key(args):\n        return args[0].id, args[1]\n    return list(FileChange.make(critic, ((changeset,) + row for row in cursor),\n                                cache_key=cache_key))\n"
  },
  {
    "path": "src/api/impl/filechange_unittest.py",
    "content": "from api.impl.changeset_unittest import ROOT_PATHLIST, ROOT_SHA1, FROM_SHA1,\\\nTO_SHA1, CUSTOM_PATHLIST\n\ndef pre():\n    import api\n\n    critic = api.critic.startSession(for_testing=True)\n    repository = api.repository.fetch(critic, name=\"critic\")\n\n    root_filechange(\"pre\", api, critic, repository)\n    custom_filechange(\"pre\", api, critic, repository)\n    single_filechange(\"pre\", api, critic, repository)\n\n    print(\"pre: ok\")\n\n\ndef post():\n    import api\n\n    critic = api.critic.startSession(for_testing=True)\n    repository = api.repository.fetch(critic, name=\"critic\")\n\n    root_filechange(\"post\", api, critic, repository)\n    custom_filechange(\"post\", api, critic, repository)\n    single_filechange(\"post\", api, critic, repository)\n\n    print(\"post: ok\")\n\n\n# requires list of filechange objects\ndef assert_valid_filechanges(filechanges):\n    for filechange in filechanges:\n        assert (filechange.old_sha1 != filechange.new_sha1),\\\n            \"Expected filechange.new_sha1 to be different from filechange.old_sha1\"\n        if filechange.old_sha1 is not None:\n            assert (isinstance(filechange.old_sha1, str)),\\\n                \"Expected filechange.old_sha1 to be a string\"\n            assert (len(filechange.old_sha1) == 40),\\\n                \"Expected filechange.old_sha1 to be 40 characters long\"\n        if filechange.new_sha1 is not None:\n            assert (filechange.new_sha1 is None or isinstance(filechange.new_sha1, str)),\\\n                \"Expected filechange.new_sha1 to be a string\"\n            assert (len(filechange.new_sha1) == 40),\\\n                \"Expected filechange.new_sha1 to be 40 characters long\"\n\n\ndef root_filechange(phase, api, critic, repository):\n    if phase == \"pre\":\n        root_commit = api.commit.fetch(repository, sha1=ROOT_SHA1)\n        try:\n            api.changeset.fetch(\n                critic, repository, single_commit=root_commit)\n        except api.changeset.ChangesetDelayed:\n            pass\n\n    elif phase == \"post\":\n        root_commit = api.commit.fetch(repository, sha1=ROOT_SHA1)\n        root_changeset = api.changeset.fetch(\n            critic, repository, single_commit=root_commit)\n        root_filechanges = api.filechange.fetchAll(critic, root_changeset)\n\n        root_files = frozenset(\n            [filechange.file.path for filechange in root_filechanges])\n        assert (root_files == ROOT_PATHLIST),\\\n            \"files in filechanges differ from the expected\"\n        assert_valid_filechanges(root_filechanges)\n\n    else:\n        raise Exception\n\n\ndef custom_filechange(phase, api, critic, repository):\n    if phase == \"pre\":\n        from_commit = api.commit.fetch(repository, sha1=FROM_SHA1)\n        to_commit = api.commit.fetch(repository, sha1=TO_SHA1)\n        try:\n            api.changeset.fetch(\n                critic, repository, from_commit=from_commit, to_commit=to_commit)\n        except api.changeset.ChangesetDelayed:\n            pass\n\n    elif phase == \"post\":\n        from_commit = api.commit.fetch(repository, sha1=FROM_SHA1)\n        to_commit = api.commit.fetch(repository, sha1=TO_SHA1)\n        custom_changeset = api.changeset.fetch(\n            critic, repository, from_commit=from_commit, to_commit=to_commit)\n        for filechange in custom_changeset.files:\n            assert_valid_filechanges([filechange])\n\n    else:\n        raise Exception\n\n\ndef single_filechange(phase, api, critic, repository):\n    if phase == \"pre\":\n        single_commit = api.commit.fetch(repository, sha1=TO_SHA1)\n        try:\n            api.changeset.fetch(\n                critic, repository, single_commit=single_commit)\n        except api.changeset.ChangesetDelayed:\n            pass\n\n    elif phase == \"post\":\n        single_commit = api.commit.fetch(repository, sha1=TO_SHA1)\n        single_changeset = api.changeset.fetch(\n            critic, repository, single_commit=single_commit)\n        all_filechanges = api.filechange.fetchAll(critic, single_changeset)\n        assert_valid_filechanges(all_filechanges)\n        all_filechange_ids = [filechange.file.id for filechange in all_filechanges]\n\n        equiv_filechange_ids = [\n            filechange.file.id for filechange in single_changeset.files\n        ]\n\n        assert True or (frozenset(all_filechange_ids) == frozenset(equiv_filechange_ids)),\\\n            \"filechanges from fetchAll should be equal to list of filechanges fetched by file_id\"\n\n    else:\n        raise Exception\n"
  },
  {
    "path": "src/api/impl/filecontent.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2017 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nfrom __future__ import absolute_import\n\nimport api\nimport api.impl\nfrom api.impl import apiobject\nimport diff\n\nclass Filecontent(apiobject.APIObject):\n    wrapper_class = api.filecontent.Filecontent\n\n    def __init__(self, critic, repository, blob_sha1, file_obj):\n        diffFile = diff.File(\n            repository=repository._impl.getInternal(critic), path=file_obj.path,\n            new_sha1=blob_sha1)\n        diffFile.loadNewLines(\n            highlighted=True, request_highlight=True, highlight_mode=\"json\")\n        self.__filecontents = diffFile.newLines(highlighted=True)\n\n    def getLines(self, first_row, last_row):\n        num_lines = len(self.__filecontents)\n\n        actual_first_row = min(first_row, num_lines)\n        if actual_first_row is None:\n            actual_first_row = 1\n\n        actual_last_row = min(max(last_row, actual_first_row), num_lines)\n        if actual_last_row is None:\n            actual_last_row = num_lines\n\n        lines = []\n        for offset in range(actual_first_row-1, actual_last_row):\n            parts = api.impl.filediff.parts_from_html(self.__filecontents[offset])\n            lines.append(api.filecontent.Line(parts, offset+1))\n\n        return lines\n\ndef fetch(critic, repository, blob_sha1, file_obj):\n    return Filecontent(critic, repository, blob_sha1, file_obj).wrap(critic)\n"
  },
  {
    "path": "src/api/impl/filediff.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2017 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nfrom __future__ import absolute_import\n\nimport api\nimport api.impl\nfrom api.impl import apiobject\nimport diff\nimport diff.context\n\nimport json\n\nclass Filediff(apiobject.APIObject):\n    wrapper_class = api.filediff.Filediff\n\n    def __init__(self, filechange):\n        self.filechange = filechange\n        self.old_count = None\n        self.new_count = None\n\n        self.__chunks = None\n        self.__macro_chunks = None\n        self.__repository = filechange.changeset.repository\n\n        diff_file = self.__getLegacyFile(filechange.critic)\n\n        self.__highlight_delayed = not diff_file.ensureHighlight(\"json\")\n\n    @staticmethod\n    def cache_key(filechange):\n        return (filechange.changeset.id, filechange.file.id)\n\n    def __getChunks(self, critic):\n        if self.__chunks is None:\n            cached_objects = Filediff.allCached(critic)\n            assert Filediff.cache_key(self.filechange) in cached_objects\n\n            cached_by_changeset = {}\n            for (changeset_id, file_id), filediff in cached_objects.items():\n                if filediff._impl.__chunks is None:\n                    filediff._impl.__chunks = []\n                    cached_by_changeset.setdefault(changeset_id, []) \\\n                        .append(file_id)\n\n            cursor = critic.getDatabaseCursor()\n            for changeset_id, file_ids in cached_by_changeset.items():\n                cursor.execute(\n                    \"\"\"SELECT file,\n                              deleteOffset, deleteCount,\n                              insertOffset, insertCount,\n                              analysis, whitespace\n                         FROM chunks\n                        WHERE changeset=%s\n                          AND file=ANY (%s)\n                     ORDER BY file, deleteOffset, insertOffset\"\"\",\n                    (changeset_id, file_ids))\n\n                for (file_id,\n                     delete_offset, delete_count,\n                     insert_offset, insert_count,\n                     analysis, is_whitespace) in cursor:\n                    cached_objects[(changeset_id, file_id)]._impl.__chunks \\\n                        .append(diff.Chunk(delete_offset, delete_count,\n                                           insert_offset, insert_count,\n                                           analysis=analysis,\n                                           is_whitespace=is_whitespace))\n\n        return self.__chunks\n\n    def __getLegacyFile(self, critic):\n        return diff.File(\n            self.filechange.file.id, self.filechange.file.path,\n            self.filechange.old_sha1, self.filechange.new_sha1,\n            self.__repository._impl.getInternal(critic),\n            old_mode=self.filechange.old_mode,\n            new_mode=self.filechange.new_mode)\n\n    def getMacroChunks(self, critic, context_lines, comments, ignore_chunks):\n        def create_line_filter(location, context_lines):\n            def line_filter(line):\n                first_context_line = location.first_line - context_lines\n                last_context_line = location.last_line + context_lines\n\n                if location.side == \"old\":\n                    line_number = line.old_offset\n                    if line_number == first_context_line and line.type == diff.Line.INSERTED:\n                        return False\n\n                else:\n                    line_number = line.new_offset\n                    if line_number == first_context_line and line.type == diff.Line.DELETED:\n                        return False\n\n                return first_context_line <= line_number <= last_context_line\n            return line_filter\n\n        if self.__macro_chunks is None:\n            if self.__highlight_delayed:\n                raise api.filediff.FilediffDelayed()\n\n            diff_file = self.__getLegacyFile(critic)\n\n            diff_file.loadOldLines(True, highlight_mode=\"json\")\n            diff_file.loadNewLines(True, highlight_mode=\"json\")\n\n            self.old_count = diff_file.oldCount()\n            self.new_count = diff_file.newCount()\n\n            diff_chunks = self.__getChunks(critic)\n\n            if comments is not None:\n                translated_comments = []\n                skinny_comment_chains = []\n\n                for comment in comments:\n                    if not isinstance(\n                            comment.location, api.comment.FileVersionLocation):\n                        continue\n                    if comment.location.file != self.filechange.file:\n                        continue\n\n                    location = comment.location.translateTo(\n                        self.filechange.changeset)\n\n                    if not location:\n                        continue\n\n                    translated_comments.append((comment, location))\n                    skinny_comment_chains.append((\n                        SkinnyCommentChain(critic, location),\n                        location.side == \"old\"))\n            else:\n                translated_comments = None\n                skinny_comment_chains = None\n\n            if ignore_chunks and translated_comments:\n                line_filter = create_line_filter(\n                    translated_comments[0][1], context_lines)\n            else:\n                line_filter = None\n\n            diff_context_lines = diff.context.ContextLines(\n                diff_file, diff_chunks, skinny_comment_chains)\n\n            legacy_macro_chunks = diff_context_lines.getMacroChunks(\n                context_lines, skip_interline_diff=True,\n                lineFilter=line_filter)\n\n            self.__macro_chunks = [\n                api.filediff.MacroChunk(MacroChunk(critic, legacy_macro_chunk))\n                for legacy_macro_chunk\n                in legacy_macro_chunks\n            ]\n        return self.__macro_chunks\n\nclass MacroChunk(object):\n    def __init__(self, critic, legacy_macro_chunk):\n        self.legacy_macro_chunk = legacy_macro_chunk\n        self.old_offset = legacy_macro_chunk.old_offset\n        self.new_offset = legacy_macro_chunk.new_offset\n        self.old_count = legacy_macro_chunk.old_count\n        self.new_count = legacy_macro_chunk.new_count\n        self.__lines = None\n\n    def getLines(self):\n        if self.__lines is None:\n            self.__lines = [\n                api.filediff.Line(Line.from_legacy_line(line))\n                for line\n                in self.legacy_macro_chunk.lines\n            ]\n        return self.__lines\n\nclass Line(object):\n    CONTEXT    = 1\n    DELETED    = 2\n    MODIFIED   = 3\n    REPLACED   = 4\n    INSERTED   = 5\n    WHITESPACE = 6\n    CONFLICT   = 7\n\n    @classmethod\n    def from_legacy_line(self, legacy_line):\n        line = Line()\n        line.legacy_line = legacy_line\n        line.__content = None\n        line.is_whitespace = legacy_line.is_whitespace\n        line.analysis = legacy_line.analysis\n        return line\n\n    @classmethod\n    def from_changed_line(self, original_line, old_content_replacement,\n                          new_content_replacement):\n        line = Line()\n        line.legacy_line = original_line.legacy_line\n        line.__old_content = old_content_replacement\n        line.__new_content = new_content_replacement\n        line.is_whitespace = original_line.is_whitespace\n        return line\n\n    def type_string(self):\n        if self.legacy_line.type == Line.CONTEXT: return \"CONTEXT\"\n        elif self.legacy_line.type == Line.DELETED: return \"DELETED\"\n        elif self.legacy_line.type == Line.MODIFIED: return \"MODIFIED\"\n        elif self.legacy_line.type == Line.REPLACED: return \"REPLACED\"\n        elif self.legacy_line.type == Line.INSERTED: return \"INSERTED\"\n        elif self.legacy_line.type == Line.WHITESPACE: return \"WHITESPACE\"\n        elif self.legacy_line.type == Line.CONFLICT: return \"CONFLICT\"\n\n    def getContent(self):\n        if self.__content is None:\n            old_value = self.legacy_line.old_value\n            new_value = self.legacy_line.new_value\n\n            old_content = parts_from_html(self.legacy_line.old_value)\n\n            if self.legacy_line.type == Line.CONTEXT:\n                content = old_content\n            else:\n                new_content = parts_from_html(self.legacy_line.new_value)\n\n                if self.legacy_line.analysis:\n                    content = perform_detailed_operations(\n                        self.legacy_line.analysis, old_content, new_content)\n                else:\n                    content = perform_basic_operations(\n                        self.legacy_line.type, old_content, new_content)\n\n            self.__content = [api.filediff.Part(part)\n                              for part in content]\n        return self.__content\n\nclass Part(object):\n    def __init__(self, part_type, content, state=None):\n        self.type = part_type\n        self.content = content\n        self.state = state\n\n    def copy(self):\n        return Part(self.type, self.content, self.state)\n\n    def with_state(self, state):\n        self.state = state\n        return self\n\nclass SkinnyCommentChain(object):\n    def __init__(self, critic, location):\n        filechange = api.filechange.fetch(\n            critic, location.changeset, location.file)\n\n        if location.side == \"old\":\n            key = filechange.old_sha1\n        else:\n            key = filechange.new_sha1\n\n        self.lines_by_sha1 = {}\n        self.lines_by_sha1[key] = (\n            location.first_line,\n            location.last_line - location.first_line + 1\n        )\n\n        self.comments = True\n\ndef fetch(critic, filechange):\n    cache_key = Filediff.cache_key(filechange)\n    try:\n        return Filediff.get_cached(critic, cache_key)\n    except KeyError:\n        pass\n    filediff = Filediff(filechange).wrap(critic)\n    Filediff.add_cached(critic, cache_key, filediff)\n    return filediff\n\ndef fetchAll(critic, changeset):\n    return [\n        fetch(critic, filechange)\n        for filechange\n        in changeset.files\n    ]\n\ndef parts_from_html(content):\n    if content is None:\n        return None\n\n    return (Part(part_json[0], part_json[1].encode(\"utf-8\"))\n            for part_json in json.loads(content))\n\nclass Parts(object):\n    def __init__(self, parts):\n        self.parts = list(parts)\n        self.offset = 0\n\n    def extract(self, length):\n        self.offset += length\n        while self.parts and len(self.parts[0].content) <= length:\n            part = self.parts.pop(0)\n            length -= len(part.content)\n            yield part\n        if length:\n            tail_part = self.parts[0]\n            head_part = tail_part.copy()\n            head_part.content = head_part.content[:length]\n            tail_part.content = tail_part.content[length:]\n            yield head_part\n\n    def skip(self, length):\n        for part in self.extract(length):\n            pass\n\ndef perform_detailed_operations(operations, old_content, new_content):\n    processed_content = []\n\n    old_parts = Parts(old_content)\n    new_parts = Parts(new_content)\n\n    for operation in operations:\n        if operation[0] == \"r\":\n            old_range, _, new_range = operation[1:].partition(\"=\")\n        elif operation[0] == \"d\":\n            old_range = operation[1:]\n            new_range = None\n        else:\n            old_range = None\n            new_range = operation[1:]\n\n        if old_range:\n            old_begin, old_end = map(int, old_range.split(\"-\"))\n\n        if new_range:\n            new_begin, new_end = map(int, new_range.split(\"-\"))\n\n        if old_range:\n            context_length = old_begin - old_parts.offset\n            if context_length:\n                processed_content.extend(old_parts.extract(context_length))\n                new_parts.skip(context_length)\n\n            deleted_length = old_end - old_begin\n            processed_content.extend(\n                part.with_state(\"d\")\n                for part in old_parts.extract(deleted_length))\n\n        if new_range:\n            if not old_range:\n                context_length = new_begin - new_parts.offset\n                if context_length:\n                    processed_content.extend(old_parts.extract(context_length))\n                    new_parts.skip(context_length)\n\n            inserted_length = new_end - new_begin\n            processed_content.extend(\n                part.with_state(\"i\")\n                for part in new_parts.extract(inserted_length))\n\n    processed_content.extend(old_parts.parts)\n\n    return processed_content\n\ndef perform_basic_operations(line_type, old_content, new_content):\n    if old_content is not None and new_content is not None:\n        return ([part.with_state(\"d\") for part in old_content or []] +\n                [part.with_state(\"i\") for part in new_content or []])\n    elif old_content is not None:\n        return old_content\n    return new_content\n"
  },
  {
    "path": "src/api/impl/filediff_unittest.py",
    "content": "from api.impl.changeset_unittest import FROM_SHA1, TO_SHA1\n\ndef pre1():\n    import api\n\n    critic = api.critic.startSession(for_testing=True)\n    repository = api.repository.fetch(critic, name=\"critic\")\n\n    from_commit = api.commit.fetch(repository, sha1=FROM_SHA1)\n    to_commit = api.commit.fetch(repository, sha1=TO_SHA1)\n\n    try:\n        api.changeset.fetch(\n            critic, repository, from_commit=from_commit, to_commit=to_commit)\n    except api.changeset.ChangesetDelayed:\n        pass\n\n    print \"pre1: ok\"\n\ndef pre2():\n    import api\n\n    critic = api.critic.startSession(for_testing=True)\n    repository = api.repository.fetch(critic, name=\"critic\")\n\n    from_commit = api.commit.fetch(repository, sha1=FROM_SHA1)\n    to_commit = api.commit.fetch(repository, sha1=TO_SHA1)\n\n    changeset = api.changeset.fetch(\n        critic, repository, from_commit=from_commit, to_commit=to_commit)\n    file = api.file.fetch(critic, path=\"src/operation/createreview.py\")\n    filechange = api.filechange.fetch(critic, changeset, file)\n\n    try:\n        api.filediff.fetch(critic, filechange).getMacroChunks(context_lines=3)\n    except api.filediff.FilediffDelayed:\n        pass\n    else:\n        assert False, \"filediff not delayed as expected\"\n\n    print \"pre2: ok\"\n\ndef post():\n    import api\n\n    critic = api.critic.startSession(for_testing=True)\n    repository = api.repository.fetch(critic, name=\"critic\")\n\n    from_commit = api.commit.fetch(repository, sha1=FROM_SHA1)\n    to_commit = api.commit.fetch(repository, sha1=TO_SHA1)\n\n    changeset = api.changeset.fetch(\n        critic, repository, from_commit=from_commit, to_commit=to_commit)\n    file = api.file.fetch(critic, path=\"src/operation/createreview.py\")\n    filechange = api.filechange.fetch(critic, changeset, file)\n    filediff = api.filediff.fetch(critic, filechange)\n\n    chunks = filediff.getMacroChunks(context_lines=3)\n\n    assert isinstance(chunks, list)\n    assert len(chunks) == 2\n\n    assert chunks[0].old_offset == 88\n    assert chunks[0].old_count == 6\n    assert chunks[0].new_offset == 88\n    assert chunks[0].new_count == 9\n\n    assert len(chunks[0].lines) == 9\n    assert chunks[0].lines[0].type_string == \"CONTEXT\"\n    assert chunks[0].lines[1].type_string == \"CONTEXT\"\n    assert chunks[0].lines[2].type_string == \"CONTEXT\"\n    assert chunks[0].lines[3].type_string == \"INSERTED\"\n    assert chunks[0].lines[4].type_string == \"INSERTED\"\n    assert chunks[0].lines[5].type_string == \"INSERTED\"\n    assert chunks[0].lines[6].type_string == \"CONTEXT\"\n    assert chunks[0].lines[7].type_string == \"CONTEXT\"\n    assert chunks[0].lines[8].type_string == \"CONTEXT\"\n\n    assert chunks[1].old_offset == 199\n    assert chunks[1].old_count == 14\n    assert chunks[1].new_offset == 202\n    assert chunks[1].new_count == 15\n\n    assert len(chunks[1].lines) == 15\n    assert chunks[1].lines[0].type_string == \"CONTEXT\"\n    assert chunks[1].lines[1].type_string == \"CONTEXT\"\n    assert chunks[1].lines[2].type_string == \"CONTEXT\"\n    assert chunks[1].lines[3].type_string == \"MODIFIED\"\n    assert chunks[1].lines[4].type_string == \"CONTEXT\"\n    assert chunks[1].lines[5].type_string == \"CONTEXT\"\n    assert chunks[1].lines[6].type_string == \"CONTEXT\"\n    assert chunks[1].lines[7].type_string == \"CONTEXT\"\n    assert chunks[1].lines[8].type_string == \"CONTEXT\"\n    assert chunks[1].lines[9].type_string == \"MODIFIED\"\n    assert chunks[1].lines[10].type_string == \"REPLACED\"\n    assert chunks[1].lines[11].type_string == \"INSERTED\"\n    assert chunks[1].lines[12].type_string == \"CONTEXT\"\n    assert chunks[1].lines[13].type_string == \"CONTEXT\"\n    assert chunks[1].lines[14].type_string == \"CONTEXT\"\n\n    print \"post: ok\"\n"
  },
  {
    "path": "src/api/impl/filters.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\nimport apiobject\n\nclass RepositoryFilter(apiobject.APIObject):\n    wrapper_class = api.filters.RepositoryFilter\n\n    def __init__(self, filter_id, subject_id, filter_type, path, repository_id,\n                 delegate_string, repository=None):\n        self.id = filter_id\n        self.__subject_id = subject_id\n        self.__subject = None\n        self.type = filter_type\n        self.path = path\n        self.__repository_id = repository_id\n        self.__repository = repository\n        self.__delegate_string = delegate_string\n        self.__delegates = None\n\n    def getSubject(self, critic):\n        if self.__subject is None:\n            self.__subject = api.user.fetch(critic, user_id=self.__subject_id)\n        return self.__subject\n\n    def getRepository(self, critic):\n        if self.__repository is None:\n            self.__repository = api.repository.fetch(\n                critic, repository_id=self.__repository_id)\n        return self.__repository\n\n    def getDelegates(self, critic):\n        if self.__delegates is None:\n            self.__delegates = frozenset(\n                api.user.fetch(critic, name=name.strip())\n                for name in filter(None, self.__delegate_string.split(\",\")))\n        return self.__delegates\n\n    @staticmethod\n    def refresh(critic, tables, cached_filters):\n        if \"filters\" not in tables:\n            return\n\n        RepositoryFilter.updateAll(\n            critic,\n            \"\"\"SELECT id, uid, type, path, repository, delegate\n                 FROM filters\n                WHERE id=ANY (%s)\"\"\",\n            cached_filters)\n\ndef fetchRepositoryFilter(critic, filter_id):\n    cursor = critic.getDatabaseCursor()\n    cursor.execute(\"\"\"SELECT id, uid, type, path, repository, delegate\n                        FROM filters\n                       WHERE id=%s\"\"\",\n                   (filter_id,))\n    try:\n        return next(RepositoryFilter.make(critic, cursor))\n    except StopIteration:\n        raise api.filters.InvalidRepositoryFilterId(filter_id)\n\nclass ReviewFilter(object):\n    wrapper_class = api.filters.ReviewFilter\n\n    def __init__(self, subject_id, filter_type, path, filter_id, review_id,\n                 creator_id):\n        self.__subject_id = subject_id\n        self.__subject = None\n        self.type = filter_type\n        self.path = path\n        self.id = filter_id\n        self.__review_id = review_id\n        self.__review = None\n        self.__creator_id = creator_id\n        self.__creator = None\n\n    def getSubject(self, critic):\n        if self.__subject is None:\n            self.__subject = api.user.fetch(critic, user_id=self.__subject_id)\n        return self.__subject\n\n    def getReview(self, critic):\n        if self.__review is None:\n            self.__review = api.review.fetch(critic, review_id=self.__review_id)\n        return self.__review\n\n    def getCreator(self, critic):\n        if self.__creator is None:\n            self.__creator = api.user.fetch(critic, user_id=self.__creator_id)\n        return self.__creator\n"
  },
  {
    "path": "src/api/impl/labeledaccesscontrolprofile.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2015 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\nimport apiobject\nimport dbutils\n\npublic_module = api.labeledaccesscontrolprofile\npublic_class = public_module.LabeledAccessControlProfile\n\nclass LabeledAccessControlProfile(apiobject.APIObject):\n    wrapper_class = public_class\n\n    def __init__(self, labels, profile_id):\n        self.labels = tuple(labels.split(\"|\"))\n        self.__profile_id = profile_id\n\n    def getAccessControlProfile(self, critic):\n        return api.accesscontrolprofile.fetch(critic, self.__profile_id)\n\n@LabeledAccessControlProfile.cached()\ndef fetch(critic, labels):\n    cursor = critic.getDatabaseCursor()\n    cursor.execute(\"\"\"SELECT labels, profile\n                        FROM labeledaccesscontrolprofiles\n                       WHERE labels=%s\"\"\",\n                   (\"|\".join(labels),))\n    try:\n        return next(LabeledAccessControlProfile.make(critic, cursor))\n    except StopIteration:\n        raise public_module.InvalidAccessControlProfileLabels(labels)\n\ndef fetchAll(critic, profile):\n    cursor = critic.getDatabaseCursor()\n    if profile is None:\n        cursor.execute(\"\"\"SELECT labels, profile\n                            FROM labeledaccesscontrolprofiles\n                        ORDER BY labels ASC\"\"\")\n    else:\n        cursor.execute(\"\"\"SELECT labels, profile\n                            FROM labeledaccesscontrolprofiles\n                           WHERE profile=%s\n                        ORDER BY labels ASC\"\"\",\n                       (profile.id,))\n    return list(LabeledAccessControlProfile.make(critic, cursor))\n"
  },
  {
    "path": "src/api/impl/log/__init__.py",
    "content": "import rebase\nimport partition\n"
  },
  {
    "path": "src/api/impl/log/partition.py",
    "content": "import api\n\nclass Partition(object):\n    def __init__(self, commits):\n        assert not commits or len(commits.heads) == 1\n\n        self.commits = commits\n        self.preceding = None\n        self.following = None\n\n    def wrap(self, critic):\n        return api.log.partition.Partition(critic, self)\n\ndef create(critic, commits, rebases):\n    commits = api.commitset.create(critic, commits)\n    partitions = []\n\n    def add(rebase, partition):\n        if partitions:\n            previous_rebase, previous_partition = partitions[-1]\n            previous_partition._impl.preceding = \\\n                api.log.partition.Partition.Edge(previous_rebase, partition)\n            partition._impl.following = \\\n                api.log.partition.Partition.Edge(previous_rebase,\n                                                 previous_partition)\n        partitions.append((rebase, partition))\n\n    rebase = None\n\n    for rebase in reversed(rebases):\n        partition_commits = commits.getAncestorsOf(\n            rebase.old_head, rebase.old_head in commits)\n        commits = commits - partition_commits\n        add(rebase, Partition(partition_commits).wrap(critic))\n\n    if len(commits.heads) > 1:\n        raise api.log.partition.PartitionError(\n            \"Incompatible commits/rebases arguments\")\n\n    add(None, Partition(commits).wrap(critic))\n\n    return partitions[-1][1]\n"
  },
  {
    "path": "src/api/impl/log/partition_unittest.py",
    "content": "def basic():\n    import api\n\n    critic = api.critic.startSession(for_testing=True)\n    repository = api.repository.fetch(critic, name=\"critic\")\n\n    def fetch_review(number):\n        branch = api.branch.fetch(critic, repository=repository,\n                                  name=\"r/020-reviewrebase/%d\" % number)\n        return api.review.fetch(critic, branch=branch)\n\n    def check_partition(partition):\n        assert isinstance(partition, api.log.partition.Partition)\n\n    def check_commits(commits, summaries):\n        assert isinstance(commits, api.commitset.CommitSet)\n        assert len(commits) == len(summaries)\n        for commit, summary in zip(commits.topo_ordered, summaries):\n            assert commit.summary == summary\n\n    def check_following(partition, rebase_class):\n        edge = partition.following\n        assert isinstance(edge, api.log.partition.Partition.Edge)\n        assert isinstance(edge.rebase, rebase_class)\n        assert isinstance(edge.partition, api.log.partition.Partition)\n        assert edge.partition is not partition\n        mirror = edge.partition.preceding\n        assert mirror.partition is partition\n        assert mirror.rebase is edge.rebase\n        return edge.partition\n\n    #\n    # 020-reviewrebase, test 1\n    #\n\n    partition = fetch_review(1).first_partition\n    assert partition.preceding is None\n    check_partition(partition)\n    check_commits(partition.commits, [\"Test #1, commit 6\",\n                                      \"Test #1, commit 5\",\n                                      \"Test #1, commit 4\"])\n    partition = check_following(partition, api.log.rebase.HistoryRewrite)\n    check_partition(partition)\n    check_commits(partition.commits, [\"Test #1, commit 3\",\n                                      \"Test #1, commit 2\",\n                                      \"Test #1, commit 1\"])\n    assert partition.following is None\n\n    #\n    # 020-reviewrebase, test 2\n    #\n\n    partition = fetch_review(2).first_partition\n\n    assert partition.preceding is None\n    check_partition(partition)\n    check_commits(partition.commits, [])\n    partition = check_following(partition, api.log.rebase.MoveRebase)\n    check_partition(partition)\n    check_commits(partition.commits, [])\n    partition = check_following(partition, api.log.rebase.HistoryRewrite)\n    check_partition(partition)\n    check_commits(partition.commits, [])\n    partition = check_following(partition, api.log.rebase.MoveRebase)\n    check_partition(partition)\n    check_commits(partition.commits, [\"Test #2, commit 3\",\n                                      \"Test #2, commit 2\",\n                                      \"Test #2, commit 1\"])\n    assert partition.following is None\n\n    #\n    # 020-reviewrebase, test 3\n    #\n\n    partition = fetch_review(3).first_partition\n    assert partition.preceding is None\n    check_partition(partition)\n    check_commits(partition.commits, [])\n    partition = check_following(partition, api.log.rebase.MoveRebase)\n    check_partition(partition)\n    check_commits(partition.commits, [])\n    partition = check_following(partition, api.log.rebase.HistoryRewrite)\n    check_partition(partition)\n    check_commits(partition.commits, [])\n    partition = check_following(partition, api.log.rebase.MoveRebase)\n    check_partition(partition)\n    check_commits(partition.commits, [\"Test #3, commit 3\",\n                                      \"Test #3, commit 2\",\n                                      \"Test #3, commit 1\"])\n    assert partition.following is None\n\n    #\n    # 020-reviewrebase, test 4\n    #\n\n    partition = fetch_review(4).first_partition\n    assert partition.preceding is None\n    check_partition(partition)\n    check_commits(partition.commits, [])\n    partition = check_following(partition, api.log.rebase.MoveRebase)\n    check_partition(partition)\n    check_commits(partition.commits, [(\"Merge branch '020-reviewrebase-4-1' \"\n                                       \"into 020-reviewrebase-4-2\"),\n                                      \"Test #4, commit 3\",\n                                      \"Test #4, commit 2\",\n                                      \"Test #4, commit 1\"])\n    assert partition.following is None\n\n    #\n    # 020-reviewrebase, test 5\n    #\n\n    partition = fetch_review(5).first_partition\n    assert partition.preceding is None\n    check_partition(partition)\n    check_commits(partition.commits, [])\n    partition = check_following(partition, api.log.rebase.MoveRebase)\n    check_partition(partition)\n    check_commits(partition.commits, [])\n    partition = check_following(partition, api.log.rebase.HistoryRewrite)\n    check_partition(partition)\n    check_commits(partition.commits, [\"Test #5, commit 3\",\n                                      \"Test #5, commit 2\",\n                                      \"Test #5, commit 1\"])\n    assert partition.following is None\n\n    print \"basic: ok\"\n"
  },
  {
    "path": "src/api/impl/log/rebase.py",
    "content": "import api\nfrom .. import apiobject\n\nclass Rebase(apiobject.APIObject):\n    wrapper_class = api.log.rebase.Rebase\n\n    def __init__(self, rebase_id, review_id, creator_id,\n                 old_head_id, new_head_id, old_upstream_id, new_upstream_id,\n                 equivalent_merge_id, replayed_rebase_id):\n        self.id = rebase_id\n        self.review_id = review_id\n        self.old_head_id = old_head_id\n        self.new_head_id = new_head_id\n        self.old_upstream_id = old_upstream_id\n        self.new_upstream_id = new_upstream_id\n        self.equivalent_merge_id = equivalent_merge_id\n        self.replayed_rebase_id = replayed_rebase_id\n        self.creator_id = creator_id\n\n        if self.new_upstream_id is None:\n            self.wrapper_class = api.log.rebase.HistoryRewrite\n        else:\n            self.wrapper_class = api.log.rebase.MoveRebase\n\n    def getReview(self, critic):\n        return api.review.fetch(critic, review_id=self.review_id)\n\n    def getRepository(self, critic):\n        return self.getReview(critic).branch.repository\n\n    def getOldHead(self, critic):\n        return api.commit.fetch(self.getRepository(critic),\n                                commit_id=self.old_head_id)\n\n    def getNewHead(self, critic):\n        if self.new_head_id is not None:\n            return api.commit.fetch(self.getRepository(critic),\n                                    commit_id=self.new_head_id)\n        else:\n            return None\n\n    def getOldUpstream(self, critic):\n        return api.commit.fetch(self.getRepository(critic),\n                                commit_id=self.old_upstream_id)\n\n    def getNewUpstream(self, critic):\n        return api.commit.fetch(self.getRepository(critic),\n                                commit_id=self.new_upstream_id)\n\n    def getEquivalentMerge(self, critic):\n        assert self.new_upstream_id is not None\n        if self.equivalent_merge_id is None:\n            return None\n        return api.commit.fetch(self.getRepository(critic),\n                                commit_id=self.equivalent_merge_id)\n\n    def getReplayedRebase(self, critic):\n        assert self.new_upstream_id is not None\n        if self.replayed_rebase_id is None:\n            return None\n        return api.commit.fetch(self.getRepository(critic),\n                                commit_id=self.replayed_rebase_id)\n\n    def getCreator(self, critic):\n        return api.user.fetch(critic, user_id=self.creator_id)\n\n@Rebase.cached()\ndef fetch(critic, rebase_id):\n    cursor = critic.getDatabaseCursor()\n    cursor.execute(\n        \"\"\"SELECT id, review, uid,\n                  old_head, new_head, old_upstream, new_upstream,\n                  equivalent_merge, replayed_rebase\n             FROM reviewrebases\n            WHERE id=%s\"\"\",\n        (rebase_id,))\n    try:\n        return next(Rebase.make(critic, cursor))\n    except StopIteration:\n        raise api.log.rebase.InvalidRebaseId(rebase_id)\n\ndef fetchAll(critic, review, pending):\n    cursor = critic.getDatabaseCursor()\n    new_head = \"new_head IS NULL\" if pending else \"new_head IS NOT NULL\"\n    if review is not None:\n        cursor.execute(\n            \"\"\"SELECT id, review, uid,\n                      old_head, new_head, old_upstream, new_upstream,\n                      equivalent_merge, replayed_rebase\n                 FROM reviewrebases\n                WHERE review=%s\n                  AND \"\"\" + new_head + \"\"\"\n             ORDER BY id DESC\"\"\",\n            (review.id,))\n    else:\n        cursor.execute(\n            \"\"\"SELECT id, review, uid,\n                      old_head, new_head, old_upstream, new_upstream,\n                      equivalent_merge, replayed_rebase\n                 FROM reviewrebases\n                WHERE \"\"\" + new_head + \"\"\"\n             ORDER BY id DESC\"\"\")\n    return list(Rebase.make(critic, cursor))\n"
  },
  {
    "path": "src/api/impl/log/rebase_unittest.py",
    "content": "def basic():\n    import api\n\n    critic = api.critic.startSession(for_testing=True)\n    repository = api.repository.fetch(critic, name=\"critic\")\n    alice = api.user.fetch(critic, name=\"alice\")\n\n    def fetch_review(number=None):\n        if number is None:\n            name = \"r/012-replayrebase\"\n        else:\n            name = \"r/020-reviewrebase/%d\" % number\n        branch = api.branch.fetch(critic, repository=repository, name=name)\n        return api.review.fetch(critic, branch=branch)\n\n    def check_history_rewrite(rebase, old_head_summary, new_head_summary):\n        assert isinstance(rebase, api.log.rebase.HistoryRewrite)\n        assert api.log.rebase.fetch(critic, rebase_id=rebase.id) is rebase\n        assert rebase.old_head.summary == old_head_summary, (rebase.old_head.summary, old_head_summary)\n        assert rebase.new_head.summary == new_head_summary, (rebase.new_head.summary, new_head_summary)\n\n    def check_move_rebase(rebase, old_head_summary, new_head_summary,\n                          old_upstream_summary, new_upstream_summary,\n                          expect=None):\n        assert isinstance(rebase, api.log.rebase.MoveRebase)\n        assert api.log.rebase.fetch(critic, rebase_id=rebase.id) is rebase\n        assert all(isinstance(commit, api.commit.Commit)\n                   for commit in (rebase.old_head, rebase.new_head,\n                                  rebase.old_upstream, rebase.new_upstream))\n        assert rebase.old_head.summary == old_head_summary, (rebase.old_head.summary, old_head_summary)\n        assert rebase.new_head.summary == new_head_summary, (rebase.new_head.summary, new_head_summary)\n        if old_upstream_summary is not None:\n            assert rebase.old_upstream.summary == old_upstream_summary\n        if new_upstream_summary is not None:\n            assert rebase.new_upstream.summary == new_upstream_summary\n        if expect == \"equivalent_merge\":\n            assert isinstance(rebase.equivalent_merge, api.commit.Commit)\n        else:\n            assert rebase.equivalent_merge is None\n        if expect == \"replayed_rebase\":\n            assert isinstance(rebase.replayed_rebase, api.commit.Commit)\n        else:\n            assert rebase.replayed_rebase is None\n        assert rebase.creator is alice\n\n    #\n    # 012-replayrebase\n    #\n\n    rebases = fetch_review().rebases\n\n    assert len(rebases) == 1\n    check_move_rebase(rebases[0],\n                      old_head_summary=\"Use temporary clones for relaying instead of temporary remotes\",\n                      new_head_summary=\"Use temporary clones for relaying instead of temporary remotes\",\n                      old_upstream_summary=\"Add test for diffs including the initial commit\",\n                      new_upstream_summary=\"Add utility function for creating a user\",\n                      expect=\"replayed_rebase\")\n\n    #\n    # 020-reviewrebase, test 1\n    #\n\n    rebases = fetch_review(1).rebases\n\n    assert len(rebases) == 1\n    check_history_rewrite(rebases[0],\n                          old_head_summary=\"Test #1, commit 3\",\n                          new_head_summary=\"Test #1, commit 1\")\n\n    #\n    # 020-reviewrebase, test 2\n    #\n\n    rebases = fetch_review(2).rebases\n\n    assert len(rebases) == 3\n    check_move_rebase(rebases[0],\n                      old_head_summary=\"Test #2, commit 7\",\n                      new_head_summary=\"Test #2, commit 8\",\n                      old_upstream_summary=\"Test #2 base, commit 2\",\n                      new_upstream_summary=\"Test #2 base, commit 1\",\n                      expect=\"replayed_rebase\")\n    check_history_rewrite(rebases[1],\n                          old_head_summary=\"Test #2, commit 6\",\n                          new_head_summary=\"Test #2, commit 7\")\n    check_move_rebase(rebases[2],\n                      old_head_summary=\"Test #2, commit 3\",\n                      new_head_summary=\"Test #2, commit 6\",\n                      old_upstream_summary=\"Test #2 base, commit 1\",\n                      new_upstream_summary=\"Test #2 base, commit 2\",\n                      expect=\"equivalent_merge\")\n\n    #\n    # 020-reviewrebase, test 3\n    #\n\n    rebases = fetch_review(3).rebases\n\n    assert len(rebases) == 3\n    check_move_rebase(rebases[0],\n                      old_head_summary=\"Test #3, commit 7\",\n                      new_head_summary=\"Test #3, commit 8\",\n                      old_upstream_summary=\"Test #3 base, commit 2\",\n                      new_upstream_summary=\"Test #3 base, commit 1\",\n                      expect=\"replayed_rebase\")\n    check_history_rewrite(rebases[1],\n                          old_head_summary=\"Test #3, commit 6\",\n                          new_head_summary=\"Test #3, commit 7\")\n    check_move_rebase(rebases[2],\n                      old_head_summary=\"Test #3, commit 3\",\n                      new_head_summary=\"Test #3, commit 6\",\n                      old_upstream_summary=\"Test #3 base, commit 1\",\n                      new_upstream_summary=\"Test #3 base, commit 2\",\n                      expect=\"equivalent_merge\")\n\n    #\n    # 020-reviewrebase, test 4\n    #\n\n    rebases = fetch_review(4).rebases\n\n    assert len(rebases) == 1\n    check_move_rebase(rebases[0],\n                      old_head_summary=(\"Merge branch '020-reviewrebase-4-1' \"\n                                        \"into 020-reviewrebase-4-2\"),\n                      new_head_summary=\"Test #4, commit 6\",\n                      old_upstream_summary=None,\n                      new_upstream_summary=None,\n                      expect=\"equivalent_merge\")\n\n    #\n    # 020-reviewrebase, test 5\n    #\n\n    rebases = fetch_review(5).rebases\n\n    assert len(rebases) == 2\n    check_move_rebase(rebases[0],\n                      old_head_summary=\"Test #5, commit 1\",\n                      new_head_summary=\"Test #5, commit 4\",\n                      old_upstream_summary=\"Test #5 base, commit 2\",\n                      new_upstream_summary=\"Test #5 base, commit 1\",\n                      expect=\"replayed_rebase\")\n    check_history_rewrite(rebases[1],\n                          old_head_summary=\"Test #5, commit 3\",\n                          new_head_summary=\"Test #5, commit 1\")\n\n    try:\n        api.log.rebase.fetch(critic, rebase_id=10000)\n    except api.log.rebase.InvalidRebaseId:\n        pass\n    except Exception as error:\n        assert False, \"wrong exception raised: %s\" % error\n    else:\n        assert False, \"no exception raised\"\n\n    print \"basic: ok\"\n"
  },
  {
    "path": "src/api/impl/reply.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2017 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\nimport apiobject\n\nclass Reply(apiobject.APIObject):\n    wrapper_class = api.reply.Reply\n\n    def __init__(self, reply_id, state, comment_id, batch_id, author_id,\n                 timestamp, text):\n        self.id = reply_id\n        self.is_draft = state == \"draft\"\n        self.__comment_id = comment_id\n        self.__batch_id = batch_id\n        self.__author_id = author_id\n        self.timestamp = timestamp\n        self.text = text\n\n    def __cmp__(self, other):\n        return cmp(self.__batch_id, other.__batch_id)\n\n    def getComment(self, critic):\n        return api.comment.fetch(critic, self.__comment_id)\n\n    def getAuthor(self, critic):\n        return api.user.fetch(critic, self.__author_id)\n\n    @staticmethod\n    def refresh(critic, tables, cached_replies):\n        if \"comments\" not in tables:\n            return\n\n        Reply.updateAll(\n            critic,\n            \"\"\"SELECT id, state, chain, batch, uid, time, comment\n                 FROM comments\n                WHERE id=ANY (%s)\"\"\",\n            cached_replies)\n\n@Reply.cached(api.reply.InvalidReplyId)\ndef fetch(critic, reply_id):\n    cursor = critic.getDatabaseCursor()\n    cursor.execute(\"\"\"SELECT comments.id, comments.state, chain, comments.batch,\n                             comments.uid, comments.time, comment\n                        FROM comments\n                        JOIN commentchains ON (commentchains.id=comments.chain)\n                       WHERE comments.id=%s\n                         AND comments.state!='deleted'\n                         AND commentchains.first_comment!=comments.id\"\"\",\n                   (reply_id,))\n    return Reply.make(critic, cursor)\n\n@Reply.cachedMany(api.reply.InvalidReplyIds)\ndef fetchMany(critic, reply_ids):\n    cursor = critic.getDatabaseCursor()\n    cursor.execute(\"\"\"SELECT comments.id, comments.state, chain, comments.batch,\n                             comments.uid, comments.time, comment\n                        FROM comments\n                        JOIN commentchains ON (commentchains.id=comments.chain)\n                       WHERE comments.id=ANY (%s)\n                         AND comments.state!='deleted'\n                         AND commentchains.first_comment!=comments.id\"\"\",\n                   (reply_ids,))\n    return Reply.make(critic, cursor)\n\ndef fetchForComment(critic, chain_id):\n    cursor = critic.getDatabaseCursor()\n    cursor.execute(\"\"\"SELECT comments.id, comments.state, chain, comments.batch,\n                             comments.uid, comments.time, comment\n                        FROM comments\n                        JOIN commentchains ON (commentchains.id=comments.chain)\n                       WHERE comments.state='current'\n                         AND commentchains.id=%s\n                         AND commentchains.first_comment!=comments.id\n                    ORDER BY comments.batch ASC\"\"\",\n                   (chain_id,))\n    return list(Reply.make(critic, cursor))\n"
  },
  {
    "path": "src/api/impl/reply_unittest.py",
    "content": "import sys\nimport datetime\n\ndef basic(arguments):\n    import api\n\n    critic = api.critic.startSession(for_testing=True)\n    repository = api.repository.fetch(critic, name=\"critic\")\n    branch = api.branch.fetch(\n        critic, repository=repository, name=arguments.review)\n    review = api.review.fetch(critic, branch=branch)\n    alice = api.user.fetch(critic, name=\"alice\")\n    bob = api.user.fetch(critic, name=\"bob\")\n    dave = api.user.fetch(critic, name=\"dave\")\n    erin = api.user.fetch(critic, name=\"erin\")\n\n    EXPECTED = {\n        0: [dave, bob, erin, bob, alice],\n        1: [alice, bob],\n        2: [bob, erin, alice],\n        3: [],\n        4: [bob],\n        5: []\n    }\n\n    def check_replies(comment):\n        assert isinstance(comment, api.comment.Comment)\n        assert isinstance(comment.replies, list)\n\n        expected = EXPECTED[comment_id_map[comment.id]]\n\n        assert len(comment.replies) == len(expected)\n\n        for index, (reply, author) in enumerate(zip(comment.replies, expected)):\n            assert isinstance(reply, api.reply.Reply)\n            assert isinstance(reply.id, int)\n            assert isinstance(reply.is_draft, bool)\n            assert not reply.is_draft\n            assert reply.comment is comment\n            assert reply.author is author, (comment.id, index, reply.author.name)\n            assert isinstance(reply.timestamp, datetime.datetime)\n            assert isinstance(reply.text, str)\n            assert reply.text == (\"This is a reply from %s.\"\n                                  % author.name.capitalize())\n\n            assert api.reply.fetch(critic, reply_id=reply.id) is reply\n\n    comments = api.comment.fetchAll(critic, review=review)\n    assert isinstance(comments, list)\n    assert len(comments) == 6\n\n    comment_id_map = {\n        comment.id: index\n        for index, comment in enumerate(comments)\n    }\n\n    for comment in comments:\n        check_replies(comment)\n\n    reply_ids = [\n        reply.id\n        for reply in reversed(comments[0].replies[:3])\n    ]\n\n    some_replies = api.reply.fetchMany(critic, reply_ids)\n\n    assert len(some_replies) == 3\n    assert some_replies[0].id == reply_ids[0]\n    assert some_replies[0] is api.reply.fetch(critic, reply_ids[0])\n    assert some_replies[1].id == reply_ids[1]\n    assert some_replies[1] is api.reply.fetch(critic, reply_ids[1])\n    assert some_replies[2].id == reply_ids[2]\n    assert some_replies[2] is api.reply.fetch(critic, reply_ids[2])\n\n    print \"basic: ok\"\n\ndef main(argv):\n    import argparse\n\n    parser = argparse.ArgumentParser()\n\n    parser.add_argument(\"--review\")\n    parser.add_argument(\"tests\", nargs=argparse.REMAINDER)\n\n    arguments = parser.parse_args(argv)\n\n    for test in arguments.tests:\n        if test == \"basic\":\n            basic(arguments)\n"
  },
  {
    "path": "src/api/impl/repository.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport os\nimport subprocess\n\nimport api\nimport apiobject\n\nimport auth\nimport configuration\nimport dbutils\nimport gitutils\n\nclass Repository(apiobject.APIObject):\n    wrapper_class = api.repository.Repository\n\n    def __init__(self, repository_id, name, path):\n        self.id = repository_id\n        self.name = name\n        self.path = path\n        self.relative_path = os.path.relpath(path, configuration.paths.GIT_DIR)\n        self.__internal = None\n\n    def getInternal(self, critic):\n        if not self.__internal:\n            self.__internal = gitutils.Repository.fromId(\n                db=critic.database, repository_id=self.id)\n        return self.__internal\n\n    def getURL(self, critic):\n        return gitutils.Repository.constructURL(\n            critic.database, critic.effective_user.internal, self.path)\n\n    def run(self, *args):\n        argv = [configuration.executables.GIT] + list(args)\n        process = subprocess.Popen(\n            argv,\n            stdout=subprocess.PIPE,\n            stderr=subprocess.PIPE,\n            cwd=self.path)\n        stdout, stderr = process.communicate()\n        if process.returncode != 0:\n            raise api.repository.GitCommandError(\n                argv, process.returncode, stdout, stderr)\n        return stdout\n\n    def resolveRef(self, ref, expect, short):\n        command_line = [\"rev-parse\", \"--verify\", \"--quiet\"]\n        if short:\n            if isinstance(short, int):\n                command_line.append(\"--short=%d\" % short)\n            else:\n                command_line.append(\"--short\")\n        if expect is not None:\n            ref += \"^{%s}\" % expect\n        command_line.append(ref)\n        try:\n            return self.run(*command_line).strip()\n        except api.repository.GitCommandError:\n            raise api.repository.InvalidRef(ref)\n\n    def listCommits(self, repository, include, exclude, args, paths):\n        args = ['rev-list'] + args\n        args.extend(commit.sha1 for commit in include)\n        args.extend(\"^\" + commit.sha1 for commit in exclude)\n        if paths:\n            args.append(\"--\")\n            args.extend(paths)\n        return api.commit.fetchMany(repository, sha1s=self.run(*args).split())\n\n    @classmethod\n    def create(Repository, critic, repository_id, name, path):\n        import auth\n        repository = Repository(repository_id, name, path).wrap(critic)\n        auth.AccessControl.accessRepository(critic.database, \"read\", repository)\n        return repository\n\n@Repository.cached()\ndef fetch(critic, repository_id, name, path):\n    cursor = critic.getDatabaseCursor()\n    if repository_id is not None:\n        cursor.execute(\"\"\"SELECT id, name, path\n                            FROM repositories\n                           WHERE id=%s\"\"\",\n                       (repository_id,))\n    elif name is not None:\n        cursor.execute(\"\"\"SELECT id, name, path\n                            FROM repositories\n                           WHERE name=%s\"\"\",\n                       (name,))\n    else:\n        cursor.execute(\"\"\"SELECT id, name, path\n                            FROM repositories\n                           WHERE path=%s\"\"\",\n                       (path,))\n    try:\n        return next(Repository.make(critic, cursor))\n    except StopIteration:\n        if repository_id is not None:\n            raise api.repository.InvalidRepositoryId(repository_id)\n        elif name is not None:\n            raise api.repository.InvalidRepositoryName(name)\n        else:\n            raise api.repository.InvalidRepositoryPath(path)\n\ndef fetchAll(critic):\n    cursor = critic.getDatabaseCursor()\n    cursor.execute(\"\"\"SELECT id, name, path\n                        FROM repositories\n                    ORDER BY name\"\"\")\n    return list(Repository.make(\n        critic, cursor, ignored_errors=(auth.AccessDenied,)))\n\ndef fetchHighlighted(critic, user):\n    highlighted = set()\n\n    cursor = critic.getDatabaseCursor()\n\n    cursor.execute(\"\"\"SELECT DISTINCT repository\n                        FROM filters\n                       WHERE uid=%s\"\"\",\n                   (user.id,))\n    highlighted.update(repository_id for (repository_id,) in cursor)\n\n    cursor.execute(\"\"\"SELECT DISTINCT repository\n                        FROM branches\n                        JOIN reviews ON (reviews.branch=branches.id)\n                        JOIN reviewusers ON (reviewusers.review=reviews.id)\n                       WHERE reviewusers.uid=%s\n                         AND reviewusers.owner\"\"\",\n                   (user.id,))\n    highlighted.update(repository_id for (repository_id,) in cursor)\n\n    cursor.execute(\"\"\"SELECT id, name, path\n                        FROM repositories\n                       WHERE id=ANY (%s)\n                    ORDER BY name\"\"\",\n                   (list(highlighted),))\n    return list(Repository.make(\n        critic, cursor, ignored_errors=(auth.AccessDenied,)))\n"
  },
  {
    "path": "src/api/impl/repository_unittest.py",
    "content": "def basic(arguments):\n    import api\n\n    assert arguments.sha1 is not None\n    assert len(arguments.sha1) == 40\n\n    assert arguments.head is not None\n    assert len(arguments.head) == 40\n\n    assert arguments.path is not None\n\n    critic = api.critic.startSession(for_testing=True)\n    repository = api.repository.fetch(critic, repository_id=1)\n    alice = api.user.fetch(critic, name=\"alice\")\n\n    assert isinstance(repository, api.repository.Repository)\n    assert isinstance(repository.id, int)\n    assert repository.id == 1\n    assert isinstance(repository.name, str)\n    assert repository.name == \"critic\"\n    assert isinstance(repository.path, str)\n    assert repository.path == arguments.path\n    assert isinstance(repository.relative_path, str)\n    assert arguments.path.endswith(repository.relative_path)\n\n    # FIXME: repository.url is currently broken.\n\n    assert api.repository.fetch(critic, name=\"critic\") is repository\n    assert api.repository.fetch(critic, path=arguments.path) is repository\n\n    all_repositories = api.repository.fetchAll(critic)\n    assert len(all_repositories) == 2\n    assert all_repositories[0] is repository\n\n    highlighted_repositories = api.repository.fetchHighlighted(critic, alice)\n    assert len(highlighted_repositories) == 1\n    assert highlighted_repositories[0] is repository\n\n    head = api.commit.fetch(repository, sha1=arguments.head)\n\n    assert arguments.head == repository.resolveRef(\"HEAD\")\n    assert arguments.head == repository.resolveRef(\"HEAD\", expect=\"commit\")\n    assert arguments.head.startswith(repository.resolveRef(\"HEAD\", short=True))\n    assert arguments.head.startswith(repository.resolveRef(\"HEAD\", short=8))\n    assert len(repository.resolveRef(\"HEAD\", short=8)) == 8\n    assert head.tree == repository.resolveRef(\"HEAD\", expect=\"tree\")\n\n    simple_tag = repository.resolveRef(\"007-repository/simple-tag\")\n    assert simple_tag == head.sha1\n\n    annotated_tag = repository.resolveRef(\"007-repository/annotated-tag\")\n    assert annotated_tag != head.sha1\n    annotated_tag = repository.resolveRef(\"007-repository/annotated-tag\",\n                                          expect=\"commit\")\n    assert annotated_tag == head.sha1\n\n    commit0 = api.commit.fetch(repository, sha1=arguments.sha1)\n    commit1 = commit0.parents[0]\n    commit2 = commit1.parents[0]\n    commit3 = commit2.parents[0]\n    commit4 = commit3.parents[0]\n    commit5 = commit4.parents[0]\n\n    commits = repository.listCommits(commit0, commit5)\n    assert len(commits) == 5\n    assert all(isinstance(commit, api.commit.Commit) for commit in commits)\n    assert commits[0] == commit0\n    assert commits[1] == commit1\n    assert commits[2] == commit2\n    assert commits[3] == commit3\n    assert commits[4] == commit4\n\n    commits = repository.listCommits((commit for commit in [commit0]),\n                                     (commit for commit in [commit5]),\n                                     args=[\"--merges\"])\n    assert len(commits) == 0\n\n    commits = repository.listCommits(args=[\"--reverse\",\n                                           \"%s..%s\" % (commit5, commit0)])\n    assert len(commits) == 5\n    assert commits[0] == commit4\n    assert commits[1] == commit3\n    assert commits[2] == commit2\n    assert commits[3] == commit1\n    assert commits[4] == commit0\n\n    try:\n        api.repository.fetch(critic, repository_id=4711)\n    except api.repository.InvalidRepositoryId:\n        pass\n    except Exception as error:\n        assert False, \"wrong exception raised: %s\" % error\n    else:\n        assert False, \"no exception raised\"\n\n    try:\n        api.repository.fetch(critic, name=\"wrong\")\n    except api.repository.InvalidRepositoryName:\n        pass\n    except Exception as error:\n        assert False, \"wrong exception raised: %s\" % error\n    else:\n        assert False, \"no exception raised\"\n\n    try:\n        api.repository.fetch(critic, path=\"/var/git/wrong.git\")\n    except api.repository.InvalidRepositoryPath:\n        pass\n    except Exception as error:\n        assert False, \"wrong exception raised: %s\" % error\n    else:\n        assert False, \"no exception raised\"\n\n    print \"basic: ok\"\n\ndef main(argv):\n    import argparse\n\n    parser = argparse.ArgumentParser()\n\n    parser.add_argument(\"--sha1\")\n    parser.add_argument(\"--head\")\n    parser.add_argument(\"--path\")\n    parser.add_argument(\"tests\", nargs=argparse.REMAINDER)\n\n    arguments = parser.parse_args(argv)\n\n    for test in arguments.tests:\n        if test == \"basic\":\n            basic(arguments)\n"
  },
  {
    "path": "src/api/impl/review.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\nimport apiobject\nimport api.impl.filters\n\nimport auth\n\nclass Review(apiobject.APIObject):\n    wrapper_class = api.review.Review\n\n    def __init__(self, review_id, repository_id, branch_id, state, summary,\n                 description):\n        self.id = review_id\n        self.__repository_id = repository_id\n        self.__branch_id = branch_id\n        self.state = state\n        self.summary = summary\n        self.description = description\n        self.__owners_ids = None\n        self.__assigned_reviewers_ids = None\n        self.__active_reviewers_ids = None\n        self.__watchers_ids = None\n        self.__filters = None\n        self.__commits = None\n        self.__rebases = None\n        self.__issues = None\n        self.__notes = None\n        self.__open_issues = None\n        self.__total_progress = None\n        self.__progress_per_commit = None\n\n    def getRepository(self, critic):\n        return api.repository.fetch(critic, repository_id=self.__repository_id)\n\n    def getBranch(self, critic):\n        return api.branch.fetch(critic, branch_id=self.__branch_id)\n\n    def __fetchOwners(self, critic):\n        if self.__owners_ids is None:\n            cursor = critic.getDatabaseCursor()\n            cursor.execute(\"\"\"SELECT uid\n                                FROM reviewusers\n                               WHERE review=%s\n                                 AND owner\"\"\",\n                           (self.id,))\n            self.__owners_ids = frozenset(user_id for (user_id,) in cursor)\n\n    def getOwners(self, critic):\n        self.__fetchOwners(critic)\n        return frozenset(api.user.fetch(critic, user_id=user_id)\n                         for user_id in self.__owners_ids)\n\n    def __fetchAssignedReviewers(self, critic):\n        if self.__assigned_reviewers_ids is None:\n            cursor = critic.getDatabaseCursor()\n            cursor.execute(\n                \"\"\"SELECT DISTINCT uid\n                     FROM reviewuserfiles\n                     JOIN reviewfiles ON (reviewfiles.id=reviewuserfiles.file)\n                    WHERE reviewfiles.review=%s\"\"\",\n                (self.id,))\n            self.__assigned_reviewers_ids = frozenset(\n                user_id for (user_id,) in cursor)\n\n    def getAssignedReviewers(self, critic):\n        self.__fetchAssignedReviewers(critic)\n        return frozenset(api.user.fetchMany(\n            critic, user_ids=self.__assigned_reviewers_ids))\n\n    def __fetchActiveReviewers(self, critic):\n        if self.__active_reviewers_ids is None:\n            cursor = critic.getDatabaseCursor()\n            cursor.execute(\n                \"\"\"SELECT DISTINCT uid\n                     FROM reviewfilechanges\n                     JOIN reviewfiles ON (reviewfiles.id=reviewfilechanges.file)\n                    WHERE reviewfiles.review=%s\"\"\",\n                (self.id,))\n            self.__active_reviewers_ids = frozenset(\n                user_id for (user_id,) in cursor)\n\n    def getActiveReviewers(self, critic):\n        self.__fetchActiveReviewers(critic)\n        return frozenset(api.user.fetchMany(\n            critic, user_ids=self.__active_reviewers_ids))\n\n    def __fetchWatchers(self, critic):\n        if self.__watchers_ids is None:\n            cursor = critic.getDatabaseCursor()\n            cursor.execute(\"\"\"SELECT uid\n                                FROM reviewusers\n                               WHERE review=%s\"\"\",\n                           (self.id,))\n            associated_users = frozenset(user_id for (user_id,) in cursor)\n            self.__fetchOwners(critic)\n            self.__fetchAssignedReviewers(critic)\n            self.__fetchActiveReviewers(critic)\n            non_watchers = self.__owners_ids | self.__assigned_reviewers_ids | \\\n                           self.__active_reviewers_ids\n            self.__watchers_ids = associated_users - non_watchers\n\n    def getWatchers(self, critic):\n        self.__fetchWatchers(critic)\n        return frozenset(api.user.fetch(critic, user_id=user_id)\n                         for user_id in self.__watchers_ids)\n\n    def getFilters(self, critic):\n        if self.__filters is None:\n            cursor = critic.getDatabaseCursor()\n            cursor.execute(\"\"\"SELECT uid, type, path, id, review, creator\n                                FROM reviewfilters\n                               WHERE review=%s\"\"\",\n                           (self.id,))\n            impls = [api.impl.filters.ReviewFilter(*row) for row in cursor]\n            self.__filters = [api.filters.ReviewFilter(critic, impl)\n                              for impl in impls]\n        return self.__filters\n\n    def getCommits(self, critic):\n        if self.__commits is None:\n            cursor = critic.getDatabaseCursor()\n            # Direct changesets: no merges, no rebase changes.\n            cursor.execute(\n                \"\"\"SELECT DISTINCT commits.id, commits.sha1\n                     FROM commits\n                     JOIN changesets ON (changesets.child=commits.id)\n                     JOIN reviewchangesets ON (reviewchangesets.changeset=changesets.id)\n                    WHERE reviewchangesets.review=%s\n                      AND changesets.type='direct'\"\"\",\n                (self.id,))\n            commit_ids_sha1s = set(cursor)\n            # Merge changesets, excluding those added by move rebases.\n            cursor.execute(\n                \"\"\"SELECT DISTINCT commits.id, commits.sha1\n                     FROM commits\n                     JOIN changesets ON (changesets.child=commits.id)\n                     JOIN reviewchangesets ON (reviewchangesets.changeset=changesets.id)\n          LEFT OUTER JOIN reviewrebases ON (reviewrebases.review=%s\n                                        AND reviewrebases.equivalent_merge=commits.id)\n                    WHERE reviewchangesets.review=%s\n                      AND changesets.type='merge'\n                      AND reviewrebases.id IS NULL\"\"\",\n                (self.id, self.id))\n            commit_ids_sha1s.update(cursor)\n            repository = self.getRepository(critic)\n            commits = [api.commit.fetch(repository, commit_id, sha1)\n                       for commit_id, sha1 in commit_ids_sha1s]\n            self.__commits = api.commitset.create(critic, commits)\n        return self.__commits\n\n    def getRebases(self, wrapper):\n        return api.log.rebase.fetchAll(wrapper.critic, wrapper)\n\n    def getPendingRebase(self, wrapper):\n        rebases = api.log.rebase.fetchAll(wrapper.critic, wrapper, pending=True)\n        if len(rebases) == 1:\n            return rebases[0]\n        else:\n            return None\n\n    def getIssues(self, wrapper):\n        if self.__issues is None:\n            self.__issues = api.comment.fetchAll(\n                wrapper.critic, review=wrapper, comment_type=\"issue\")\n        return self.__issues\n\n    def getOpenIssues(self, wrapper):\n        if self.__open_issues is None:\n            self.__open_issues = [issue\n                                  for issue\n                                  in self.getIssues(wrapper)\n                                  if issue.state == \"open\"]\n        return self.__open_issues\n\n    def getNotes(self, wrapper):\n        if self.__notes is None:\n            self.__notes = api.comment.fetchAll(\n                wrapper.critic, review=wrapper, comment_type=\"note\")\n        return self.__notes\n\n    def isReviewableCommit(self, critic, commit):\n        cursor = critic.getDatabaseCursor()\n        cursor.execute(\n            \"\"\"SELECT 1\n                 FROM reviewchangesets\n                 JOIN changesets ON (changesets.id=reviewchangesets.changeset)\n                WHERE reviewchangesets.review=%s\n                  AND changesets.child=%s\"\"\",\n            (self.id, commit.id))\n        return bool(cursor.fetchone())\n\n    def getTotalProgress(self, critic):\n        if self.__total_progress is None:\n            cursor = critic.getDatabaseCursor()\n            cursor.execute(\n                \"\"\"SELECT state, sum(inserted+deleted)\n                     FROM reviewfiles\n                    WHERE review=%s\n                 GROUP BY state\"\"\",\n                (self.id,))\n\n            reviewed = 0\n            pending = 0\n            for state, modifications in cursor:\n                if modifications == 0: # binary file change\n                    actual_modifications = 1\n                else:\n                    actual_modifications = modifications\n                if state == \"reviewed\":\n                    reviewed = actual_modifications\n                elif state == \"pending\":\n                    pending = actual_modifications\n\n            total = reviewed + pending\n            if reviewed == 0:\n                self.__total_progress = 0\n            elif pending == 0:\n                self.__total_progress = 1\n            else:\n                self.__total_progress = reviewed / float(total)\n        return self.__total_progress\n\n    def getProgressPerCommit(self, critic):\n        if self.__progress_per_commit is None:\n            cursor = critic.getDatabaseCursor()\n            cursor.execute(\n                \"\"\"SELECT changesets.child, SUM(deleted + inserted)\n                     FROM reviewfiles\n                     JOIN changesets ON changesets.id=reviewfiles.changeset\n                    WHERE reviewfiles.review=%s\n                 GROUP BY changesets.child\"\"\",\n                (self.id,))\n\n            total_changes_dict = {}\n            for commit_id, changes in cursor:\n                total_changes_dict[commit_id] = changes\n\n            cursor.execute(\n                \"\"\"SELECT changesets.child, SUM(deleted + inserted)\n                     FROM reviewfiles\n                     JOIN changesets ON changesets.id=reviewfiles.changeset\n                    WHERE reviewfiles.review=%s AND state='reviewed'\n                 GROUP BY changesets.child\"\"\",\n                (self.id,))\n\n            reviewed_changes_dict = {}\n            for commit_id, changes in cursor:\n                reviewed_changes_dict[commit_id] = changes\n\n            commit_change_counts = []\n            for commit_id, total_changes in total_changes_dict.iteritems():\n                reviewed_changes = reviewed_changes_dict.get(commit_id, 0)\n\n                commit_change_counts.append(api.review.CommitChangeCount(\n                    commit_id, total_changes, reviewed_changes))\n\n            self.__progress_per_commit = commit_change_counts\n        return self.__progress_per_commit\n\n    @classmethod\n    def create(Review, critic, *args):\n        review = Review(*args).wrap(critic)\n        # Access the repository object to trigger an access control check.\n        review.repository\n        return review\n\n@Review.cached()\ndef fetch(critic, review_id, branch):\n    cursor = critic.getDatabaseCursor()\n    if review_id is not None:\n        cursor.execute(\"\"\"SELECT reviews.id, branches.repository, branches.id,\n                                 state, summary, description\n                            FROM reviews\n                            JOIN branches ON (branches.id=reviews.branch)\n                           WHERE reviews.id=%s\"\"\",\n                       (review_id,))\n    else:\n        cursor.execute(\"\"\"SELECT reviews.id, branches.repository, branches.id,\n                                 state, summary, description\n                            FROM reviews\n                            JOIN branches ON (branches.id=reviews.branch)\n                           WHERE branches.id=%s\"\"\",\n                       (int(branch),))\n    try:\n        return next(Review.make(critic, cursor))\n    except StopIteration:\n        if review_id is not None:\n            raise api.review.InvalidReviewId(review_id)\n        else:\n            raise api.review.InvalidReviewBranch(branch)\n\ndef fetchMany(critic, review_ids):\n    cursor = critic.getDatabaseCursor()\n    cursor.execute(\"\"\"SELECT reviews.id, branches.repository, branches.id,\n                             state, summary, description\n                        FROM reviews\n                        JOIN branches ON (branches.id=reviews.branch)\n                       WHERE reviews.id=ANY (%s)\"\"\",\n                   (review_ids,))\n\n    reviews_by_id = {review.id: review for review in Review.make(critic, cursor)}\n\n    return [reviews_by_id[review_id] for review_id in review_ids]\n\ndef fetchAll(critic, repository, state):\n    cursor = critic.getDatabaseCursor()\n    conditions = [\"TRUE\"]\n    values = []\n    if repository is not None:\n        conditions.append(\"branches.repository=%s\")\n        values.append(repository.id)\n    if state is not None:\n        conditions.append(\"reviews.state IN (%s)\"\n                          % \", \".join([\"%s\"] * len(state)))\n        values.extend(state)\n    cursor.execute(\"\"\"SELECT reviews.id, branches.repository, branches.id,\n                             state, summary, description\n                        FROM reviews\n                        JOIN branches ON (branches.id=reviews.branch)\n                       WHERE \"\"\" + \" AND \".join(conditions) + \"\"\"\n                    ORDER BY reviews.id\"\"\",\n                   values)\n    return list(Review.make(\n        critic, cursor, ignored_errors=(auth.AccessDenied,)))\n"
  },
  {
    "path": "src/api/impl/review_unittest.py",
    "content": "def basic():\n    import api\n\n    critic = api.critic.startSession(for_testing=True)\n    review = api.review.fetch(critic, review_id=1)\n\n    assert isinstance(review, api.review.Review)\n\n    assert isinstance(review.id, int)\n    assert review.id == 1\n    assert int(review) == 1\n\n    assert api.review.fetch(critic, branch=review.branch) is review\n\n    assert review.state == \"open\"\n\n    assert isinstance(review.summary, basestring)\n    assert review.summary == \"Minor /dashboard query optimizations\"\n\n    assert review.description is None\n\n    assert isinstance(review.repository, api.repository.Repository)\n    assert review.repository.name == \"critic\"\n\n    assert isinstance(review.branch, api.branch.Branch)\n    assert review.branch.name == \"r/004-createreview\"\n    assert review.branch.repository is review.repository\n\n    assert isinstance(review.owners, frozenset)\n    assert all(isinstance(owner, api.user.User) for owner in review.owners)\n    assert list(review.owners)[0].name == \"alice\"\n\n    assert isinstance(review.filters, list)\n    assert all(isinstance(review_filter, api.filters.ReviewFilter)\n               for review_filter in review.filters)\n    assert len(review.filters) == 3\n\n    assert isinstance(review.filters[0].subject, api.user.User)\n    assert review.filters[0].subject.name == \"bob\"\n    assert review.filters[0].type == \"reviewer\"\n    assert review.filters[0].path == \"/\"\n    assert isinstance(review.filters[0].id, int)\n    assert review.filters[0].review is review\n    assert isinstance(review.filters[0].creator, api.user.User)\n    assert review.filters[0].creator.name == \"alice\"\n\n    assert isinstance(review.filters[1].subject, api.user.User)\n    assert review.filters[1].subject.name == \"dave\"\n    assert review.filters[1].type == \"watcher\"\n    assert review.filters[1].path == \"/\"\n    assert isinstance(review.filters[1].id, int)\n    assert review.filters[1].id != review.filters[0].id\n    assert review.filters[1].review is review\n    assert isinstance(review.filters[1].creator, api.user.User)\n    assert review.filters[1].creator.name == \"alice\"\n\n    assert isinstance(review.filters[2].subject, api.user.User)\n    assert review.filters[2].subject.name == \"erin\"\n    assert review.filters[2].type == \"watcher\"\n    assert review.filters[2].path == \"/\"\n    assert isinstance(review.filters[2].id, int)\n    assert review.filters[2].id not in (review.filters[0].id,\n                                        review.filters[1].id)\n    assert review.filters[2].review is review\n    assert isinstance(review.filters[2].creator, api.user.User)\n    assert review.filters[2].creator.name == \"alice\"\n\n    assert isinstance(review.commits, api.commitset.CommitSet)\n    assert all(isinstance(commit, api.commit.Commit)\n               for commit in review.commits)\n    assert len(review.commits) == 2\n    assert len(review.commits.heads) == 1\n    assert len(review.commits.tails) == 1\n    topo_ordered = review.commits.topo_ordered\n    assert topo_ordered.next().summary == \"Add missing import\"\n    assert topo_ordered.next().summary == \"Minor /dashboard query optimizations\"\n    assert review.commits == review.branch.commits\n\n    assert isinstance(review.rebases, list)\n    assert len(review.rebases) == 0\n\n    try:\n        api.review.fetch(critic, review_id=10000)\n    except api.review.InvalidReviewId:\n        pass\n    except Exception as error:\n        assert False, \"wrong exception raised: %s\" % error\n    else:\n        assert False, \"no exception raised\"\n\n    master = api.branch.fetch(\n        critic, repository=review.branch.repository, name=\"master\")\n\n    try:\n        api.review.fetch(critic, branch=master)\n    except api.review.InvalidReviewBranch:\n        pass\n    except Exception as error:\n        assert False, \"wrong exception raised: %s\" % error\n    else:\n        assert False, \"no exception raised\"\n\n    all_reviews = api.review.fetchAll(critic)\n\n    assert isinstance(all_reviews, list)\n    assert len(all_reviews) >= 1\n    assert review in all_reviews\n\n    critic_reviews = api.review.fetchAll(critic, repository=review.repository)\n\n    assert isinstance(critic_reviews, list)\n    assert review in critic_reviews\n\n    open_reviews = api.review.fetchAll(critic, state=\"open\")\n\n    assert isinstance(open_reviews, list)\n    assert review in open_reviews\n    assert all(review.state == \"open\" for review in open_reviews)\n\n    closed_reviews = api.review.fetchAll(critic, state=\"closed\")\n\n    assert isinstance(closed_reviews, list)\n    assert review not in closed_reviews\n    assert all(review.state == \"closed\" for review in closed_reviews)\n\n    dropped_reviews = api.review.fetchAll(critic, state=\"dropped\")\n\n    assert isinstance(dropped_reviews, list)\n    assert review not in dropped_reviews\n    assert all(review.state == \"dropped\" for review in dropped_reviews)\n\n    any_reviews = api.review.fetchAll(\n        critic, state=(state for state in [\"open\", \"closed\", \"dropped\"]))\n\n    assert any_reviews == all_reviews\n\n    print \"basic: ok\"\n"
  },
  {
    "path": "src/api/impl/reviewablefilechange.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2017 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\nimport apiobject\n\nclass ReviewableFileChange(apiobject.APIObject):\n    wrapper_class = api.reviewablefilechange.ReviewableFileChange\n\n    def __init__(self, filechange_id, review_id, changeset_id, file_id,\n                 deleted_lines, inserted_lines, reviewed_by_id):\n        self.id = filechange_id\n        self.__review_id = review_id\n        self.__changeset_id = changeset_id\n        self.__file_id = file_id\n        self.inserted_lines = inserted_lines\n        self.deleted_lines = deleted_lines\n        self.is_reviewed = reviewed_by_id is not None\n        self.__reviewed_by_id = reviewed_by_id\n        self.__assigned_reviewers = None\n        self.__draft_changes = None\n        self.__draft_changes_fetched = False\n\n    def getReview(self, critic):\n        return api.review.fetch(critic, self.__review_id)\n\n    def getChangeset(self, critic):\n        review = self.getReview(critic)\n        return api.changeset.fetch(\n            critic, review.repository, self.__changeset_id)\n\n    def getFile(self, critic):\n        return api.file.fetch(critic, self.__file_id)\n\n    def getReviewedBy(self, critic):\n        if self.__reviewed_by_id is None:\n            return None\n        return api.user.fetch(critic, self.__reviewed_by_id)\n\n    def getAssignedReviewers(self, critic):\n        if self.__assigned_reviewers is None:\n            cached_objects = ReviewableFileChange.allCached(critic)\n            assert self.id in cached_objects\n\n            # Filter out those cached objects (including this) whose assigned\n            # reviewers hasn't been fetched yet.\n            need_fetch = set()\n            for filechange in cached_objects.values():\n                if filechange._impl.__assigned_reviewers is None:\n                    filechange._impl.__assigned_reviewers = set()\n                    need_fetch.add(filechange.id)\n            assert self.id in need_fetch\n\n            cursor = critic.getDatabaseCursor()\n            cursor.execute(\"\"\"SELECT file, uid\n                                FROM reviewuserfiles\n                               WHERE file=ANY (%s)\"\"\",\n                           (list(need_fetch),))\n            for filechange_id, reviewer_id in cursor:\n                filechange = cached_objects[filechange_id]\n                filechange._impl.__assigned_reviewers.add(\n                    api.user.fetch(critic, reviewer_id))\n\n            for filechange_id in need_fetch:\n                filechange = cached_objects[filechange_id]\n                filechange._impl.__assigned_reviewers = frozenset(\n                    filechange._impl.__assigned_reviewers)\n\n        return self.__assigned_reviewers\n\n    def getDraftChanges(self, critic):\n        if not self.__draft_changes_fetched:\n            cached_objects = ReviewableFileChange.allCached(critic)\n            assert self.id in cached_objects\n\n            # Filter out those cached objects (including this) whose draft\n            # changes hasn't been fetched yet.\n            need_fetch = set()\n            for filechange in cached_objects.values():\n                if not filechange._impl.__draft_changes_fetched:\n                    need_fetch.add(filechange.id)\n            assert self.id in need_fetch\n\n            cursor = critic.getDatabaseCursor()\n            cursor.execute(\n                \"\"\"SELECT file, from_state='reviewed', to_state='reviewed'\n                     FROM reviewfilechanges\n                    WHERE uid=%s\n                      AND state='draft'\n                      AND file=ANY (%s)\"\"\",\n                (critic.effective_user.id, list(need_fetch)))\n\n            draft_changes = {\n                filechange_id: (from_is_reviewed, to_is_reviewed)\n                for filechange_id, from_is_reviewed, to_is_reviewed in cursor\n            }\n\n            for filechange_id in need_fetch:\n                filechange = cached_objects[filechange_id]\n                filechange._impl.__draft_changes_fetched = True\n                if filechange_id not in draft_changes:\n                    # No unpublished changes in the database.\n                    continue\n                from_is_reviewed, to_is_reviewed = draft_changes[filechange_id]\n                if filechange.is_reviewed != from_is_reviewed:\n                    # The unpublished change has been made redundant, typically\n                    # by another user making the same change.\n                    continue\n                new_reviewed_by = (critic.actual_user\n                                   if to_is_reviewed else None)\n                filechange._impl.__draft_changes = \\\n                    api.reviewablefilechange.ReviewableFileChange.DraftChanges(\n                        critic.actual_user, new_reviewed_by)\n\n        return self.__draft_changes\n\n    @staticmethod\n    def refresh(critic, tables, cached_filechanges):\n        if not tables.intersection((\"reviewfiles\", \"reviewfilechanges\",\n                                    \"reviewuserfiles\")):\n            return\n\n        ReviewableFileChange.updateAll(\n            critic,\n            \"\"\"SELECT id, review, changeset, file, deleted, inserted,\n                      reviewer\n                 FROM reviewfiles\n                WHERE id=ANY (%s)\"\"\",\n            cached_filechanges)\n\n@ReviewableFileChange.cached(\n    api.reviewablefilechange.InvalidReviewableFileChangeId)\ndef fetch(critic, filechange_id):\n    cursor = critic.getDatabaseCursor()\n    cursor.execute(\"\"\"SELECT id, review, changeset, file, deleted, inserted,\n                             reviewer\n                        FROM reviewfiles\n                       WHERE id=%s\"\"\",\n                   (filechange_id,))\n    return ReviewableFileChange.make(critic, cursor)\n\n@ReviewableFileChange.cachedMany(\n    api.reviewablefilechange.InvalidReviewableFileChangeIds)\ndef fetchMany(critic, filechange_ids):\n    cursor = critic.getDatabaseCursor()\n    cursor.execute(\"\"\"SELECT id, review, changeset, file, deleted, inserted,\n                             reviewer\n                        FROM reviewfiles\n                       WHERE id=ANY (%s)\"\"\",\n                   (filechange_ids,))\n    return ReviewableFileChange.make(critic, cursor)\n\ndef fetchAll(critic, review, changeset, file, assignee, is_reviewed):\n    cursor = critic.getDatabaseCursor()\n    tables = [\"reviewfiles\"]\n    conditions = [\"reviewfiles.review=%s\"]\n    values = [review.id]\n    if changeset:\n        # Check if the changeset is a \"squash\" of the changes in multiple\n        # commits. If so, return the reviewable file changes from each of the\n        # commits.\n        contributing_commits = changeset.contributing_commits\n        if contributing_commits is None:\n            raise api.reviewablefilechange.InvalidChangeset(changeset)\n        if len(contributing_commits) > 1:\n            result = []\n            try:\n                for commit in contributing_commits:\n                    # Note: Checking that it is a reviewable commit here is sort\n                    # of redundant; we could just call fetchAll() recursively,\n                    # and it would raise an InvalidChangeset if it is not. The\n                    # problem is that if it is not a reviewable commit, there\n                    # may not be a changeset prepared for it, which would make\n                    # this asynchronous. As long as we only deal with reviewable\n                    # commits, the api.changeset.fetch() call is guaranteed to\n                    # succeed synchronously.\n                    if not review.isReviewableCommit(commit):\n                        raise api.reviewablefilechange.InvalidChangeset(\n                            changeset)\n                    commit_changeset = api.changeset.fetch(\n                        critic, review.repository, single_commit=commit)\n                    result.extend(fetchAll(critic, review, commit_changeset,\n                                           file, assignee, is_reviewed))\n            except api.reviewablefilechange.InvalidChangeset:\n                raise api.reviewablefilechange.InvalidChangeset(changeset)\n            return sorted(result, key=lambda change: change.id)\n        elif not review.isReviewableCommit(next(iter(contributing_commits))):\n            raise api.reviewablefilechange.InvalidChangeset(changeset)\n        conditions.append(\"reviewfiles.changeset=%s\")\n        values.append(changeset.id)\n    if file:\n        conditions.append(\"reviewfiles.file=%s\")\n        values.append(file.id)\n    if assignee:\n        tables.append(\"JOIN reviewuserfiles \"\n                      \" ON (reviewuserfiles.file=reviewfiles.id)\")\n        conditions.append(\"reviewuserfiles.uid=%s\")\n        values.append(assignee.id)\n    if is_reviewed is not None:\n        if assignee:\n            # If the specified assignee has a draft change to the state, use\n            # that changed state instead of the actual state when filtering.\n            tables.append(\"LEFT OUTER JOIN reviewfilechanges \"\n                          \" ON (reviewfilechanges.file=reviewfiles.id\"\n                          \" AND reviewfilechanges.uid=reviewuserfiles.uid\"\n                          \" AND reviewfilechanges.state='draft')\")\n            conditions.append(\"COALESCE(reviewfilechanges.to_state,\"\n                              \"         reviewfiles.state)=%s\")\n        else:\n            conditions.append(\"reviewfiles.state=%s\")\n        values.append(\"reviewed\" if is_reviewed else \"pending\")\n    cursor.execute(\"\"\"SELECT reviewfiles.id, reviewfiles.review,\n                             reviewfiles.changeset, reviewfiles.file,\n                             reviewfiles.deleted, reviewfiles.inserted,\n                             reviewfiles.reviewer\n                        FROM {}\n                       WHERE {}\n                    ORDER BY id\"\"\".format(\n                        \" \".join(tables),\n                        \" AND \".join(conditions)),\n                   values)\n    return list(ReviewableFileChange.make(critic, cursor))\n"
  },
  {
    "path": "src/api/impl/reviewsummary.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2017 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nfrom operator import itemgetter\nimport calendar\nfrom datetime import datetime\n\nimport api\nimport api.impl\nimport apiobject\n\nclass ReviewSummaryContainer(apiobject.APIObject):\n    wrapper_class = api.reviewsummary.ReviewSummaryContainer\n    def __init__(self, review_summaries, more):\n        self.reviews = review_summaries\n        self.more = more\n\nclass ReviewSummary(apiobject.APIObject):\n    wrapper_class = api.reviewsummary.ReviewSummary\n    def __init__(self, review, latest_change):\n        self.review = review\n        self.latest_change = latest_change\n\ndef fetchMany(critic, search_type, user, count, offset):\n    cursor = critic.getDatabaseCursor()\n    if count is None:\n        count = 10\n    if offset is None:\n        offset = 0\n\n    if search_type == \"own\" or search_type == \"other\":\n        cursor.execute(\n            \"\"\"SELECT DISTINCT reviews.id\n                 FROM reviews\n                 JOIN reviewusers ON (reviewusers.review=reviews.id)\n                WHERE reviews.state='open'\n                  AND reviewusers.uid=%s\n                  AND reviewusers.owner=%s\"\"\",\n            (user.id, search_type==\"own\"))\n    else:\n        cursor.execute(\n            \"\"\"SELECT DISTINCT reviews.id\n                 FROM reviews\n                WHERE reviews.state='open'\"\"\")\n    rows = cursor.fetchall()\n    review_ids = [row[0] for row in rows]\n\n    cursor.execute(\n        \"\"\"SELECT reviewchangesets.review,\n                  MAX(commits.commit_time) AS latest_change\n             FROM commits\n             JOIN changesets ON (changesets.child=commits.id)\n             JOIN reviewchangesets ON (reviewchangesets.changeset=changesets.id)\n            WHERE reviewchangesets.review=ANY (%s)\n         GROUP BY reviewchangesets.review\"\"\",\n        (review_ids,))\n\n    latest_commits = {}\n    for review_id, latest_timestamp in cursor:\n        if isinstance(latest_timestamp, str): # sqlite3 returns a string\n            latest_commits[review_id] = calendar.timegm(datetime.strptime(\n                latest_timestamp, \"%Y-%m-%d %H:%M:%S\").timetuple())\n        else:\n            latest_commits[review_id] = calendar.timegm(latest_timestamp.timetuple())\n\n    cursor.execute(\n        \"\"\"SELECT commentchains.review, MAX(comments.time) AS latest_change\n             FROM comments\n             JOIN commentchains ON (commentchains.id=comments.chain)\n            WHERE commentchains.review=ANY (%s)\n              AND (comments.state='current' OR comments.state='edited')\n         GROUP BY commentchains.review\"\"\",\n        (review_ids,))\n\n    latest_comments = {}\n    for review_id, latest_timestamp in cursor:\n        if isinstance(latest_timestamp, datetime):\n            latest_comments[review_id] = calendar.timegm(\n                latest_timestamp.timetuple())\n        else:\n            latest_comments[review_id] = latest_timestamp\n\n    latest_changes = []\n    for review_id in review_ids:\n        latest_change = max(latest_commits.get(review_id),\n                            latest_comments.get(review_id))\n        if latest_change is not None:\n            latest_changes.append((latest_change, review_id))\n\n    latest_sorted_changes = sorted(\n        latest_changes, reverse=True)[offset:offset+count]\n\n    sorted_reviews = [\n        review_id\n        for _, review_id\n        in latest_sorted_changes\n    ]\n\n    review_objects = api.review.fetchMany(critic, sorted_reviews)\n\n    review_summaries = [\n        ReviewSummary(review, latest_change[0]).wrap(critic)\n        for review, latest_change\n        in zip(review_objects, latest_sorted_changes)\n    ]\n\n    has_more = len(latest_changes) > len(review_summaries) + offset\n\n    return ReviewSummaryContainer(review_summaries, has_more).wrap(critic)\n"
  },
  {
    "path": "src/api/impl/user.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\nimport apiobject\n\nimport dbutils\n\nclass User(apiobject.APIObject):\n    wrapper_class = api.user.User\n\n    def __init__(self, user_id, name, fullname, status, email):\n        self.id = user_id\n        self.name = name\n        self.fullname = fullname\n        self.status = status\n        self.email = email\n\n        # Things that are fetched on demand.\n        self.__internal = None\n\n    def isAnonymous(self):\n        return self.id is None\n\n    def getInternal(self, critic):\n        if not self.__internal:\n            if self.isAnonymous():\n                self.__internal = dbutils.User.makeAnonymous()\n            else:\n                self.__internal = dbutils.User.fromId(critic.database, self.id)\n        return self.__internal\n\n    def getPrimaryEmails(self, critic):\n        cursor = critic.getDatabaseCursor()\n        cursor.execute(\"\"\"SELECT useremails.email,\n                                 useremails.id=users.email,\n                                 useremails.verified\n                            FROM useremails\n                            JOIN users ON (users.id=useremails.uid)\n                           WHERE useremails.uid=%s\n                        ORDER BY useremails.id ASC\"\"\",\n                       (self.id,))\n        return [api.user.User.PrimaryEmail(address, bool(selected),\n                                           dbutils.boolean(verified))\n                for address, selected, verified in cursor]\n\n    def getGitEmails(self, critic):\n        cursor = critic.getDatabaseCursor()\n        cursor.execute(\"\"\"SELECT email\n                            FROM usergitemails\n                           WHERE uid=%s\"\"\",\n                       (self.id,))\n        return set(email for (email,) in cursor)\n\n    def getRepositoryFilters(self, critic):\n        from api.impl.filters import RepositoryFilter\n\n        all_repositories = {}\n        filters = {}\n\n        def processRepository(repository_id):\n            if repository_id not in all_repositories:\n                all_repositories[repository_id] = api.repository.fetch(\n                    critic, repository_id=repository_id)\n            return all_repositories[repository_id]\n\n        cursor = critic.getDatabaseCursor()\n        cursor.execute(\"\"\"SELECT id, uid, type, path, repository, delegate\n                            FROM filters\n                           WHERE uid=%s\n                        ORDER BY id ASC\"\"\",\n                       (self.id,))\n\n        for repository_filter in RepositoryFilter.make(critic, cursor):\n            filters.setdefault(repository_filter.repository, []).append(\n                repository_filter)\n\n        return filters\n\n    def hasRole(self, critic, role):\n        cursor = critic.getDatabaseCursor()\n        cursor.execute(\"\"\"SELECT uid\n                            FROM roles\n                 LEFT OUTER JOIN userroles ON (userroles.role=roles.name\n                                           AND userroles.uid=%s)\n                           WHERE name=%s\"\"\",\n                       (self.id, role))\n        row = cursor.fetchone()\n        if row:\n            return row[0] is not None\n        raise api.user.InvalidRole(role)\n\n    def getPreference(self, critic, item, user, repository):\n        cursor = critic.getDatabaseCursor()\n        cursor.execute(\"SELECT type FROM preferences WHERE item=%s\", (item,))\n        row = cursor.fetchone()\n        if not row:\n            raise api.preference.InvalidPreferenceItem(item)\n        preference_type = row[0]\n\n        arguments = [item]\n        where = [\"item=%s\"]\n\n        if preference_type in (\"boolean\", \"integer\"):\n            column = \"integer\"\n        else:\n            column = \"string\"\n\n        if not self.isAnonymous():\n            arguments.append(user.id)\n            where.append(\"uid=%s OR uid IS NULL\")\n        else:\n            where.append(\"uid IS NULL\")\n\n        if repository is not None:\n            arguments.append(repository.id)\n            where.append(\"repository=%s OR repository IS NULL\")\n        else:\n            where.append(\"repository IS NULL\")\n\n        where = \" AND \".join(\"(%s)\" % condition for condition in where)\n\n        query = (\"\"\"SELECT %(column)s, uid, repository\n                      FROM userpreferences\n                     WHERE %(where)s\"\"\"\n                 % { \"column\": column, \"where\": where })\n\n        cursor.execute(query, arguments)\n\n        row = sorted(cursor, key=lambda row: row[1:])[-1]\n        value, user_id, repository_id = row\n\n        if preference_type == \"boolean\":\n            value = bool(value)\n\n        if user_id is None:\n            user = None\n        if repository_id is None:\n            repository = None\n\n        return api.preference.Preference(item, value, user, repository)\n\n    @staticmethod\n    def refresh(critic, tables, cached_users):\n        if not tables.intersection((\"users\", \"useremails\")):\n            return\n\n        User.updateAll(\n            critic,\n            \"\"\"SELECT users.id, name, fullname, status, useremails.email\n                 FROM users\n      LEFT OUTER JOIN useremails ON (useremails.id=users.email\n                                 AND (useremails.verified IS NULL\n                                   OR useremails.verified))\n                WHERE users.id=ANY (%s)\"\"\",\n            cached_users)\n\n@User.cached()\ndef fetch(critic, user_id, name):\n    try:\n        return fetchMany(critic,\n                         user_ids=None if user_id is None else [user_id],\n                         names=None if name is None else [name])[0]\n    except api.user.InvalidUserIds as error:\n        raise api.user.InvalidUserId(error.values[0])\n    except api.user.InvalidUserNames as error:\n        raise api.user.InvalidUserName(error.values[0])\n\ndef fetchMany(critic, user_ids, names):\n    return_type = list\n\n    if user_ids is not None:\n        if isinstance(user_ids, set):\n            return_type = set\n        where_column = \"users.id\"\n        values = [int(user_id) for user_id in user_ids]\n        column_index = 0\n    else:\n        if isinstance(names, set):\n            return_type = set\n        where_column = \"name\"\n        values = [str(name) for name in names]\n        column_index = 1\n\n    cursor = critic.getDatabaseCursor()\n    cursor.execute(\"\"\"SELECT users.id, name, fullname, status, useremails.email\n                        FROM users\n             LEFT OUTER JOIN useremails ON (useremails.id=users.email\n                                        AND (useremails.verified IS NULL\n                                          OR useremails.verified))\n                       WHERE \"\"\" + where_column + \"\"\"=ANY (%s)\"\"\",\n                   (values,))\n    rows = cursor.fetchall()\n\n    if len(rows) < len(values):\n        found = set(row[column_index] for row in rows)\n        if user_ids is not None:\n            exception_type = api.user.InvalidUserIds\n        else:\n            exception_type = api.user.InvalidUserNames\n        values = [value for value in values if value not in found]\n        raise exception_type(values)\n\n    rows = dict((row[column_index], row) for row in rows)\n    return return_type(User.make(critic, (rows[key] for key in values)))\n\ndef fetchAll(critic, status):\n    cursor = critic.getDatabaseCursor()\n    if status is None:\n        condition = \"\"\n        values = ()\n    else:\n        condition = \" WHERE status IN (%s)\" % \", \".join([\"%s\"] * len(status))\n        values = tuple(status)\n    cursor.execute(\"\"\"SELECT users.id, name, fullname, status, useremails.email\n                        FROM users\n             LEFT OUTER JOIN useremails ON (useremails.id=users.email\n                                        AND (useremails.verified IS NULL\n                                          OR useremails.verified))\n                       \"\"\" + condition + \"\"\"\n                    ORDER BY users.id\"\"\",\n                   values)\n    return list(User.make(critic, cursor))\n\ndef anonymous(critic):\n    return next(User.make(critic, [(None, None, None, \"anonymous\", None)]))\n"
  },
  {
    "path": "src/api/impl/user_unittest.py",
    "content": "def basic(arguments):\n    import api\n\n    critic = api.critic.startSession(for_testing=True)\n\n    alice = api.user.fetch(critic, name=\"alice\")\n    bob = api.user.fetch(critic, name=\"bob\")\n    carol = api.user.fetch(critic, name=\"carol\")\n    dave = api.user.fetch(critic, name=\"dave\")\n    erin = api.user.fetch(critic, name=\"erin\")\n    felix = api.user.fetch(critic, name=\"felix\")\n    howard = api.user.fetch(critic, name=\"howard\")\n    gina = api.user.fetch(critic, name=\"gina\")\n    iris = api.user.fetch(critic, name=\"iris\")\n    admin = api.user.fetch(critic, name=\"admin\")\n    extra = api.user.fetch(critic, name=\"extra\")\n\n    all_users = [admin, alice, bob, dave, erin, howard,\n                 carol, felix, gina, iris, extra]\n\n    assert isinstance(alice, api.user.User)\n    assert isinstance(alice.id, int)\n    assert int(alice) == alice.id\n    assert hash(alice) == hash(alice.id)\n    assert alice == alice.id\n    assert alice.id == alice\n\n    assert alice.name == \"alice\"\n    assert alice.fullname == \"Alice von Testing\"\n    assert alice.status == \"current\"\n    assert alice.email == \"alice@example.org\"\n    assert alice.is_anonymous is False\n\n    assert isinstance(alice.primary_emails, list)\n    assert len(alice.primary_emails) == 1\n    assert isinstance(alice.primary_emails[0], api.user.User.PrimaryEmail)\n    assert alice.primary_emails[0].address == \"alice@example.org\"\n    assert alice.primary_emails[0].selected is True\n    assert alice.primary_emails[0].verified is None\n\n    assert isinstance(alice.git_emails, set)\n    if len(alice.git_emails) == 0:\n        assert arguments.unreliable_git_emails\n    else:\n        assert len(alice.git_emails) == 2\n        assert \"alice@example.org\" in alice.git_emails\n        assert \"common@example.org\" in alice.git_emails\n\n    assert isinstance(alice.repository_filters, dict)\n    assert len(alice.repository_filters) == 1\n    repository, filters = alice.repository_filters.items()[0]\n    assert isinstance(repository, api.repository.Repository)\n    assert repository.name == \"critic\"\n    assert len(filters) == 1\n    assert isinstance(filters[0], api.filters.RepositoryFilter)\n    assert filters[0].subject is alice\n    assert filters[0].type == \"reviewer\"\n    assert filters[0].path == \"028-gitemails/\"\n    assert isinstance(filters[0].id, int)\n    assert filters[0].repository is repository\n    assert isinstance(filters[0].delegates, frozenset)\n    assert all(isinstance(delegate, api.user.User)\n               for delegate in filters[0].delegates)\n    assert erin in filters[0].delegates\n\n    assert not (alice == bob)\n    assert alice != bob\n\n    try:\n        api.user.fetch(alice, user_id=alice.id)\n    except AssertionError:\n        pass\n    else:\n        assert False\n\n    try:\n        api.user.fetch(critic)\n    except AssertionError:\n        pass\n    else:\n        assert False\n\n    try:\n        api.user.fetch(critic, user_id=alice.id, name=alice.name)\n    except AssertionError:\n        pass\n    else:\n        assert False\n\n    try:\n        api.user.fetch(critic, user_id=\"foo\")\n    except ValueError:\n        pass\n    else:\n        assert False\n\n    try:\n        api.user.fetch(critic, user_id=4711)\n    except api.user.InvalidUserId as error:\n        assert error.message == \"Invalid user id: %r\" % 4711\n        assert error.value == 4711\n    else:\n        assert False\n\n    try:\n        api.user.fetch(critic, name=\"nobody\")\n    except api.user.InvalidUserName as error:\n        assert error.message == \"Invalid user name: %r\" % \"nobody\"\n        assert error.value == \"nobody\"\n    else:\n        assert False\n\n    try:\n        api.user.fetchMany(alice, user_ids=[alice.id])\n    except AssertionError:\n        pass\n    else:\n        assert False\n\n    try:\n        api.user.fetchMany(critic,\n                           user_ids=[alice.id],\n                           names=[alice.name])\n    except AssertionError:\n        pass\n    else:\n        assert False\n\n    try:\n        api.user.fetchMany(critic, user_ids=[4711, 4712])\n    except api.user.InvalidUserIds as error:\n        assert error.message == \"Invalid user ids: %r\" % [4711, 4712], error.message\n        assert error.values == [4711, 4712], repr(error.values)\n    else:\n        assert False\n\n    try:\n        api.user.fetchMany(critic, names=[\"nobody\", \"anybody\"])\n    except api.user.InvalidUserNames as error:\n        assert error.message == \"Invalid user names: %r\" % [\"nobody\", \"anybody\"], error.message\n        assert error.values == [\"nobody\", \"anybody\"], repr(error.values)\n    else:\n        assert False\n\n    alice_bob_and_dave = api.user.fetchMany(\n        critic, user_ids=[alice.id, bob.id, dave.id])\n\n    assert isinstance(alice_bob_and_dave, list), type(alice_bob_and_dave)\n    assert alice_bob_and_dave == [alice, bob, dave], repr(alice_bob_and_dave)\n\n    alice_bob_and_dave = api.user.fetchMany(\n        critic, names=[alice.name, bob.name, dave.name])\n\n    assert isinstance(alice_bob_and_dave, list), type(alice_bob_and_dave)\n    assert alice_bob_and_dave == [alice, bob, dave], repr(alice_bob_and_dave)\n\n    alice_bob_and_dave = api.user.fetchMany(\n        critic, user_ids=set([alice.id, bob.id, dave.id]))\n\n    assert isinstance(alice_bob_and_dave, set), type(alice_bob_and_dave)\n    assert alice_bob_and_dave == set([alice, bob, dave]), repr(alice_bob_and_dave)\n\n    alice_bob_and_dave = api.user.fetchMany(\n        critic, names=set([alice.name, bob.name, dave.name]))\n\n    assert isinstance(alice_bob_and_dave, set), type(alice_bob_and_dave)\n    assert alice_bob_and_dave == set([alice, bob, dave]), repr(alice_bob_and_dave)\n\n    alice_bob_and_dave = api.user.fetchMany(\n        critic, user_ids=(user.id for user in [alice, bob, dave]))\n\n    assert isinstance(alice_bob_and_dave, list), type(alice_bob_and_dave)\n    assert alice_bob_and_dave == [alice, bob, dave], repr(alice_bob_and_dave)\n\n    alice_bob_and_dave = api.user.fetchMany(\n        critic, names=(user.name for user in [alice, bob, dave]))\n\n    assert isinstance(alice_bob_and_dave, list), type(alice_bob_and_dave)\n    assert alice_bob_and_dave == [alice, bob, dave], repr(alice_bob_and_dave)\n\n    users = api.user.fetchAll(critic)\n    assert isinstance(users, list)\n    assert users == sorted(all_users, key=lambda user: user.id)\n\n    users = api.user.fetchAll(critic, status=\"current\")\n    assert isinstance(users, list)\n    assert users == sorted([user for user in all_users\n                            if user.status == \"current\"],\n                           key=lambda user: user.id)\n\n    users = api.user.fetchAll(critic, status=[\"current\", \"absent\"])\n    assert isinstance(users, list)\n    assert users == sorted([user for user in all_users\n                            if user.status in (\"current\", \"absent\")],\n                           key=lambda user: user.id)\n\n    users = api.user.fetchAll(critic, status=[\"retired\", \"absent\"])\n    assert isinstance(users, list)\n    assert users == sorted([user for user in all_users\n                            if user.status in (\"retired\", \"absent\")],\n                           key=lambda user: user.id)\n\n    users = api.user.fetchAll(critic, status=[\"absent\"])\n    assert isinstance(users, list)\n    assert users == []\n\n    users = api.user.fetchAll(\n        critic, status=(status for status in [\"current\", \"absent\"]))\n    assert isinstance(users, list)\n    assert users == sorted([user for user in all_users\n                            if user.status in (\"current\", \"absent\")],\n                           key=lambda user: user.id)\n\n    assert alice.hasRole(\"administrator\") is False\n    assert alice.hasRole(\"repositories\") is False\n    assert alice.hasRole(\"newswriter\") is False\n    assert alice.hasRole(\"developer\") is False\n\n    assert admin.hasRole(\"administrator\") is True\n    assert admin.hasRole(\"repositories\") is True\n    if arguments.unreliable_admin_newswriter:\n        assert isinstance(admin.hasRole(\"newswriter\"), bool)\n    else:\n        assert admin.hasRole(\"newswriter\") is True\n    assert admin.hasRole(\"developer\") is True\n\n    try:\n        alice.hasRole(\"crazy-cat-lady\")\n    except api.user.InvalidRole as error:\n        assert error.message == \"Invalid role: %r\" % \"crazy-cat-lady\", error.message\n        assert error.role == \"crazy-cat-lady\", error.role\n    else:\n        assert False\n\n    anonymous = api.user.anonymous(critic)\n\n    assert isinstance(anonymous, api.user.User)\n    assert anonymous.id is None\n    assert anonymous.name is None\n    assert anonymous.fullname is None\n    assert anonymous.is_anonymous is True\n    assert anonymous.email is None\n    assert anonymous.primary_emails == []\n    assert anonymous.git_emails == set([])\n    assert anonymous.repository_filters == {}\n\n    print \"basic: ok\"\n\ndef preferences():\n    import api\n\n    critic = api.critic.startSession(for_testing=True)\n    alice = api.user.fetch(critic, name=\"alice\")\n    repository = api.repository.fetch(critic, name=\"critic\")\n\n    compactMode = alice.getPreference(\"commit.diff.compactMode\")\n    assert isinstance(compactMode.value, bool)\n    assert compactMode.item == \"commit.diff.compactMode\"\n    assert compactMode.value is True\n    assert compactMode.user is None\n    assert compactMode.repository is None\n\n    rulerColumn = alice.getPreference(\"commit.diff.rulerColumn\")\n    assert isinstance(rulerColumn.value, int)\n    assert rulerColumn.item == \"commit.diff.rulerColumn\"\n    assert rulerColumn.value == 0\n    assert rulerColumn.user is None\n    assert rulerColumn.repository is None\n\n    defaultGroups = alice.getPreference(\"dashboard.defaultGroups\")\n    assert isinstance(defaultGroups.value, str)\n    assert defaultGroups.item == \"dashboard.defaultGroups\"\n    assert defaultGroups.value == \"owned,draft,active,watched\"\n    assert defaultGroups.user is None\n    assert defaultGroups.repository is None\n\n    # Read per-repository, not overridden.\n    compactMode = alice.getPreference(\"commit.diff.compactMode\",\n                                      repository=repository)\n    assert compactMode.item == \"commit.diff.compactMode\"\n    assert compactMode.value is True\n    assert compactMode.user is None\n    assert compactMode.repository is None\n\n    # Read per-user, overridden per user.\n    visualTabs = alice.getPreference(\"commit.diff.visualTabs\")\n    assert visualTabs.value is True\n    assert visualTabs.user is alice\n    assert visualTabs.repository is None\n\n    # Read per-repository, overridden per user.\n    visualTabs = alice.getPreference(\"commit.diff.visualTabs\",\n                                     repository=repository)\n    assert visualTabs.value is True\n    assert visualTabs.user is alice\n    assert visualTabs.repository is None\n\n    # Read per-user, overridden per repository.\n    expandAllFiles = alice.getPreference(\"commit.expandAllFiles\")\n    assert expandAllFiles.value is False\n    assert expandAllFiles.user is None\n    assert expandAllFiles.repository is None\n\n    # Read per-repository, overridden per repository.\n    expandAllFiles = alice.getPreference(\"commit.expandAllFiles\",\n                                         repository=repository)\n    assert expandAllFiles.value is True\n    assert expandAllFiles.user is alice\n    assert expandAllFiles.repository is repository\n\n    print \"preferences: ok\"\n\ndef main(argv):\n    import argparse\n\n    parser = argparse.ArgumentParser()\n\n    parser.add_argument(\"--unreliable-git-emails\", action=\"store_true\")\n    parser.add_argument(\"--unreliable-admin-newswriter\", action=\"store_true\")\n    parser.add_argument(\"tests\", nargs=argparse.REMAINDER)\n\n    arguments = parser.parse_args(argv)\n\n    for test in arguments.tests:\n        if test == \"basic\":\n            basic(arguments)\n        elif test == \"preferences\":\n            preferences()\n"
  },
  {
    "path": "src/api/labeledaccesscontrolprofile.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2015 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\n\nclass LabeledAccessControlProfileError(api.APIError):\n    \"\"\"Base exception for all errors related to the LabeledAccessControlProfile\n       class\"\"\"\n    pass\n\nclass InvalidAccessControlProfileLabels(LabeledAccessControlProfileError):\n    \"\"\"Raised when an invalid label set is used\"\"\"\n\n    def __init__(self, value):\n        \"\"\"Constructor\"\"\"\n        super(InvalidAccessControlProfileLabels, self).__init__(\n            \"Invalid labels: %s\" % \"|\".join(value))\n        self.value = value\n\nclass LabeledAccessControlProfile(api.APIObject):\n    \"\"\"Representation of a labeled access control profile selector\"\"\"\n\n    RULE_VALUES = frozenset([\"allow\", \"deny\"])\n\n    def __str__(self):\n        return \"|\".join(self.labels)\n    def __hash__(self):\n        return hash(str(self))\n    def __eq__(self, other):\n        return str(self) == str(other)\n\n    @property\n    def labels(self):\n        \"\"\"The labels for which the access control profile is selected\"\"\"\n        return self._impl.labels\n\n    @property\n    def profile(self):\n        \"\"\"The access control profile that is selected\"\"\"\n        return self._impl.getAccessControlProfile(self.critic)\n\ndef fetch(critic, labels):\n    \"\"\"Fetch an LabeledAccessControlProfile object for the given labels\"\"\"\n    import api.impl\n    assert isinstance(critic, api.critic.Critic)\n    labels = tuple(sorted(str(label) for label in labels))\n    return api.impl.labeledaccesscontrolprofile.fetch(critic, labels)\n\ndef fetchAll(critic, profile=None):\n    \"\"\"Fetch LabeledAccessControlProfile objects for all labeled profiles\n       selectors in the system\"\"\"\n    import api.impl\n    assert isinstance(critic, api.critic.Critic)\n    assert profile is None \\\n        or isinstance(profile, api.accesscontrolprofile.AccessControlProfile)\n    return api.impl.labeledaccesscontrolprofile.fetchAll(critic, profile)\n"
  },
  {
    "path": "src/api/log/__init__.py",
    "content": "import rebase\nimport partition\n"
  },
  {
    "path": "src/api/log/partition.py",
    "content": "import api\n\nclass PartitionError(api.APIError):\n    \"\"\"Raised for incompatible commits/rebases arguments to create()\"\"\"\n    pass\n\nclass Partition(api.APIObject):\n    \"\"\"Representation of a part of a disjoint commit log\n\n       The history of a branch (such as a review branch) that has potentially\n       been rebased one or more times during its existence, is represented as a\n       linked list of \"partitions\" where each partition represents a connected\n       set of commits and each \"link\" or \"edge\" between represents the branch\n       being rebased.\"\"\"\n\n    class Edge(object):\n        \"\"\"The edge (in one direction) between two partitions\"\"\"\n\n        def __init__(self, rebase, partition):\n            self.__rebase = rebase\n            self.__partition = partition\n\n        @property\n        def rebase(self):\n            \"\"\"The rebase between the partitions\"\"\"\n            return self.__rebase\n\n        @property\n        def partition(self):\n            \"\"\"The other partition\"\"\"\n            return self.__partition\n\n    @property\n    def preceding(self):\n        \"\"\"The edge leading to the preceding (newer) partition\"\"\"\n        return self._impl.preceding\n\n    @property\n    def following(self):\n        \"\"\"The edge leading to the following (older) partition\"\"\"\n        return self._impl.following\n\n    @property\n    def commits(self):\n        \"\"\"The set of commits in the partition\n\n           The return value is an api.commitset.CommitSet object.\"\"\"\n        return self._impl.commits\n\ndef create(critic, commits, rebases=[]):\n    \"\"\"Divide a set of commits into partitions and return the first\"\"\"\n    import api.impl\n    assert isinstance(critic, api.critic.Critic)\n    if not isinstance(commits, api.commitset.CommitSet):\n        commits = list(commits)\n        assert all(isinstance(commit, api.commit.Commit) for commit in commits)\n    rebases = list(rebases)\n    assert all(isinstance(rebase, api.log.rebase.Rebase) for rebase in rebases)\n    return api.impl.log.partition.create(critic, commits, rebases)\n"
  },
  {
    "path": "src/api/log/rebase.py",
    "content": "import api\n\nclass RebaseError(api.APIError):\n    \"\"\"Base exception for all errors related to the Rebase class\"\"\"\n    pass\n\nclass InvalidRebaseId(RebaseError):\n    \"\"\"Raised when an invalid rebase id is used\"\"\"\n\n    def __init__(self, value):\n        \"\"\"Constructor\"\"\"\n        super(InvalidRebaseId, self).__init__(\"Invalid rebase id: %r\" % value)\n        self.value = value\n\nclass Rebase(api.APIObject):\n    \"\"\"Representation of a rebase of a review branch\"\"\"\n\n    @property\n    def id(self):\n        return self._impl.id\n\n    @property\n    def review(self):\n        return self._impl.getReview(self.critic)\n\n    @property\n    def old_head(self):\n        return self._impl.getOldHead(self.critic)\n\n    @property\n    def new_head(self):\n        return self._impl.getNewHead(self.critic)\n\n    @property\n    def creator(self):\n        return self._impl.getCreator(self.critic)\n\nclass HistoryRewrite(Rebase):\n    \"\"\"Representation of a history rewrite rebase\n\n       The review branch after a history rewrite rebase is always based on the\n       same upstream commit as before it and makes the exact same changes\n       relative it, but contains a different set of actual commits.\"\"\"\n\n    pass\n\nclass MoveRebase(Rebase):\n    \"\"\"Representation of a \"move\" rebase\n\n       A move rebase moves the changes in the review onto a different upstream\n       commit.\"\"\"\n\n    @property\n    def old_upstream(self):\n        return self._impl.getOldUpstream(self.critic)\n\n    @property\n    def new_upstream(self):\n        return self._impl.getNewUpstream(self.critic)\n\n    @property\n    def equivalent_merge(self):\n        return self._impl.getEquivalentMerge(self.critic)\n\n    @property\n    def replayed_rebase(self):\n        return self._impl.getReplayedRebase(self.critic)\n\ndef fetch(critic, rebase_id):\n    \"\"\"Fetch a Rebase object with the given id\"\"\"\n    import api.impl\n    assert isinstance(critic, api.critic.Critic)\n    return api.impl.log.rebase.fetch(critic, rebase_id)\n\ndef fetchAll(critic, review=None, pending=False):\n    \"\"\"Fetch Rebase objects for all rebases\n\n       If a review is provided, restrict the return value to rebases of the\n       specified review. If pending is True, fetch only pending rebases,\n       otherwise fetch only performed (completed) rebases.\"\"\"\n    import api.impl\n    assert isinstance(critic, api.critic.Critic)\n    assert review is None or isinstance(review, api.review.Review)\n    assert isinstance(pending, bool)\n    return api.impl.log.rebase.fetchAll(critic, review, pending)\n"
  },
  {
    "path": "src/api/preference.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\n\nclass InvalidPreferenceItem(api.APIError):\n    \"\"\"Raised if an invalid preference item is used.\"\"\"\n\n    def __init__(self, item):\n        \"\"\"Constructor\"\"\"\n        super(InvalidPreferenceItem, self).__init__(\n            \"Invalid preference item: %r\" % item)\n\nclass Preference(object):\n    def __init__(self, item, value, user, repository):\n        self.__item = item\n        self.__value = value\n        self.__user = user\n        self.__repository = repository\n\n    def __bool__(self):\n        return bool(self.__value)\n    def __int__(self):\n        return self.__value\n    def __str__(self):\n        return self.__value\n\n    @property\n    def item(self):\n        return self.__item\n\n    @property\n    def value(self):\n        return self.__value\n\n    @property\n    def user(self):\n        return self.__user\n\n    @property\n    def repository(self):\n        return self.__repository\n"
  },
  {
    "path": "src/api/reply.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2017 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\n\nclass ReplyError(api.APIError):\n    pass\n\nclass InvalidReplyId(ReplyError):\n    \"\"\"Raised when an invalid reply id is used.\"\"\"\n\n    def __init__(self, reply_id):\n        \"\"\"Constructor\"\"\"\n        super(InvalidReplyId, self).__init__(\n            \"Invalid reply id: %d\" % reply_id)\n        self.reply_id = reply_id\n\nclass InvalidReplyIds(ReplyError):\n    \"\"\"Raised by fetchMany() when invalid reply ids are used.\"\"\"\n\n    def __init__(self, reply_ids):\n        \"\"\"Constructor\"\"\"\n        super(InvalidReplyIds, self).__init__(\n            \"Invalid reply ids: %s\" % \", \".join(map(str, reply_ids)))\n        self.reply_ids = reply_ids\n\nclass Reply(api.APIObject):\n    @property\n    def id(self):\n        \"\"\"The reply's unique id\"\"\"\n        return self._impl.id\n\n    @property\n    def is_draft(self):\n        \"\"\"True if the reply is not yet published\n\n           Unpublished replies are not displayed to other users.\"\"\"\n        return self._impl.is_draft\n\n    @property\n    def comment(self):\n        \"\"\"The comment this reply is a reply to\n\n           The comment is returned as an api.comment.Comment object.\"\"\"\n        return self._impl.getComment(self.critic)\n\n    @property\n    def author(self):\n        \"\"\"The reply's author\n\n           The author is returned as an api.user.User object.\"\"\"\n        return self._impl.getAuthor(self.critic)\n\n    @property\n    def timestamp(self):\n        \"\"\"The reply's timestamp\n\n           The return value is a datetime.datetime object.\"\"\"\n        return self._impl.timestamp\n\n    @property\n    def text(self):\n        \"\"\"The reply's text\"\"\"\n        return self._impl.text\n\ndef fetch(critic, reply_id):\n    \"\"\"Fetch the Reply object with the given id\"\"\"\n    import api.impl\n    assert isinstance(critic, api.critic.Critic)\n    assert isinstance(reply_id, int)\n    return api.impl.reply.fetch(critic, reply_id)\n\ndef fetchMany(critic, reply_ids):\n    \"\"\"Fetch multiple Reply objects with the given ids\"\"\"\n    import api.impl\n    assert isinstance(critic, api.critic.Critic)\n    reply_ids = list(reply_ids)\n    assert all(isinstance(reply_id, int) for reply_id in reply_ids)\n    return api.impl.reply.fetchMany(critic, reply_ids)\n"
  },
  {
    "path": "src/api/repository.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\n\nclass RepositoryError(api.APIError):\n    \"\"\"Base exception for all errors related to the Repository class\"\"\"\n    pass\n\nclass InvalidRepositoryId(RepositoryError):\n    \"\"\"Raised when an invalid repository id is used\"\"\"\n\n    def __init__(self, repository_id):\n        \"\"\"Constructor\"\"\"\n        super(InvalidRepositoryId, self).__init__(\n            \"Invalid repository id: %r\" % repository_id)\n\nclass InvalidRepositoryName(RepositoryError):\n    \"\"\"Raised when an invalid repository name is used\"\"\"\n\n    def __init__(self, name):\n        \"\"\"Constructor\"\"\"\n        super(InvalidRepositoryName, self).__init__(\n            \"Invalid repository name: %r\" % name)\n\nclass InvalidRepositoryPath(RepositoryError):\n    \"\"\"Raised when an invalid repository path is used\"\"\"\n\n    def __init__(self, path):\n        \"\"\"Constructor\"\"\"\n        super(InvalidRepositoryPath, self).__init__(\n            \"Invalid repository path: %r\" % path)\n\nclass InvalidRef(RepositoryError):\n    \"\"\"Raised by Repository.resolveRef() for invalid refs\"\"\"\n\n    def __init__(self, ref):\n        \"\"\"Constructor\"\"\"\n        super(InvalidRef, self).__init__(\"Invalid ref: %r\" % ref)\n        self.ref = ref\n\nclass GitCommandError(RepositoryError):\n    \"\"\"Raised by Repository methods when 'git' fails unexpectedly\"\"\"\n\n    def __init__(self, argv, returncode, stdout, stderr):\n        self.argv = argv\n        self.returncode = returncode\n        self.stdout = stdout\n        self.stderr = stderr\n\nclass Repository(api.APIObject):\n    \"\"\"Representation of one of Critic's repositories\"\"\"\n\n    @property\n    def id(self):\n        \"\"\"The repository's unique id\"\"\"\n        return self._impl.id\n\n    @property\n    def name(self):\n        \"\"\"The repository's short name\"\"\"\n        return self._impl.name\n\n    @property\n    def path(self):\n        \"\"\"The repository's (absolute) file-system path\"\"\"\n        return self._impl.path\n\n    @property\n    def relative_path(self):\n        \"\"\"The repository's (relative) file-system path\n\n           The path is relative the directory in which all repositories are\n           stored on the system (`configuration.paths.GIT_DIR`).\"\"\"\n        return self._impl.relative_path\n\n    @property\n    def url(self):\n        \"\"\"The repository's URL\n\n           The URL type depends on the effective user's 'repository.urlType'\n           setting.\"\"\"\n        return self._impl.getURL(self.critic)\n\n    def resolveRef(self, ref, expect=None, short=False):\n        \"\"\"Resolve the given ref to a SHA-1 using 'git rev-parse'\n\n           If 'expect' is not None, it should be a string containing a Git\n           object type, such as \"commit\", \"tag\", \"tree\" or \"blob\".  When given,\n           it is passed on to 'git rev-parse' using the \"<ref>^{<expect>}\"\n           syntax.\n\n           If 'short' is True, 'git rev-parse' is given the '--short' argument,\n           which causes it to return a shortened SHA-1.  If 'short' is an int,\n           it is given as the argument value: '--short=N'.\n\n           If the ref can't be resolved, an InvalidRef exception is raised.\"\"\"\n        assert expect is None or expect in (\"blob\", \"commit\", \"tag\", \"tree\")\n        return self._impl.resolveRef(str(ref), expect, short)\n\n    def listCommits(self, include=None, exclude=None, args=None, paths=None):\n        \"\"\"List commits using 'git rev-list'\n\n           Call 'git rev-list' to list commits reachable from the commits in\n           'include' but not reachable from the commits in 'exclude'.  Extra\n           arguments to 'git rev-list' can be added through 'args' or 'paths'.\n\n           The return value is a list of api.commit.Commit objects.\"\"\"\n        if include is None:\n            include = []\n        elif isinstance(include, api.commit.Commit):\n            include = [include]\n        else:\n            include = list(include)\n        if exclude is None:\n            exclude = []\n        elif isinstance(exclude, api.commit.Commit):\n            exclude = [exclude]\n        else:\n            exclude = list(exclude)\n        args = [] if args is None else list(args)\n        paths = [] if paths is None else list(paths)\n        assert all(isinstance(commit, api.commit.Commit) for commit in include)\n        assert all(isinstance(commit, api.commit.Commit) for commit in exclude)\n        assert all(isinstance(arg, basestring) for arg in args)\n        assert all(isinstance(path, basestring) for path in paths)\n        return self._impl.listCommits(self, include, exclude, args, paths)\n\ndef fetch(critic, repository_id=None, name=None, path=None):\n    \"\"\"Fetch a Repository object with the given id, name or path\"\"\"\n    import api.impl\n    assert isinstance(critic, api.critic.Critic)\n    assert sum((repository_id is None, name is None, path is None)) == 2\n    return api.impl.repository.fetch(critic, repository_id, name, path)\n\ndef fetchAll(critic):\n    \"\"\"Fetch Repository objects for all repositories\n\n       The return value is a list ordered by the repositories' names.\"\"\"\n    import api.impl\n    assert isinstance(critic, api.critic.Critic)\n    return api.impl.repository.fetchAll(critic)\n\ndef fetchHighlighted(critic, user):\n    \"\"\"Fetch Repository objects for repositories that are extra relevant\n\n       The return value is a list ordered by the repositories' names.\"\"\"\n    import api.impl\n    assert isinstance(critic, api.critic.Critic)\n    assert isinstance(user, api.user.User)\n    return api.impl.repository.fetchHighlighted(critic, user)\n"
  },
  {
    "path": "src/api/review.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\n\nclass ReviewError(api.APIError):\n    \"\"\"Base exception for all errors related to the Review class.\"\"\"\n    pass\n\nclass InvalidReviewId(ReviewError):\n    \"\"\"Raised when an invalid review id is used.\"\"\"\n\n    def __init__(self, review_id):\n        \"\"\"Constructor\"\"\"\n        super(InvalidReviewId, self).__init__(\n            \"Invalid review id: %d\" % review_id)\n\nclass InvalidReviewBranch(ReviewError):\n    \"\"\"Raised when an invalid review branch is used.\"\"\"\n\n    def __init__(self, branch):\n        \"\"\"Constructor\"\"\"\n        super(InvalidReviewBranch, self).__init__(\n            \"Invalid review branch: %r\" % str(branch))\n\nclass Review(api.APIObject):\n    \"\"\"Representation of a Critic review\"\"\"\n\n    STATE_VALUES = frozenset([\"open\", \"closed\", \"dropped\"])\n\n    @property\n    def id(self):\n        \"\"\"The review's unique id\"\"\"\n        return self._impl.id\n\n    @property\n    def state(self):\n        \"\"\"The review's state\"\"\"\n        return self._impl.state\n\n    @property\n    def summary(self):\n        \"\"\"The review's summary\"\"\"\n        return self._impl.summary\n\n    @property\n    def description(self):\n        \"\"\"The review's description, or None\"\"\"\n        return self._impl.description\n\n    @property\n    def repository(self):\n        \"\"\"The review's repository\n\n           The repository is returned as an api.repository.Repository object.\"\"\"\n        return self._impl.getRepository(self.critic)\n\n    @property\n    def branch(self):\n        \"\"\"The review's branch\n\n           The branch is returned as an api.branch.Branch object.\"\"\"\n        return self._impl.getBranch(self.critic)\n\n    @property\n    def owners(self):\n        \"\"\"The review's owners\n\n           The owners are returned as a set of api.user.User objects.\"\"\"\n        return self._impl.getOwners(self.critic)\n\n    @property\n    def assigned_reviewers(self):\n        \"\"\"The review's assigned reviewers\n\n           The reviewers are returned as a set of api.user.User objects.\n\n           Assigned reviewers are users that have been (manually or\n           automatically) assigned as such. An assigned reviewer may or may not\n           also be an active reviewer (a reviewer that has reviewed changes).\"\"\"\n        return self._impl.getAssignedReviewers(self.critic)\n\n    @property\n    def active_reviewers(self):\n        \"\"\"The review's active reviewers\n\n           The reviewers are returned as a set of api.user.User objects.\n\n           Active reviewers are users that have reviewed changes. An active\n           reviewer may or may not also be an assigned reviewer (see above).\"\"\"\n        return self._impl.getActiveReviewers(self.critic)\n\n    @property\n    def watchers(self):\n        \"\"\"The review's watchers\n\n           The watchers are returned as a set of api.user.User objects.\n\n           A user is a watcher if he/she is on the list of users that receive\n           emails about the review, and is neither an owner nor a reviewer.\"\"\"\n        return self._impl.getWatchers(self.critic)\n\n    @property\n    def filters(self):\n        \"\"\"The review's local filters\n\n           The filters are returned as a list of api.filters.ReviewFilter\n           objects.\"\"\"\n        return self._impl.getFilters(self.critic)\n\n    @property\n    def commits(self):\n        \"\"\"The set of commits that are part of the review\n\n           Note: This set never changes when the review branch is rebased, and\n                 commits are never removed from it.  For the set of commits that\n                 are actually reachable from the review branch, consult the\n                 'commits' attribute on the api.branch.Branch object that is\n                 returned by the 'branch' attribute.\"\"\"\n        return self._impl.getCommits(self.critic)\n\n    @property\n    def rebases(self):\n        \"\"\"The rebases of the review branch\n\n           The rebases are returned as a list of api.log.rebase.Rebase objects,\n           ordered chronologically with the most recent rebase first.\"\"\"\n        return self._impl.getRebases(self)\n\n    @property\n    def pending_rebase(self):\n        \"\"\"The pending rebase of the review branch\n\n           The rebase, if it exists, is returned as an api.log.rebase.Rebase\n           object. If there isn't a pending rebase, this will be None.\"\"\"\n        return self._impl.getPendingRebase(self)\n\n    @property\n    def issues(self):\n        \"\"\"The issues in the review\n\n           The issues are returned as a list of api.comment.Issue objects.\"\"\"\n        return self._impl.getIssues(self)\n\n    @property\n    def open_issues(self):\n        \"\"\"The open issues in the review\n\n           The issues are returned as a list of api.comment.Issue objects.\"\"\"\n        return self._impl.getOpenIssues(self)\n\n    @property\n    def notes(self):\n        \"\"\"The notes in the review\n\n           The notes are returned as a list of api.comment.Note objects.\"\"\"\n        return self._impl.getNotes(self)\n\n    @property\n    def first_partition(self):\n        return api.log.partition.create(\n            self.critic, self.commits, self.rebases)\n\n    def isReviewableCommit(self, commit):\n        \"\"\"Return true if the commit is a primary commit in this review\n\n           A primary commit is one that is included in one of the log\n           partitions, and not just part of the \"actual log\" after a rebase of\n           the review branch.\"\"\"\n        assert isinstance(commit, api.commit.Commit)\n        return self._impl.isReviewableCommit(self.critic, commit)\n\n    @property\n    def total_progress(self):\n        \"\"\"Total progress made on a review\n\n           Total progress is expressed as a number between 0 and 1, 1 being\n           fully reviewed and 0 being fully pending.\"\"\"\n        return self._impl.getTotalProgress(self.critic)\n\n    @property\n    def progress_per_commit(self):\n        \"\"\"Progress made on a review, grouped by commit\n\n           Returned as a list of CommitChangeCount, where each has the number of\n           total changed lines, and the number of reviewed changed lines\"\"\"\n        return self._impl.getProgressPerCommit(self.critic)\n\nclass CommitChangeCount:\n    def __init__(self, commit_id, total_changes, reviewed_changes):\n        self.commit_id = commit_id\n        self.total_changes = total_changes\n        self.reviewed_changes = reviewed_changes\n\ndef fetch(critic, review_id=None, branch=None):\n    \"\"\"Fetch a Review object with the given id or branch\"\"\"\n\n    import api.impl\n    assert isinstance(critic, api.critic.Critic)\n    assert (review_id is None) != (branch is None)\n    assert branch is None or isinstance(branch, api.branch.Branch)\n    return api.impl.review.fetch(critic, review_id, branch)\n\ndef fetchMany(critic, review_ids):\n    \"\"\"Fetch many Review objects with the given ids, and return them in the same\n       order\"\"\"\n\n    import api.impl\n    assert isinstance(critic, api.critic.Critic)\n    review_ids = list(review_ids)\n\n    return api.impl.review.fetchMany(critic, review_ids)\n\ndef fetchAll(critic, repository=None, state=None):\n    \"\"\"Fetch all Review objects in repository with the given state\"\"\"\n\n    import api.impl\n    assert isinstance(critic, api.critic.Critic)\n    assert (repository is None or\n            isinstance(repository, api.repository.Repository))\n    if state is not None:\n        if isinstance(state, basestring):\n            state = set([state])\n        else:\n            state = set(state)\n        assert not (state - Review.STATE_VALUES)\n\n    return api.impl.review.fetchAll(critic, repository, state)\n"
  },
  {
    "path": "src/api/reviewablefilechange.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2017 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\n\nclass ReviewableFileChangeError(api.APIError):\n    pass\n\nclass InvalidReviewableFileChangeId(ReviewableFileChangeError):\n    \"\"\"Raised when an invalid reviewable file change id is used\"\"\"\n    def __init__(self, filechange_id):\n        super(InvalidReviewableFileChangeId, self).__init__(\n            \"Invalid reviewable file change id: %d\" % filechange_id)\n        self.filechange_id = filechange_id\n\nclass InvalidReviewableFileChangeIds(ReviewableFileChangeError):\n    \"\"\"Raised when invalid reviewable file change ids are used\"\"\"\n    def __init__(self, filechange_ids):\n        super(InvalidReviewableFileChangeIds, self).__init__(\n            \"Invalid reviewable file change ids: %s\"\n            % \", \".join(map(str, filechange_ids)))\n        self.filechange_ids = filechange_ids\n\nclass InvalidChangeset(ReviewableFileChangeError):\n    \"\"\"Raised when fetchAll() is called with an invalid changeset\"\"\"\n    def __init__(self, changeset):\n        super(InvalidChangeset, self).__init__(\n            \"Changeset has no reviewable changes: %d\" % changeset.id)\n        self.changeset = changeset\n\nclass ReviewableFileChange(api.APIObject):\n    \"\"\"Representation of changes to a file, to be reviewed\"\"\"\n\n    @property\n    def id(self):\n        return self._impl.id\n\n    @property\n    def review(self):\n        return self._impl.getReview(self.critic)\n\n    @property\n    def changeset(self):\n        \"\"\"The changeset that the change is part of\n\n           The changeset is returned as an api.changeset.Changeset object. Note\n           that this changeset is always of a single commit, and that this\n           commit will be included in a partition in the review (meaning it will\n           not be part of a rebased version of the review branch.)\"\"\"\n        return self._impl.getChangeset(self.critic)\n\n    @property\n    def file(self):\n        \"\"\"The file that was changed\n\n           The file is returned as an api.file.File object.\"\"\"\n        return self._impl.getFile(self.critic)\n\n    @property\n    def deleted_lines(self):\n        \"\"\"Number of deleted or modified lines\n\n           In other words, number of lines in the old version of the file that\n           are not present in the new version of the file.\"\"\"\n        return self._impl.deleted_lines\n\n    @property\n    def inserted_lines(self):\n        \"\"\"Number of modified or inserted lines\n\n           In other words, number of lines in the new version of the file that\n           were not present in the old version of the file.\"\"\"\n        return self._impl.inserted_lines\n\n    @property\n    def is_reviewed(self):\n        \"\"\"True if the file change has been marked as reviewed.\"\"\"\n        return self._impl.is_reviewed\n\n    @property\n    def reviewed_by(self):\n        \"\"\"The user that reviewed the changes\n\n           The user is returned as a api.user.User object, or None if the change\n           has not been reviewed yet.\"\"\"\n        return self._impl.getReviewedBy(self.critic)\n\n    @property\n    def assigned_reviewers(self):\n        \"\"\"The users that are assigned to review the changes\n\n           The reviewers are returned as a set of api.user.User objects.\"\"\"\n        return self._impl.getAssignedReviewers(self.critic)\n\n    class DraftChanges(object):\n        \"\"\"Draft changes to file change state\"\"\"\n\n        def __init__(self, author, new_reviewed_by):\n            self.__author = author\n            self.__new_is_reviewed = new_reviewed_by is not None\n            self.__new_reviewed_by = new_reviewed_by\n\n        @property\n        def author(self):\n            \"\"\"The author of these draft changes\n\n               The author is returned as an api.user.User object.\"\"\"\n            return self.__author\n\n        @property\n        def new_is_reviewed(self):\n            \"\"\"New value for the |is_reviewed| attribute\"\"\"\n            return self.__new_is_reviewed\n\n        @property\n        def new_reviewed_by(self):\n            \"\"\"New value for the |reviewed_by| attribute\"\"\"\n            return self.__new_reviewed_by\n\n    @property\n    def draft_changes(self):\n        \"\"\"The file change's current draft changes\n\n           The draft changes are returned as a ReviewableFileChange.DraftChanges\n           object, or None if the current user has no unpublished changes to\n           this file change.\"\"\"\n        return self._impl.getDraftChanges(self.critic)\n\ndef fetch(critic, filechange_id):\n    \"\"\"Fetch a single reviewable file change by its unique id\"\"\"\n    assert isinstance(critic, api.critic.Critic)\n    assert isinstance(filechange_id, int)\n    return api.impl.reviewablefilechange.fetch(critic, filechange_id)\n\ndef fetchMany(critic, filechange_ids):\n    \"\"\"Fetch multiple reviewable file change by their unique ids\"\"\"\n    assert isinstance(critic, api.critic.Critic)\n    filechange_ids = list(filechange_ids)\n    assert all(isinstance(filechange_id, int)\n               for filechange_id in filechange_ids)\n    return api.impl.reviewablefilechange.fetchMany(critic, filechange_ids)\n\ndef fetchAll(critic, review, changeset=None, file=None, assignee=None,\n             is_reviewed=None):\n    \"\"\"Fetch all reviewable file changes in a review\n\n       If a |changeset| is specified, fetch only file changes that are part of\n       that changeset.\n\n       If a |file| is specified, fetch only file changes in that file.\n\n       If a |assignee| is specified, fetch only file changes that the specified\n       user is assigned to review.\n\n       If |is_reviewed| is specified (not |None|), fetch only file changes that\n       are marked as reviewed (when |is_reviewed==True|) or not.\"\"\"\n    import api.impl\n    assert isinstance(critic, api.critic.Critic)\n    assert isinstance(review, api.review.Review)\n    assert changeset is None or isinstance(changeset, api.changeset.Changeset)\n    assert file is None or isinstance(file, api.file.File)\n    assert assignee is None or isinstance(assignee, api.user.User)\n    assert is_reviewed is None or isinstance(is_reviewed, bool)\n    return api.impl.reviewablefilechange.fetchAll(\n        critic, review, changeset, file, assignee, is_reviewed)\n"
  },
  {
    "path": "src/api/reviewsummary.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2017 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nfrom types import NoneType\nimport api\n\nclass ReviewSummaryError(api.APIError):\n    pass\n\nclass ReviewSummaryContainer(api.APIObject):\n    \"\"\"Container object for review summaries\"\"\"\n\n    @property\n    def reviews(self):\n        return self._impl.reviews\n\n    @property\n    def more(self):\n        return self._impl.more\n\nclass ReviewSummary(api.APIObject):\n    \"\"\"Representation of a review summary\"\"\"\n\n    TYPE_VALUES = frozenset([\"all\", \"own\", \"other\"])\n\n    @property\n    def review(self):\n        return self._impl.review\n\n    @property\n    def latest_change(self):\n        return self._impl.latest_change\n\ndef fetchMany(critic, search_type, user, count, offset):\n    \"\"\"Fetch the dashboard for user\"\"\"\n\n    import api.impl\n    assert search_type is not None\n    if user is None:\n        assert search_type == \"all\"\n    assert isinstance(search_type, str)\n    assert isinstance(user, api.user.User) or user is None\n    assert search_type in ReviewSummary.TYPE_VALUES\n    assert isinstance(count, int) or isinstance(count, NoneType)\n    assert isinstance(offset, int) or isinstance(count, NoneType)\n    if count is not None:\n        assert count > 0\n    if offset is not None:\n        assert offset >= 0\n    return api.impl.reviewsummary.fetchMany(critic, search_type, user, count, offset)\n"
  },
  {
    "path": "src/api/transaction/__init__.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2015 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\n\nclass Transaction(object):\n    def __init__(self, critic):\n        self.critic = critic\n        self.tables = set()\n        self.items = Queries()\n        self.callbacks = []\n\n    def modifyUser(self, subject):\n        from user import ModifyUser\n        assert isinstance(subject, api.user.User)\n        api.PermissionDenied.raiseUnlessUser(self.critic, subject)\n        return ModifyUser(self, subject)\n\n    def modifyAccessToken(self, access_token):\n        from accesstoken import ModifyAccessToken, CreatedAccessToken\n        assert isinstance(access_token, (api.accesstoken.AccessToken,\n                                         CreatedAccessToken))\n        api.PermissionDenied.raiseUnlessAdministrator(self.critic)\n        return ModifyAccessToken(self, access_token)\n\n    def createAccessControlProfile(self, callback=None):\n        from accesscontrolprofile import ModifyAccessControlProfile\n        api.PermissionDenied.raiseUnlessAdministrator(self.critic)\n        return ModifyAccessControlProfile.create(self, callback)\n\n    def modifyAccessControlProfile(self, profile):\n        from accesscontrolprofile import ModifyAccessControlProfile\n        assert isinstance(\n            profile, api.accesscontrolprofile.AccessControlProfile)\n        api.PermissionDenied.raiseUnlessAdministrator(self.critic)\n        return ModifyAccessControlProfile(self, profile)\n\n    def createLabeledAccessControlProfile(self, labels, profile, callback=None):\n        from labeledaccesscontrolprofile \\\n            import ModifyLabeledAccessControlProfile\n        api.PermissionDenied.raiseUnlessAdministrator(self.critic)\n        assert isinstance(\n            profile, api.accesscontrolprofile.AccessControlProfile)\n        return ModifyLabeledAccessControlProfile.create(\n            self, labels, profile, callback)\n\n    def modifyLabeledAccessControlProfile(self, labeled_profile):\n        from labeledaccesscontrolprofile \\\n            import ModifyLabeledAccessControlProfile\n        api.PermissionDenied.raiseUnlessAdministrator(self.critic)\n        assert isinstance(\n            labeled_profile,\n            api.labeledaccesscontrolprofile.LabeledAccessControlProfile)\n        return ModifyLabeledAccessControlProfile(self, labeled_profile)\n\n    def modifyReview(self, review):\n        from review import ModifyReview\n        assert isinstance(review, api.review.Review)\n        return ModifyReview(self, review)\n\n    def __enter__(self):\n        return self\n\n    def __exit__(self, exc_type, exc_val, exc_tb):\n        if exc_type is None and exc_val is None and exc_tb is None:\n            self.__commit()\n        return False\n\n    def __commit(self):\n        if not self.items:\n            return\n        try:\n            with self.critic.getUpdatingDatabaseCursor(*self.tables) as cursor:\n                for item in self.items:\n                    item(self.critic, cursor)\n                for callback in self.callbacks:\n                    callback()\n        finally:\n            self.critic._impl.transactionEnded(self.critic, self.tables)\n\nclass Query(object):\n    def __init__(self, statement, *values, **kwargs):\n        self.statement = statement\n        self.__values = list(values)\n        self.collector = kwargs.get(\"collector\")\n\n    def merge(self, query):\n        if self.statement == query.statement \\\n                and not self.collector \\\n                and not query.collector:\n            self.__values.extend(query.__values)\n            return True\n        return False\n\n    @property\n    def values(self):\n        def evaluate(value):\n            if isinstance(value, LazyValue):\n                return value.evaluate()\n            elif isinstance(value, (set, list, tuple)):\n                return [evaluate(element) for element in value]\n            else:\n                return value\n\n        for values in self.__values:\n            yield evaluate(values)\n\n    def __call__(self, critic, cursor):\n        if self.collector:\n            for values in self.values:\n                cursor.execute(self.statement, values)\n                for row in cursor:\n                    self.collector(*row)\n        else:\n            cursor.executemany(self.statement, self.values)\n\nclass Queries(list):\n    def append(self, query):\n        if self and self[-1].merge(query):\n            return\n        super(Queries, self).append(query)\n\n    def extend(self, queries):\n        raise Exception(\"Append queries one at a time!\")\n\nclass LazyValue(object):\n    def evaluate(self):\n        raise Exception(\"LazyValue.evaluate() must be implemented!\")\n\nclass LazyInt(LazyValue):\n    def __init__(self, source):\n        self.source = source\n    def evaluate(self):\n        return self.source()\n\nclass LazyStr(LazyValue):\n    def __init__(self, source):\n        self.source = source\n    def evaluate(self):\n        return self.source()\n\nclass LazyObject(LazyValue):\n    def __init__(self, callback=None):\n        self.object_id = None\n        self.callback = callback\n    def __call__(self, object_id):\n        self.object_id = object_id\n        if self.callback:\n            self.callback(self)\n    @property\n    def id(self):\n        return LazyInt(self.evaluate)\n    def evaluate(self):\n        assert self.object_id is not None\n        return self.object_id\n\nclass LazyAPIObject(LazyObject):\n    def __init__(self, critic, fetch, callback=None):\n        super(LazyAPIObject, self).__init__(\n            callback=self.callback_wrapper if callback else None)\n        self.critic = critic\n        self.__fetch = fetch\n        self.__callback = callback\n    def fetch(self):\n        return self.__fetch(self.critic, self.evaluate())\n    @staticmethod\n    def callback_wrapper(self):\n        self.__callback(self.fetch())\n"
  },
  {
    "path": "src/api/transaction/accesscontrolprofile.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2015 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\n\nclass InvalidRuleValue(api.TransactionError):\n    pass\n\nclass InvalidRequestMethod(api.TransactionError):\n    pass\n\nclass InvalidPathPattern(api.TransactionError):\n    pass\n\nclass InvalidRepositoryAccessType(api.TransactionError):\n    pass\n\nclass InvalidExtensionAccessType(api.TransactionError):\n    pass\n\nclass ModifyExceptions(object):\n    def __init__(self, transaction, profile):\n        self.transaction = transaction\n        self.profile = profile\n\n    def __addTable(self):\n        self.transaction.tables.add(\"accesscontrol_\" + self.table_name)\n\n    def delete(self, exception_id):\n        self.__addTable()\n        self.transaction.items.append(\n            api.transaction.Query(\n                \"\"\"DELETE\n                     FROM accesscontrol_{}\n                    WHERE id=%s\n                      AND profile=%s\"\"\".format(self.table_name),\n                (exception_id, self.profile.id)))\n\n    def deleteAll(self):\n        self.__addTable()\n        self.transaction.items.append(\n            api.transaction.Query(\n                \"\"\"DELETE\n                     FROM accesscontrol_{}\n                    WHERE profile=%s\"\"\".format(self.table_name),\n                (self.profile.id,)))\n\n    def add(self, *values):\n        self.__addTable()\n        self.transaction.items.append(\n            api.transaction.Query(\n                \"\"\"INSERT INTO accesscontrol_{} (profile, {})\n                        VALUES (%s, {})\"\"\".format(\n                            self.table_name,\n                            \", \".join(self.column_names),\n                            \", \".join([\"%s\"] * len(self.column_names))),\n                (self.profile.id,) + values))\n\nclass ModifyHTTPExceptions(ModifyExceptions):\n    table_name = \"http\"\n    column_names = (\"request_method\", \"path_pattern\")\n\n    def add(self, request_method, path_pattern):\n        REQUEST_METHODS = api.accesscontrolprofile \\\n                .AccessControlProfile.HTTPException.REQUEST_METHODS\n\n        if request_method is not None:\n            if request_method not in REQUEST_METHODS:\n                raise InvalidRequestMethod(request_method)\n\n        if path_pattern is not None:\n            import re\n            try:\n                re.compile(path_pattern)\n            except re.error as error:\n                raise InvalidPathPattern(\n                    \"%r: %s\" % (path_pattern, error.message))\n\n        super(ModifyHTTPExceptions, self).add(request_method, path_pattern)\n\nclass ModifyRepositoriesExceptions(ModifyExceptions):\n    table_name = \"repositories\"\n    column_names = (\"access_type\", \"repository\")\n\n    def add(self, access_type, repository):\n        assert (repository is None or\n                isinstance(repository, api.repository.Repository))\n\n        if access_type not in (None, \"read\", \"modify\"):\n            raise InvalidRepositoryAccessType(access_type)\n\n        repository_id = repository.id if repository else None\n\n        super(ModifyRepositoriesExceptions, self).add(\n            access_type, repository_id)\n\nclass ModifyExtensionsExceptions(ModifyExceptions):\n    table_name = \"extensions\"\n    column_names = (\"access_type\", \"extension_key\")\n\n    def add(self, access_type, extension):\n        assert (extension is None or\n                isinstance(extension, api.extension.Extension))\n\n        if access_type not in (None, \"install\", \"execute\"):\n            raise InvalidExtensionAccessType(access_type)\n\n        extension_key = extension.key if extension else None\n\n        super(ModifyExtensionsExceptions, self).add(\n            access_type, extension_key)\n\nclass ModifyAccessControlProfile(object):\n    def __init__(self, transaction, profile):\n        self.transaction = transaction\n        self.profile = profile\n\n    def setTitle(self, value):\n        self.transaction.tables.add(\"accesscontrolprofiles\")\n        self.transaction.items.append(\n            api.transaction.Query(\n                \"\"\"UPDATE accesscontrolprofiles\n                      SET title=%s\n                    WHERE id=%s\"\"\",\n                (value, self.profile.id)))\n\n    def setRule(self, category, value):\n        assert category in (\"http\", \"repositories\", \"extensions\")\n        if value not in (\"allow\", \"deny\"):\n            raise InvalidRuleValue(value)\n\n        self.transaction.tables.add(\"accesscontrolprofiles\")\n        self.transaction.items.append(\n            api.transaction.Query(\n                \"\"\"UPDATE accesscontrolprofiles\n                      SET {}=%s\n                    WHERE id=%s\"\"\".format(category),\n                (value, self.profile.id)))\n\n    def modifyExceptions(self, category):\n        assert category in (\"http\", \"repositories\", \"extensions\")\n        if category == \"http\":\n            return ModifyHTTPExceptions(self.transaction, self.profile)\n        if category == \"repositories\":\n            return ModifyRepositoriesExceptions(self.transaction, self.profile)\n        # category == \"extensions\"\n        return ModifyExtensionsExceptions(self.transaction, self.profile)\n\n    def delete(self):\n        self.transaction.tables.add(\"accesscontrolprofiles\")\n        self.transaction.items.append(\n            api.transaction.Query(\n                \"\"\"DELETE\n                     FROM accesscontrolprofiles\n                    WHERE id=%s\"\"\",\n                (self.profile.id,)))\n\n    @staticmethod\n    def create(transaction, callback=None):\n        critic = transaction.critic\n\n        profile = CreatedAccessControlProfile(critic, None, callback)\n\n        transaction.tables.add(\"accesscontrolprofiles\")\n        transaction.items.append(\n            api.transaction.Query(\n                \"\"\"INSERT\n                     INTO accesscontrolprofiles\n                  DEFAULT VALUES\n                RETURNING id\"\"\",\n                (),\n                collector=profile))\n\n        return ModifyAccessControlProfile(transaction, profile)\n\nclass CreatedAccessControlProfile(api.transaction.LazyAPIObject):\n    def __init__(self, critic, access_token, callback=None):\n        super(CreatedAccessControlProfile, self).__init__(\n            critic, api.accesscontrolprofile.fetch, callback)\n        self.access_token = access_token\n"
  },
  {
    "path": "src/api/transaction/accesstoken.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2015 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\n\nclass ModifyAccessToken(object):\n    def __init__(self, transaction, access_token):\n        self.transaction = transaction\n        self.access_token = access_token\n\n    def setTitle(self, value):\n        self.transaction.tables.add(\"accesstokens\")\n        self.transaction.items.append(\n            api.transaction.Query(\n                \"\"\"UPDATE accesstokens\n                      SET title=%s\n                    WHERE id=%s\"\"\",\n                (value, self.access_token.id)))\n\n    def delete(self):\n        self.transaction.tables.add(\"accesstokens\")\n        self.transaction.items.append(\n            api.transaction.Query(\n                \"\"\"DELETE\n                     FROM accesstokens\n                    WHERE id=%s\"\"\",\n                (self.access_token.id,)))\n\n    def modifyProfile(self):\n        from accesscontrolprofile import ModifyAccessControlProfile\n        assert self.access_token.profile\n        return ModifyAccessControlProfile(\n            self.transaction, self.access_token.profile)\n\nclass CreatedAccessToken(api.transaction.LazyAPIObject):\n    def __init__(self, critic, user, callback=None):\n        from accesscontrolprofile import CreatedAccessControlProfile\n        super(CreatedAccessToken, self).__init__(\n            critic, api.accesstoken.fetch, callback)\n        self.user = user\n        self.profile = CreatedAccessControlProfile(critic, self)\n"
  },
  {
    "path": "src/api/transaction/comment.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2017 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\n\nclass ModifyComment(object):\n    def __init__(self, transaction, comment):\n        self.transaction = transaction\n        self.comment = comment\n\n    def __raiseUnlessDraft(self, action):\n        if not self.comment.is_draft:\n            raise api.comment.CommentError(\n                \"Published comments cannot be \" + action)\n\n    def setText(self, text):\n        self.__raiseUnlessDraft(\"edited\")\n\n        self.transaction.tables.add(\"comments\")\n        self.transaction.items.append(\n            api.transaction.Query(\n                \"\"\"UPDATE comments\n                      SET comment=%s\n                    WHERE id in (SELECT first_comment\n                                   FROM commentchains\n                                  WHERE id=%s)\"\"\",\n                (text, self.comment.id)))\n\n    def addReply(self, author, text, callback=None):\n        assert isinstance(author, api.user.User)\n        assert isinstance(text, str)\n\n        if self.comment.is_draft:\n            raise api.comment.CommentError(\n                \"Draft comments cannot be replied to\")\n\n        if self.comment.draft_changes and self.comment.draft_changes.reply:\n            raise api.comment.CommentError(\n                \"Comment already has a draft reply\")\n\n        critic = self.transaction.critic\n\n        # Users are not (generally) allowed to create comments as other users.\n        api.PermissionDenied.raiseUnlessUser(critic, author)\n\n        reply = CreatedReply(critic, self.comment, callback)\n\n        self.transaction.tables.add(\"comments\")\n        self.transaction.items.append(\n            api.transaction.Query(\n                \"\"\"INSERT\n                     INTO comments (chain, uid, state, comment)\n                   VALUES (%s, %s, 'draft', %s)\n                RETURNING id\"\"\",\n                (self.comment.id, author.id, text),\n                collector=reply))\n\n        return reply\n\n    def modifyReply(self, reply):\n        from reply import ModifyReply\n        assert isinstance(reply, api.reply.Reply)\n        assert reply.comment == self.comment\n\n        api.PermissionDenied.raiseUnlessUser(self.transaction.critic,\n                                             reply.author)\n\n        return ModifyReply(self.transaction, reply)\n\n    def resolveIssue(self):\n        critic = self.transaction.critic\n\n        if isinstance(self.comment, api.comment.Note):\n            raise api.comment.CommentError(\n                \"Only issues can be resolved\")\n\n        if self.comment.is_draft:\n            raise api.comment.CommentError(\n                \"Unpublished issues cannot be resolved\")\n\n        self.transaction.tables.add(\"commentchainchanges\")\n\n        if self.comment.draft_changes:\n            if ((self.comment.draft_changes.new_state\n                 and self.comment.draft_changes.new_state != \"open\")\n                or self.comment.draft_changes.new_type):\n                raise api.comment.CommentError(\n                    \"Issue has unpublished conflicting modifications\")\n\n            if self.comment.draft_changes.new_state == \"open\":\n                self.transaction.items.append(\n                    api.transaction.Query(\n                        \"\"\"DELETE\n                             FROM commentchainchanges\n                            WHERE uid=%s\n                              AND chain=%s\n                              AND to_state='open'\"\"\",\n                        (critic.actual_user.id, self.comment.id)))\n                return\n\n        if self.comment.state != \"open\":\n            raise api.comment.CommentError(\n                \"Only open issues can be resolved\")\n\n        self.transaction.items.append(\n            api.transaction.Query(\n                \"\"\"INSERT\n                     INTO commentchainchanges (uid, chain, from_state,\n                                               to_state)\n                   VALUES (%s, %s, %s, %s)\"\"\",\n                (critic.actual_user.id, self.comment.id, \"open\", \"closed\")))\n\n    def reopenIssue(self):\n        critic = self.transaction.critic\n\n        if isinstance(self.comment, api.comment.Note):\n            raise api.comment.CommentError(\n                \"Only issues can be reopened\")\n\n        self.transaction.tables.add(\"commentchainchanges\")\n\n        if self.comment.draft_changes:\n            if ((self.comment.draft_changes.new_state\n                 and self.comment.draft_changes.new_state != \"resolved\")\n                or self.comment.draft_changes.new_type):\n                raise api.comment.CommentError(\n                    \"Issue has unpublished conflicting modifications\")\n\n            if self.comment.draft_changes.new_state == \"resolved\":\n                self.transaction.items.append(\n                    api.transaction.Query(\n                        \"\"\"DELETE\n                             FROM commentchainchanges\n                            WHERE uid=%s\n                              AND chain=%s\n                              AND to_state='closed'\"\"\",\n                        (critic.actual_user.id, self.comment.id)))\n                return\n\n        if self.comment.state != \"resolved\":\n            raise api.comment.CommentError(\n                \"Only resolved issues can be reopened\")\n\n        self.transaction.items.append(\n            api.transaction.Query(\n                \"\"\"INSERT\n                     INTO commentchainchanges (uid, chain, from_state,\n                                               to_state)\n                   VALUES (%s, %s, %s, %s)\"\"\",\n                (critic.actual_user.id, self.comment.id, \"closed\", \"open\")))\n\n    def delete(self):\n        critic = self.transaction.critic\n\n        api.PermissionDenied.raiseUnlessUser(critic, self.comment.author)\n\n        self.__raiseUnlessDraft(\"deleted\")\n\n        self.transaction.tables.add(\"commentchains\")\n        self.transaction.items.append(\n            api.transaction.Query(\n                \"\"\"DELETE\n                     FROM commentchains\n                    WHERE id=%s\"\"\",\n                (self.comment.id,)))\n\nclass CreatedReply(api.transaction.LazyAPIObject):\n    def __init__(self, critic, comment, callback=None):\n        super(CreatedReply, self).__init__(\n            critic, api.reply.fetch, callback)\n        self.comment = comment\n"
  },
  {
    "path": "src/api/transaction/filters.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2015 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api.transaction\n\nclass ModifyRepositoryFilter(object):\n    def __init__(self, transaction, repository_filter):\n        self.transaction = transaction\n        self.repository_filter = repository_filter\n\n    def setDelegates(self, value):\n        assert all(isinstance(delegate, api.user.User) for delegate in value)\n        self.transaction.tables.add(\"filters\")\n        self.transaction.items.append(\n            api.transaction.Query(\n                \"\"\"UPDATE filters\n                      SET delegate=%s\n                    WHERE id=%s\"\"\",\n                (\",\".join(delegate.name for delegate in value),\n                 self.repository_filter.id)))\n\n    def delete(self):\n        self.transaction.tables.add(\"filters\")\n        self.transaction.items.append(\n            api.transaction.Query(\n                \"\"\"DELETE\n                     FROM filters\n                    WHERE id=%s\"\"\",\n                (self.repository_filter.id,)))\n"
  },
  {
    "path": "src/api/transaction/labeledaccesscontrolprofile.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2015 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\n\nclass ModifyLabeledAccessControlProfile(object):\n    def __init__(self, transaction, labeled_profile):\n        self.transaction = transaction\n        self.labeled_profile = labeled_profile\n\n    def delete(self):\n        self.transaction.tables.add(\"labeledaccesscontrolprofiles\")\n        self.transaction.items.append(\n            api.transaction.Query(\n                \"\"\"DELETE\n                     FROM labeledaccesscontrolprofiles\n                    WHERE labels=%s\"\"\",\n                (str(self.labeled_profile),)))\n\n    @staticmethod\n    def create(transaction, labels, profile, callback=None):\n        critic = transaction.critic\n\n        labeled_profile = CreatedLabeledAccessControlProfile(critic, callback)\n\n        transaction.tables.add(\"labeledaccesscontrolprofiles\")\n        transaction.items.append(\n            api.transaction.Query(\n                \"\"\"INSERT\n                     INTO labeledaccesscontrolprofiles (labels, profile)\n                   VALUES (%s, %s)\n                RETURNING labels\"\"\",\n                (\"|\".join(sorted(labels)), profile.id),\n                collector=labeled_profile))\n\n        return ModifyLabeledAccessControlProfile(transaction, labeled_profile)\n\nclass CreatedLabeledAccessControlProfile(api.transaction.LazyAPIObject):\n    def __init__(self, critic, callback=None):\n        def fetch(critic, labels):\n            return api.labeledaccesscontrolprofile.fetch(\n                critic, labels.split(\"|\"))\n        super(CreatedLabeledAccessControlProfile, self).__init__(\n            critic, fetch, callback)\n"
  },
  {
    "path": "src/api/transaction/reply.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2017 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\n\nclass ModifyReply(object):\n    def __init__(self, transaction, reply):\n        self.transaction = transaction\n        self.reply = reply\n\n    def __raiseUnlessDraft(self, action):\n        if not self.reply.is_draft:\n            raise api.reply.ReplyError(\n                \"Published replies cannot be \" + action)\n\n    def setText(self, text):\n        self.__raiseUnlessDraft(\"edited\")\n\n        if not text.strip():\n            raise api.reply.ReplyError(\"Empty reply\")\n\n        self.transaction.tables.add(\"comments\")\n        self.transaction.items.append(\n            api.transaction.Query(\n                \"\"\"UPDATE comments\n                      SET comment=%s\n                    WHERE id=%s\"\"\",\n                (text, self.reply.id)))\n\n    def delete(self):\n        self.__raiseUnlessDraft(\"deleted\")\n\n        self.transaction.tables.add(\"comments\")\n        self.transaction.items.append(\n            api.transaction.Query(\n                \"\"\"DELETE\n                     FROM comments\n                    WHERE id=%s\"\"\",\n                (self.reply.id,)))\n"
  },
  {
    "path": "src/api/transaction/review.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2017 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\n\nimport dbutils\nimport gitutils\n\nfrom reviewing.comment.propagate import Propagation\n\nclass ModifyReview(object):\n    def __init__(self, transaction, review):\n        self.transaction = transaction\n        self.review = review\n\n    def createComment(self, comment_type, author, text, location=None,\n                      callback=None):\n        assert comment_type in api.comment.Comment.TYPE_VALUES\n        assert isinstance(author, api.user.User)\n        assert isinstance(text, str)\n\n        critic = self.transaction.critic\n\n        # Users are not (generally) allowed to create comments as other users.\n        api.PermissionDenied.raiseUnlessUser(critic, author)\n\n        side = file_id = first_commit_id = last_commit_id = lines = None\n\n        if isinstance(location, api.comment.CommitMessageLocation):\n            first_commit_id = last_commit_id = location.commit.id\n            # FIXME: Make commit message comment line numbers one-based too!\n            lines = [(location.commit.sha1, (location.first_line - 1,\n                                             location.last_line - 1))]\n            # FIXME: ... and then delete the \" - 1\" from the above two lines.\n        elif isinstance(location, api.comment.FileVersionLocation):\n            # Propagate the comment using \"legacy\" comment propagation helper.\n\n            if location.changeset:\n                if location.side == \"old\":\n                    commit = location.changeset.from_commit\n                else:\n                    commit = location.changeset.to_commit\n                side = location.side\n                first_commit_id = location.changeset.from_commit.id\n                last_commit_id = location.changeset.to_commit.id\n            else:\n                commit = location.commit\n                first_commit_id = last_commit_id = location.commit.id\n\n            legacy_review = dbutils.Review.fromAPI(self.review)\n            legacy_commit = gitutils.Commit.fromAPI(commit)\n\n            propagation = Propagation(critic.database)\n            propagation.setCustom(\n                legacy_review, legacy_commit, location.file.id,\n                location.first_line, location.last_line)\n            propagation.calculateInitialLines()\n\n            file_id = location.file.id\n            lines = propagation.all_lines.items()\n\n        comment = CreatedComment(critic, self.review)\n        initial_comment = api.transaction.LazyObject()\n\n        def collectInitialComment(comment_id):\n            initial_comment_id.append(comment_id)\n\n        self.transaction.tables.update((\"commentchains\", \"comments\"))\n        self.transaction.items.append(\n            api.transaction.Query(\n                \"\"\"INSERT\n                     INTO commentchains (review, uid, type, origin, file,\n                                         first_commit, last_commit)\n                   VALUES (%s, %s, %s, %s, %s, %s, %s)\n                RETURNING id\"\"\",\n                (self.review.id, author.id, comment_type, side, file_id,\n                 first_commit_id, last_commit_id),\n                collector=comment))\n        self.transaction.items.append(\n            api.transaction.Query(\n                \"\"\"INSERT\n                     INTO comments (chain, uid, state, comment)\n                   VALUES (%s, %s, 'draft', %s)\n                RETURNING id\"\"\",\n                (comment.id, author.id, text),\n                collector=initial_comment))\n        self.transaction.items.append(\n            api.transaction.Query(\n                \"\"\"UPDATE commentchains\n                      SET first_comment=%s\n                    WHERE id=%s\"\"\",\n                (initial_comment.id, comment.id)))\n\n        if lines:\n            self.transaction.tables.add(\"commentchainlines\")\n            self.transaction.items.append(\n                api.transaction.Query(\n                    \"\"\"INSERT\n                         INTO commentchainlines (chain, uid, sha1,\n                                                 first_line, last_line)\n                       VALUES (%s, %s, %s, %s, %s)\"\"\",\n                    *((comment.id, author.id, sha1, first_line, last_line)\n                      for sha1, (first_line, last_line) in lines)))\n\n        if callback:\n            self.transaction.callbacks.append(\n                lambda: callback(comment.fetch()))\n\n        return comment\n\n    def modifyComment(self, comment):\n        from comment import ModifyComment\n        assert comment.review == self.review\n\n        # Users are not (generally) allowed to modify other users' draft\n        # comments.\n        if comment.is_draft:\n            api.PermissionDenied.raiseUnlessUser(self.transaction.critic,\n                                                 comment.author)\n\n        return ModifyComment(self.transaction, comment)\n\n    def prepareRebase(self, user, new_upstream=None, history_rewrite=None, branch=None, callback=None):\n        assert isinstance(user, api.user.User)\n        assert new_upstream is None or isinstance(new_upstream, str)\n        assert history_rewrite is None or isinstance(history_rewrite, bool)\n        assert (new_upstream is None) != (history_rewrite is None)\n        assert callback is None or callable(callback)\n\n        pending = self.review.pending_rebase\n        if pending is not None:\n            creator = pending.creator\n            raise api.log.rebase.RebaseError(\n                \"The review is already being rebased by %s <%s>.\" %\n                (creator.fullname, creator.email if creator.email is not None\n                 else \"email missing\"))\n\n        commitset = self.review.branch.commits\n        tails = commitset.filtered_tails\n        heads = commitset.heads\n\n        assert len(heads) == 1\n        head = next(iter(heads))\n\n        old_upstream_id = None\n        new_upstream_id = None\n\n        if new_upstream is not None:\n            if len(tails) > 1:\n                raise api.log.rebase.RebaseError(\n                    \"Rebase of branch with multiple tails, to new upstream \"\n                    \"commit, is not supported.\")\n\n            tail = next(iter(tails))\n            old_upstream_id = tail.id\n\n            if new_upstream == \"0\" * 40:\n                new_upstream_id = None\n            else:\n                if not gitutils.re_sha1.match(new_upstream):\n                    cursor = self.transaction.critic.getDatabaseCursor()\n                    cursor.execute(\"SELECT sha1 FROM tags WHERE repository=%s AND name=%s\",\n                                   (self.review.repository.id, new_upstream))\n                    row = cursor.fetchone()\n                    if row:\n                        new_upstream_arg = row[0]\n                    else:\n                        raise api.log.rebase.RebaseError(\n                            \"Specified new_upstream is invalid.\")\n                try:\n                    new_upstream_commit = api.commit.fetch(\n                        self.review.repository, ref=new_upstream)\n                except:\n                    raise api.log.rebase.RebaseError(\n                        \"The specified new upstream commit does not exist \"\n                        \"in Critic's repository\")\n                new_upstream_id = new_upstream_commit.id\n\n        rebase = CreatedRebase(self.transaction.critic, self.review)\n\n        self.transaction.tables.add(\"reviewrebases\")\n        self.transaction.items.append(\n            api.transaction.Query(\n                \"\"\"INSERT\n                     INTO reviewrebases (review, old_head, new_head, old_upstream, new_upstream, uid, branch)\n                   VALUES (%s, %s, NULL, %s, %s, %s, %s)\n                RETURNING id\"\"\",\n                (self.review.id, head.id, old_upstream_id, new_upstream_id, user.id, branch),\n                collector=rebase))\n\n        if callback:\n            self.transaction.callbacks.append(\n                lambda: callback(rebase.fetch()))\n\n    def cancelRebase(self, rebase):\n        self.transaction.tables.add(\"reviewrebases\")\n        self.transaction.items.append(\n            api.transaction.Query(\n                \"\"\"DELETE\n                     FROM reviewrebases\n                    WHERE review=%s\n                      AND new_head IS NULL\n                      AND id=%s\"\"\",\n                (self.review.id, rebase.id)))\n\n    def submitChanges(self, batch_comment, callback):\n        critic = self.transaction.critic\n\n        unpublished_changes = api.batch.fetchUnpublished(critic, self.review)\n\n        if unpublished_changes.is_empty:\n            raise api.batch.BatchError(\"No unpublished changes to submit\")\n\n        created_comments = []\n        empty_comments = []\n\n        if batch_comment:\n            created_comments.append(batch_comment)\n\n        for comment in unpublished_changes.created_comments:\n            if comment.text.strip():\n                created_comments.append(comment)\n            else:\n                empty_comments.append(comment)\n\n        batch = CreatedBatch(critic, self.review)\n\n        self.transaction.tables.add(\"batches\")\n        self.transaction.items.append(\n            api.transaction.Query(\n                \"\"\"INSERT\n                     INTO batches (review, uid, comment)\n                   VALUES (%s, %s, %s)\n                RETURNING id\"\"\",\n                (self.review.id, critic.actual_user.id,\n                 batch_comment.id if batch_comment else None),\n                collector=batch))\n\n        def ids(api_objects):\n            return [api_object.id for api_object in api_objects]\n\n        self.transaction.tables.add(\"commentchains\")\n        self.transaction.items.append(\n            api.transaction.Query(\n                \"\"\"UPDATE commentchains\n                      SET state='open',\n                          batch=%s\n                    WHERE id=ANY (%s)\"\"\",\n                (batch.id, ids(created_comments))))\n        self.transaction.items.append(\n            api.transaction.Query(\n                \"\"\"DELETE\n                     FROM commentchains\n                    WHERE id=ANY (%s)\"\"\",\n                (ids(empty_comments),)))\n\n        self.transaction.tables.add(\"comments\")\n        self.transaction.items.append(\n            api.transaction.Query(\n                \"\"\"UPDATE comments\n                      SET state='current',\n                          batch=%s\n                    WHERE id IN (SELECT first_comment\n                                   FROM commentchains\n                                  WHERE id=ANY (%s))\"\"\",\n                (batch.id, ids(created_comments))))\n        self.transaction.items.append(\n            api.transaction.Query(\n                \"\"\"UPDATE comments\n                      SET state='current',\n                          batch=%s\n                    WHERE id=ANY (%s)\"\"\",\n                (batch.id, ids(unpublished_changes.written_replies))))\n\n        self.transaction.tables.add(\"commentchainlines\")\n        self.transaction.items.append(\n            api.transaction.Query(\n                \"\"\"UPDATE commentchainlines\n                      SET state='current'\n                    WHERE chain=ANY (%s)\"\"\",\n                (ids(created_comments),)))\n\n        # Lock all rows in |commentchains| that we may want to update.\n        self.transaction.items.append(\n            api.transaction.Query(\n                \"\"\"SELECT 1\n                     FROM commentchains\n                    WHERE id=ANY (%s)\n                      FOR UPDATE\"\"\",\n                (ids(unpublished_changes.resolved_issues) +\n                 ids(unpublished_changes.reopened_issues) +\n                 ids(unpublished_changes.morphed_comments.keys()),)))\n\n        # Mark valid comment state changes as performed.\n        self.transaction.tables.add(\"commentchainchanges\")\n        self.transaction.items.append(\n            api.transaction.Query(\n                \"\"\"UPDATE commentchainchanges\n                      SET batch=%s,\n                          state='performed'\n                    WHERE uid=%s\n                      AND state='draft'\n                      AND chain IN (SELECT id\n                                      FROM commentchains\n                                     WHERE id=ANY (%s)\n                                       AND type='issue'\n                                       AND state=%s)\"\"\",\n                (batch.id, critic.actual_user.id,\n                 ids(unpublished_changes.resolved_issues), \"open\"),\n                (batch.id, critic.actual_user.id,\n                 ids(unpublished_changes.reopened_issues), \"closed\"))) # FIXME: handle |state='addressed'|\n\n        # Mark valid comment type changes as performed.\n        morphed_to_issue = []\n        morphed_to_note = []\n        for comment, new_type in unpublished_changes.morphed_comments.items():\n            if new_type == \"issue\":\n                morphed_to_issue.append(comment)\n            else:\n                morphed_to_note.append(comment)\n        self.transaction.items.append(\n            api.transaction.Query(\n                \"\"\"UPDATE commentchainchanges\n                      SET batch=%s,\n                          state='performed'\n                    WHERE uid=%s\n                      AND state='draft'\n                      AND chain IN (SELECT id\n                                      FROM commentchains\n                                     WHERE id=ANY (%s)\n                                       AND type=%s)\"\"\",\n                (batch.id, critic.actual_user.id, ids(morphed_to_issue),\n                 \"note\"),\n                (batch.id, critic.actual_user.id, ids(morphed_to_note),\n                 \"issue\")))\n\n        # Actually perform state changes marked as valid above.\n        self.transaction.items.append(\n            api.transaction.Query(\n                \"\"\"UPDATE commentchains\n                      SET state=%s,\n                          closed_by=%s\n                    WHERE id IN (SELECT chain\n                                   FROM commentchainchanges\n                                  WHERE batch=%s\n                                    AND state='performed'\n                                    AND to_state=%s)\"\"\",\n                (\"closed\", critic.actual_user.id, batch.id, \"closed\"),\n                (\"open\", None, batch.id, \"open\")))\n\n        # Actually perform type changes marked as valid above.\n        self.transaction.items.append(\n            api.transaction.Query(\n                \"\"\"UPDATE commentchains\n                      SET type=%s\n                    WHERE id IN (SELECT chain\n                                   FROM commentchainchanges\n                                  WHERE batch=%s\n                                    AND state='performed'\n                                    AND to_type=%s)\"\"\",\n                ('issue', batch.id, 'issue'),\n                ('note', batch.id, 'note')))\n\n        # Lock all rows in |reviewfiles| that we may want to update.\n        self.transaction.tables.add(\"reviewfilechanges\")\n        self.transaction.items.append(\n            api.transaction.Query(\n                \"\"\"SELECT 1\n                     FROM reviewfiles\n                    WHERE id=ANY (%s)\n                      FOR UPDATE\"\"\",\n                (ids(unpublished_changes.reviewed_file_changes) +\n                 ids(unpublished_changes.unreviewed_file_changes),)))\n\n        # Mark valid draft changes as \"performed\".\n        self.transaction.items.append(\n            api.transaction.Query(\n                \"\"\"UPDATE reviewfilechanges\n                      SET batch=%s,\n                          state='performed'\n                    WHERE uid=%s\n                      AND state='draft'\n                      AND file IN (SELECT id\n                                     FROM reviewfiles\n                                    WHERE id=ANY (%s)\n                                      AND state=%s)\"\"\",\n                (batch.id, critic.actual_user.id,\n                 ids(unpublished_changes.reviewed_file_changes),\n                 \"pending\"),\n                (batch.id, critic.actual_user.id,\n                 ids(unpublished_changes.unreviewed_file_changes),\n                 \"reviewed\")))\n\n        # Actually perform all the changes we previously marked as performed.\n        self.transaction.tables.add(\"reviewfiles\")\n        self.transaction.items.append(\n            api.transaction.Query(\n                \"\"\"UPDATE reviewfiles\n                      SET state=%s,\n                          reviewer=%s\n                    WHERE id IN (SELECT file\n                                   FROM reviewfilechanges\n                                  WHERE batch=%s\n                                    AND state='performed'\n                                    AND to_state=%s)\"\"\",\n                ('reviewed', critic.actual_user.id, batch.id, 'reviewed'),\n                ('pending', None, batch.id, 'pending')))\n\n        if callback:\n            self.transaction.callbacks.append(\n                lambda: callback(batch.fetch()))\n\n        return batch\n\n    def markChangeAsReviewed(self, filechange):\n        assert isinstance(filechange,\n                          api.reviewablefilechange.ReviewableFileChange)\n\n        critic = self.transaction.critic\n\n        if filechange.draft_changes:\n            current_state = filechange.draft_changes.new_is_reviewed\n        else:\n            current_state = filechange.is_reviewed\n        if current_state:\n            raise api.reviewablefilechange.ReviewableFileChangeError(\n                \"Specified file change is already marked as reviewed\")\n\n        if critic.actual_user not in filechange.assigned_reviewers:\n            raise api.reviewablefilechange.ReviewableFileChangeError(\n                \"Specified file change is not assigned to current user\")\n\n        self.transaction.tables.add(\"reviewfilechanges\")\n\n        if filechange.draft_changes:\n            self.transaction.items.append(\n                api.transaction.Query(\n                    \"\"\"DELETE\n                         FROM reviewfilechanges\n                        WHERE file=%s\n                          AND uid=%s\n                          AND to_state='pending'\"\"\",\n                    (filechange.id, critic.actual_user.id)))\n\n        if not filechange.is_reviewed:\n            self.transaction.items.append(\n                api.transaction.Query(\n                    \"\"\"INSERT\n                         INTO reviewfilechanges (file, uid, from_state,\n                                                 to_state)\n                       VALUES (%s, %s, 'pending', 'reviewed')\"\"\",\n                    (filechange.id, critic.actual_user.id)))\n\n    def markChangeAsPending(self, filechange):\n        assert isinstance(filechange,\n                          api.reviewablefilechange.ReviewableFileChange)\n\n        critic = self.transaction.critic\n\n        if filechange.draft_changes:\n            current_state = filechange.draft_changes.new_is_reviewed\n        else:\n            current_state = filechange.is_reviewed\n        if not current_state:\n            raise api.reviewablefilechange.ReviewableFileChangeError(\n                \"Specified file change is already marked as pending\")\n\n        if critic.actual_user not in filechange.assigned_reviewers:\n            raise api.reviewablefilechange.ReviewableFileChangeError(\n                \"Specified file change is not assigned to current user\")\n\n        self.transaction.tables.add(\"reviewfilechanges\")\n\n        if filechange.draft_changes:\n            self.transaction.items.append(\n                api.transaction.Query(\n                    \"\"\"DELETE\n                         FROM reviewfilechanges\n                        WHERE file=%s\n                          AND uid=%s\n                          AND to_state='reviewed'\"\"\",\n                    (filechange.id, critic.actual_user.id)))\n\n        if filechange.is_reviewed:\n            self.transaction.items.append(\n                api.transaction.Query(\n                    \"\"\"INSERT\n                         INTO reviewfilechanges (file, uid, from_state,\n                                                 to_state)\n                       VALUES (%s, %s, 'reviewed', 'pending')\"\"\",\n                    (filechange.id, critic.actual_user.id)))\n\nclass CreatedComment(api.transaction.LazyAPIObject):\n    def __init__(self, critic, review, callback=None):\n        super(CreatedComment, self).__init__(\n            critic, api.comment.fetch, callback)\n        self.review = review\n\nclass CreatedRebase(api.transaction.LazyAPIObject):\n    def __init__(self, critic, review, callback=None):\n        super(CreatedRebase, self).__init__(\n            critic, api.log.rebase.fetch, callback)\n        self.review = review\n\nclass CreatedBatch(api.transaction.LazyAPIObject):\n    def __init__(self, critic, review):\n        super(CreatedBatch, self).__init__(critic, api.batch.fetch)\n        self.review = review\n"
  },
  {
    "path": "src/api/transaction/user.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2015 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\n\nclass ModifyUser(object):\n    def __init__(self, transaction, user):\n        self.transaction = transaction\n        self.user = user\n\n    def setFullname(self, value):\n        self.transaction.tables.add(\"users\")\n        self.transaction.items.append(\n            api.transaction.Query(\n                \"\"\"UPDATE users\n                      SET fullname=%s\n                    WHERE id=%s\"\"\",\n                (value, self.user.id)))\n\n    # Repository filters\n    # ==================\n\n    def createFilter(self, filter_type, repository, path, delegates,\n                     callback=None):\n        assert filter_type in (\"reviewer\", \"watcher\", \"ignore\")\n        assert isinstance(repository, api.repository.Repository)\n        assert all(isinstance(delegate, api.user.User)\n                   for delegate in delegates)\n\n        def collectCreatedFilter(filter_id):\n            if callback:\n                callback(api.filters.fetchRepositoryFilter(\n                    self.transaction.critic, filter_id))\n\n        self.transaction.tables.add(\"filters\")\n        self.transaction.items.append(\n            api.transaction.Query(\n                \"\"\"INSERT\n                     INTO filters (uid, repository, path, type, delegate)\n                   VALUES (%s, %s, %s, %s, %s)\n                RETURNING id\"\"\",\n                (self.user.id, repository.id, path, filter_type,\n                 \",\".join(delegate.name for delegate in delegates)),\n                collector=collectCreatedFilter))\n\n    def modifyFilter(self, repository_filter):\n        from filters import ModifyRepositoryFilter\n        assert repository_filter.subject == self.user\n        return ModifyRepositoryFilter(self.transaction, repository_filter)\n\n    # Access tokens\n    # =============\n\n    def createAccessToken(self, access_type, title, callback=None):\n        import auth\n        import base64\n\n        from accesstoken import CreatedAccessToken\n\n        critic = self.transaction.critic\n\n        if access_type != \"user\":\n            api.PermissionDenied.raiseUnlessAdministrator(critic)\n\n        user_id = self.user.id if access_type == \"user\" else None\n\n        part1 = auth.getToken(encode=base64.b64encode, length=12)\n        part2 = auth.getToken(encode=base64.b64encode, length=21)\n\n        access_token = CreatedAccessToken(\n            critic, self.user if access_type == \"user\" else None, callback)\n\n        self.transaction.tables.update((\"accesstokens\",\n                                        \"accesscontrolprofiles\"))\n        self.transaction.items.append(\n            api.transaction.Query(\n                \"\"\"INSERT\n                     INTO accesstokens (access_type, uid, part1, part2, title)\n                   VALUES (%s, %s, %s, %s, %s)\n                RETURNING id\"\"\",\n                (access_type, user_id, part1, part2, title),\n                collector=access_token))\n        self.transaction.items.append(\n            api.transaction.Query(\n                \"\"\"INSERT\n                     INTO accesscontrolprofiles (access_token)\n                   VALUES (%s)\n                RETURNING id\"\"\",\n                (access_token,),\n                collector=access_token.profile))\n\n        return access_token\n\n    def modifyAccessToken(self, access_token):\n        from accesstoken import ModifyAccessToken\n        assert access_token.user == self.user\n\n        critic = self.transaction.critic\n\n        if critic.access_token and critic.access_token == access_token:\n            # Don't allow any modifications of the access token used to\n            # authenticate.  This could for instance be used to remove the\n            # access restrictions of the token, which would obviously be bad.\n            raise api.PermissionDenied(\"Access token used to authenticate\")\n\n        return ModifyAccessToken(self.transaction, access_token)\n"
  },
  {
    "path": "src/api/user.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\n\nclass UserError(api.APIError):\n    \"\"\"Base exception for all errors related to the User class\"\"\"\n    pass\n\nclass InvalidUserIds(UserError):\n    \"\"\"Raised when one or more invalid user ids is used\"\"\"\n\n    def __init__(self, values):\n        \"\"\"Constructor\"\"\"\n        super(InvalidUserIds, self).__init__(\"Invalid user ids: %r\" % values)\n        self.values = values\n\nclass InvalidUserId(InvalidUserIds):\n    \"\"\"Raised when a single invalid user id is used\"\"\"\n\n    def __init__(self, value):\n        \"\"\"Constructor\"\"\"\n        super(InvalidUserId, self).__init__([value])\n        self.message = \"Invalid user id: %r\" % value\n        self.value = value\n\nclass InvalidUserNames(UserError):\n    \"\"\"Raised when one or more invalid user names is used\"\"\"\n\n    def __init__(self, values):\n        \"\"\"Constructor\"\"\"\n        super(InvalidUserNames, self).__init__(\n            \"Invalid user names: %r\" % values)\n        self.values = values\n\nclass InvalidUserName(InvalidUserNames):\n    \"\"\"Raised when a single user name is used\"\"\"\n\n    def __init__(self, value):\n        \"\"\"Constructor\"\"\"\n        super(InvalidUserName, self).__init__([value])\n        self.message = \"Invalid user name: %r\" % value\n        self.value = value\n\nclass InvalidRole(UserError):\n    \"\"\"Raised when an invalid role is used\"\"\"\n\n    def __init__(self, role):\n        \"\"\"Constructor\"\"\"\n        super(InvalidRole, self).__init__(\"Invalid role: %r\" % role)\n        self.role = role\n\nclass User(api.APIObject):\n    \"\"\"Representation of a Critic user\"\"\"\n\n    STATUS_VALUES = frozenset([\"current\", \"absent\", \"retired\"])\n\n    @property\n    def id(self):\n        \"\"\"The user's unique id\"\"\"\n        return self._impl.id\n\n    @property\n    def name(self):\n        \"\"\"The user's unique username\"\"\"\n        return self._impl.name\n\n    @property\n    def fullname(self):\n        \"\"\"The user's full name\"\"\"\n        return self._impl.fullname\n\n    @property\n    def status(self):\n        \"\"\"The user's status\n\n           For regular users, the value is one of the strings in the\n           User.STATUS_VALUES set.\n\n           For the anonymous user, the value is \"anonymous\".\n           For the Critic system user, the value is \"system\".\"\"\"\n        return self._impl.status\n\n    @property\n    def is_anonymous(self):\n        \"\"\"True if this object represents an anonymous user\"\"\"\n        return self.id is None\n\n    @property\n    def email(self):\n        \"\"\"The user's selected primary email address\n\n           If the user has no primary email address or if the selected primary\n           email address is unverified, this attribute's value is None.\"\"\"\n        return self._impl.email\n\n    class PrimaryEmail(object):\n        \"\"\"Primary email address\n\n           The 'address' attribute is the email address as a string.\n\n           The 'selected' attribute is True if this is the user's currently\n           selected primary email address.\n\n           The 'verified' attribute is False if the address is unverified and\n           shouldn't be used until it has been verified, True if the address has\n           been verified by us, and None if it hasn't been verified but can be\n           used anyway.\"\"\"\n\n        def __init__(self, address, selected, verified):\n            self.address = address\n            self.selected = selected\n            self.verified = verified\n\n    @property\n    def primary_emails(self):\n        \"\"\"The user's primary email addresses\n\n           The value is a list of PrimaryEmail objects.\"\"\"\n        return self._impl.getPrimaryEmails(self.critic)\n\n    @property\n    def git_emails(self):\n        \"\"\"The user's \"git\" email addresses\n\n           The value is a set of strings.\n\n           These addresses are used to identify the user as author or committer\n           of Git commits by matching the email address in the commit's meta\n           data.\"\"\"\n        return self._impl.getGitEmails(self.critic)\n\n    @property\n    def repository_filters(self):\n        \"\"\"The user's repository filters\n\n           The value is a dictionary mapping api.repository.Repository objects\n           to lists of api.filters.RepositoryFilter objects.\"\"\"\n        return self._impl.getRepositoryFilters(self.critic)\n\n    @property\n    def internal(self):\n        \"\"\"The corresponding internal dbutils.User object\n\n           Should only be used when interfacing with legacy code.\"\"\"\n        return self._impl.getInternal(self.critic)\n\n    def hasRole(self, role):\n        \"\"\"Return True if the user has the named role\n\n           If the argument is not a valid role name, an InvalidRole exception is\n           raised.\"\"\"\n        return self._impl.hasRole(self.critic, role)\n\n    def getPreference(self, item, repository=None):\n        \"\"\"Fetch the user's preference setting for 'item'\n\n           The setting is returned as an api.preference.Preference object, whose\n           'user' and 'repository' attributes can be used to determine whether\n           there was a per-user and/or per-repository override, or if a system\n           default value was used.\n\n           If 'repository' is not None, fetch a per-repository override if there\n           is one.\"\"\"\n        assert (repository is None or\n                isinstance(repository, api.repository.Repository))\n        return self._impl.getPreference(self.critic, item, self, repository)\n\ndef fetch(critic, user_id=None, name=None):\n    \"\"\"Fetch a User object with the given user id or name\n\n       Exactly one of the 'user_id' and 'name' arguments can be used.\n\n       Exceptions:\n\n         InvalidUserIds: if 'user_id' is used and is not a valid user id.\n         InvalidUserNames: if 'name' is used and is not a valid user name.\"\"\"\n    import api.impl\n    assert isinstance(critic, api.critic.Critic)\n    assert (user_id is None) != (name is None)\n    return api.impl.user.fetch(critic, user_id, name)\n\ndef fetchMany(critic, user_ids=None, names=None):\n    \"\"\"Fetch many User objects with given user ids or names\n\n       Exactly one of the 'user_ids' and 'names' arguments can be used.\n\n       If the value of the provided 'user_ids' or 'names' argument is a set, the\n       return value is a also set of User objects, otherwise it is a list of\n       User objects, in the same order as the argument sequence.\n\n       Exceptions:\n\n         InvalidUserIds: if 'user_ids' is used and any element in it is not a\n                         valid user id.\n         InvalidUserNames: if 'names' is used and any element in it is not a\n                           valid user name.\"\"\"\n    import api.impl\n    assert isinstance(critic, api.critic.Critic)\n    assert (user_ids is None) != (names is None)\n    return api.impl.user.fetchMany(critic, user_ids, names)\n\ndef fetchAll(critic, status=None):\n    \"\"\"Fetch User objects for all users of the system\n\n       If |status| is not None, it must be one of the user statuses \"current\",\n       \"absent\" or \"retired\", or an iterable containing one or more of those\n       strings.\"\"\"\n    import api.impl\n    assert isinstance(critic, api.critic.Critic)\n    if status is not None:\n        if isinstance(status, basestring):\n            status = set([status])\n        else:\n            status = set(status)\n        assert not (status - User.STATUS_VALUES)\n    return api.impl.user.fetchAll(critic, status)\n\ndef anonymous(critic):\n    \"\"\"Fetch a User object representing an anonymous user\"\"\"\n    import api.impl\n    assert isinstance(critic, api.critic.Critic)\n    return api.impl.user.anonymous(critic)\n"
  },
  {
    "path": "src/auth/__init__.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport os\nimport base64\nimport re\n\nimport configuration\nimport dbutils\n\nclass CheckFailed(Exception): pass\nclass NoSuchUser(CheckFailed): pass\nclass WrongPassword(CheckFailed): pass\n\ndef createCryptContext():\n    try:\n        from passlib.context import CryptContext\n    except ImportError:\n        if not configuration.debug.IS_QUICKSTART:\n            raise\n\n        # Support quick-starting without 'passlib' installed by falling back to\n        # completely bogus unsalted SHA-256-based hashing.\n\n        import hashlib\n\n        class CryptContext:\n            def __init__(self, **kwargs):\n                pass\n            def encrypt(self, password):\n                return hashlib.sha256(password).hexdigest()\n            def verify_and_update(self, password, hashed):\n                return self.encrypt(password) == hashed, None\n\n    kwargs = {}\n\n    for scheme, min_rounds in configuration.auth.MINIMUM_ROUNDS.items():\n        kwargs[\"%s__min_rounds\" % scheme] = min_rounds\n\n    all_schemes = configuration.auth.PASSWORD_HASH_SCHEMES\n    default_scheme = configuration.auth.DEFAULT_PASSWORD_HASH_SCHEME\n\n    return CryptContext(\n        schemes=all_schemes, default=default_scheme,\n        deprecated=filter(lambda scheme: scheme != default_scheme, all_schemes),\n        **kwargs)\n\ndef checkPassword(db, username, password):\n    cursor = db.cursor()\n    cursor.execute(\"SELECT id, password FROM users WHERE name=%s\", (username,))\n\n    row = cursor.fetchone()\n    if not row:\n        raise NoSuchUser\n    user_id, hashed = row\n\n    if hashed is None:\n        # No password set => there is no \"right\" password.\n        raise WrongPassword\n\n    ok, new_hashed = createCryptContext().verify_and_update(password, hashed)\n\n    if not ok:\n        raise WrongPassword\n\n    if new_hashed:\n        with db.updating_cursor(\"users\") as cursor:\n            cursor.execute(\"UPDATE users SET password=%s WHERE id=%s\",\n                           (new_hashed, user_id))\n\n    return dbutils.User.fromId(db, user_id)\n\ndef hashPassword(password):\n    return createCryptContext().encrypt(password)\n\ndef getToken(encode=base64.b64encode, length=20):\n    return encode(os.urandom(length))\n\nclass InvalidUserName(Exception): pass\n\ndef validateUserName(name):\n    if not name:\n        raise InvalidUserName(\"Empty user name is not allowed.\")\n    elif not re.sub(r\"\\s\", \"\", name, re.UNICODE):\n        raise InvalidUserName(\n            \"A user name containing only white-space is not allowed.\")\n    elif configuration.base.USER_NAME_PATTERN is not None:\n        if not re.match(configuration.base.USER_NAME_PATTERN, name):\n            raise InvalidUserName(\n                configuration.base.USER_NAME_PATTERN_DESCRIPTION)\n\ndef isValidUserName(name):\n    try:\n        validateUserName(name)\n    except InvalidUserName:\n        return False\n    return True\n\nclass InvalidRequest(Exception):\n    pass\n\nclass Failure(Exception):\n    pass\n\nfrom session import createSessionId, deleteSessionId, checkSession\n\nfrom accesscontrol import (AccessDenied, AccessControlError,\n                           AccessControlProfile, AccessControl)\nfrom database import AuthenticationError, AuthenticationFailed, Database\n\nDATABASE = None\n\nimport databases\n\nfrom provider import Provider\nfrom oauth import OAuthProvider\n\nPROVIDERS = {}\n\nimport providers\n"
  },
  {
    "path": "src/auth/accesscontrol.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2015 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport re\n\nimport base\nimport auth\nimport configuration\n\nclass AccessDenied(Exception):\n    \"\"\"Raised by AccessControl checks on failure\"\"\"\n    pass\n\nclass AccessControlError(Exception):\n    \"\"\"Raised in case of system configuration errors\"\"\"\n    pass\n\nclass HTTPException(object):\n    def __init__(self, request_method, path_pattern):\n        self.request_method = request_method\n        self.path_regexp = (re.compile(\"^\" + path_pattern + \"$\")\n                            if path_pattern is not None else None)\n\n    def applies(self, req):\n        if self.request_method is not None \\\n                and self.request_method != req.method:\n            return False\n        if self.path_regexp is not None \\\n                and not self.path_regexp.match(req.path):\n            return False\n        return True\n\nclass RepositoryException(object):\n    def __init__(self, access_type, repository_id):\n        self.access_type = access_type\n        self.repository_id = repository_id\n\n    def applies(self, access_type, repository_id):\n        if self.access_type is not None \\\n                and self.access_type != access_type:\n            return False\n        if self.repository_id is not None \\\n                and self.repository_id != repository_id:\n            return False\n        return True\n\nclass ExtensionException(object):\n    def __init__(self, access_type, extension_key):\n        self.access_type = access_type\n        self.extension_key = extension_key\n\n    def applies(self, access_type, extension):\n        if self.access_type is not None \\\n                and self.access_type != access_type:\n            return False\n        if self.extension_key is not None \\\n                and self.extension_key != extension.getKey():\n            return False\n        return True\n\ndef _isAllowed(rule, exceptions, *args):\n    # If an exception applies, then allow if the rule is \"deny\" ...\n    if any(exception.applies(*args) for exception in exceptions):\n        return rule == \"deny\"\n    # ... otherwise allow if the rule is \"allow\".\n    return rule == \"allow\"\n\nclass AccessControlProfile(object):\n    def __init__(self, *rules):\n        if len(rules) == 1:\n            rules = rules * 3\n\n        (self.http_rule,\n         self.repositories_rule,\n         self.extensions_rule) = rules\n\n        self.http_exceptions = []\n        self.repositories_exceptions = []\n        self.extensions_exceptions = []\n\n    @staticmethod\n    def isAllowedHTTP(profiles, req):\n        return all(_isAllowed(profile.http_rule, profile.http_exceptions, req)\n                   for profile in profiles)\n\n    @staticmethod\n    def isAllowedRepository(profiles, access_type, repository_id):\n        return all(_isAllowed(profile.repositories_rule,\n                              profile.repositories_exceptions,\n                              access_type, repository_id)\n                   for profile in profiles)\n\n    @staticmethod\n    def isAllowedExtension(profiles, access_type, extension):\n        return all(_isAllowed(profile.extensions_rule,\n                              profile.extensions_exceptions,\n                              access_type, extension)\n                   for profile in profiles)\n\n    @staticmethod\n    def forUser(db, user, authentication_labels=()):\n        cursor = db.readonly_cursor()\n        if user.isSystem():\n            # The system user can always do everything.\n            return AccessControlProfile(\"allow\")\n        if user.isAnonymous():\n            if not configuration.base.ALLOW_ANONYMOUS_USER:\n                profile = AccessControlProfile(\"deny\")\n                if configuration.base.SESSION_TYPE == \"cookie\":\n                    # Hard-coded exceptions to allow access to things that must\n                    # be accessed in order for the user to load the login page\n                    # and successfully sign in.\n                    profile.http_exceptions.extend([\n                        HTTPException(\"GET\", \"login\"),\n                        HTTPException(\"POST\", \"validatelogin\")\n                    ])\n                return profile\n            cursor.execute(\"\"\"SELECT profile\n                                FROM useraccesscontrolprofiles\n                               WHERE access_type='anonymous'\"\"\")\n            row = cursor.fetchone()\n        else:\n            cursor.execute(\"\"\"SELECT profile\n                                FROM useraccesscontrolprofiles\n                               WHERE access_type='user'\n                                 AND uid=%s\"\"\",\n                           (user.id,))\n            row = cursor.fetchone()\n            if not row and authentication_labels:\n                cursor.execute(\"\"\"SELECT profile\n                                    FROM labeledaccesscontrolprofiles\n                                   WHERE labels=%s\"\"\",\n                               (\"|\".join(sorted(authentication_labels)),))\n                row = cursor.fetchone()\n        if not row:\n            cursor.execute(\"\"\"SELECT profile\n                                FROM useraccesscontrolprofiles\n                               WHERE access_type='user'\n                                 AND uid IS NULL\"\"\")\n            row = cursor.fetchone()\n        if not row:\n            # By default, allow everything.\n            return AccessControlProfile(\"allow\")\n        profile_id, = row\n        return AccessControlProfile.fromId(db, profile_id)\n\n    @staticmethod\n    def fromId(db, profile_id):\n        cursor = db.readonly_cursor()\n        cursor.execute(\n            \"\"\"SELECT http, repositories, extensions\n                 FROM accesscontrolprofiles\n                WHERE id=%s\"\"\",\n            (profile_id,))\n\n        profile = AccessControlProfile(*cursor.fetchone())\n\n        cursor.execute(\"\"\"SELECT request_method, path_pattern\n                            FROM accesscontrol_http\n                           WHERE profile=%s\"\"\",\n                       (profile_id,))\n        profile.http_exceptions.extend(\n            HTTPException(request_method, path_pattern)\n            for request_method, path_pattern in cursor)\n\n        cursor.execute(\"\"\"SELECT access_type, repository\n                            FROM accesscontrol_repositories\n                           WHERE profile=%s\"\"\",\n                       (profile_id,))\n        profile.repositories_exceptions.extend(\n            RepositoryException(access_type, repository_id)\n            for access_type, repository_id in cursor)\n\n        if configuration.extensions.ENABLED:\n            cursor.execute(\"\"\"SELECT access_type, extension_key\n                                FROM accesscontrol_extensions\n                               WHERE profile=%s\"\"\",\n                           (profile_id,))\n            profile.extensions_exceptions.extend(\n                ExtensionException(access_type, extension_key)\n                for access_type, extension_key in cursor)\n\n        return profile\n\nclass AccessControl(object):\n    @staticmethod\n    def forRequest(db, req):\n        # Check the session status of the request.  This raises exceptions in\n        # various situations.  If no exception is raised, req.user will have\n        # been set, possibly to the anonymous user (or the system user.)\n        auth.checkSession(db, req)\n\n        assert db.user\n        assert db.profiles\n\n    @staticmethod\n    def accessHTTP(db, req):\n        if not AccessControlProfile.isAllowedHTTP(db.profiles, req):\n            raise AccessDenied(\"Access denied: %s /%s\" % (req.method, req.path))\n\n    class Repository(object):\n        def __init__(self, repository_id, path):\n            self.id = repository_id\n            self.path = path\n\n    @staticmethod\n    def accessRepository(db, access_type, repository):\n        if not AccessControlProfile.isAllowedRepository(\n                db.profiles, access_type, repository.id):\n            raise AccessDenied(\"Repository access denied: %s %s\"\n                               % (access_type, repository.path))\n\n    @staticmethod\n    def accessExtension(db, access_type, extension):\n        if not AccessControlProfile.isAllowedExtension(\n                db.profiles, access_type, extension):\n            raise AccessDenied(\"Access denied to extension: %s %s\"\n                               % (access_type, extension.getKey()))\n"
  },
  {
    "path": "src/auth/database.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2015 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport configuration\n\nclass AuthenticationError(Exception):\n    \"\"\"Raised by Database.authenticate() on error\n\n       \"Error\" here means something the system administrator should be informed\n       about, rather than the user.\n\n       The user trying to sign in will not see this exception's message, but\n       will instead be shown a generic error message saying that something went\n       wrong.\"\"\"\n    pass\n\nclass AuthenticationFailed(Exception):\n    \"\"\"Raised by Database.authenticate() on failure\n\n       \"Failure\" here means the identification and/or password provided by the\n       user was incorrect.\n\n       The user trying to sign in will be presented with this exception's\n       message as the reason for the failure.  Note: this value is taken as HTML\n       source text, meaning it can contain tags, but also that '<' and '&'\n       characters must be replaced with &lt; and &amp; entity references.\"\"\"\n    pass\n\nclass Database(object):\n    def __init__(self, name):\n        self.name = name\n        self.configuration = configuration.auth.DATABASES.get(name, {})\n\n    def getFields(self):\n        \"\"\"The fields in the sign-in form\n\n           The return value should be a sequence of tuples, with elements as\n           follows:\n\n           The first element should be a boolean.  True means the value should\n           be hidden, e.g. that it's a password or similar.\n\n           The second element should be an (in this context unique) identifer,\n           for internal use.\n\n           The third element should be the field's label in the login form,\n           e.g. \"Username\" or \"Email (@example.com)\".\n\n           The fourth element is optional; if present it should be a longer\n           description of the field to be used in a help popup or similar.\n           Note: this value is taken as HTML source text, meaning it can contain\n           tags, but also that '<' and '&' characters must be replaced with &lt;\n           and &amp; entity references.\"\"\"\n        raise base.NotReached()\n\n    def authenticate(self, db, fields):\n        \"\"\"Authenticate user based on values input\n\n           The |fields| argument is a dictionary mapping identifiers (the second\n           element in the tuples returned by getFields()) to the values the user\n           entered.\n\n           On success, a dbutils.User object must be returned.  No other type of\n           return value is acceptable.\n\n           On failure/error, either AuthenticationError or AuthenticationFailed\n           should be raised.  Any other exception raised will be treated similar\n           to an AuthenticationError exception.\"\"\"\n        raise base.NotReached()\n\n    def getAuthenticationLabels(self, user):\n        return ()\n\n    def supportsHTTPAuthentication(self):\n        \"\"\"Returns true if HTTP authentication is supported\n\n           By default it is if the database declares two fields, where the first\n           is not hidden (not a password) and the second is hidden.  The first\n           field will receive the HTTP username and the second field the HTTP\n           password.\"\"\"\n        fields = self.getFields()\n        return (len(fields) == 2 and\n                not fields[0][0] and\n                fields[1][0])\n\n    def performHTTPAuthentication(self, db, username, password):\n        if not self.supportsHTTPAuthentication():\n            raise AuthenticationFailed(\"HTTP authentication not supported\")\n\n        fields = self.getFields()\n        self.authenticate(db, { fields[0][1]: username,\n                                fields[1][1]: password })\n\n    def supportsPasswordChange(self):\n        \"\"\"Returns true if password changing is supported\"\"\"\n        return False\n\n    def changePassword(self, db, user, current_pw, new_pw):\n        \"\"\"Change the user's password\n\n           Raises auth.WrongPassword if |current_pw| is incorrect.\"\"\"\n        pass\n"
  },
  {
    "path": "src/auth/databases/__init__.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2015 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport internaldb\nimport ldapdb\n\n# This must be last, since it wraps the other enabled database.\nimport accesstokensdb\n"
  },
  {
    "path": "src/auth/databases/accesstokensdb.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2015 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport auth\nimport configuration\nimport dbutils\n\nclass AccessTokens(auth.Database):\n    def __init__(self, authdb):\n        super(AccessTokens, self).__init__(\"accesstokens\")\n        self.authdb = authdb\n\n    def getFields(self):\n        return self.authdb.getFields()\n\n    def authenticate(self, db, fields):\n        self.authdb.authenticate(db, fields)\n\n    def getAuthenticationLabels(self, user):\n        return self.authdb.getAuthenticationLabels(user)\n\n    def supportsHTTPAuthentication(self):\n        # HTTP authentication is the primary use-case.\n        return True\n\n    def performHTTPAuthentication(self, db, username, password):\n        cursor = db.readonly_cursor()\n        cursor.execute(\"\"\"SELECT id, access_type, uid\n                            FROM accesstokens\n                           WHERE part1=%s\n                             AND part2=%s\"\"\",\n                       (username, password))\n        row = cursor.fetchone()\n\n        if row:\n            token_id, access_type, user_id = row\n\n            if access_type == \"anonymous\":\n                db.setUser(dbutils.User.makeAnonymous())\n            elif access_type == \"system\":\n                db.setUser(dbutils.User.makeSystem())\n            else:\n                user = dbutils.User.fromId(db, user_id)\n                authentication_labels = self.getAuthenticationLabels(user)\n                db.setUser(user, authentication_labels)\n\n            import api\n            db.critic.setAccessToken(api.accesstoken.fetch(db.critic, token_id))\n\n            cursor.execute(\"\"\"SELECT id\n                                FROM accesscontrolprofiles\n                               WHERE access_token=%s\"\"\",\n                           (token_id,))\n            row = cursor.fetchone()\n\n            if row:\n                profile_id, = row\n                db.addProfile(auth.AccessControlProfile.fromId(db, profile_id))\n\n            return\n\n        return self.authdb.performHTTPAuthentication(db, username, password)\n\n    def supportsPasswordChange(self):\n        return self.authdb.supportsPasswordChange()\n\n    def changePassword(self, db, user, current_pw, new_pw):\n        if db.critic.access_token is not None:\n            raise auth.AccessDenied\n        return self.authdb.changePassword(db, user, current_pw, new_pw)\n\nif configuration.auth.ENABLE_ACCESS_TOKENS:\n    auth.DATABASE = AccessTokens(auth.DATABASE)\n"
  },
  {
    "path": "src/auth/databases/internaldb.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2015 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport auth\nimport configuration\n\nclass Internal(auth.Database):\n    def __init__(self):\n        super(Internal, self).__init__(\"internal\")\n\n    def getFields(self):\n        return [(False, \"username\", \"Username:\"),\n                (True, \"password\", \"Password:\")]\n\n    def authenticate(self, db, values):\n        username = values[\"username\"].strip()\n        if not username:\n            raise auth.database.AuthenticationFailed(\"Empty username\")\n        password = values[\"password\"]\n        if not password:\n            raise auth.database.AuthenticationFailed(\"Empty password\")\n\n        try:\n            db.setUser(auth.checkPassword(db, username, password))\n        except auth.NoSuchUser:\n            raise auth.AuthenticationFailed(\"Invalid username\")\n        except auth.WrongPassword:\n            raise auth.AuthenticationFailed(\"Wrong password\")\n\n    def supportsPasswordChange(self):\n        return True\n\n    def changePassword(self, db, user, current_pw, new_pw):\n        # If |current_pw| is True, then this is an administrator changing\n        # another user's password. The usual rules do not apply.\n        if current_pw is not True:\n            cursor = db.readonly_cursor()\n            cursor.execute(\"SELECT password FROM users WHERE id=%s\", (user.id,))\n\n            hashed_pw, = cursor.fetchone()\n\n            if current_pw is not None:\n                auth.checkPassword(db, user.name, current_pw)\n            elif hashed_pw is not None:\n                # This is mostly a sanity check; the only way to trigger this is\n                # if the user has no password when he loads /home, sets a\n                # password in another tab or using another browser, and then\n                # tries to set (rather than change) the password using the old\n                # stale /home.\n                raise auth.WrongPassword\n\n        with db.updating_cursor(\"users\") as cursor:\n            cursor.execute(\"UPDATE users SET password=%s WHERE id=%s\",\n                           (auth.hashPassword(new_pw), user.id))\n\nif configuration.auth.DATABASE == \"internal\":\n    auth.DATABASE = Internal()\n"
  },
  {
    "path": "src/auth/databases/ldapdb.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2015 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport hashlib\nimport threading\nimport time\n\nimport auth\nimport dbutils\nimport configuration\n\ndef escaped(fields, fn):\n    return { identifier: fn(value)\n             for identifier, value in fields.items() }\n\nclass LDAPCache(object):\n    def __init__(self, max_age):\n        self.lock = threading.Lock()\n        # sha256(repr(fields)) => (user_id, timestamp)\n        self.cache = {}\n        self.max_age = max_age\n\n    @staticmethod\n    def __key(fields):\n        return hashlib.sha256(repr(sorted(fields.items()))).hexdigest()\n\n    def get(self, fields):\n        # A max age of zero (or less) means the cache is disabled.\n        if self.max_age <= 0:\n            return None\n        with self.lock:\n            key = LDAPCache.__key(fields)\n            value, timestamp = self.cache.get(key, (None, None))\n            if value is None:\n                return None\n            if time.time() - timestamp > self.max_age:\n                del self.cache[key]\n                return None\n            return value\n\n    def set(self, fields, value):\n        # A max age of zero (or less) means the cache is disabled.\n        if self.max_age <= 0:\n            return\n        with self.lock:\n            # Note: We might overwrite an existing (presumably identical) entry\n            # here, due to two threads racing to authenticate the same user.\n            key = LDAPCache.__key(fields)\n            self.cache[key] = (value, time.time())\n\nclass LDAP(auth.Database):\n    def __init__(self):\n        super(LDAP, self).__init__(\"ldap\")\n        self.cache = LDAPCache(self.configuration[\"cache_max_age\"])\n\n    def __startConnection(self):\n        import ldap\n        connection = ldap.initialize(self.configuration[\"url\"])\n        if self.configuration[\"use_tls\"]:\n            connection.start_tls_s()\n        return connection\n\n    def __isMemberOfGroup(self, connection, group, fields):\n        import ldap\n        result = connection.search_s(\n            group[\"dn\"], ldap.SCOPE_BASE,\n            attrlist=[group[\"members_attribute\"]])\n        if len(result) != 1:\n            raise auth.AuthenticationError(\n                \"Required group '%s' not found\" % group[\"dn\"])\n        group_dn, group_attributes = result[0]\n        if group[\"members_attribute\"] not in group_attributes:\n            raise auth.AuthenticationError(\n                \"Required group '%s' has no attribute '%s'\"\n                % (group[\"dn\"], group[\"members_attribute\"]))\n        members = group_attributes[group[\"members_attribute\"]]\n        member_value = (group[\"member_value\"]\n                        % escaped(fields, ldap.dn.escape_dn_chars))\n        return member_value in members\n\n    def getFields(self):\n        return self.configuration[\"fields\"]\n\n    def authenticate(self, db, fields):\n        import ldap\n        import ldap.filter\n\n        cached_data = self.cache.get(fields)\n        if cached_data is not None:\n            cached_user_id, cached_authentication_labels = cached_data\n            try:\n                user = dbutils.User.fromId(db, cached_user_id)\n            except dbutils.InvalidUserId:\n                pass\n            else:\n                db.setUser(user, cached_authentication_labels)\n                return\n\n        connection = self.__startConnection()\n\n        search_base = (self.configuration[\"search_base\"]\n                       % escaped(fields, ldap.dn.escape_dn_chars))\n        search_filter = (self.configuration[\"search_filter\"]\n                         % escaped(fields, ldap.filter.escape_filter_chars))\n\n        attributes = [self.configuration[\"username_attribute\"]]\n\n        if self.configuration[\"create_user\"]:\n            attributes.extend([self.configuration[\"fullname_attribute\"],\n                               self.configuration[\"email_attribute\"]])\n\n        result = connection.search_s(\n            search_base, ldap.SCOPE_SUBTREE, search_filter, attributes)\n\n        if not result:\n            raise auth.AuthenticationFailed(\"LDAP search found no matches\")\n\n        if len(result) > 1:\n            raise auth.AuthenticationFailed(\"LDAP search found multiple matches\")\n\n        dn, attributes = result[0]\n\n        # fields_with_dn = fields.copy()\n        # fields_with_dn[\"dn\"] = dn\n\n        try:\n            connection.simple_bind_s(\n                dn, fields[self.configuration[\"credentials\"]])\n        except ldap.INVALID_CREDENTIALS:\n            raise auth.AuthenticationFailed(\"Invalid credentials\")\n        except ldap.UNWILLING_TO_PERFORM:\n            # Might be raised for e.g. empty password.\n            raise auth.AuthenticationFailed(\"Invalid credentials\")\n\n        authentication_labels = set()\n\n        if \"require_groups\" in self.configuration:\n            for group in self.configuration[\"require_groups\"]:\n                if not self.__isMemberOfGroup(connection, group, fields):\n                    raise auth.AuthenticationFailed(\n                        \"Not member of required LDAP groups\")\n                if \"label\" in group:\n                    authentication_labels.add(group[\"label\"])\n\n        if \"additional_groups\" in self.configuration:\n            for group in self.configuration[\"additional_groups\"]:\n                if self.__isMemberOfGroup(connection, group, fields):\n                    authentication_labels.add(group[\"label\"])\n\n        connection.unbind_s()\n\n        def getAttribute(name, defvalue=None):\n            value_list = attributes.get(name)\n            if value_list is None or not value_list or not value_list[0]:\n                defvalue\n            return value_list[0]\n\n        username = getAttribute(self.configuration[\"username_attribute\"])\n\n        if username is None:\n            raise auth.AuthenticationError(\n                \"Configured username LDAP attribute missing\")\n\n        try:\n            user = dbutils.User.fromName(db, username)\n        except dbutils.NoSuchUser:\n            if not self.configuration[\"create_user\"]:\n                raise auth.AuthenticationFailed(\"No matching Critic user found\")\n\n            fullname = getAttribute(self.configuration[\"fullname_attribute\"],\n                                    username)\n            email = getAttribute(self.configuration[\"email_attribute\"])\n\n            user = dbutils.User.create(\n                db, username, fullname, email, email_verified=None)\n\n        db.setUser(user, authentication_labels)\n\n        self.cache.set(fields, (user.id, authentication_labels))\n\n    def getAuthenticationLabels(self, user):\n        connection = self.__startConnection()\n\n        fields = self.configuration[\"fields_from_user\"](user)\n        authentication_labels = set()\n\n        if \"require_groups\" in self.configuration:\n            for group in self.configuration[\"require_groups\"]:\n                if \"label\" in group \\\n                        and self.__isMemberOfGroup(connection, group, fields):\n                    authentication_labels.add(group[\"label\"])\n\n        if \"additional_groups\" in self.configuration:\n            for group in self.configuration[\"additional_groups\"]:\n                if self.__isMemberOfGroup(connection, group, fields):\n                    authentication_labels.add(group[\"label\"])\n\n        return authentication_labels\n\nif configuration.auth.DATABASE == \"ldap\":\n    auth.DATABASE = LDAP()\n"
  },
  {
    "path": "src/auth/oauth.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport urllib\n\nimport dbutils\nimport auth\nimport textutils\nimport request\n\nclass OAuthProvider(auth.Provider):\n    def start(self, db, req, target_url=None):\n        state = auth.getToken()\n\n        authorize_url = self.getAuthorizeURL(state)\n\n        if authorize_url is None:\n            return None\n\n        if target_url is None:\n            target_url = req.getParameter(\"target\", None)\n\n        with db.updating_cursor(\"oauthstates\") as cursor:\n            cursor.execute(\"\"\"INSERT INTO oauthstates (state, url)\n                                   VALUES (%s, %s)\"\"\",\n                           (state, target_url))\n\n        return authorize_url\n\n    def finish(self, db, req):\n        if req.method != \"GET\":\n            raise auth.InvalidRequest\n\n        code = req.getParameter(\"code\", default=None)\n        state = req.getParameter(\"state\", default=None)\n\n        if code is None or state is None:\n            raise auth.InvalidRequest(\"Missing parameter(s)\")\n\n        cursor = db.cursor()\n        cursor.execute(\"\"\"SELECT url\n                            FROM oauthstates\n                           WHERE state=%s\"\"\",\n                       (state,))\n\n        row = cursor.fetchone()\n\n        if not row:\n            raise auth.InvalidRequest(\"Invalid OAuth state: %s\" % state)\n\n        (target_url,) = row\n\n        access_token = self.getAccessToken(code)\n\n        if access_token is None:\n            raise auth.Failure(\"failed to get access token\")\n\n        user_data = self.getUserData(access_token)\n\n        if user_data is None:\n            raise auth.Failure(\"failed to get user data\")\n\n        account = textutils.encode(user_data[\"account\"])\n        username = textutils.encode(user_data[\"username\"])\n        email = user_data[\"email\"]\n        email = textutils.encode(email) if email else None\n        fullname = textutils.encode(user_data.get(\"fullname\", username))\n\n        cursor.execute(\"\"\"SELECT id, uid\n                            FROM externalusers\n                           WHERE provider=%s\n                             AND account=%s\"\"\",\n                       (self.name, account))\n\n        row = cursor.fetchone()\n\n        if row:\n            external_user_id, user_id = row\n        else:\n            with db.updating_cursor(\"externalusers\") as updating_cursor:\n                updating_cursor.execute(\n                    \"\"\"INSERT INTO externalusers (provider, account, email)\n                            VALUES (%s, %s, %s)\n                         RETURNING id\"\"\",\n                    (self.name, account, email))\n                external_user_id, = updating_cursor.fetchone()\n                user_id = None\n\n        user = None\n\n        if user_id is not None:\n            user = dbutils.User.fromId(db, user_id)\n        else:\n            if auth.isValidUserName(username) \\\n                    and self.configuration.get(\"bypass_createuser\"):\n                try:\n                    dbutils.User.fromName(db, username)\n                except dbutils.NoSuchUser:\n                    user = dbutils.User.create(\n                        db, username, fullname, email, email_verified=None,\n                        external_user_id=external_user_id)\n                    user.sendUserCreatedMail(\"wsgi[oauth/%s]\" % self.name,\n                                             { \"provider\": self.name,\n                                               \"account\": account })\n\n        if user is None:\n            token = auth.getToken()\n\n            with db.updating_cursor(\"externalusers\") as updating_cursor:\n                updating_cursor.execute(\n                    \"\"\"UPDATE externalusers\n                          SET token=%s\n                        WHERE id=%s\"\"\",\n                    (token, external_user_id))\n\n            data = { \"provider\": self.name,\n                     \"account\": account,\n                     \"token\": token }\n\n            if target_url:\n                data[\"target\"] = target_url\n            if username:\n                data[\"username\"] = username\n            if email:\n                data[\"email\"] = email\n            if fullname:\n                data[\"fullname\"] = fullname\n\n            target_url = \"/createuser?%s\" % urllib.urlencode(data)\n\n        if user is not None:\n            auth.createSessionId(db, req, user)\n\n        raise request.Found(target_url or \"/\")\n\n    def validateToken(self, db, account, token):\n        cursor = db.cursor()\n        cursor.execute(\"\"\"SELECT token\n                            FROM externalusers\n                           WHERE provider=%s\n                             AND account=%s\"\"\",\n                       (self.name, account))\n        row = cursor.fetchone()\n        return row and token == row[0]\n"
  },
  {
    "path": "src/auth/provider.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport configuration\n\nclass Provider(object):\n    def __init__(self, name):\n        self.name = name\n        self.configuration = configuration.auth.PROVIDERS[name]\n\n    def getTitle(self):\n        \"\"\"Title, suitable as X in 'Sign in using your X'\"\"\"\n        pass\n\n    def getAccountIdDescription(self):\n        \"\"\"Description of the value used as the account identifier\"\"\"\n        pass\n\n    def start(self, db, req, target_url=None):\n        pass\n\n    def finish(self, db, req):\n        pass\n"
  },
  {
    "path": "src/auth/providers/__init__.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport github\nimport google\nimport dummy\n"
  },
  {
    "path": "src/auth/providers/dummy.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport hashlib\nimport urllib\n\nimport configuration\nimport auth\n\nclass DummyOAuthProvider(auth.OAuthProvider):\n    \"\"\"Dummy OAuth authentication provider used by automatic tests\"\"\"\n\n    def __init__(self, name):\n        super(DummyOAuthProvider, self).__init__(name)\n        self.access_token = None\n\n    def getTitle(self):\n        \"\"\"Title, suitable as X in 'Sign in using your X'\"\"\"\n        return self.name.capitalize() + \" account\"\n\n    def getAccountIdDescription(self):\n        return self.name.capitalize() + \" username\"\n\n    def getAccountURL(self, name):\n        return \"https://example.com/user/%s\" % name\n\n    def getAuthorizeURL(self, state):\n        query = urllib.urlencode({ \"state\": state })\n        return \"https://example.com/authorize?%s\" % query\n\n    def getAccessToken(self, code):\n        if code == \"incorrect\":\n            raise auth.Failure(\"Incorrect code\")\n        self.access_token = hashlib.sha1(code).hexdigest()\n        return self.access_token\n\n    def getUserData(self, access_token):\n        if access_token != self.access_token:\n            raise auth.Failure(\"Invalid access token\")\n        return { \"account\": \"account-\" + self.name,\n                 \"username\": self.name,\n                 \"email\": self.name + \"@example.org\",\n                 \"fullname\": self.name.capitalize() + \" von Testing\" }\n\nif configuration.debug.IS_TESTING:\n    def createProvider(name, allow_user_registration, verify_email_addresses,\n                       bypass_createuser):\n        configuration.auth.PROVIDERS[name] = {\n            \"enabled\": True,\n            \"allow_user_registration\": allow_user_registration,\n            \"verify_email_addresses\": verify_email_addresses,\n            \"client_id\": \"DummyClientId\",\n            \"client_secret\": \"DummyClientSecret\",\n            \"bypass_createuser\": bypass_createuser\n        }\n        auth.PROVIDERS[name] = DummyOAuthProvider(name)\n\n    createProvider(\"alice\", False, False, False)\n    createProvider(\"carol\", True, False, False)\n    createProvider(\"felix\", True, False, True)\n    createProvider(\"gina\", True, True, False)\n"
  },
  {
    "path": "src/auth/providers/github.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport urllib\n\nimport urlutils\nimport configuration\nimport auth\n\nclass GitHubAuthentication(auth.OAuthProvider):\n    def __init__(self):\n        super(GitHubAuthentication, self).__init__(\"github\")\n\n        self.client_id = self.configuration[\"client_id\"]\n        self.client_secret = self.configuration[\"client_secret\"]\n        self.redirect_uri = self.configuration[\"redirect_uri\"]\n\n    def getTitle(self):\n        \"\"\"Title, suitable as X in 'Sign in using your X'\"\"\"\n        return \"GitHub account\"\n\n    def getAccountIdDescription(self):\n        return \"GitHub username\"\n\n    def getAccountURL(self, name):\n        return \"https://github.com/%s\" % name\n\n    def getAuthorizeURL(self, state):\n        query = urllib.urlencode({ \"client_id\": self.client_id,\n                                   \"redirect_uri\": self.redirect_uri,\n                                   \"state\": state })\n\n        return \"https://github.com/login/oauth/authorize?%s\" % query\n\n    def getAccessToken(self, code):\n        response = urlutils.post(\n            \"https://github.com/login/oauth/access_token\",\n            data={ \"client_id\": self.client_id,\n                   \"client_secret\": self.client_secret,\n                   \"code\": code },\n            headers={ \"Accept\": \"application/json\" })\n\n        if response.status_code != 200:\n            return None\n\n        data = response.json()\n\n        if data is None:\n            return None\n        elif \"error\" in data:\n            raise auth.Failure(data[\"error\"])\n        elif \"access_token\" not in data:\n            return None\n\n        return data[\"access_token\"]\n\n    def getUserData(self, access_token):\n        response = urlutils.get(\n            \"https://api.github.com/user?access_token=%s\" % access_token)\n\n        if response.status_code != 200:\n            return None\n\n        data = response.json()\n\n        if data is None or \"login\" not in data:\n            return None\n\n        return { \"account\": data[\"login\"],\n                 \"username\": data[\"login\"],\n                 \"email\": data.get(\"email\"),\n                 \"fullname\": data.get(\"name\") }\n\nif \"github\" in configuration.auth.PROVIDERS:\n    if configuration.auth.PROVIDERS[\"github\"][\"enabled\"]:\n        auth.PROVIDERS[\"github\"] = GitHubAuthentication()\n"
  },
  {
    "path": "src/auth/providers/google.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport urllib\n\nimport urlutils\nimport configuration\nimport auth\n\nclass GoogleAuthentication(auth.OAuthProvider):\n    def __init__(self):\n        super(GoogleAuthentication, self).__init__(\"google\")\n\n        self.client_id = self.configuration[\"client_id\"]\n        self.client_secret = self.configuration[\"client_secret\"]\n        self.redirect_uri = self.configuration[\"redirect_uri\"]\n\n    def getTitle(self):\n        \"\"\"Title, suitable as X in 'Sign in using your X'\"\"\"\n        return \"Google account\"\n\n    def getAccountIdDescription(self):\n        return \"Google (e.g. GMail) email address\"\n\n    def getAccountURL(self, name):\n        return None\n\n    def getAuthorizeURL(self, state):\n        query = urllib.urlencode({ \"client_id\": self.client_id,\n                                   \"response_type\": \"code\",\n                                   \"scope\": \"openid email\",\n                                   \"redirect_uri\": self.redirect_uri,\n                                   \"state\": state })\n\n        return \"https://accounts.google.com/o/oauth2/auth?\" + query\n\n    def getAccessToken(self, code):\n        response = urlutils.post(\n            \"https://accounts.google.com/o/oauth2/token\",\n            data={ \"code\": code,\n                   \"client_id\": self.client_id,\n                   \"client_secret\": self.client_secret,\n                   \"redirect_uri\": self.redirect_uri,\n                   \"grant_type\": \"authorization_code\" },\n            headers={ \"Accept\": \"application/json\" },\n            verify=False)\n\n        if response.status_code != 200:\n            return None\n\n        data = response.json()\n\n        if data is None:\n            return None\n        elif \"error\" in data:\n            raise auth.Failure(data[\"error\"])\n        elif \"access_token\" not in data:\n            return None\n\n        return data[\"access_token\"]\n\n    def getUserData(self, access_token):\n        response = urlutils.get(\n            \"https://www.googleapis.com/oauth2/v3/userinfo\",\n            params={ \"access_token\": access_token },\n            verify=False)\n\n        if response.status_code != 200:\n            return None\n\n        data = response.json()\n\n        if data is None or \"email\" not in data:\n            return None\n\n        email = data[\"email\"]\n        username = email.partition(\"@\")[0]\n\n        return { \"account\": email,\n                 \"username\": username,\n                 \"email\": email,\n                 \"fullname\": data.get(\"name\", username) }\n\nif \"google\" in configuration.auth.PROVIDERS:\n    if configuration.auth.PROVIDERS[\"google\"][\"enabled\"]:\n        auth.PROVIDERS[\"google\"] = GoogleAuthentication()\n"
  },
  {
    "path": "src/auth/session.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2015 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport auth\nimport dbutils\nimport configuration\nimport request\n\ndef isInsecurePath(req):\n    \"\"\"Check if the request is for an insecure path\n\n       \"Insecure\" here means that unauthenticated/anonymous access should be\n       allowed even if the system doesn't normally allow anonymous access.\"\"\"\n    if configuration.base.SESSION_TYPE == \"cookie\":\n        # Login machinery must be accessible before being logged in.\n        if req.path in (\"login\", \"validatelogin\"):\n            return True\n    if configuration.base.ALLOW_USER_REGISTRATION:\n        # User creation machinery also, if user creation is enabled.\n        if req.path in (\"createuser\", \"registeruser\"):\n            return True\n    # Allow unauthenticated access to all static resources.\n    if req.path.startswith(\"static-resource/\"):\n        return True\n    return False\n\ntry:\n    from customization.email import getUserEmailAddress\nexcept ImportError:\n    def getUserEmailAddress(_username):\n        return None\n\ndef createSessionId(db, req, user, authentication_labels=None):\n    sid = auth.getToken()\n    if authentication_labels:\n        labels = \"|\".join(sorted(authentication_labels))\n    else:\n        labels = \"\"\n\n    with db.updating_cursor(\"usersessions\") as cursor:\n        cursor.execute(\"\"\"INSERT INTO usersessions (key, uid, labels)\n                               VALUES (%s, %s, %s)\"\"\",\n                       (sid, user.id, labels))\n\n    req.setCookie(\"sid\", sid, secure=True)\n    req.setCookie(\"has_sid\", \"1\")\n\ndef deleteSessionId(db, req, user):\n    sid = req.cookies.get(\"sid\", None)\n\n    if sid is None:\n        return False\n\n    req.deleteCookie(\"sid\")\n    req.setCookie(\"has_sid\", \"0\")\n\n    cursor = db.readonly_cursor()\n    cursor.execute(\"\"\"SELECT 1\n                        FROM usersessions\n                       WHERE key=%s\n                         AND uid=%s\"\"\",\n                   (sid, user.id))\n\n    if not cursor.fetchone():\n        # Not a valid session cookie..?\n        return False\n\n    with db.updating_cursor(\"usersessions\") as cursor:\n        cursor.execute(\"\"\"DELETE\n                            FROM usersessions\n                           WHERE key=%s\n                             AND uid=%s\"\"\",\n                       (sid, user.id))\n\n    return True\n\ndef checkSession(db, req):\n    \"\"\"Check if the request is part of a session and if so set req.user\n\n       Raises an request.HTTPResponse exception if immediate action is required,\n       otherwise sets req.user to non-None (but possibly to the anonymous user)\n       and returns.\"\"\"\n\n    # Step 1: If the host web server is supposed to authenticate users, use the\n    #         $REMOTE_USER environment variable.\n    if configuration.base.AUTHENTICATION_MODE == \"host\":\n        # Strip white-space, since Apache is known to do this internally when\n        # authenticating, but then passing on the original unstripped string to\n        # us on success.\n        username = req.getEnvironment().get(\"REMOTE_USER\", \"\").strip()\n        if not username:\n            # No REMOTE_USER variable.  If we support anonymous users, this is\n            # fine, otherwise it indicates a configuration error.\n            if configuration.base.ALLOW_ANONYMOUS_USER:\n                db.setUser(dbutils.User.makeAnonymous())\n                return\n            raise request.MissingWSGIRemoteUser()\n\n        # We have a username.  Fetch the (or create a) matching user record.\n        try:\n            db.setUser(dbutils.User.fromName(db, username))\n        except dbutils.NoSuchUser:\n            email = getUserEmailAddress(username)\n            db.setUser(dbutils.User.create(\n                db, username, username, email, email_verified=None))\n        return\n\n    # Step 2: If cookie based sessions are used, check if there is a valid\n    #         session cookie.\n    if configuration.base.SESSION_TYPE == \"cookie\":\n        sid = req.cookies.get(\"sid\")\n        if sid:\n            cursor = db.cursor()\n            cursor.execute(\n                \"\"\"SELECT uid, labels, EXTRACT('epoch' FROM NOW() - atime) AS age\n                     FROM usersessions\n                    WHERE key=%s\"\"\",\n                (sid,))\n\n            row = cursor.fetchone()\n            if row:\n                user_id, labels, session_age = row\n\n                if configuration.base.SESSION_MAX_AGE == 0 \\\n                        or session_age < configuration.base.SESSION_MAX_AGE:\n                    # This is a valid session cookie.\n                    user = dbutils.User.fromId(db, user_id)\n                    if labels is None:\n                        labels = auth.DATABASE.getAuthenticationLabels(user)\n                    else:\n                        labels = labels.split(\"|\") if labels else ()\n                    db.setUser(user, labels)\n                    return\n\n                # The session cookie is too old.  Delete it from the database.\n                with db.updating_cursor(\"usersessions\") as cursor:\n                    cursor.execute(\"\"\"DELETE FROM usersessions\n                                            WHERE key=%s\"\"\",\n                                   (sid,))\n\n            # The session cookie is not valid.  Delete it from the browser.\n            req.deleteCookie(\"sid\")\n            # Also delete the has_sid cookie, if there is one.\n            req.deleteCookie(\"has_sid\")\n\n            # Since the session seems to have expired, offer the user to sign in\n            # again by redirecting to the login page.  Signing in is optional\n            # though, meaning the login page will have a \"Continue anonymously\"\n            # link (if anonymous access is allowed.)\n            #\n            # Exception: Don't do this if /login is being requested.\n            if req.allowRedirect(307) and req.path != \"login\":\n                raise request.NeedLogin(req, optional=True)\n\n        elif req.cookies.get(\"has_sid\") == \"1\":\n            # The request had no session cookie, but had the has_sid cookie that\n            # indicates the browser ought to have a sesssion cookie.  Typically,\n            # this means a signed in user accesses a mixed HTTP/HTTPS system\n            # over HTTP.  If so, redirect the user to HTTPS.\n            req.ensureSecure()\n\n            # The above call would have raised if a redirect was meaningful.  If\n            # it didn't, the has_sid cookie is bogus, so delete it.\n            req.deleteCookie(\"has_sid\")\n\n        elif req.cookies.get(\"has_sid\") == \"0\":\n            # This indicates that the user just signed out.  If anonymous access\n            # is not allowed, we'll redirect the user to the login page again,\n            # which is sort of a bit unhelpful.\n            #\n            # Worse yet; if use of an external authentication provider is\n            # enforced, the login page will redirect there, which might sign the\n            # user back in, non-interactively.  In that case, signing out would\n            # be impossible.\n            #\n            # So, instead, detect the sign-out and return a simple \"you have\n            # signed out\" page in this case.\n\n            # Delete the cookie.  This means that on reload, the user is\n            # redirected to the login page again.  (This is to prevent the user\n            # from getting stuck on this \"you have signed out\" page.)\n            req.deleteCookie(\"has_sid\")\n\n            # Do the redirect if anonymous access isn't allowed.  Also don't do\n            # it on the actual login page.\n            if not configuration.base.ALLOW_ANONYMOUS_USER \\\n                    and req.path != \"login\":\n                raise request.DisplayMessage(\n                    title=\"You have signed out\",\n                    body=\"To use this system, you will need to sign in again.\")\n\n    # Step 3(a): Check if there's a valid HTTP Authorization header (even if\n    #            cookie based sessions are typically used.)  If there is such a\n    #            header, we assume HTTP authentication was meant to be used, and\n    #            respond with a 401 Unauthorized response if authentication\n    #            using the header fails.\n    authorization_header = req.getRequestHeader(\"Authorization\")\n    if authorization_header:\n        import base64\n\n        try:\n            authtype, base64_credentials = authorization_header.split(None, 1)\n        except ValueError:\n            authtype = \"invalid\"\n        if authtype.lower() != \"basic\":\n            raise request.RequestHTTPAuthentication()\n\n        try:\n            credentials = base64.b64decode(base64_credentials)\n        except (ValueError, TypeError) as error:\n            raise request.RequestHTTPAuthentication()\n\n        username, _, password = credentials.partition(\":\")\n        username = username.strip()\n\n        if username and password:\n            try:\n                auth.DATABASE.performHTTPAuthentication(db, username, password)\n                req.session_type = \"httpauth\"\n                return\n            except auth.AuthenticationFailed:\n                pass\n\n        raise request.RequestHTTPAuthentication()\n\n    # Step 3(b): If the request has a \"use_httpauth\" cookie, request/require\n    #            HTTP authentication.  This is a just a convenience feature for\n    #            clients using HTTP stacks that only send credentials in\n    #            response to server challenges.  (If cookie sessions are used,\n    #            no such challenge would normally be returned, we'd rather\n    #            redirect to the login page.)\n    if req.cookies.get(\"use_httpauth\"):\n        raise request.RequestHTTPAuthentication()\n    # Also do this for requests with a \"httpauth=yes\" query parameter.\n    if req.getParameter(\"httpauth\", \"no\") == \"yes\":\n        raise request.RequestHTTPAuthentication()\n\n    # Step 4: If anonymous access is supported or if it should be allowed as an\n    #         exception for the accessed path, leave the session anonymous.\n    if configuration.base.ALLOW_ANONYMOUS_USER or isInsecurePath(req):\n        db.setUser(dbutils.User.makeAnonymous())\n        req.session_type = None\n        return\n\n    # Step 5: If HTTP authentication is required (i.e. no session cookies) then\n    #         request that now.\n    if configuration.base.SESSION_TYPE == \"httpauth\":\n        raise request.RequestHTTPAuthentication()\n\n    # Step 6: Cookie based sessions are enabled, and not anonymous access.  If\n    #         this is a POST or PUT request, respond with 403 Forbidden, and\n    #         otherwise redirect to the login page.\n    if not req.allowRedirect(307):\n        raise request.Forbidden(\"Valid user session required\")\n\n    raise request.NeedLogin(req, optional=req.cookies.has_key(\"has_sid\"))\n"
  },
  {
    "path": "src/background/__init__.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\n"
  },
  {
    "path": "src/background/branchtracker.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport os\nimport time\nimport traceback\n\nsys.path.insert(0, os.path.abspath(os.path.join(os.path.dirname(sys.argv[0]), \"..\")))\n\nimport background.utils\nimport dbutils\nimport gitutils\nimport mailutils\nimport configuration\n\n# Git (git-send-pack) appends a line suffix to its output.  This suffix depends\n# on the $TERM value.  When $TERM is \"dumb\", the suffix is 8 spaces.  We strip\n# this suffix if it's present.  (If we incorrectly strip 8 spaces not actually\n# added by Git, it's not the end of the world.)\n#\n# See https://github.com/git/git/blob/master/sideband.c for details.\nDUMB_SUFFIX = \"        \"\n\nclass BranchTracker(background.utils.BackgroundProcess):\n    def __init__(self):\n        super(BranchTracker, self).__init__(service=configuration.services.BRANCHTRACKER)\n\n    def update(self, trackedbranch_id, repository_id, local_name, remote, remote_name, forced):\n        repository = gitutils.Repository.fromId(self.db, repository_id)\n\n        try:\n            with repository.relaycopy(\"branchtracker\") as relay:\n                relay.run(\"remote\", \"add\", \"source\", remote)\n\n                current = None\n                new = None\n                tags = []\n\n                if local_name == \"*\":\n                    output = relay.run(\"fetch\", \"source\", \"refs/tags/*:refs/tags/*\", include_stderr=True)\n                    for line in output.splitlines():\n                        if \"[new tag]\" in line:\n                            tags.append(line.rsplit(\" \", 1)[-1])\n                else:\n                    relay.run(\"fetch\", \"--quiet\", \"--no-tags\", \"source\", \"refs/heads/%s:refs/remotes/source/%s\" % (remote_name, remote_name))\n                    try:\n                        current = repository.revparse(\"refs/heads/%s\" % local_name)\n                    except gitutils.GitReferenceError:\n                        # It's okay if the local branch doesn't exist (yet).\n                        pass\n                    new = relay.run(\"rev-parse\", \"refs/remotes/source/%s\" % remote_name).strip()\n\n                if current != new or tags:\n                    if local_name == \"*\":\n                        refspecs = [(\"refs/tags/%s\" % tag) for tag in tags]\n                    else:\n                        refspecs = [\"refs/remotes/source/%s:refs/heads/%s\"\n                                    % (remote_name, local_name)]\n\n                    returncode, stdout, stderr = relay.run(\n                        \"push\", \"--force\", \"origin\", *refspecs,\n                        env={ \"CRITIC_FLAGS\": \"trackedbranch_id=%d\" % trackedbranch_id,\n                              \"TERM\": \"dumb\" },\n                        check_errors=False)\n\n                    stderr_lines = []\n                    remote_lines = []\n\n                    for line in stderr.splitlines():\n                        if line.endswith(DUMB_SUFFIX):\n                            line = line[:-len(DUMB_SUFFIX)]\n                        stderr_lines.append(line)\n                        if line.startswith(\"remote: \"):\n                            line = line[8:]\n                            remote_lines.append(line)\n\n                    if returncode == 0:\n                        if local_name == \"*\":\n                            for tag in tags:\n                                self.info(\"  updated tag: %s\" % tag)\n                        elif current:\n                            self.info(\"  updated branch: %s: %s..%s\" % (local_name, current[:8], new[:8]))\n                        else:\n                            self.info(\"  created branch: %s: %s\" % (local_name, new[:8]))\n\n                        hook_output = \"\"\n\n                        for line in remote_lines:\n                            self.debug(\"  [hook] \" + line)\n                            hook_output += line + \"\\n\"\n\n                        if local_name != \"*\":\n                            cursor = self.db.cursor()\n                            cursor.execute(\"INSERT INTO trackedbranchlog (branch, from_sha1, to_sha1, hook_output, successful) VALUES (%s, %s, %s, %s, %s)\",\n                                           (trackedbranch_id, current if current else '0' * 40, new if new else '0' * 40, hook_output, True))\n                            self.db.commit()\n                    else:\n                        if local_name == \"*\":\n                            error = \"update of tags from %s failed\" % remote\n                        else:\n                            error = \"update of branch %s from %s in %s failed\" % (local_name, remote_name, remote)\n\n                        hook_output = \"\"\n\n                        for line in stderr_lines:\n                            error += \"\\n    \" + line\n\n                        for line in remote_lines:\n                            hook_output += line + \"\\n\"\n\n                        self.error(error)\n\n                        cursor = self.db.cursor()\n\n                        if local_name != \"*\":\n                            cursor.execute(\"\"\"INSERT INTO trackedbranchlog (branch, from_sha1, to_sha1, hook_output, successful)\n                                                   VALUES (%s, %s, %s, %s, %s)\"\"\",\n                                           (trackedbranch_id, current, new, hook_output, False))\n                            self.db.commit()\n\n                        cursor.execute(\"SELECT uid FROM trackedbranchusers WHERE branch=%s\", (trackedbranch_id,))\n                        recipients = [dbutils.User.fromId(self.db, user_id) for (user_id,) in cursor]\n\n                        if local_name == \"*\":\n                            mailutils.sendMessage(recipients, \"%s: update of tags from %s stopped!\" % (repository.name, remote),\n                                                  \"\"\"\\\nThe automatic update of tags in\n  %s:%s\nfrom the remote\n  %s\nfailed, and has been disabled.  Manual intervention is required to resume the\nautomatic updating.\n\nOutput from Critic's git hook\n-----------------------------\n\n%s\"\"\" % (configuration.base.HOSTNAME, repository.path, remote, hook_output))\n                        else:\n                            mailutils.sendMessage(recipients, \"%s: update from %s in %s stopped!\" % (local_name, remote_name, remote),\n                                                  \"\"\"\\\nThe automatic update of the branch '%s' in\n  %s:%s\nfrom the branch '%s' in\n  %s\nfailed, and has been disabled.  Manual intervention is required to resume the\nautomatic updating.\n\nOutput from Critic's git hook\n-----------------------------\n\n%s\"\"\" % (local_name, configuration.base.HOSTNAME, repository.path, remote_name, remote, hook_output))\n\n                        # Disable the tracking.\n                        return False\n                else:\n                    self.debug(\"  fetched %s in %s; no changes\" % (remote_name, remote))\n\n            # Everything went well; keep the tracking enabled.\n            return True\n        except:\n            exception = traceback.format_exc()\n\n            if local_name == \"*\":\n                error = \"  update of tags from %s failed\" % remote\n            else:\n                error = \"  update of branch %s from %s in %s failed\" % (local_name, remote_name, remote)\n\n            for line in exception.splitlines():\n                error += \"\\n    \" + line\n\n            self.error(error)\n\n            # The expected failure (in case of diverged branches, or review branch\n            # irregularities) is a failed \"git push\" and is handled above.  This is\n            # an unexpected failure, so might be intermittent.  Leave the tracking\n            # enabled and spam the system administrator(s).\n            return True\n\n    def run(self):\n        self.db = dbutils.Database.forSystem()\n\n        while not self.terminated:\n            self.interrupted = False\n\n            cursor = self.db.cursor()\n            cursor.execute(\"\"\"SELECT id, repository, local_name, remote, remote_name, forced\n                                FROM trackedbranches\n                               WHERE NOT disabled\n                                 AND (next IS NULL OR next < NOW())\n                            ORDER BY next ASC NULLS FIRST\"\"\")\n            rows = cursor.fetchall()\n\n            for trackedbranch_id, repository_id, local_name, remote, remote_name, forced in rows:\n                if local_name == \"*\":\n                    self.info(\"checking tags in %s\" % remote)\n                else:\n                    self.info(\"checking %s in %s\" % (remote_name, remote))\n\n                cursor.execute(\"\"\"UPDATE trackedbranches\n                                     SET previous=NOW(),\n                                         next=NOW() + delay,\n                                         updating=TRUE\n                                   WHERE id=%s\"\"\",\n                               (trackedbranch_id,))\n\n                self.db.commit()\n\n                if self.update(trackedbranch_id, repository_id, local_name, remote, remote_name, forced):\n                    cursor.execute(\"\"\"UPDATE trackedbranches\n                                         SET updating=FALSE\n                                       WHERE id=%s\"\"\",\n                                   (trackedbranch_id,))\n                    cursor.execute(\"\"\"SELECT next::text\n                                        FROM trackedbranches\n                                       WHERE id=%s\"\"\",\n                                   (trackedbranch_id,))\n                    self.info(\"  next scheduled update at %s\" % cursor.fetchone())\n                else:\n                    cursor.execute(\"\"\"UPDATE trackedbranches\n                                         SET updating=FALSE,\n                                             disabled=TRUE\n                                       WHERE id=%s\"\"\",\n                                   (trackedbranch_id,))\n                    self.info(\"  tracking disabled\")\n\n                self.db.commit()\n\n                if self.terminated: break\n\n            cursor.execute(\"\"\"SELECT 1\n                                FROM trackedbranches\n                               WHERE NOT disabled\n                                 AND next IS NULL\"\"\")\n\n            if not cursor.fetchone():\n                maintenance_delay = self.run_maintenance()\n\n                if maintenance_delay is None:\n                    maintenance_delay = 3600\n\n                cursor.execute(\"\"\"SELECT COUNT(*), EXTRACT('epoch' FROM (MIN(next) - NOW()))\n                                    FROM trackedbranches\n                                   WHERE NOT disabled\"\"\")\n\n                enabled_branches, update_delay = cursor.fetchone()\n\n                if not enabled_branches:\n                    self.info(\"nothing to do\")\n                    update_delay = 3600\n                else:\n                    update_delay = max(0, int(update_delay))\n\n                delay = min(maintenance_delay, update_delay)\n\n                if delay:\n                    self.signal_idle_state()\n\n                    self.debug(\"sleeping %d seconds\" % delay)\n\n                    gitutils.Repository.forEach(self.db, lambda db, repository: repository.stopBatch())\n\n                    self.db.commit()\n\n                    before = time.time()\n                    time.sleep(delay)\n                    if self.interrupted:\n                        self.debug(\"sleep interrupted after %.2f seconds\" % (time.time() - before))\n\n            self.db.commit()\n\ndef start_service():\n    tracker = BranchTracker()\n    return tracker.start()\n\nbackground.utils.call(\"branchtracker\", start_service)\n"
  },
  {
    "path": "src/background/branchtrackerhook.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport os\nimport signal\nimport time\n\nsys.path.insert(0, os.path.abspath(os.path.join(os.path.dirname(sys.argv[0]), \"..\")))\n\nimport dbutils\nfrom textutils import json_encode, json_decode\n\nif \"--wait-for-update\" in sys.argv:\n    data = json_decode(sys.stdin.read())\n\n    branch_id = data[\"branch_id\"]\n    timeout = data[\"timeout\"]\n    log_offset = data[\"log_offset\"]\n\n    db = dbutils.Database.forSystem()\n\n    cursor = db.cursor()\n    cursor.execute(\"SELECT MAX(time) FROM trackedbranchlog WHERE branch=%s\", (branch_id,))\n    last_log_entry = cursor.fetchone()[0]\n\n    start = time.time()\n\n    status = None\n    output = \"\"\n\n    while time.time() - start < timeout:\n        time.sleep(0.5)\n\n        db.commit()\n\n        cursor = db.cursor()\n        cursor.execute(\"SELECT hook_output FROM trackedbranchlog WHERE branch=%s ORDER BY time ASC OFFSET %s\", (branch_id, log_offset))\n        rows = cursor.fetchall()\n\n        if rows:\n            for (hook_output,) in rows: output += hook_output\n            status = \"output\"\n            break\n\n        cursor.execute(\"SELECT updating FROM trackedbranches WHERE id=%s\", (branch_id,))\n\n        if not cursor.fetchone()[0]:\n            # Update performed, but no log entries added.\n            status = \"no-output\"\n            break\n    else:\n        status = \"timeout\"\n\n    sys.stdout.write(json_encode({ \"status\": status, \"output\": output or None }))\n    sys.stdout.flush()\n\n    db.close()\nelse:\n    import configuration\n\n    from background.utils import PeerServer\n    from textutils import json_decode\n    from subprocess import STDOUT\n\n    class BranchTrackerHook(PeerServer):\n        class WaitForUpdate(PeerServer.ChildProcess):\n            def __init__(self, client, branch_id, timeout, log_offset):\n                super(BranchTrackerHook.WaitForUpdate, self).__init__(client.server, [sys.executable, sys.argv[0], \"--wait-for-update\"], stderr=STDOUT)\n                self.client = client\n                self.client.write(\"wait\\n\")\n                self.write(json_encode({ \"branch_id\": branch_id, \"timeout\": timeout, \"log_offset\": log_offset }))\n                self.close()\n\n            def handle_input(self, _file, data):\n                try: data = json_decode(data)\n                except ValueError:\n                    self.server.error(\"invalid response from wait-for-update child: %r\" % data)\n                    self.client.close()\n\n                if data[\"status\"] == \"output\":\n                    self.client.write(data[\"output\"])\n                    self.server.debug(\"  hook output written to client\")\n                elif data[\"status\"] == \"no-output\":\n                    self.server.debug(\"  update produced no hook output\")\n                else:\n                    self.server.debug(\"  timeout\")\n\n                self.client.close()\n\n        class Client(PeerServer.SocketPeer):\n            def __init__(self, server, peersocket, peeraddress):\n                super(BranchTrackerHook.Client, self).__init__(server, peersocket)\n                self.__peeraddress = peeraddress\n\n            def handle_input(self, _file, data):\n                try: data = json_decode(data)\n                except ValueError: return\n\n                message = \"connection from %s:%d:\" % self.__peeraddress\n                message += \"\\n  repository: %s\" % data[\"repository\"]\n\n                if data.has_key(\"timeout\"):\n                    message += \"\\n  timeout:    %d\" % data[\"timeout\"]\n                if data[\"branches\"]:\n                    message += \"\\n  branches:   %s\" % \", \".join(data[\"branches\"])\n                if data[\"tags\"]:\n                    message += \"\\n  tags:       %s\" % \", \".join(data[\"tags\"])\n\n                self.server.info(message)\n\n                db = dbutils.Database.forSystem()\n\n                try:\n                    cursor = db.cursor()\n                    notify_tracker = False\n                    wait_for_reply = False\n\n                    # Make sure the 'knownremotes' table has this remote listed\n                    # as \"pushing\" since it obviously is.\n\n                    cursor.execute(\"\"\"SELECT pushing\n                                        FROM knownremotes\n                                       WHERE url=%s\"\"\",\n                                   (data[\"repository\"],))\n\n                    row = cursor.fetchone()\n\n                    if not row:\n                        cursor.execute(\"\"\"INSERT INTO knownremotes (url, pushing)\n                                               VALUES (%s, TRUE)\"\"\",\n                                       (data[\"repository\"],))\n                    elif not row[0]:\n                        cursor.execute(\"\"\"UPDATE knownremotes\n                                             SET pushing=TRUE\n                                           WHERE url=%s\"\"\",\n                                       (data[\"repository\"],))\n\n                    # If we just recorded this remote as \"pushing,\" adjust the\n                    # configured updating frequency of any existing tracked\n                    # branches from it.\n\n                    if not row or not row[0]:\n                        cursor.execute(\"\"\"UPDATE trackedbranches\n                                             SET delay='1 week'\n                                           WHERE remote=%s\"\"\",\n                                       (data[\"repository\"],))\n\n                    for branch in data[\"branches\"]:\n                        cursor.execute(\"\"\"SELECT id, local_name\n                                            FROM trackedbranches\n                                           WHERE remote=%s\n                                             AND remote_name=%s\n                                             AND NOT disabled\n                                             AND next IS NOT NULL\"\"\",\n                                       (data[\"repository\"], branch))\n\n                        row = cursor.fetchone()\n                        if row:\n                            branch_id, local_name = row\n\n                            cursor.execute(\"\"\"UPDATE trackedbranches\n                                                 SET next=NULL\n                                               WHERE id=%s\"\"\",\n                                           (branch_id,))\n\n                            notify_tracker = True\n                            self.server.debug(\"tracked branch: %s\" % local_name)\n\n                            if len(data[\"branches\"]) == 1 and local_name.startswith(\"r/\"):\n                                wait_for_reply = (True, branch_id)\n                                self.server.debug(\"  will wait for reply\")\n\n                    if data[\"tags\"]:\n                        cursor.execute(\"\"\"SELECT id\n                                            FROM trackedbranches\n                                           WHERE remote=%s\n                                             AND remote_name=%s\n                                             AND NOT disabled\n                                             AND next IS NOT NULL\"\"\",\n                                       (data[\"repository\"], \"*\"))\n\n                        row = cursor.fetchone()\n                        if row:\n                            branch_id = row[0]\n\n                            cursor.execute(\"\"\"UPDATE trackedbranches\n                                                 SET next=NULL\n                                               WHERE id=%s\"\"\",\n                                           (branch_id,))\n\n                            notify_tracker = True\n\n                    db.commit()\n\n                    if notify_tracker:\n                        if wait_for_reply:\n                            branch_id = wait_for_reply[1]\n                            cursor.execute(\"SELECT COUNT(*) FROM trackedbranchlog WHERE branch=%s\", (branch_id,))\n                            log_offset = cursor.fetchone()[0]\n                            self.server.add_peer(BranchTrackerHook.WaitForUpdate(self, branch_id, data.get(\"timeout\", 30), log_offset))\n\n                        try:\n                            branchtracker_pid = int(open(configuration.services.BRANCHTRACKER[\"pidfile_path\"]).read().strip())\n                            os.kill(branchtracker_pid, signal.SIGHUP)\n                        except:\n                            self.server.exception()\n                            return\n\n                        if wait_for_reply:\n                            return\n\n                    self.close()\n                finally:\n                    try: db.close()\n                    except: pass\n\n        def __init__(self):\n            super(BranchTrackerHook, self).__init__(service=configuration.services.BRANCHTRACKERHOOK)\n\n        def handle_peer(self, peersocket, peeraddress):\n            return BranchTrackerHook.Client(self, peersocket, peeraddress)\n\n    server = BranchTrackerHook()\n    sys.exit(server.start())\n"
  },
  {
    "path": "src/background/changeset.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport os.path\n\nsys.path.insert(0, os.path.abspath(os.path.join(os.path.dirname(sys.argv[0]), \"..\")))\n\nimport configuration\nimport dbutils\nimport background.utils\n\nfrom textutils import json_decode, json_encode\n\nif \"--json-job\" in sys.argv[1:]:\n    from resource import getrlimit, setrlimit, RLIMIT_RSS\n    from traceback import print_exc\n\n    def perform_job():\n        soft_limit, hard_limit = getrlimit(RLIMIT_RSS)\n        rss_limit = configuration.services.CHANGESET[\"rss_limit\"]\n        if soft_limit < rss_limit:\n            setrlimit(RLIMIT_RSS, (rss_limit, hard_limit))\n\n        from changeset.create import createChangeset\n\n        request = json_decode(sys.stdin.read())\n\n        try:\n            db = dbutils.Database.forSystem()\n\n            createChangeset(db, request)\n\n            db.close()\n\n            sys.stdout.write(json_encode(request))\n        except:\n            print \"Request:\"\n            print json_encode(request, indent=2)\n            print\n\n            print_exc(file=sys.stdout)\n\n    background.utils.call(\"changeset_job\", perform_job)\nelse:\n    from background.utils import JSONJobServer\n\n    def describeRequest(request):\n        if request[\"changeset_type\"] in (\"direct\", \"merge\", \"conflicts\"):\n            return \"%s (%s)\" % (request[\"changeset_type\"], request[\"child_sha1\"][:8])\n        else:\n            return \"custom (%s..%s)\" % (request[\"parent_sha1\"][:8], request[\"child_sha1\"][:8])\n\n    class ChangesetServer(JSONJobServer):\n        def __init__(self):\n            service = configuration.services.CHANGESET\n\n            super(ChangesetServer, self).__init__(service)\n\n            if \"purge_at\" in service:\n                hour, minute = service[\"purge_at\"]\n                self.register_maintenance(hour=hour, minute=minute, callback=self.__purge)\n\n        def execute_command(self, client, command):\n            if command[\"command\"] == \"purge\":\n                purged_count = self.__purge()\n\n                client.write(json_encode({ \"status\": \"ok\",\n                                           \"purged\": purged_count }))\n                client.close()\n            else:\n                super(ChangesetServer, self).execute_command(client, command)\n\n        def request_started(self, job, request):\n            super(ChangesetServer, self).request_started(job, request)\n\n            self.debug(\"started: %s in %s [pid=%d]\" % (describeRequest(request), request[\"repository_name\"], job.pid))\n\n        def request_finished(self, job, request, result):\n            super(ChangesetServer, self).request_finished(job, request, result)\n\n            if \"error\" not in result:\n                for parent_sha1, changeset_id in result[\"changeset_ids\"].items():\n                    self.info(\"finished: %d for %s (%s..%s) in %s [pid=%d]\"\n                              % (changeset_id, request[\"changeset_type\"],\n                                 parent_sha1[:8], request[\"child_sha1\"][:8],\n                                 request[\"repository_name\"], job.pid))\n\n        def __purge(self):\n            db = dbutils.Database.forSystem()\n\n            cursor = db.cursor()\n\n            cursor.execute(\"\"\"SELECT COUNT(*)\n                                FROM changesets\n                                JOIN customchangesets ON (customchangesets.changeset=changesets.id)\n                               WHERE time < NOW() - INTERVAL '3 months'\"\"\")\n            npurged = cursor.fetchone()[0]\n\n            if npurged:\n                self.info(\"purging %d old custom changesets\" % npurged)\n\n                cursor.execute(\"\"\"DELETE FROM changesets\n                                        WHERE id IN (SELECT changeset\n                                                       FROM customchangesets\n                                                      WHERE time < NOW() - INTERVAL '3 months')\"\"\")\n                db.commit()\n\n            db.close()\n\n            return npurged\n\n    def start_service():\n        server = ChangesetServer()\n        return server.start()\n\n    background.utils.call(\"changeset\", start_service)\n"
  },
  {
    "path": "src/background/daemon.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport os\nimport sys\n\ndef detach(parent_exit_hook=lambda: 0):\n    try:\n        if os.fork() != 0:\n            # Exit from parent process.\n            sys.exit(parent_exit_hook())\n    except OSError as error:\n        print >>sys.stderr, \"fork failed: %s\" % error.message\n        sys.exit(1)\n\n    os.setsid()\n    os.umask(0)\n\n    try:\n        if os.fork() != 0:\n            # Exit from parent process.\n            sys.exit(0)\n    except OSError as error:\n        print >>sys.stderr, \"fork failed: %s\" % error.message\n        sys.exit(1)\n\n    sys.stdout.flush()\n    sys.stderr.flush()\n\n    stdin = open(\"/dev/null\", \"r\")\n    stdout = open(\"/dev/null\", \"a+\")\n    stderr = open(\"/dev/null\", \"a+\")\n\n    os.dup2(stdin.fileno(), sys.stdin.fileno())\n    os.dup2(stdout.fileno(), sys.stdout.fileno())\n    os.dup2(stderr.fileno(), sys.stderr.fileno())\n"
  },
  {
    "path": "src/background/extensionrunner.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2013 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport os\nimport time\n\nsys.path.insert(0, os.path.abspath(os.path.join(os.path.dirname(sys.argv[0]), \"..\")))\n\nimport configuration\nimport textutils\nimport background.utils\nimport extensions.execute\nimport communicate\n\nclass ExtensionRunner(background.utils.PeerServer):\n    class Extension(background.utils.PeerServer.SpawnedProcess):\n        def __init__(self, server, client, process, timeout):\n            super(ExtensionRunner.Extension, self).__init__(\n                server, process, deadline=time.time() + timeout)\n            self.client = client\n            self.stdout = self.stderr = None\n            self.did_time_out = False\n\n        def handle_input(self, pipe, data):\n            assert pipe in (self.process.stdout, self.process.stderr)\n            if pipe == self.process.stdout:\n                self.stdout = data\n            else:\n                self.stderr = data\n\n        def timed_out(self):\n            super(ExtensionRunner.Extension, self).timed_out()\n            self.server.debug(\"Timeout, killing process [pid=%d]\" % self.pid)\n            self.did_time_out = True\n\n        def check_result(self):\n            self.client.finished(self)\n\n    class Client(background.utils.PeerServer.SocketPeer):\n        def __init__(self, server, peersocket):\n            super(ExtensionRunner.Client, self).__init__(server, peersocket)\n\n        def handle_input(self, _file, data):\n            data = textutils.json_decode(data)\n\n            process = self.server.get_process(data[\"flavor\"])\n\n            extension = ExtensionRunner.Extension(\n                self.server, self, process, data[\"timeout\"])\n            extension.write(data[\"stdin\"])\n            extension.close()\n\n            self.server.add_peer(extension)\n\n        def finished(self, process):\n            if process.did_time_out:\n                status = status_text = \"timeout\"\n            else:\n                status = \"ok\"\n                if process.returncode == 0:\n                    status_text = \"success\"\n                else:\n                    status_text = \"error(%d)\" % process.returncode\n\n            self.server.debug(\"Process finished: %s [pid=%d]\"\n                              % (status_text, process.pid))\n\n            if process.stdout:\n                self.server.debug(\"  stdout=%r\" % process.stdout)\n            if process.stderr:\n                self.server.debug(\"  stderr=%r\" % process.stderr)\n\n            self.write(textutils.json_encode({\n                \"status\": status,\n                \"stdout\": process.stdout,\n                \"stderr\": process.stderr,\n                \"returncode\": process.returncode\n            }))\n            self.close()\n\n    def __init__(self):\n        service = configuration.services.EXTENSIONRUNNER\n\n        super(ExtensionRunner, self).__init__(service=service)\n\n        self.target_cached_processes = service[\"cached_processes\"]\n        self.cached_processes = []\n\n    def run(self):\n        if not configuration.extensions.ENABLED:\n            self.info(\"service stopping: extension support not enabled\")\n            return\n\n        self.__fill_cache()\n\n        super(ExtensionRunner, self).run()\n\n    def handle_peer(self, peersocket, peeraddress):\n        return ExtensionRunner.Client(self, peersocket)\n\n    def peer_destroyed(self, peer):\n        self.__cache_process()\n\n    def signal_idle_state(self):\n        super(ExtensionRunner, self).signal_idle_state()\n        self.__fill_cache()\n\n    def get_process(self, flavor):\n        if flavor == configuration.extensions.DEFAULT_FLAVOR \\\n                and self.cached_processes:\n            process = self.cached_processes.pop(0)\n            self.debug(\"Using cached process [pid=%d]\" % process.pid)\n        else:\n            process = extensions.execute.startProcess(flavor)\n            self.debug(\"Started new process [pid=%d]\" % process.pid)\n        return process\n\n    def __fill_cache(self):\n        while self.__cache_process():\n            pass\n\n    def __cache_process(self):\n        if len(self.cached_processes) < self.target_cached_processes:\n            process = extensions.execute.startProcess(\n                configuration.extensions.DEFAULT_FLAVOR)\n            self.debug(\"Started cached process [pid=%d]\" % process.pid)\n            self.cached_processes.append(process)\n            return True\n        else:\n            return False\n\ndef start_service():\n    extensionrunner = ExtensionRunner()\n    return extensionrunner.start()\n\nbackground.utils.call(\"extensionrunner\", start_service)\n"
  },
  {
    "path": "src/background/extensiontasks.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2013 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport os\nimport time\n\nsys.path.insert(0, os.path.abspath(os.path.join(os.path.dirname(sys.argv[0]), \"..\")))\n\nimport configuration\nimport dbutils\nimport background.utils\nimport extensions.role.filterhook\n\nclass ExtensionTasks(background.utils.BackgroundProcess):\n    def __init__(self):\n        service = configuration.services.EXTENSIONTASKS\n\n        super(ExtensionTasks, self).__init__(service=service)\n\n    def run(self):\n        if not configuration.extensions.ENABLED:\n            self.info(\"service stopping: extension support not enabled\")\n            return\n\n        failed_events = set()\n\n        while not self.terminated:\n            self.interrupted = False\n\n            with dbutils.Database.forSystem() as db:\n                cursor = db.cursor()\n                cursor.execute(\"\"\"SELECT id\n                                    FROM extensionfilterhookevents\n                                ORDER BY id ASC\"\"\")\n\n                finished_events = []\n\n                for (event_id,) in cursor:\n                    if event_id not in failed_events:\n                        try:\n                            extensions.role.filterhook.processFilterHookEvent(\n                                db, event_id, self.debug)\n                        except Exception:\n                            self.exception()\n                            failed_events.add(event_id)\n                        else:\n                            finished_events.append(event_id)\n\n                cursor.execute(\"\"\"DELETE FROM extensionfilterhookevents\n                                        WHERE id=ANY (%s)\"\"\",\n                               (finished_events,))\n\n                db.commit()\n\n            timeout = self.run_maintenance()\n\n            if timeout is None:\n                timeout = 86400\n\n            self.debug(\"sleeping %d seconds\" % timeout)\n\n            self.signal_idle_state()\n\n            before = time.time()\n\n            time.sleep(timeout)\n\n            if self.interrupted:\n                self.debug(\"sleep interrupted after %.2f seconds\"\n                           % (time.time() - before))\n\ndef start_service():\n    extensiontasks = ExtensionTasks()\n    return extensiontasks.start()\n\nbackground.utils.call(\"extensiontasks\", start_service)\n"
  },
  {
    "path": "src/background/githook.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport os\nimport os.path\n\nsys.path.insert(0, os.path.abspath(os.path.join(os.path.dirname(sys.argv[0]), \"..\")))\n\nimport configuration\nimport background.utils\nimport dbutils\nimport auth\n\nfrom textutils import json_decode, json_encode\n\ntry:\n    from customization.email import getUserEmailAddress\nexcept ImportError:\n    def getUserEmailAddress(_username):\n        return None\n\ndef getUser(db, user_name):\n    if user_name == configuration.base.SYSTEM_USER_NAME:\n        return dbutils.User.makeSystem()\n    try:\n        return dbutils.User.fromName(db, user_name)\n    except dbutils.NoSuchUser:\n        if configuration.base.AUTHENTICATION_MODE == \"host\":\n            email = getUserEmailAddress(user_name)\n            return dbutils.User.create(\n                db, user_name, user_name, email, email_verified=None)\n        raise\n\nsys_stdout = sys.stdout\n\ndef slave():\n    import StringIO\n    import traceback\n\n    import dbutils\n    import gitutils\n    import index\n\n    def reject(message):\n        sys_stdout.write(json_encode({ \"status\": \"reject\", \"message\": message }))\n        sys.exit(0)\n\n    def error(message):\n        sys_stdout.write(json_encode({ \"status\": \"error\", \"error\": message }))\n        sys.exit(0)\n\n    db = dbutils.Database.forUser()\n\n    try:\n        data = sys.stdin.read()\n        request = json_decode(data)\n\n        create_branches = []\n        delete_branches = []\n        update_branches = []\n\n        create_tags = []\n        delete_tags = []\n        update_tags = []\n\n        user = getUser(db, request[\"user_name\"])\n        authentication_labels = auth.DATABASE.getAuthenticationLabels(user)\n\n        db.setUser(user, authentication_labels)\n\n        try:\n            repository = gitutils.Repository.fromName(\n                db, request[\"repository_name\"], for_modify=True)\n        except auth.AccessDenied as error:\n            reject(error.message)\n\n        if request[\"flags\"] and user.isSystem():\n            flags = dict(flag.split(\"=\", 1) for flag in request[\"flags\"].split(\",\"))\n        else:\n            flags = {}\n\n        sys.stdout = StringIO.StringIO()\n\n        commits_to_process = set()\n\n        for ref in request[\"refs\"]:\n            name = ref[\"name\"]\n            old_sha1 = ref[\"old_sha1\"]\n            new_sha1 = ref[\"new_sha1\"]\n\n            if \"//\" in name:\n                reject(\"invalid ref name: '%s'\" % name)\n            if not name.startswith(\"refs/\"):\n                reject(\"unexpected ref name: '%s'\" % name)\n\n            if new_sha1 != '0000000000000000000000000000000000000000':\n                commits_to_process.add(new_sha1)\n\n            name = name[len(\"refs/\"):]\n\n            if name.startswith(\"heads/\"):\n                name = name[len(\"heads/\"):]\n                if new_sha1 == '0000000000000000000000000000000000000000':\n                    delete_branches.append((name, old_sha1))\n                elif old_sha1 == '0000000000000000000000000000000000000000':\n                    create_branches.append((name, new_sha1))\n                else:\n                    update_branches.append((name, old_sha1, new_sha1))\n            elif name.startswith(\"tags/\"):\n                name = name[len(\"tags/\"):]\n                if old_sha1 == '0000000000000000000000000000000000000000':\n                    create_tags.append((name, new_sha1))\n                elif new_sha1 == '0000000000000000000000000000000000000000':\n                    delete_tags.append(name)\n                else:\n                    update_tags.append((name, old_sha1, new_sha1))\n            elif name.startswith(\"temporary/\") or name.startswith(\"keepalive/\"):\n                # len(\"temporary/\") == len(\"keepalive/\")\n                name = name[len(\"temporary/\"):]\n                if name != new_sha1:\n                    reject(\"invalid update of '%s'; value is not %s\" % (ref[\"name\"], name))\n            else:\n                reject(\"unexpected ref name: '%s'\" % ref[\"name\"])\n\n        multiple = (len(delete_branches) + len(update_branches) + len(create_branches) + len(delete_tags) + len(update_tags) + len(create_tags)) > 1\n        info = []\n\n        for sha1 in commits_to_process:\n            index.processCommits(db, repository, sha1)\n\n        for name, old in delete_branches:\n            index.deleteBranch(db, user, repository, name, old)\n            info.append(\"branch deleted: %s\" % name)\n\n        for name, old, new in update_branches:\n            index.updateBranch(db, user, repository, name, old, new, multiple, flags)\n            info.append(\"branch updated: %s (%s..%s)\" % (name, old[:8], new[:8]))\n\n        index.createBranches(db, user, repository, create_branches, flags)\n\n        for name, new in create_branches:\n            info.append(\"branch created: %s (%s)\" % (name, new[:8]))\n\n        for name in delete_tags:\n            index.deleteTag(db, user, repository, name)\n            info.append(\"tag deleted: %s\" % name)\n\n        for name, old, new in update_tags:\n            index.updateTag(db, user, repository, name, old, new)\n            info.append(\"tag updated: %s (%s..%s)\" % (name, old[:8], new[:8]))\n\n        for name, new in create_tags:\n            index.createTag(db, user, repository, name, new)\n            info.append(\"tag created: %s (%s)\" % (name, new[:8]))\n\n        sys_stdout.write(json_encode({ \"status\": \"ok\", \"accept\": True, \"output\": sys.stdout.getvalue(), \"info\": info }))\n\n        db.commit()\n    except index.IndexException as exception:\n        sys_stdout.write(json_encode({ \"status\": \"ok\", \"accept\": False, \"output\": exception.message, \"info\": info }))\n    except SystemExit:\n        raise\n    except:\n        exception = traceback.format_exc()\n        message = \"\"\"\\\n%s\n\nRequest:\n%s\n\n%s\"\"\" % (exception.splitlines()[-1], json_encode(request, indent=2), traceback.format_exc())\n\n        sys_stdout.write(json_encode({ \"status\": \"error\", \"error\": message }))\n    finally:\n        db.close()\n\nclass GitHookServer(background.utils.PeerServer):\n    class ChildProcess(background.utils.PeerServer.ChildProcess):\n        def __init__(self, server, client):\n            super(GitHookServer.ChildProcess, self).__init__(server, [sys.executable, sys.argv[0], \"--slave\"])\n            self.__client = client\n\n        def handle_input(self, _file, data):\n            try:\n                result = json_decode(data)\n            except ValueError:\n                result = { \"status\": \"error\",\n                           \"error\": (\"invalid response:\\n\" +\n                                     background.utils.indent(data)) }\n            if result[\"status\"] == \"ok\":\n                for item in result[\"info\"]:\n                    self.server.info(item)\n                if result[\"output\"]:\n                    self.__client.write(result[\"output\"].strip() + \"\\n\")\n                if result[\"accept\"]:\n                    self.__client.write(\"ok\\n\")\n            elif result[\"status\"] == \"reject\":\n                self.server.warning(result[\"message\"])\n                self.__client.write(result[\"message\"].strip() + \"\\n\")\n            else:\n                self.server.error(result[\"error\"])\n                self.__client.write(\"\"\"\\\nAn exception was raised while processing the request.  A message has\nbeen sent to the system administrator(s).\n\"\"\")\n                if configuration.debug.IS_DEVELOPMENT:\n                    self.__client.write(\"\\n\" + result[\"error\"].strip() + \"\\n\")\n            self.__client.close()\n\n    class Client(background.utils.PeerServer.SocketPeer):\n        def handle_input(self, _file, data):\n            lines = data.splitlines()\n\n            user_name = lines[0]\n\n            # The second line is the value of the REMOTE_USER environment\n            # variable (from the environment with which the git hook ran.)\n            #\n            # We use it as the actual user only if the actual user was the\n            # Critic system user, meaning the push was performed by the\n            # branch tracker service, the web front-end (for instance via\n            # 'git http-backend') or an extension.\n            if user_name == configuration.base.SYSTEM_USER_NAME and lines[1]:\n                user_name = lines[1]\n\n            self.__request = { \"user_name\": user_name,\n                               \"repository_name\": lines[2],\n                               \"flags\": lines[3],\n                               \"refs\": [{ \"name\": name,\n                                          \"old_sha1\": old_sha1,\n                                          \"new_sha1\": new_sha1 }\n                                        for old_sha1, new_sha1, name\n                                        in map(str.split, lines[4:])] }\n\n            self.server.info(\"session started: %s / %s\"\n                             % (self.__request[\"user_name\"],\n                                self.__request[\"repository_name\"]))\n\n            child_process = GitHookServer.ChildProcess(self.server, self)\n            child_process.write(json_encode(self.__request))\n            child_process.close()\n            self.server.add_peer(child_process)\n\n        def destroy(self):\n            self.server.info(\"session ended: %s / %s\"\n                             % (self.__request[\"user_name\"],\n                                self.__request[\"repository_name\"]))\n\n    def __init__(self):\n        super(GitHookServer, self).__init__(service=configuration.services.GITHOOK)\n\n    def startup(self):\n        super(GitHookServer, self).startup()\n\n        os.chmod(configuration.services.GITHOOK[\"address\"], 0770)\n\n    def handle_peer(self, peersocket, peeraddress):\n        return GitHookServer.Client(self, peersocket)\n\ndef start_service():\n    server = GitHookServer()\n    return server.start()\n\nif \"--slave\" in sys.argv[1:]:\n    background.utils.call(\"githook\", slave)\nelse:\n    background.utils.call(\"githook\", start_service)\n"
  },
  {
    "path": "src/background/highlight.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport os\nimport time\nfrom subprocess import Popen as process\n\nsys.path.insert(0, os.path.abspath(os.path.join(os.path.dirname(sys.argv[0]), \"..\")))\n\nimport background.utils\nfrom textutils import json_decode, json_encode\n\nif \"--json-job\" in sys.argv[1:]:\n    def perform_job():\n        import syntaxhighlight.generate\n\n        request = json_decode(sys.stdin.read())\n        request[\"highlighted\"] = syntaxhighlight.generate.generateHighlight(\n            repository_path=request[\"repository_path\"],\n            sha1=request[\"sha1\"],\n            language=request[\"language\"],\n            mode=request[\"mode\"])\n        sys.stdout.write(json_encode(request))\n\n    background.utils.call(\"highlight_job\", perform_job)\nelse:\n    import background.utils\n    from syntaxhighlight import isHighlighted\n    from syntaxhighlight.context import importCodeContexts\n\n    import configuration\n    import dbutils\n\n    class HighlightServer(background.utils.JSONJobServer):\n        def __init__(self):\n            service = configuration.services.HIGHLIGHT\n\n            super(HighlightServer, self).__init__(service)\n\n            self.db = dbutils.Database.forSystem()\n\n            if \"compact_at\" in service:\n                hour, minute = service[\"compact_at\"]\n                self.register_maintenance(hour=hour, minute=minute, callback=self.__compact)\n\n        def request_result(self, request):\n            if isHighlighted(request[\"sha1\"], request[\"language\"], request[\"mode\"]):\n                result = request.copy()\n                result[\"highlighted\"] = True\n                return result\n\n        def request_started(self, job, request):\n            super(HighlightServer, self).request_started(job, request)\n\n            self.debug(\"started: %s:%s (%s) in %s [pid=%d]\" % (request[\"path\"], request[\"sha1\"][:8], request[\"language\"], request[\"repository_path\"], job.pid))\n\n        def request_finished(self, job, request, result):\n            super(HighlightServer, self).request_finished(job, request, result)\n\n            failed = \"\" if \"error\" not in result else \" (failed!)\"\n            self.info(\"finished: %s:%s (%s) in %s [pid=%d]%s\" % (request[\"path\"], request[\"sha1\"][:8], request[\"language\"], request[\"repository_path\"], job.pid, failed))\n\n            ncontexts = importCodeContexts(self.db, request[\"sha1\"], request[\"language\"])\n\n            if ncontexts: self.debug(\"  added %d code contexts\" % ncontexts)\n            else: self.debug(\"  no code contexts added\")\n\n        def execute_command(self, client, command):\n            if command[\"command\"] == \"compact\":\n                uncompressed_count, compressed_count, purged_files_count, purged_contexts_count = self.__compact()\n\n                client.write(json_encode({ \"status\": \"ok\",\n                                           \"uncompressed\": uncompressed_count,\n                                           \"compressed\": compressed_count,\n                                           \"purged_files\": purged_files_count,\n                                           \"purged_contexts\": purged_contexts_count }))\n                client.close()\n            else:\n                super(HighlightServer, self).execute_command(client, command)\n\n        def __compact(self):\n            import syntaxhighlight\n\n            cache_dir = configuration.services.HIGHLIGHT[\"cache_dir\"]\n\n            if not os.path.isdir(cache_dir):\n                # Newly installed system that hasn't highlighted anything.\n                return 0, 0, 0, 0\n\n            self.info(\"cache compacting started\")\n\n            now = time.time()\n\n            max_age_uncompressed = 7 * 24 * 60 * 60\n            max_age_compressed = 90 * 24 * 60 * 60\n\n            uncompressed_count = 0\n            compressed_count = 0\n\n            purged_paths = []\n\n            db = dbutils.Database.forSystem()\n\n            cursor = db.cursor()\n\n            cursor.execute(\"CREATE TEMPORARY TABLE purged (sha1 CHAR(40) PRIMARY KEY)\")\n            cursor.execute(\"INSERT INTO purged (sha1) SELECT DISTINCT sha1 FROM codecontexts\")\n\n            for section in sorted(os.listdir(cache_dir)):\n                if len(section) == 2:\n                    for filename in os.listdir(\"%s/%s\" % (cache_dir, section)):\n                        fullname = \"%s/%s/%s\" % (cache_dir, section, filename)\n                        age = now - os.stat(fullname).st_mtime\n\n                        parts = filename.split(\".\")\n\n                        if len(parts) < 2 \\\n                                or len(parts[0]) != 38 \\\n                                or parts[1] not in syntaxhighlight.LANGUAGES:\n                            os.unlink(fullname)\n                            continue\n\n                        sha1 = section + parts[0]\n\n                        if parts[-1] == \"bz2\":\n                            if age > max_age_compressed:\n                                self.debug(\"purging: %s/%s\" % (section, filename))\n                                purged_paths.append(fullname)\n                            else:\n                                cursor.execute(\"DELETE FROM purged WHERE sha1=%s\", (sha1,))\n                                compressed_count += 1\n                        elif parts[-1] == \"ctx\":\n                            self.debug(\"deleting context file: %s/%s\" % (section, filename))\n                            os.unlink(fullname)\n                        else:\n                            cursor.execute(\"DELETE FROM purged WHERE sha1=%s\", (sha1,))\n                            if age > max_age_uncompressed:\n                                self.debug(\"compressing: %s/%s\" % (section, filename))\n                                worker = process([\"/bin/bzip2\", fullname])\n                                worker.wait()\n                                compressed_count += 1\n                            else:\n                                uncompressed_count += 1\n\n            self.info(\"cache compacting finished: uncompressed=%d / compressed=%d / purged=%d\"\n                      % (uncompressed_count, compressed_count, len(purged_paths)))\n\n            if purged_paths:\n                for path in purged_paths: os.unlink(path)\n\n            cursor.execute(\"SELECT COUNT(*) FROM purged\")\n\n            purged_contexts = cursor.fetchone()[0]\n\n            cursor.execute(\"\"\"DELETE\n                                FROM codecontexts\n                               WHERE sha1 IN (SELECT sha1\n                                                FROM purged)\"\"\")\n\n            db.commit()\n            db.close()\n\n            return uncompressed_count, compressed_count, len(purged_paths), purged_contexts\n\n    def start_service():\n        server = HighlightServer()\n        return server.start()\n\n    background.utils.call(\"highlight\", start_service)\n"
  },
  {
    "path": "src/background/maildelivery.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport os\nimport time\nimport json\n\nimport smtplib\nimport email.mime.text\nimport email.header\nimport email.utils\n\nsys.path.insert(0, os.path.abspath(os.path.join(os.path.dirname(sys.argv[0]), \"..\")))\n\nimport configuration\nimport background.utils\n\nclass User:\n    def __init__(self, *args):\n        if len(args) > 1:\n            self.email, self.fullname = args[-2:]\n        else:\n            self.fullname, self.email = email.utils.parseaddr(args[0])\n\nclass MailDelivery(background.utils.PeerServer):\n    def __init__(self, credentials):\n        # We disable the automatic administrator mails (using the\n        # 'send_administrator_mails' argument) since\n        #\n        # 1) it's pretty pointless to report mail delivery problems\n        #    via mail, and\n        #\n        # 2) it can cause runaway mail generation, since failure to\n        #    timely deliver the mail delivery problem report emails\n        #    would trigger further automatic problem report emails.\n        #\n        # Instead, we keep track of having encountered any problems,\n        # and send a single administrator mail (\"check the logs\")\n        # after having successfully delivered an email.\n\n        service = configuration.services.MAILDELIVERY\n\n        super(MailDelivery, self).__init__(service=service,\n                                           send_administrator_mails=False)\n\n        self.__credentials = credentials\n        self.__connection = None\n        self.__connection_timeout = service.get(\"timeout\")\n        self.__has_logged_warning = 0\n        self.__has_logged_error = 0\n\n        self.register_maintenance(hour=3, minute=45, callback=self.__cleanup)\n\n    def __sendAdministratorMessage(self):\n        from_user = User(configuration.base.SYSTEM_USER_EMAIL, \"Critic System\")\n        recipients = []\n\n        for recipient in configuration.base.SYSTEM_RECIPIENTS:\n            recipients.append(User(recipient))\n\n        if self.__has_logged_warning and self.__has_logged_error:\n            what = \"%d warning%s and %d error%s\" % (self.__has_logged_warning,\n                                                    \"s\" if self.__has_logged_warning > 1 else \"\",\n                                                    self.__has_logged_error,\n                                                    \"s\" if self.__has_logged_error > 1 else \"\")\n        elif self.__has_logged_warning:\n            what = \"%d warning%s\" % (self.__has_logged_warning,\n                                     \"s\" if self.__has_logged_warning > 1 else \"\")\n        else:\n            what = \"%d error%s\" % (self.__has_logged_error,\n                                     \"s\" if self.__has_logged_error > 1 else \"\")\n\n        for to_user in recipients:\n            self.__send(message_id=None,\n                        parent_message_id=None,\n                        headers={},\n                        date=time.time(),\n                        from_user=from_user,\n                        to_user=to_user,\n                        recipients=recipients,\n                        subject=\"maildelivery: check the logs!\",\n                        body=\"%s have been logged.\\n\\n-- critic\\n\" % what,\n                        try_once=True)\n\n        self.__has_logged_warning = 0\n        self.__has_logged_error = 0\n\n    def run(self):\n        try:\n            sleeptime = 0\n\n            while not self.terminated:\n                self.interrupted = False\n\n                filenames = os.listdir(configuration.paths.OUTBOX)\n                pending = []\n\n                for filename in filenames:\n                    if filename.endswith(\".txt\"):\n                        pending.append(\"%s/%s\" % (configuration.paths.OUTBOX, filename))\n\n                if pending:\n                    self.__connect()\n\n                    # We may have been terminated while attempting to connect.\n                    if self.terminated:\n                        return\n\n                    sleeptime = 0\n\n                    now = time.time()\n\n                    def age(filename):\n                        return now - os.stat(filename).st_ctime\n\n                    too_old = len(filter(lambda filename: age(filename) > 60, pending))\n                    oldest_age = max(map(age, pending))\n\n                    if too_old > 0:\n                        self.warning((\"%d files were created more than 60 seconds ago\\n\"\n                                      \"  The oldest is %s which is %d seconds old.\")\n                                     % (too_old, os.path.basename(filename), oldest_age))\n                        self.__has_logged_warning += 1\n\n                    for filename in sorted(pending):\n                        lines = open(filename).readlines()\n\n                        try:\n                            if self.__send(**eval(lines[0])):\n                                os.rename(filename, \"%s/sent/%s.sent\" % (configuration.paths.OUTBOX, os.path.basename(filename)))\n\n                                if self.__has_logged_warning or self.__has_logged_error:\n                                    try: self.__sendAdministratorMessage()\n                                    except:\n                                        self.exception()\n                                        self.__has_logged_error += 1\n\n                            # We may have been terminated while attempting to send.\n                            if self.terminated:\n                                return\n                        except:\n                            self.exception()\n                            self.__has_logged_error += 1\n                            os.rename(filename, \"%s/%s.invalid\" % (configuration.paths.OUTBOX, os.path.basename(filename)))\n                            continue\n                else:\n                    self.signal_idle_state()\n\n                    if sleeptime > 25:\n                        self.__disconnect()\n\n                    before = time.time()\n                    timeout = (30 - sleeptime) if self.__connection else self.run_maintenance()\n\n                    self.debug(\"sleeping %d seconds\" % timeout)\n\n                    time.sleep(timeout)\n\n                    if self.interrupted:\n                        self.debug(\"sleep interrupted after %.2f seconds\" % (time.time() - before))\n\n                    sleeptime += (time.time() - before)\n        finally:\n            self.__disconnect()\n\n    def __connect(self):\n        if not self.__connection:\n            attempts = 0\n\n            while not self.terminated:\n                attempts += 1\n\n                try:\n                    if configuration.smtp.USE_SSL:\n                        self.__connection = smtplib.SMTP_SSL(timeout=self.__connection_timeout)\n                    else:\n                        self.__connection = smtplib.SMTP(timeout=self.__connection_timeout)\n\n                    self.__connection.connect(configuration.smtp.HOST, configuration.smtp.PORT)\n\n                    if configuration.smtp.USE_STARTTLS:\n                        self.__connection.starttls()\n\n                    if self.__credentials:\n                        self.__connection.login(self.__credentials[\"username\"],\n                                                self.__credentials[\"password\"])\n\n                    self.debug(\"connected\")\n                    return\n                except:\n                    self.debug(\"failed to connect to SMTP server\")\n                    if (attempts % 5) == 0:\n                        self.error(\"Failed to connect to SMTP server %d times.  \"\n                                   \"Will keep retrying.\" % attempts)\n                        self.__has_logged_error += 1\n                    self.__connection = None\n\n                seconds = min(60, 2 ** attempts)\n\n                self.debug(\"sleeping %d seconds\" % seconds)\n\n                time.sleep(seconds)\n\n    def __disconnect(self):\n        if self.__connection:\n            try:\n                self.__connection.quit()\n                self.debug(\"disconnected\")\n            except: pass\n\n            self.__connection = None\n\n    def __send(self, message_id, parent_message_id, headers, from_user, to_user, recipients, subject, body, **kwargs):\n        def isascii(s):\n            return all(ord(c) < 128 for c in s)\n\n        def usersAsHeader(users, header_name):\n            header = email.header.Header(header_name=header_name)\n\n            for index, user in enumerate(users):\n                if isascii(user.fullname):\n                    header.append(user.fullname, \"us-ascii\")\n                else:\n                    header.append(user.fullname, \"utf-8\")\n                if index < len(users) - 1:\n                    header.append(\"<%s>,\" % user.email, \"us-ascii\")\n                else:\n                    header.append(\"<%s>\" % user.email, \"us-ascii\")\n\n            return header\n\n        def stringAsHeader(s, name):\n            if isascii(s): return email.header.Header(s, \"us-ascii\", header_name=name)\n            else: return email.header.Header(s, \"utf-8\", header_name=name)\n\n        message = email.mime.text.MIMEText(body, \"plain\", \"utf-8\")\n        recipients = filter(lambda user: bool(user.email), recipients)\n\n        if not to_user.email:\n            return True\n\n        if message_id:\n            message_id = \"<%s@%s>\" % (message_id, configuration.base.HOSTNAME)\n            message[\"Message-ID\"] = message_id\n        else:\n            message_id = \"N/A\"\n\n        if parent_message_id:\n            message[\"In-Reply-To\"] = parent_message_id\n            message[\"References\"] = parent_message_id\n\n        message[\"From\"] = usersAsHeader([from_user], \"From\")\n        message[\"To\"] = usersAsHeader(recipients, \"To\")\n        message[\"Subject\"] = stringAsHeader(subject, \"Subject\")\n        message[\"Date\"] = email.utils.formatdate(kwargs.get(\"date\", time.time()))\n\n        for name, value in headers.items():\n            message[name] = value\n\n        self.debug(\"%s => %s (%s)\" % (from_user.email, to_user.email, message_id))\n\n        # Used from __sendAdministratorMessage(); we'll try once to send it even\n        # if self.terminated.\n        try_once = kwargs.get(\"try_once\", False)\n\n        attempts = 0\n\n        while try_once or not self.terminated:\n            try_once = False\n\n            try:\n                self.__connection.sendmail(configuration.base.SYSTEM_USER_EMAIL, [to_user.email], message.as_string())\n                return True\n            except:\n                self.exception()\n                self.__has_logged_error += 1\n\n                if self.terminated:\n                    return False\n\n                attempts += 1\n                sleeptime = min(60, 2 ** attempts)\n\n                self.error(\"delivery failure: sleeping %d seconds\" % sleeptime)\n\n                self.__disconnect()\n                time.sleep(sleeptime)\n                self.__connect()\n\n        # We were terminated before the mail was sent.  Return false to keep the\n        # mail in the outbox for later delivery.\n        return False\n\n    def __cleanup(self):\n        now = time.time()\n        deleted = 0\n\n        for filename in os.listdir(os.path.join(configuration.paths.OUTBOX, \"sent\")):\n            if filename.endswith(\".txt.sent\"):\n                filename = os.path.join(configuration.paths.OUTBOX, \"sent\", filename)\n                age = now - os.stat(filename).st_ctime\n\n                if age > 7 * 24 * 60 * 60:\n                    os.unlink(filename)\n                    deleted += 1\n\n        if deleted:\n            self.info(\"deleted %d files from %s\"\n                      % (deleted, os.path.join(configuration.paths.OUTBOX, \"sent\")))\n\ndef start_service():\n    stdin_data = sys.stdin.read()\n\n    if stdin_data:\n        credentials = json.loads(stdin_data)[\"credentials\"]\n        if not credentials.get(\"username\") or not credentials.get(\"password\"):\n            credentials = None\n    else:\n        credentials = None\n\n    maildelivery = MailDelivery(credentials)\n    return maildelivery.start()\n\nbackground.utils.call(\"maildelivery\", start_service)\n"
  },
  {
    "path": "src/background/maintenance.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2013 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport os\nimport time\nimport shutil\n\nsys.path.insert(0, os.path.abspath(os.path.join(os.path.dirname(sys.argv[0]), \"..\")))\n\nimport configuration\nimport dbutils\nimport gitutils\nimport background.utils\n\nclass Maintenance(background.utils.BackgroundProcess):\n    def __init__(self):\n        service = configuration.services.MAINTENANCE\n\n        super(Maintenance, self).__init__(service=service)\n\n        hour, minute = service[\"maintenance_at\"]\n        self.register_maintenance(hour=hour, minute=minute, callback=self.__maintenance)\n\n    def run(self):\n        with dbutils.Database.forSystem() as db:\n            # Do an initial load/update of timezones.\n            #\n            # The 'timezones' table initially (post-installation) only contains\n            # the Universal/UTC timezone; this call adds all the others that the\n            # PostgreSQL database server knows about.\n            dbutils.loadTimezones(db)\n\n        super(Maintenance, self).run()\n\n    def __maintenance(self):\n        with dbutils.Database.forSystem() as db:\n            cursor = db.cursor()\n\n            # Update the UTC offsets of all timezones.\n            #\n            # The PostgreSQL database server has accurate (DST-adjusted) values,\n            # but is very slow to query, so we cache the UTC offsets in our\n            # 'timezones' table.  This call updates that cache every night.\n            # (This is obviously a no-op most nights, but we don't want to have\n            # to care about which nights it isn't.)\n            self.debug(\"updating timezones\")\n            dbutils.updateTimezones(db)\n\n            if self.terminated:\n                return\n\n            # Execute scheduled review branch archivals.\n            if configuration.base.ARCHIVE_REVIEW_BRANCHES:\n                repository = None\n\n                cursor.execute(\"\"\"SELECT branches.repository, branches.id, branches.name\n                                    FROM scheduledreviewbrancharchivals\n                                    JOIN reviews ON (reviews.id=scheduledreviewbrancharchivals.review)\n                                    JOIN branches ON (branches.id=reviews.branch)\n                                   WHERE scheduledreviewbrancharchivals.deadline <= NOW()\n                                     AND reviews.state IN ('closed', 'dropped')\n                                     AND NOT branches.archived\n                                ORDER BY branches.repository\"\"\",\n                               for_update=True)\n\n                for repository_id, branch_id, branch_name in cursor:\n                    if not repository or repository.id != repository_id:\n                        if repository:\n                            repository.stopBatch()\n                        repository = gitutils.Repository.fromId(db, repository_id)\n                        self.info(\"archiving branches in: \" + repository.name)\n\n                    self.info(\"  \" + branch_name)\n\n                    branch = dbutils.Branch.fromId(db, branch_id, repository=repository)\n\n                    try:\n                        branch.archive(db)\n                    except Exception:\n                        self.exception(as_warning=True)\n\n                # Since NOW() returns the same value each time within a single\n                # transaction, this is guaranteed to delete only the set of\n                # archivals we selected above.\n                cursor.execute(\"\"\"DELETE\n                                    FROM scheduledreviewbrancharchivals\n                                   WHERE deadline <= NOW()\"\"\")\n\n                db.commit()\n\n            # Run a garbage collect in all Git repositories, to keep them neat\n            # and tidy.  Also pack keepalive refs.\n            cursor.execute(\"SELECT name FROM repositories\")\n            for (repository_name,) in cursor:\n                self.debug(\"repository GC: %s\" % repository_name)\n                try:\n                    repository = gitutils.Repository.fromName(db, repository_name)\n                    repository.packKeepaliveRefs()\n                    repository.run(\"gc\", \"--prune=1 day\", \"--quiet\")\n                    repository.stopBatch()\n                except Exception:\n                    self.exception(\"repository GC failed: %s\" % repository_name)\n\n                if self.terminated:\n                    return\n\n            if configuration.extensions.ENABLED:\n                now = time.time()\n                max_age = 7 * 24 * 60 * 60\n\n                base_path = os.path.join(configuration.paths.DATA_DIR,\n                                         \"temporary\", \"EXTENSIONS\")\n\n                for user_name in os.listdir(base_path):\n                    user_dir = os.path.join(base_path, user_name)\n\n                    for extension_id in os.listdir(user_dir):\n                        extension_dir = os.path.join(user_dir, extension_id)\n\n                        for repository_name in os.listdir(extension_dir):\n                            repository_dir = os.path.join(extension_dir,\n                                                          repository_name)\n                            age = now - os.stat(repository_dir).st_mtime\n\n                            if age > max_age:\n                                self.info(\"Removing repository work copy: %s\"\n                                          % repository_dir)\n                                shutil.rmtree(repository_dir)\n\ndef start_service():\n    maintenance = Maintenance()\n    return maintenance.start()\n\nbackground.utils.call(\"maintenance\", start_service)\n"
  },
  {
    "path": "src/background/servicemanager.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport subprocess\nimport time\nimport signal\nimport os\nimport json\nimport glob\n\nimport configuration\n\n# Number of seconds to wait for startup synchronization.\nSTARTUP_SYNC_TIMEOUT = 30\n\nif \"--slave\" in sys.argv:\n    import background.utils\n\n    class ServiceManager(background.utils.PeerServer):\n        # The master process manages our pid file, so tell our base class to\n        # leave it alone.\n        manage_pidfile = False\n\n        class Service(object):\n            class Process(background.utils.PeerServer.ChildProcess):\n                def __init__(self, service, input_data):\n                    super(ServiceManager.Service.Process, self).__init__(\n                        service.manager, [sys.executable, \"-m\", service.module],\n                        stderr=subprocess.STDOUT)\n\n                    self.__service = service\n                    self.__output = None\n                    if input_data:\n                        self.write(json.dumps(input_data))\n                    self.close()\n\n                def handle_input(self, _file, data):\n                    self.__output = data\n\n                def destroy(self):\n                    super(ServiceManager.Service.Process, self).destroy()\n                    self.__service.stopped(self.returncode, self.__output)\n\n            def __init__(self, manager, service_data):\n                self.manager = manager\n                self.name = service_data[\"name\"]\n                self.module = service_data[\"module\"]\n                self.started = None\n                self.process = None\n                self.callbacks = []\n\n            def signal_callbacks(self, event):\n                self.callbacks = filter(lambda callback: callback(event), self.callbacks)\n\n            def start(self, input_data):\n                self.process = ServiceManager.Service.Process(self, input_data)\n                self.started = time.time()\n                self.manager.add_peer(self.process)\n                self.manager.info(\"%s: started (pid=%d)\" % (self.name, self.process.pid))\n                self.input_data = input_data\n                self.signal_callbacks(\"started\")\n\n            def restart(self, callback=None):\n                if callback:\n                    self.callbacks.append(callback)\n                self.start(self.input_data)\n\n            def stop(self, callback=None):\n                if callback:\n                    self.callbacks.append(callback)\n                if self.process:\n                    self.manager.info(\"%s: sending process SIGTERM\" % self.name)\n                    self.process.kill(signal.SIGTERM)\n\n            def stopped(self, returncode, output):\n                restart = not self.manager.terminated and not self.manager.restart_requested\n                if returncode != 0:\n                    message = \"%s: exited with returncode %d\" % (self.name, returncode)\n                    if output:\n                        message += \"\\n\" + background.utils.indent(output)\n                    if time.time() - self.started < 1:\n                        message += \"\\n  Process restarted less than 1 second ago; not restarting.\"\n                        restart = False\n                    self.manager.error(message)\n                else:\n                    self.manager.info(\"%s: exited normally\" % self.name)\n                    if not self.callbacks:\n                        restart = False\n                self.process = None\n                if restart:\n                    self.restart()\n                else:\n                    self.signal_callbacks(\"stopped\")\n\n        class Client(background.utils.PeerServer.SocketPeer):\n            def __init__(self, manager, peersocket):\n                super(ServiceManager.Client, self).__init__(manager, peersocket)\n                self.__manager = manager\n\n            def send_response(self, value):\n                self.write(background.utils.json_encode(value))\n                self.close()\n\n            def handle_input(self, _file, data):\n                result = self.send_response\n\n                try:\n                    request = background.utils.json_decode(data)\n                except:\n                    return result({ \"status\": \"error\",\n                                    \"error\": \"invalid input: JSON decode failed\" })\n\n                if type(request) is not dict:\n                    return result({ \"status\": \"error\",\n                                    \"error\": \"invalid input: expected object\" })\n\n                if request.get(\"query\") == \"status\":\n                    services = { \"manager\": { \"module\": \"background.servicemanager\",\n                                              \"uptime\": time.time() - self.__manager.started,\n                                              \"pid\": os.getpid() }}\n\n                    for service in self.__manager.services:\n                        uptime = time.time() - service.started if service.started else -1\n                        pid = service.process.pid if service.process else -1\n                        services[service.name] = { \"module\": service.module,\n                                                   \"uptime\": uptime,\n                                                   \"pid\": pid }\n\n                    return result({ \"status\": \"ok\", \"services\": services })\n                elif request.get(\"command\") == \"restart\":\n                    if \"service\" not in request:\n                        return result({ \"status\": \"error\",\n                                        \"error\": \"invalid input: no service specified\" })\n                    if request[\"service\"] == \"manager\":\n                        self.__manager.info(\"restart requested\")\n                        self.__manager.requestRestart()\n                        return result({ \"status\": \"ok\" })\n                    for service in self.__manager.services:\n                        if service.name == request.get(\"service\"):\n                            self.__manager.info(\"%s: restart requested\" % service.name)\n                            def callback(event):\n                                self.send_response({ \"status\": \"ok\",\n                                                     \"event\": event })\n                            if service.process: service.stop(callback)\n                            else: service.restart(callback)\n                            break\n                    else:\n                        return result({ \"status\": \"error\", \"error\": \"%s: no such service\" % request.get(\"service\") })\n                else:\n                    return result({ \"status\": \"error\", \"error\": \"invalid input: unsupported data\" })\n\n        def __init__(self, input_data):\n            service = configuration.services.SERVICEMANAGER.copy()\n\n            super(ServiceManager, self).__init__(service=service)\n\n            self.input_data = input_data\n            self.services = []\n            self.started = time.time()\n\n        def handle_peer(self, peersocket, peeraddress):\n            return ServiceManager.Client(self, peersocket)\n\n        def startup(self):\n            super(ServiceManager, self).startup()\n\n            for service_data in configuration.services.SERVICEMANAGER[\"services\"]:\n                starting_path = service_data[\"pidfile_path\"] + \".starting\"\n                with open(starting_path, \"w\") as starting:\n                    starting.write(\"%s\\n\" % time.ctime())\n\n                service = ServiceManager.Service(self, service_data)\n                service.start(self.input_data.get(service.name))\n                self.services.append(service)\n\n        def shutdown(self):\n            for service in self.services:\n                service.stop()\n\n            super(ServiceManager, self).shutdown()\n\n        def requestRestart(self):\n            super(ServiceManager, self).requestRestart()\n\n            for service in self.services:\n                service.stop()\n\n    def start_service():\n        stdin_data = sys.stdin.read()\n\n        if stdin_data:\n            input_data = json.loads(stdin_data)\n        else:\n            input_data = {}\n\n        manager = ServiceManager(input_data)\n        return manager.start()\n\n    background.utils.call(\"servicemanager\", start_service)\nelse:\n    import errno\n    import pwd\n    import grp\n    import stat\n\n    pwentry = pwd.getpwnam(configuration.base.SYSTEM_USER_NAME)\n    grentry = grp.getgrnam(configuration.base.SYSTEM_GROUP_NAME)\n\n    uid = pwentry.pw_uid\n    gid = grentry.gr_gid\n    home = pwentry.pw_dir\n\n    import daemon\n\n    pidfile_path = configuration.services.SERVICEMANAGER[\"pidfile_path\"]\n\n    if os.path.isfile(pidfile_path):\n        print >>sys.stderr, \"%s: file exists; daemon already running?\" % pidfile_path\n        sys.exit(1)\n\n    # Our RUN_DIR (/var/run/critic/IDENTITY) is typically on a tmpfs that gets\n    # nuked on reboot, so recreate it with the right access if it doesn't exist.\n\n    def mkdir(path, mode):\n        if not os.path.isdir(path):\n            if not os.path.isdir(os.path.dirname(path)):\n                mkdir(os.path.dirname(path), mode)\n            os.mkdir(path, mode)\n        else:\n            os.chmod(path, mode)\n        os.chown(path, uid, gid)\n\n    mkdir(configuration.paths.RUN_DIR, 0755 | stat.S_ISUID | stat.S_ISGID)\n    mkdir(os.path.join(configuration.paths.RUN_DIR, \"sockets\"), 0755)\n    mkdir(os.path.join(configuration.paths.RUN_DIR, \"wsgi\"), 0750)\n\n    os.environ[\"HOME\"] = home\n    os.chdir(home)\n\n    smtp_credentials_path = os.path.join(configuration.paths.CONFIG_DIR,\n                                         \"configuration\",\n                                         \"smtp-credentials.json\")\n    if os.path.isfile(smtp_credentials_path):\n        with open(smtp_credentials_path) as smtp_credentials_file:\n            smtp_credentials = json.load(smtp_credentials_file)\n    else:\n        smtp_credentials = None\n\n    input_data = { \"maildelivery\": { \"credentials\": smtp_credentials }}\n\n    os.setgid(gid)\n    os.setuid(uid)\n\n    starting_pattern = os.path.join(os.path.dirname(pidfile_path), \"*.starting\")\n\n    # Remove any stale/unexpected *.starting files that would otherwise break\n    # our startup synchronization.\n    for filename in glob.glob(starting_pattern):\n        try:\n            os.unlink(filename)\n        except OSError as error:\n            print >>sys.stderr, error\n\n    with open(pidfile_path + \".starting\", \"w\") as starting:\n        starting.write(\"%s\\n\" % time.ctime())\n\n    def wait_for_startup_sync():\n        deadline = time.time() + STARTUP_SYNC_TIMEOUT\n        while True:\n            filenames = glob.glob(starting_pattern)\n            if not filenames:\n                return 0\n            if time.time() > deadline:\n                break\n            time.sleep(0.1)\n        print >>sys.stderr\n        print >>sys.stderr, (\"Startup synchronization timeout after %d seconds!\"\n                             % STARTUP_SYNC_TIMEOUT)\n        print >>sys.stderr, \"Services still starting:\"\n        for filename in filenames:\n            print >>sys.stderr, \"  \" + os.path.basename(filename)\n        return 1\n\n    with open(pidfile_path, \"w\") as pidfile:\n        daemon.detach(parent_exit_hook=wait_for_startup_sync)\n        pidfile.write(\"%s\\n\" % os.getpid())\n\n    os.umask(022)\n\n    was_terminated = False\n\n    def terminated(signum, frame):\n        global was_terminated\n        was_terminated = True\n\n    signal.signal(signal.SIGTERM, terminated)\n\n    while not was_terminated:\n        process = subprocess.Popen(\n            [sys.executable, \"-m\", \"background.servicemanager\", \"--slave\"],\n            stdin=subprocess.PIPE)\n\n        process.stdin.write(json.dumps(input_data))\n        process.stdin.close()\n\n        while not was_terminated:\n            try:\n                pid, status = os.wait()\n                if pid == process.pid:\n                    process = None\n                    break\n            except OSError as error:\n                if error.errno == errno.EINTR: continue\n                else: break\n\n    if process:\n        try:\n            process.send_signal(signal.SIGTERM)\n            process.wait()\n        except:\n            pass\n\n    try:\n        os.unlink(pidfile_path)\n    except:\n        pass\n"
  },
  {
    "path": "src/background/utils.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport subprocess\nimport os\nimport logging\nimport logging.handlers\nimport atexit\nimport socket\nimport errno\nimport select\nimport traceback\nimport signal\nimport fcntl\nimport time\nimport datetime\n\nimport configuration\nfrom textutils import json_encode, json_decode, indent\n\ndef freeze(d):\n    return tuple(sorted(d.items()))\ndef thaw(f):\n    return dict(f)\n\nclass AdministratorMailHandler(logging.Handler):\n    def __init__(self, logfile_path):\n        super(AdministratorMailHandler, self).__init__()\n        self.__logfile_name = os.path.basename(logfile_path)\n\n    def emit(self, record):\n        import mailutils\n        try:\n            import dbutils\n            db = dbutils.Database.forSystem()\n        except:\n            db = None\n        mailutils.sendAdministratorErrorReport(db, self.__logfile_name,\n                                                   record.message.splitlines()[0],\n                                                   self.formatter.format(record))\n        if db:\n            db.close()\n\nclass BackgroundProcess(object):\n    # Set to False in sub-class to disable pid file creation and deletion.\n    manage_pidfile = True\n\n    def __init__(self, service, send_administrator_mails=True):\n        try: loglevel = getattr(logging, service[\"loglevel\"].upper())\n        except: loglevel = logging.INFO\n\n        formatter = logging.Formatter(\"%(asctime)s - %(levelname)5s - %(message)s\")\n\n        file_handler = logging.handlers.RotatingFileHandler(service[\"logfile_path\"], maxBytes=1024**2, backupCount=5)\n        file_handler.setFormatter(formatter)\n        file_handler.setLevel(loglevel)\n\n        logger = logging.getLogger()\n        logger.setLevel(loglevel)\n        logger.addHandler(file_handler)\n\n        if send_administrator_mails:\n            mail_handler = AdministratorMailHandler(service[\"logfile_path\"])\n            mail_handler.setFormatter(formatter)\n            mail_handler.setLevel(logging.WARNING)\n            logger.addHandler(mail_handler)\n\n        self.terminated = False\n        self.interrupted = False\n        self.restart_requested = False\n        self.synchronize_when_idle = False\n        self.force_maintenance = False\n\n        self.__maintenance_hooks = []\n        self.__logger = logger\n        self.__pidfile_path = service[\"pidfile_path\"]\n        self.__create_pidfile()\n\n        signal.signal(signal.SIGHUP, self.__handle_SIGHUP)\n        signal.signal(signal.SIGTERM, self.__handle_SIGTERM)\n        signal.signal(signal.SIGUSR1, self.__handle_SIGUSR1)\n        signal.signal(signal.SIGUSR2, self.__handle_SIGUSR2)\n\n        self.info(\"service started\")\n\n        atexit.register(self.__stopped)\n\n    def __handle_SIGHUP(self, signum, frame):\n        self.interrupted = True\n    def __handle_SIGTERM(self, signum, frame):\n        self.terminated = True\n    def __handle_SIGUSR1(self, signum, frame):\n        # Used for synchronization during testing.\n        #\n        # Someone<TM> creates a file named \"<pidfile_path>.busy\", then sends\n        # SIGUSR1, and expects the file to be deleted as soon as this service\n        # reaches an idle point.\n        self.synchronize_when_idle = True\n    def __handle_SIGUSR2(self, signum, frame):\n        # Used for running maintenance tasks during testing.\n        #\n        # Works the same way as SIGUSR1, but additionally makes sure to run all\n        # scheduled maintenance tasks before reporting an idle state.\n        self.force_maintenance = True\n        self.synchronize_when_idle = True\n\n    def __create_pidfile(self):\n        if self.manage_pidfile:\n            try: os.makedirs(os.path.dirname(self.__pidfile_path))\n            except OSError as error:\n                if error.errno == errno.EEXIST: pass\n                else: raise\n            pidfile = open(self.__pidfile_path, \"w\")\n            pidfile.write(str(os.getpid()) + \"\\n\")\n            pidfile.close()\n\n    def __delete_pidfile(self):\n        if self.manage_pidfile:\n            try: os.unlink(self.__pidfile_path)\n            except: pass\n\n    def __signal_started(self):\n        try:\n            os.unlink(self.__pidfile_path + \".starting\")\n        except OSError as error:\n            if error.errno != errno.ENOENT:\n                self.exception()\n        except Exception:\n            self.exception()\n\n    def __stopped(self):\n        self.info(\"service stopped\")\n        self.__delete_pidfile()\n\n    def start(self):\n        try:\n            try:\n                self.startup()\n            except Exception:\n                self.exception()\n                self.__signal_started()\n                return 1\n\n            self.__signal_started()\n\n            try:\n                return self.run() or 0\n            except Exception:\n                self.exception()\n                return 1\n        finally:\n            try:\n                self.shutdown()\n            except Exception:\n                self.exception()\n                return 1\n\n    def startup(self):\n        pass\n    def shutdown(self):\n        pass\n\n    def signal_idle_state(self):\n        if self.synchronize_when_idle:\n            if self.force_maintenance:\n                self.run_maintenance()\n                self.force_maintenance = False\n            os.unlink(self.__pidfile_path + \".busy\")\n            self.synchronize_when_idle = False\n\n    def error(self, message):\n        self.__logger.error(message)\n\n    def warning(self, message):\n        self.__logger.warning(message)\n\n    def info(self, message):\n        self.__logger.info(message)\n\n    def debug(self, message):\n        self.__logger.debug(message)\n\n    def exception(self, message=None, as_warning=False):\n        backtrace = traceback.format_exc()\n        if message is None:\n            message = \"unhandled exception: \" + backtrace.splitlines()[-1]\n        if as_warning:\n            self.__logger.warning(message + \"\\n\" + indent(backtrace))\n        else:\n            self.__logger.error(message + \"\\n\" + indent(backtrace))\n\n    def register_maintenance(self, hour, minute, callback):\n        self.__maintenance_hooks.append(\n            [hour, minute, callback, datetime.datetime.now()])\n\n    def run_maintenance(self):\n        if self.__maintenance_hooks:\n            sleep_seconds = 86400\n\n            for hook in self.__maintenance_hooks:\n                hour, minute, callback, last = hook\n                now = datetime.datetime.now()\n\n                if hour is None:\n                    scheduled_at = datetime.time(now.hour, minute)\n                    interval = datetime.timedelta(seconds=3600)\n                    interval_type = \"hourly\"\n                else:\n                    scheduled_at = datetime.time(hour, minute)\n                    interval = datetime.timedelta(days=1)\n                    interval_type = \"daily\"\n\n                scheduled_at = datetime.datetime.combine(datetime.date.today(),\n                                                         scheduled_at)\n\n                while scheduled_at <= last:\n                    # We already ran the callback this hour/day.\n                    scheduled_at += interval\n\n                if scheduled_at <= now:\n                    self.info(\"performing %s maintenance task\" % interval_type)\n                    callback()\n                    hook[3] = scheduled_at\n                    scheduled_at += interval\n                elif self.force_maintenance:\n                    self.info(\"performing %s maintenance task (forced)\" % interval_type)\n                    callback()\n\n                now = datetime.datetime.now()\n                seconds_remaining = (scheduled_at - now).total_seconds()\n\n                # Wait at least 60 seconds, even if that would make us over-\n                # shoot the deadline slightly.  Maintenance tasks are not really\n                # that sensitive.\n                seconds_remaining = max(seconds_remaining, 60)\n\n                sleep_seconds = min(sleep_seconds, seconds_remaining)\n\n            return sleep_seconds\n\n    def run(self):\n        while not self.terminated:\n            # Aside from scheduled maintenance task, this service is always\n            # idle, so...\n            self.signal_idle_state()\n\n            timeout = self.run_maintenance()\n\n            if timeout is None:\n                # No configured maintenance hooks; nothing to do.  Returning will\n                # probably cause service to terminate, and we just started, so the\n                # service manager will leave the service not running.\n                return 0\n\n            self.debug(\"sleeping %d seconds\" % timeout)\n            time.sleep(timeout)\n\n    def requestRestart(self):\n        self.restart_requested = True\n\nclass PeerServer(BackgroundProcess):\n    class Peer(object):\n        def __init__(self, server, writing, *reading, **kwargs):\n            self.server = server\n            self.deadline = kwargs.get(\"deadline\", None)\n\n            self.__writing = writing\n            self.__write_data = \"\"\n            self.__write_closed = False\n            self.__write_failed = False\n\n            if writing:\n                fcntl.fcntl(writing, fcntl.F_SETFL, fcntl.fcntl(writing, fcntl.F_GETFL) | os.O_NONBLOCK)\n\n            self.__reading = list(reading)\n            self.__read_data = [\"\"] * len(reading)\n            self.__read_closed = [False] * len(reading)\n\n            for readfile in reading:\n                if readfile and readfile.fileno() != writing.fileno():\n                    fcntl.fcntl(readfile, fcntl.F_SETFL, fcntl.fcntl(readfile, fcntl.F_GETFL) | os.O_NONBLOCK)\n\n            self.__timed_out = False\n\n        def timed_out(self):\n            self.__timed_out = True\n            self.__writing = None\n            self.__reading = []\n\n        def is_finished(self):\n            return not self.__writing and not any(self.__reading)\n\n        def writing(self):\n            if self.__write_data or self.__write_closed: return self.__writing\n            else: return None\n\n        def write(self, data):\n            assert self.__writing\n            assert not self.__write_closed\n            self.__write_data += data\n\n        def close(self):\n            assert self.__writing\n            assert not self.__write_closed\n            self.__write_closed = True\n\n        def do_write(self):\n            try:\n                while self.__write_data:\n                    nwritten = os.write(self.__writing.fileno(), self.__write_data)\n                    self.__write_data = self.__write_data[nwritten:]\n            except EnvironmentError as error:\n                if error.errno in (errno.EAGAIN, errno.EINTR):\n                    raise\n                self.server.warning(\"Failed to write to peer: %s\" % error)\n                if error.errno == errno.EPIPE:\n                    self.__write_failed = True\n                else:\n                    raise\n            if self.__write_closed or self.__write_failed:\n                self.writing_done(self.__writing)\n                self.__writing = None\n\n        def reading(self):\n            return [readfile if not closed else None\n                    for readfile, closed in zip(self.__reading,\n                                                self.__read_closed)]\n\n        def read(self):\n            return [data if closed else None\n                    for data, closed in zip(self.__read_data,\n                                            self.__read_closed)]\n\n        def do_read(self, index):\n            while True:\n                readfile = self.__reading[index]\n                read = os.read(readfile.fileno(), 4096)\n                if not read:\n                    self.reading_done(readfile)\n                    self.__reading[index] = None\n                    self.__read_closed[index] = True\n                    self.handle_input(readfile, self.__read_data[index])\n                    break\n                self.__read_data[index] += read\n\n        def writing_done(self, writing):\n            writing.close()\n\n        def reading_done(self, reading):\n            reading.close()\n\n        def destroy(self):\n            pass\n\n    class SocketPeer(Peer):\n        def __init__(self, server, clientsocket):\n            super(PeerServer.SocketPeer, self).__init__(server, clientsocket, clientsocket)\n\n        def reading_done(self, reading):\n            reading.shutdown(socket.SHUT_RD)\n\n        def writing_done(self, writing):\n            writing.shutdown(socket.SHUT_WR)\n\n        def handle_input(self, _file, data):\n            pass\n\n    class SpawnedProcess(Peer):\n        def __init__(self, server, process, **kwargs):\n            self.process = process\n            self.pid = process.pid\n            super(PeerServer.SpawnedProcess, self).__init__(\n                server,\n                self.process.stdin, self.process.stdout, self.process.stderr,\n                **kwargs)\n\n        def kill(self, signal):\n            self.process.send_signal(signal)\n\n        def destroy(self):\n            self.process.wait()\n            self.returncode = self.process.returncode\n            self.check_result()\n\n        def timed_out(self):\n            super(PeerServer.SpawnedProcess, self).timed_out()\n            self.kill(signal.SIGKILL)\n\n        def check_result(self):\n            if self.returncode:\n                self.server.error(\"child process exited (pid=%d, returncode=%d)\" % (self.pid, self.returncode))\n            else:\n                self.server.debug(\"child process exited (pid=%d, returncode=0)\" % self.pid)\n\n    class ChildProcess(SpawnedProcess):\n        def __init__(self, server, args, **kwargs):\n            process = subprocess.Popen(\n                args, stdin=subprocess.PIPE, stdout=subprocess.PIPE, **kwargs)\n            super(PeerServer.ChildProcess, self).__init__(server, process)\n            self.server.debug(\"spawned child process (pid=%d)\" % self.process.pid)\n\n    def __init__(self, service, **kwargs):\n        super(PeerServer, self).__init__(service, **kwargs)\n\n        self.__peers = []\n        self.__address = service.get(\"address\")\n\n    def __create_listening_socket(self):\n        if type(self.__address) == str:\n            try: os.makedirs(os.path.dirname(self.__address))\n            except OSError as error:\n                if error.errno == errno.EEXIST: pass\n                else: raise\n\n            if os.path.exists(self.__address):\n                connection = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM)\n                try:\n                    connection.connect(self.__address)\n                    connection.close()\n\n                    print >>sys.stderr, \"ERROR: Server already started!\"\n                    sys.exit(1)\n                except socket.error as error:\n                    if error[0] == errno.ECONNREFUSED:\n                        self.debug(\"removing stale socket\")\n                        os.unlink(self.__address)\n                    else: raise\n\n            self.__listening_socket = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM)\n            self.__listening_socket.setblocking(0)\n            self.__listening_socket.bind(self.__address)\n            self.__listening_socket.listen(4)\n\n            os.chmod(self.__address, 0700)\n\n            self.debug(\"listening: %s\" % self.__address)\n        elif type(self.__address) == tuple:\n            host, port = self.__address\n\n            self.__listening_socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)\n            self.__listening_socket.setblocking(0)\n            self.__listening_socket.bind((host, port))\n            self.__listening_socket.listen(4)\n\n            self.debug(\"listening: %s:%d\" % (host, port))\n        elif self.__address is None:\n            self.__listening_socket = open(\"/dev/null\", \"r\")\n\n            self.debug(\"not listening\")\n        else:\n            raise Exception(\"invalid address: %r\" % self.__address)\n\n        atexit.register(self.__destroy_listening_socket)\n\n    def __destroy_listening_socket(self):\n        try: self.__listening_socket.close()\n        except: pass\n\n        if type(self.__address) == str:\n            try: os.unlink(self.__address)\n            except: pass\n\n    def run(self):\n        while not self.terminated:\n            self.interrupted = False\n\n            if self.restart_requested:\n                if not self.__peers:\n                    break\n                else:\n                    self.debug(\"restart delayed; have %d peers\" % len(self.__peers))\n\n            poll = select.poll()\n            poll.register(self.__listening_socket, select.POLLIN)\n\n            writing_map = {}\n            reading_map = {}\n\n            def fileno(file):\n                if file:\n                    return file.fileno()\n                else:\n                    return None\n\n            nearest_peer_deadline = None\n\n            for peer in self.__peers:\n                if peer.writing():\n                    poll.register(peer.writing(), select.POLLOUT)\n                    writing_map[peer.writing().fileno()] = peer\n                for index, readfile in enumerate(peer.reading()):\n                    if readfile:\n                        poll.register(readfile, select.POLLIN)\n                        reading_map[readfile.fileno()] = (peer, index)\n                if peer.deadline is not None:\n                    if nearest_peer_deadline is None:\n                        nearest_peer_deadline = peer.deadline\n                    else:\n                        nearest_peer_deadline = min(peer.deadline,\n                                                    nearest_peer_deadline)\n\n            while not self.terminated:\n                timeout_seconds = self.run_maintenance()\n\n                if timeout_seconds:\n                    if not self.__peers:\n                        self.debug(\"next maintenance task check scheduled in %d seconds\"\n                                   % timeout_seconds)\n\n                if nearest_peer_deadline is not None:\n                    deadline_seconds = nearest_peer_deadline - time.time()\n                    if deadline_seconds < 0:\n                        deadline_seconds = 0\n                    if timeout_seconds is None:\n                        timeout_seconds = deadline_seconds\n                    else:\n                        timeout_seconds = min(timeout_seconds, deadline_seconds)\n\n                if timeout_seconds:\n                    timeout_ms = timeout_seconds * 1000\n                else:\n                    timeout_ms = None\n\n                if self.synchronize_when_idle and not self.__peers:\n                    # We seem to be idle, but poll once, non-blocking,\n                    # just to be sure.\n                    timeout_ms = 0\n\n                try:\n                    events = poll.poll(timeout_ms)\n                    break\n                except select.error as error:\n                    if error[0] == errno.EINTR:\n                        continue\n                    else:\n                        raise\n\n            if self.terminated:\n                break\n            elif not (self.__peers or events):\n                self.signal_idle_state()\n\n            def catch_error(fn, *args):\n                while True:\n                    try:\n                        fn(*args)\n                    except EnvironmentError as error:\n                        if error.errno == errno.EINTR:\n                            continue\n                        if error.errno == errno.EAGAIN:\n                            return\n                        raise\n                    else:\n                        return\n\n            def check_peer(peer):\n                if peer.is_finished():\n                    peer.destroy()\n                    self.peer_destroyed(peer)\n                    self.__peers.remove(peer)\n\n            for fd, event in events:\n                if fd == self.__listening_socket.fileno():\n                    peersocket, peeraddress = self.__listening_socket.accept()\n                    peer = self.handle_peer(peersocket, peeraddress)\n                    if peer:\n                        self.__peers.append(peer)\n                    else:\n                        try:\n                            peersocket.close()\n                        except Exception:\n                            pass\n                else:\n                    if event != select.POLLIN and fd in writing_map:\n                        peer = writing_map[fd]\n                        catch_error(peer.do_write)\n                        check_peer(peer)\n                    if event != select.POLLOUT and fd in reading_map:\n                        peer, index = reading_map[fd]\n                        catch_error(peer.do_read, index)\n                        check_peer(peer)\n\n            if nearest_peer_deadline is not None:\n                now = time.time()\n                for peer in self.__peers[:]:\n                    if peer.deadline is not None and peer.deadline < now:\n                        peer.timed_out()\n                        check_peer(peer)\n\n    def add_peer(self, peer):\n        self.__peers.append(peer)\n\n    def handle_peer(self, peersocket, peeraddress):\n        pass\n\n    def peer_destroyed(self, peer):\n        pass\n\n    def startup(self):\n        self.__create_listening_socket()\n\n    def shutdown(self):\n        for peer in self.__peers:\n            try:\n                peer.destroy()\n            except:\n                self.exception()\n            try:\n                self.peer_destroyed(peer)\n            except:\n                self.exception()\n\nclass SlaveProcessServer(PeerServer):\n    class SlaveChildProcess(PeerServer.ChildProcess):\n        def __init__(self, server, client):\n            super(SlaveProcessServer.SlaveChildProcess, self).__init__(server, [sys.executable, sys.argv[0], \"--slave\"])\n            self.__client = client\n\n        def handle_input(self, _file, value):\n            self.__client.write(value)\n            self.__client.close()\n\n    class SlaveClient(PeerServer.SocketPeer):\n        def __init__(self, server, peersocket):\n            super(SlaveProcessServer.SlaveClient, self).__init__(server, peersocket)\n\n        def handle_input(self, _file, value):\n            if value:\n                child_process = SlaveProcessServer.SlaveChildProcess(self.server, self)\n                child_process.write(value)\n                child_process.close()\n                self.server.add_peer(child_process)\n\n    def handle_peer(self, peersocket, peeraddress):\n        return SlaveProcessServer.SlaveClient(self, peersocket)\n\nclass JSONJobServer(PeerServer):\n    class Job(PeerServer.ChildProcess):\n        def __init__(self, server, client, request):\n            super(JSONJobServer.Job, self).__init__(server, [sys.executable, sys.argv[0], \"--json-job\"], stderr=subprocess.STDOUT)\n            self.clients = [client]\n            self.request = request\n            self.write(json_encode(request))\n            self.close()\n\n        def handle_input(self, _file, value):\n            try: result = json_decode(value)\n            except ValueError:\n                self.server.error(\"invalid response:\\n\" + indent(value))\n                result = self.request.copy()\n                result[\"error\"] = value\n            for client in self.clients: client.add_result(result)\n            self.server.request_finished(self, self.request, result)\n\n    class JobClient(PeerServer.SocketPeer):\n        def handle_input(self, _file, value):\n            decoded = json_decode(value)\n            assert isinstance(decoded, dict)\n            if \"requests\" in decoded:\n                self.__requests = decoded[\"requests\"]\n                self.__pending_requests = map(freeze, self.__requests)\n                self.__async = decoded.get(\"async\", False)\n                self.__results = []\n                self.server.add_requests(self)\n            else:\n                self.server.execute_command(self, decoded)\n            if self.__async:\n                self.close()\n\n        def has_requests(self):\n            return bool(self.__pending_requests)\n\n        def get_request(self):\n            return self.__pending_requests.pop()\n\n        def add_result(self, result):\n            if self.__async:\n                # Client is already gone, so we don't really care about the\n                # results.\n                return\n            self.__results.append(result)\n            if len(self.__results) == len(self.__requests):\n                self.write(json_encode(self.__results))\n                self.close()\n\n    def __init__(self, service):\n        super(JSONJobServer, self).__init__(service)\n        self.__clients_with_requests = []\n        self.__started_requests = {}\n        self.__max_workers = service.get(\"max_workers\", 4)\n\n    def __startJobs(self):\n        # Repeat \"start a job\" while there are jobs to start and we haven't\n        # reached the limit on number of concurrent jobs to run.\n        while self.__clients_with_requests and len(self.__started_requests) < self.__max_workers:\n            # Fetch next request from first client in list of clients with\n            # pending requests.\n            client = self.__clients_with_requests.pop(0)\n            frozen = client.get_request()\n\n            if client.has_requests():\n                # Client has more pending requests, so put it back at the end of\n                # the list of clients with pending requests.\n                self.__clients_with_requests.append(client)\n\n            if frozen in self.__started_requests:\n                # Another client has requested the same thing, piggy-back on\n                # that job instead of starting another.\n                self.__started_requests[frozen].clients.append(client)\n                continue\n\n            request = thaw(frozen)\n\n            # Check if this request is already finished.  Default implementation\n            # of this callback always returns None.\n            result = self.request_result(request)\n\n            if result:\n                # Request is already finished; don't bother starting a child\n                # process, just report result directly to the client.\n                client.add_result(result)\n            else:\n                # Start child process.\n                job = JSONJobServer.Job(self, client, request)\n                self.add_peer(job)\n                self.request_started(job, request)\n\n    def add_requests(self, client):\n        assert client.has_requests()\n        self.__clients_with_requests.append(client)\n        self.__startJobs()\n\n    def execute_command(self, client, command):\n        client.write(json_encode({ \"status\": \"error\", \"error\": \"command not supported\" }))\n        client.close()\n\n    def handle_peer(self, peersocket, peeraddress):\n        return JSONJobServer.JobClient(self, peersocket)\n\n    def peer_destroyed(self, peer):\n        if isinstance(peer, JSONJobServer.Job): self.__startJobs()\n\n    def request_result(self, request):\n        pass\n    def request_started(self, job, request):\n        self.__started_requests[freeze(request)] = job\n    def request_finished(self, job, request, result):\n        del self.__started_requests[freeze(request)]\n\ndef call(context, fn, *args, **kwargs):\n    if configuration.debug.COVERAGE_DIR:\n        import coverage\n        result = coverage.call(context, fn, *args, **kwargs)\n    else:\n        result = fn(*args, **kwargs)\n    sys.exit(result)\n"
  },
  {
    "path": "src/background/wait-for-pidfiles.py",
    "content": "# This script is no longer used, but is kept around in case an old version of\n# the SysV init script is used, that still calls this script.\n"
  },
  {
    "path": "src/background/watchdog.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport os\nimport time\nimport signal\nimport errno\nimport multiprocessing\n\nsys.path.insert(0, os.path.abspath(os.path.join(os.path.dirname(sys.argv[0]), \"..\")))\n\nimport configuration\n\nimport background.utils\nimport mailutils\n\ndef getRSS(pid):\n    for line in open(\"/proc/%d/status\" % pid):\n        words = line.split()\n        if words[0] == \"VmRSS:\":\n            if words[2].lower() == \"kb\": unit = 1024\n            elif words[2].lower() == \"mb\": unit = 1024 ** 2\n            elif words[2].lower() == \"gb\": unit = 1024 ** 3\n            else: raise Exception(\"unknown unit: %s\" % words[2])\n            return int(words[1]) * unit\n    else: raise Exception(\"invalid pid\")\n\nclass Watchdog(background.utils.BackgroundProcess):\n    def __init__(self):\n        service = configuration.services.WATCHDOG\n\n        super(Watchdog, self).__init__(service=service)\n\n        cpu_count = multiprocessing.cpu_count()\n\n        self.load1_limit = service.get(\"load1_limit\", 0) * cpu_count\n        self.load5_limit = service.get(\"load5_limit\", 0) * cpu_count\n        self.load15_limit = service.get(\"load15_limit\", 0) * cpu_count\n\n    def run(self):\n        soft_restart_attempted = set()\n        previous = {}\n\n        getloadavg_failed = False\n        load1_has_warned = 0\n        load1_last_time = 0\n        load5_has_warned = 0\n        load5_last_time = 0\n        load15_has_warned = 0\n        load15_last_time = 0\n\n        while not self.terminated:\n            self.interrupted = False\n\n            def sendLoadAverageWarning(interval, limit, load):\n                cpu_count = multiprocessing.cpu_count()\n                mailutils.sendAdministratorMessage(\n                    \"watchdog\", \"%d-minute load average\" % interval,\n                    (\"The current %d-minute load average is %.2f!\\n\"\n                     % (interval, load)) +\n                    (\"The configured limit is %.2f (%.2f x %d CPUs).\\n\"\n                     % (limit, limit / cpu_count, cpu_count)) +\n                    \"\\n-- critic\\n\")\n\n            try:\n                load1, load5, load15 = os.getloadavg()\n                self.debug(\"load average: %r, %r, %r\" % (load1, load5, load15))\n            except OSError:\n                load1, load5, load15 = 0, 0, 0\n\n                if not getloadavg_failed:\n                    self.exception(\"failed to detect system load average\")\n                    getloadavg_failed = True\n\n            now = time.time()\n\n            if self.load1_limit and load1 > self.load1_limit:\n                if load1 > load1_has_warned * 1.2 or now - load1_last_time > 60:\n                    sendLoadAverageWarning(1, self.load1_limit, load1)\n                    load1_has_warned = load1\n                    load1_last_time = now\n            else:\n                load1_has_warned = 0\n                load1_last_time = 0\n\n                if self.load5_limit and load5 > self.load5_limit:\n                    if load5 > load5_has_warned * 1.2 or now - load5_last_time > 5 * 60:\n                        sendLoadAverageWarning(5, self.load5_limit, load5)\n                        load5_has_warned = load5\n                        load5_last_time = now\n                else:\n                    load5_has_warned = 0\n                    load5_last_time = 0\n\n                    if self.load15_limit and load15 > self.load15_limit:\n                        if load15 > load15_has_warned * 1.2 or now - load15_last_time > 15 * 60:\n                            sendLoadAverageWarning(15, self.load15_limit, load15)\n                            load15_has_warned = load15\n                            load15_last_time = now\n                    else:\n                        load15_has_warned = 0\n                        load15_last_time = 0\n\n            pidfile_dir = configuration.paths.WSGI_PIDFILE_DIR\n\n            if os.path.isdir(pidfile_dir):\n                pids = set(map(int, os.listdir(pidfile_dir)))\n            else:\n                pids = set()\n\n            for pid in pids:\n                try: rss = getRSS(pid)\n                except IOError as error:\n                    if error.errno == errno.ENOENT:\n                        self.warning(\"unlinking stale pid-file: %s\" % os.path.join(pidfile_dir, str(pid)))\n                        os.unlink(os.path.join(pidfile_dir, str(pid)))\n                        continue\n                    else: raise\n\n                if previous.get(pid) != rss:\n                    self.debug(\"pid=%d, rss=%d bytes\" % (pid, rss))\n                    previous[pid] = rss\n\n                if rss > configuration.services.WATCHDOG[\"rss_hard_limit\"]:\n                    mailutils.sendAdministratorMessage(\n                        \"watchdog\", \"pid(%d): hard memory limit exceeded\" % pid,\n                        (\"Current RSS: %d kilobytes\\nSending process SIGKILL (%d).\\n\\n-- critic\"\n                         % (rss, signal.SIGKILL)))\n                    self.info(\"Killing pid(%d): hard memory limit exceeded, RSS: %d kilobytes\" % (pid, rss))\n                    os.kill(pid, signal.SIGKILL)\n                elif rss > configuration.services.WATCHDOG[\"rss_soft_limit\"] and pid not in soft_restart_attempted:\n                    mailutils.sendAdministratorMessage(\n                        \"watchdog\", \"pid(%d): soft memory limit exceeded\" % pid,\n                        (\"Current RSS: %d kilobytes\\nSending process SIGINT (%d).\\n\\n\"\n                         % (rss, signal.SIGINT)))\n                    self.info(\"Killing pid(%d): soft memory limit exceeded, RSS: %d kilobytes\" % (pid, rss))\n                    os.kill(pid, signal.SIGINT)\n                    soft_restart_attempted.add(pid)\n\n            for pid in previous.keys():\n                if pid not in pids: del previous[pid]\n\n            soft_restart_attempted = soft_restart_attempted & pids\n\n            self.signal_idle_state()\n\n            time.sleep(10)\n\ndef start_service():\n    watchdog = Watchdog()\n    return watchdog.start()\n\nbackground.utils.call(\"watchdog\", start_service)\n"
  },
  {
    "path": "src/base.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nclass Error(Exception):\n    pass\n\nclass ImplementationError(Error):\n    pass\n"
  },
  {
    "path": "src/base_unittest.py",
    "content": "def independence():\n    # Simply check that base can be imported.\n\n    import base\n\n    print \"independence: ok\"\n"
  },
  {
    "path": "src/changeset/__init__.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\n"
  },
  {
    "path": "src/changeset/client.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport socket\n\nimport base\nimport configuration\nfrom textutils import json_encode, json_decode, indent\n\nclass ChangesetBackgroundServiceError(base.ImplementationError):\n    def __init__(self, message):\n        super(ChangesetBackgroundServiceError, self).__init__(\n            \"Changeset background service failed: %s\" % message)\n\ndef requestChangesets(requests, async=False):\n    try:\n        connection = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM)\n        connection.connect(configuration.services.CHANGESET[\"address\"])\n        connection.send(json_encode({\n            \"requests\": requests,\n            \"async\": async\n        }))\n        connection.shutdown(socket.SHUT_WR)\n\n        data = \"\"\n\n        while True:\n            received = connection.recv(4096)\n            if not received: break\n            data += received\n\n        connection.close()\n    except EnvironmentError as error:\n        raise ChangesetBackgroundServiceError(str(error))\n\n    if async:\n        return\n\n    if not data:\n        raise ChangesetBackgroundServiceError(\n            \"returned an invalid response: no response\")\n\n    try:\n        results = json_decode(data)\n    except ValueError:\n        raise ChangesetBackgroundServiceError(\n            \"returned an invalid response: %r\" % data)\n\n    if type(results) != list:\n        # If not a list, the result is probably an error message.\n        raise ChangesetBackgroundServiceError(str(results))\n\n    if len(results) != len(requests):\n        raise ChangesetBackgroundServiceError(\"didn't process all requests\")\n\n    errors = []\n\n    for result in results:\n        if \"error\" in result:\n            errors.append(result[\"error\"])\n\n    if errors:\n        raise ChangesetBackgroundServiceError(\n            \"one or more requests failed:\\n%s\" % \"\\n\".join(map(indent, errors)))\n"
  },
  {
    "path": "src/changeset/create.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport dbutils\nimport gitutils\nimport diff.merge\nimport diff.parse\n\ndef createChangeset(db, request):\n    repository_name = request[\"repository_name\"]\n    changeset_type = request[\"changeset_type\"]\n\n    repository = gitutils.Repository.fromName(db, repository_name)\n\n    def insertChangeset(db, parent, child, files):\n        while True:\n            # Inserting new files will often clash when creating multiple\n            # related changesets in parallel.  It's a simple operation, so if it\n            # fails with an integrity error, just try again until it doesn't\n            # fail.  (It will typically succeed the second time because then the\n            # new files already exist, and it doesn't need to insert anything.)\n            try:\n                dbutils.find_files(db, files)\n                db.commit()\n                break\n            except dbutils.IntegrityError:\n                db.rollback()\n\n        cursor = db.cursor()\n        cursor.execute(\"INSERT INTO changesets (type, parent, child) VALUES (%s, %s, %s) RETURNING id\",\n                       (changeset_type, parent.getId(db) if parent else None, child.getId(db)))\n        changeset_id = cursor.fetchone()[0]\n\n        fileversions_values = []\n        chunks_values = []\n\n        file_ids = set()\n\n        for file in files:\n            if file.id in file_ids: raise Exception(\"duplicate:%d:%s\" % (file.id, file.path))\n            file_ids.add(file.id)\n\n            fileversions_values.append((changeset_id, file.id, file.old_sha1, file.new_sha1, file.old_mode, file.new_mode))\n\n            for index, chunk in enumerate(file.chunks):\n                chunk.analyze(file, index == len(file.chunks) - 1)\n                chunks_values.append((changeset_id, file.id, chunk.delete_offset, chunk.delete_count, chunk.insert_offset, chunk.insert_count, chunk.analysis, 1 if chunk.is_whitespace else 0))\n\n            file.clean()\n\n        if fileversions_values:\n            cursor.executemany(\"\"\"INSERT INTO fileversions (changeset, file, old_sha1, new_sha1, old_mode, new_mode)\n                                       VALUES (%s, %s, %s, %s, %s, %s)\"\"\",\n                               fileversions_values)\n        if chunks_values:\n            cursor.executemany(\"\"\"INSERT INTO chunks (changeset, file, deleteOffset, deleteCount, insertOffset, insertCount, analysis, whitespace)\n                                       VALUES (%s, %s, %s, %s, %s, %s, %s, %s)\"\"\",\n                               chunks_values)\n\n        return changeset_id\n\n    changeset_ids = request[\"changeset_ids\"] = {}\n\n    child = gitutils.Commit.fromSHA1(db, repository, request[\"child_sha1\"])\n\n    cursor = db.cursor()\n\n    if \"parent_sha1\" in request:\n        assert changeset_type in (\"custom\", \"conflicts\")\n\n        parent_sha1 = request[\"parent_sha1\"]\n\n        if parent_sha1 == \"0\" * 40:\n            parent = parent_id = None\n        else:\n            parent = gitutils.Commit.fromSHA1(db, repository, parent_sha1)\n            parent_id = parent.getId(db)\n\n        cursor.execute(\"\"\"SELECT id, %s\n                            FROM changesets\n                           WHERE type=%s\n                             AND parent=%s\n                             AND child=%s\"\"\",\n                       (parent_sha1, changeset_type, parent_id, child.getId(db)))\n    else:\n        assert changeset_type in (\"direct\", \"merge\")\n\n        cursor.execute(\"\"\"SELECT changesets.id, commits.sha1\n                            FROM changesets\n                 LEFT OUTER JOIN commits ON (commits.id=changesets.parent)\n                           WHERE type=%s\n                             AND child=%s\"\"\",\n                       (changeset_type, child.getId(db)))\n\n    rows = cursor.fetchall()\n\n    if rows:\n        # Changeset(s) already exists in database.\n\n        for changeset_id, parent_sha1 in rows:\n            changeset_ids[parent_sha1] = changeset_id\n    else:\n        # Parse diff and insert changeset(s) into the database.\n\n        if changeset_type == \"merge\":\n            changes = diff.merge.parseMergeDifferences(db, repository, child)\n        elif changeset_type == \"direct\":\n            changes = diff.parse.parseDifferences(repository, commit=child)\n        else:\n            changes = diff.parse.parseDifferences(repository, from_commit=parent, to_commit=child)\n\n        for parent_sha1, files in changes.items():\n            if parent_sha1 is None:\n                parent = None\n            else:\n                parent = gitutils.Commit.fromSHA1(db, repository, parent_sha1)\n            changeset_ids[parent_sha1] = insertChangeset(db, parent, child, files)\n\n        db.commit()\n"
  },
  {
    "path": "src/changeset/detectmoves.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport difflib\nimport re\nimport diff\nimport diff.analyze\n\nre_ws = re.compile(\"\\\\s+\")\n\nSMALLEST_INSERT = 5\nMAXIMUM_GAP = 10\n\nclass Line:\n    def __init__(self, string):\n        self.string = string\n        self.wsnorm = re_ws.sub(\" \", string.strip())\n\n    def __str__(self):\n        return self.string\n\n    def __eq__(self, other):\n        return self.wsnorm == other.wsnorm\n\n    def __ne__(self, other):\n        return self.wsnorm == other.wsnorm\n\n    def __hash__(self):\n        return hash(self.wsnorm)\n\ndef compareChunks(source_file, source_chunk, target_file, target_chunk, extra_target_chunks, context_lines=3):\n    source_length = source_file.oldCount()\n    target_length = target_file.newCount()\n\n    source_lines = map(Line, source_chunk.deleted_lines)\n    target_lines = map(Line, target_chunk.inserted_lines)\n\n    sm = difflib.SequenceMatcher(None, source_lines, target_lines)\n\n    blocks = filter(lambda x: x[2], sm.get_matching_blocks())\n\n    if blocks:\n        chunks = []\n\n        i, j, n = blocks.pop(0)\n\n        current = [(i, j, n)]\n        matched = n\n\n        pi = i + n\n        pj = j + n\n\n        for i, j, n in blocks:\n            if i - pi > MAXIMUM_GAP or j - pj > MAXIMUM_GAP:\n                chunks.append((matched, current))\n                current = [(i, j, n)]\n                matched = n\n            else:\n                current.append((i, j, n))\n                matched += n\n            pi = i + n\n            pj = j + n\n\n        chunks.append((matched, current))\n        chunks.sort()\n\n        matched, blocks = chunks[-1]\n\n        if matched < SMALLEST_INSERT:\n            return None\n\n        source_begin = max(-(source_chunk.delete_offset - 1), blocks[0][0] - context_lines)\n        source_end = min(source_length + 1 - source_chunk.delete_offset, blocks[-1][0] + blocks[-1][2] + context_lines)\n\n        target_begin = max(-(target_chunk.insert_offset - 1), blocks[0][1] - context_lines)\n        target_end = min(target_length + 1 - target_chunk.insert_offset, blocks[-1][1] + blocks[-1][2] + context_lines)\n\n        new_chunk = diff.Chunk(source_chunk.delete_offset + source_begin,\n                               source_end - source_begin,\n                               target_chunk.insert_offset + target_begin,\n                               target_end - target_begin)\n\n        new_chunk.source_chunk = source_chunk\n        new_chunk.source_begin = source_begin\n        new_chunk.source_end = source_end\n        new_chunk.source_length = source_length\n\n        if blocks[0][1] >= SMALLEST_INSERT and blocks[0][1] < target_chunk.insert_count:\n            extra_before = diff.Chunk(0, 0, target_chunk.insert_offset, blocks[0][1])\n        else:\n            extra_before = None\n\n        match_end = blocks[-1][1] + blocks[-1][2]\n        if target_chunk.insert_count - match_end >= SMALLEST_INSERT:\n            extra_after = diff.Chunk(0, 0, target_chunk.insert_offset + match_end, target_chunk.insert_count - match_end)\n        else:\n            extra_after = None\n\n        new_chunk.deleted_lines = source_file.getOldLines(new_chunk)\n        new_chunk.inserted_lines = target_file.getNewLines(new_chunk)\n\n        if matched > len(new_chunk.inserted_lines) * 0.25:\n            analysis = diff.analyze.analyzeChunk(new_chunk.deleted_lines, new_chunk.inserted_lines, moved=True)\n\n            if matched > len(new_chunk.inserted_lines) * 0.5 or (analysis and len(analysis.split(';')) >= len(new_chunk.inserted_lines) * 0.5):\n                new_chunk.analysis = analysis\n                if extra_before: extra_target_chunks.append(extra_before)\n                if extra_after: extra_target_chunks.append(extra_after)\n                return new_chunk\n\n    return None\n\ndef findSourceChunk(db, changeset, source_file_ids, target_file, target_chunk, extra_target_chunks):\n    for source_file in changeset.files:\n        if source_file_ids and not source_file.id in source_file_ids: continue\n        if source_file.chunks is None: continue\n\n        for source_chunk in source_file.chunks:\n            # Shouldn't compare chunk to itself, of course.\n            if target_file == source_file and target_chunk == source_chunk:\n                continue\n\n            # Much fewer deleted lines than inserted lines in the target chunk;\n            # unlikely to be a relevant source chunk.\n            #if source_chunk.delete_count * 1.5 < target_chunk.insert_count:\n            #    continue\n\n            if source_chunk.analysis:\n                # If more than half the deleted lines are mapped against\n                # inserted lines, most likely edited rather than moved code.\n                if source_chunk.delete_count < len(source_chunk.analysis.split(\";\")) * 2:\n                    continue\n\n            source_file.loadOldLines()\n            source_chunk.deleted_lines = source_file.getOldLines(source_chunk)\n\n            new_chunk = compareChunks(source_file, source_chunk, target_file, target_chunk, extra_target_chunks)\n\n            if new_chunk:\n                return source_file, new_chunk\n\n    return None, None\n\ndef detectMoves(db, changeset, source_file_ids=None, target_file_ids=None):\n    moves = []\n\n    for target_file in changeset.files:\n        if target_file_ids and not target_file.id in target_file_ids: continue\n\n        current_chunks = target_file.chunks\n\n        count = 0\n\n        while current_chunks:\n            extra_target_chunks = []\n            count += 1\n\n            for target_chunk in current_chunks:\n                # White-space only changes; unlikely target of moved code.\n                if target_chunk.is_whitespace:\n                    continue\n\n                # Too few inserted lines; couldn't possibly be an interesting target\n                # of moved code.\n                if target_chunk.insert_count < 5:\n                    continue\n\n                if target_chunk.analysis:\n                    # If more than half the inserted lines are mapped against\n                    # deleted lines, most likely edited rather than moved code.\n                    if target_chunk.insert_count < len(target_chunk.analysis.split(\";\")) * 2:\n                        continue\n\n                target_file.loadNewLines()\n                target_chunk.inserted_lines = target_file.getNewLines(target_chunk)\n\n                source_file, chunk = findSourceChunk(db, changeset, source_file_ids, target_file, target_chunk, extra_target_chunks)\n\n                if source_file and chunk:\n                    moves.append((source_file, target_file, chunk))\n                    continue\n\n            current_chunks = extra_target_chunks\n\n    if moves:\n        def orderChunks(a, b):\n            a_source_file, a_target_file, a_chunk = a\n            b_source_file, b_target_file, b_chunk = b\n\n            c = cmp(a_target_file.path, b_target_file.path)\n            if c != 0: return c\n            else: return cmp(a_chunk.insert_offset, b_chunk.insert_offset)\n\n        moves.sort(orderChunks)\n\n        move_changeset = diff.Changeset(None, changeset.parent, changeset.child, 'moves', [])\n\n        for source_file, target_file, chunk in moves:\n            move_file = diff.File(0, \"\",\n                                  source_file.old_sha1,\n                                  target_file.new_sha1,\n                                  source_file.repository,\n                                  chunks=[chunk],\n                                  move_source_file=source_file,\n                                  move_target_file=target_file)\n\n            move_changeset.files.append(move_file)\n\n        return move_changeset\n    else:\n        return None\n"
  },
  {
    "path": "src/changeset/html.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport re\nimport itertools\nimport urllib\n\nfrom time import strftime\nfrom bisect import bisect_right\n\nimport textutils\nimport dbutils\nimport diff\nimport diff.context\nimport changeset.utils as changeset_utils\nimport reviewing.comment as review_comment\nimport htmlutils\nimport configuration\n\nfrom htmlutils import jsify, Generator, Text, HTML, stripStylesheet\n\nre_tag = re.compile(\"<([bi]) class='?([a-z]+)'?>\")\nre_tailws = re.compile(\"^(.*?)(\\s+)((?:<[^>]+>)*)$\")\n\nclass CodeContexts:\n    class Context:\n        def __init__(self, first_line, last_line, description):\n            self.first_line = first_line\n            self.last_line = last_line\n            self.description = description\n\n        def __cmp__(self, index):\n            return cmp(self.first_line, index)\n\n    def __init__(self, db, sha1, first_line, last_line):\n        cursor = db.cursor()\n        cursor.execute(\"\"\"SELECT first_line, last_line, context\n                            FROM codecontexts\n                           WHERE sha1=%s\n                             AND first_line<=%s\n                             AND last_line>=%s\n                        ORDER BY first_line ASC, last_line DESC\"\"\",\n                       (sha1, last_line, first_line))\n        self.contexts = [CodeContexts.Context(first_line, last_line, description) for first_line, last_line, description in cursor]\n\n    def find(self, linenr):\n        index = bisect_right(self.contexts, linenr)\n        if index:\n            context = self.contexts[index - 1]\n            if context.last_line >= linenr:\n                return context.description\n        return None\n\ndef expandHTML(db, file, old_offset, new_offset, lines, target):\n    if old_offset == 1: where = 'top'\n    elif old_offset + lines - 1 == file.oldCount(): where = 'bottom'\n    else: where = 'middle'\n\n    select = target.select(onchange=('expand(this, %d, %r, %r, %r, %d, %d, %d);' % (file.id, file.path, file.new_sha1, where, old_offset, new_offset, lines)))\n    select.option(value='none').text(\"%s lines not shown\" % lines)\n\n    if where == 'middle': actualLines = lines\n    else: actualLines = lines * 2\n\n    if actualLines > 20: select.option(value=10).text(\"Show 10 more\")\n    if actualLines > 50: select.option(value=25).text(\"Show 25 more\")\n    if actualLines > 100: select.option(value=50).text(\"Show 50 more\")\n\n    select.option(value=lines).text(\"All\")\n\ndef generateDataScript(db, user, changeset, review, file_id_format, compact, parent_index):\n    data = { 'parent_id': changeset.parent.id if changeset.parent else None,\n             'parent_sha1': htmlutils.jsify(changeset.parent.sha1) if changeset.parent else None,\n             'child_id': changeset.child.id,\n             'child_sha1': htmlutils.jsify(changeset.child.sha1),\n             'changeset_id': jsify(changeset.id),\n             'commit_ids': \", \".join([str(commit.getId(db)) for commit in reversed(changeset.commits(db))]),\n             'parent_index': parent_index }\n\n    if parent_index is None:\n        commits = changeset.commits(db)\n\n        if review and commits:\n            if len(commits) > 1:\n                cursor = db.cursor()\n                cursor.execute(\"SELECT id FROM changesets JOIN reviewchangesets ON (changeset=id) WHERE review=%s AND child=ANY (%s)\",\n                               (review.id, [commit.getId(db) for commit in commits]))\n\n                changeset_ids = [changeset_id for (changeset_id,) in cursor]\n            else:\n                changeset_ids = [changeset.id]\n        else:\n            changeset_ids = None\n\n        if changeset_ids is None: changeset_ids = \"null\"\n        else: changeset_ids = \"[%s]\" % \", \".join(map(str, changeset_ids))\n\n        data[\"changeset_ids\"] = changeset_ids\n\n        if changeset.parent:\n            data_script = \"\"\"\nvar changeset = { parent: { id: %(parent_id)d, sha1: %(parent_sha1)s },\n                  child: { id: %(child_id)d, sha1: %(child_sha1)s },\n                  id: %(changeset_id)s,\n                  ids: %(changeset_ids)s,\n                  commits: [ %(commit_ids)s ] };\nvar useFiles = files;\n\"\"\" % data\n        else:\n            data_script = \"\"\"\nvar changeset = { parent: null,\n                  child: { id: %(child_id)d, sha1: %(child_sha1)s },\n                  id: %(changeset_id)s,\n                  ids: %(changeset_ids)s,\n                  commits: [ %(commit_ids)s ] };\nvar useFiles = files;\n\"\"\" % data\n    else:\n        data_script = \"\"\"\nvar changeset;\nvar parent_index = %(parent_index)d;\n\nif (!changeset)\n  changeset = { commits: [ %(child_id)d ] };\n\nif (!changeset[parent_index])\n  changeset[parent_index] = {};\n\nif (!changeset.child)\n  changeset.child = { id: %(child_id)d, sha1: %(child_sha1)s };\n\nchangeset[parent_index].parent = { id: %(parent_id)d, sha1: %(parent_sha1)s };\nchangeset[parent_index].child = { id: %(child_id)d, sha1: %(child_sha1)s };\nchangeset[parent_index].id = %(changeset_id)s;\n\nvar useFiles = files[parent_index] = {};\n\"\"\" % data\n\n    parent_index_property = \"parent: %d, \" % parent_index if parent_index is not None else \"\"\n\n    all_files = set()\n\n    for file in changeset.files:\n        if file.move_source_file and file.move_target_file:\n            all_files.add(file.move_source_file)\n            all_files.add(file.move_target_file)\n        elif file.hasChanges():\n            all_files.add(file)\n\n    for file in all_files:\n        data_script += \"\"\"\nuseFiles[%d] = { old_sha1: %r,\n               %snew_sha1: %r,\n               %spath: %s };\n\"\"\" % (file.id, file.old_sha1, \" \" * len(str(file.id)), file.new_sha1, \" \" * len(str(file.id)), jsify(file.path))\n        if file.old_sha1 != \"0\" * 40 and file.new_sha1 != \"0\" * 40:\n            data_script += \"\"\"files[%r] = { %sid: %d, side: 'o' };\n\"\"\" % (file.old_sha1, parent_index_property, file.id)\n            data_script += \"\"\"files[%r] = { id: %d, side: 'n' };\n\"\"\" % (file.new_sha1, file.id)\n        elif file.new_sha1 != \"0\" * 40:\n            data_script += \"\"\"files[%r] = { id: %d, side: 'n' };\n\"\"\" % (file.new_sha1, file.id)\n        else:\n            data_script += \"\"\"files[%r] = { %sid: %d, side: 'o' };\n\"\"\" % (file.old_sha1, parent_index_property, file.id)\n\n    if review:\n        data_script += \"\"\"\n%s\nvar commentChains;\n\"\"\" % review.getJS()\n\n        reviewable_files = \", \".join([(\"%d: %r\" % (file_id, state)) for file_id, (is_reviewer, state, reviewers) in changeset.getReviewFiles(db, user, review).items() if is_reviewer])\n\n        if parent_index is None:\n            data_script += \"\"\"\nchangeset.reviewableFiles = { %s };\n\"\"\" % reviewable_files\n        else:\n            data_script += \"\"\"\nchangeset[parent_index].reviewableFiles = { %s };\n\"\"\" % reviewable_files\n\n    if compact: return re.sub(r\"\\B\\s+\\B|\\b\\s+\\B|\\B\\s+\\b\", \"\", data_script).strip()\n    else: return data_script.strip()\n\ndef render(db, target, user, repository, changeset, review=None, review_mode=None,\n           context_lines=3, style=\"horizontal\", wrap=True, options={}, parent_index=None):\n    addResources(db, user, repository, review, options.get(\"compact\", False),\n                 options.get(\"tabify\", False), target)\n\n    compact = options.get(\"compact\", False)\n\n    cursor = db.cursor()\n\n    if options.get(\"merge\"):\n        local_comments_only = True\n    else:\n        local_comments_only = False\n\n    target.script(type='text/javascript').text(generateDataScript(db, user, changeset, review, options.get(\"file_id_format\", \"f%d\"), compact, parent_index), cdata=True)\n    target.addInternalStylesheet(\"\"\"\ntable.file > tbody.lines > tr > td.line {\n    white-space: pre%s !important\n}\"\"\" % (wrap and \"-wrap\" or \"\"))\n\n    comment_chains_per_file = {}\n\n    if review:\n        comment_chain_script = \"\"\n\n        for file in changeset.files:\n            if file.hasChanges() and not file.wasRemoved():\n                comment_chains = review_comment.loadCommentChains(db, review, user, file=file, changeset=changeset, local_comments_only=local_comments_only)\n                if comment_chains:\n                    comment_chains_per_file[file.path] = comment_chains\n\n                    for chain in comment_chains:\n                        if file.new_sha1 in chain.lines_by_sha1: sha1 = file.new_sha1\n                        else: sha1 = file.old_sha1\n\n                        comment_chain_script += \"commentChains.push(%s);\\n\" % chain.getJSConstructor(sha1)\n\n        if comment_chain_script:\n            target.script(type='text/javascript').text(comment_chain_script, cdata=True)\n\n    def join(a, b):\n        if a and b: return itertools.chain(a, b)\n        elif a: return a\n        elif b: return b\n        else: return []\n\n    local_options = { \"style\": style, \"count_chunks\": True }\n    for name, value in options.items():\n        local_options[name] = value\n\n    if local_options.get(\"expand\"):\n        limit = user.getPreference(db, \"commit.expandFilesLimit\")\n        if limit != 0 and limit < len(changeset.files):\n            del local_options[\"expand\"]\n\n    for index, file in enumerate(changeset.files):\n        if file.hasChanges():\n            if not file.wasRemoved() and not file.isBinaryChanges():\n                file.loadOldLines(True, request_highlight=True)\n                file.loadNewLines(True, request_highlight=True)\n\n                if not file.isEmptyChanges():\n                    lines = diff.context.ContextLines(\n                        file, file.chunks, [(chain, False) for chain in comment_chains_per_file.get(file.path, [])],\n                        merge=options.get(\"merge\", False), conflicts=changeset.conflicts)\n                    file.macro_chunks = lines.getMacroChunks(context_lines, highlight=True)\n                else:\n                    file.macro_chunks = []\n            else:\n                file.macro_chunks = []\n\n            renderFile(db, target, user, review, file, first_file=index == 0, options=local_options, conflicts=changeset.conflicts, add_resources=False)\n\n            file.clean()\n\n            yield target\n\ndef renderFile(db, target, user, review, file, first_file=False, options={}, conflicts=False, add_resources=True):\n    if add_resources:\n        addResources(db, user, file.repository, review, options.get(\"compact\", False), options.get(\"tabify\"), target)\n\n    if options.get(\"count_chunks\"):\n        deleted = 0\n        inserted = 0\n        if file.wasRemoved():\n            file.loadOldLines(False)\n            deleted = file.oldCount()\n        else:\n            for macro_chunk in file.macro_chunks:\n                for chunk in macro_chunk.chunks:\n                    deleted += chunk.delete_count\n                    inserted += chunk.insert_count\n        chunksText = \"-%d/+%d lines\" % (deleted, inserted)\n    else:\n        chunksText = \"\"\n\n    compact = options.get(\"compact\", False)\n\n    file_table_class = \"file sourcefont\"\n    compact = options.get(\"compact\", False)\n\n    if options.get(\"show\"):\n        file_table_class += \" show\"\n    if options.get(\"expand\"):\n        file_table_class += \" expanded\"\n        compact = False\n\n    if first_file:\n        file_table_class += \" first\"\n\n    file_id = \"f%d\" % file.id\n    customFileId = options.get(\"file_id\")\n    if customFileId:\n        file_id = customFileId(file_id)\n\n    if options.get(\"tabify\"):\n        target.script(type=\"text/javascript\").text(\"calculateTabWidth();\")\n\n    table = target.table(file_table_class, width='100%', cellspacing=0, cellpadding=0, id=file_id, critic_file_id=file.id, critic_parent_index=options.get(\"parent_index\"))\n\n    if not compact:\n        columns = table.colgroup()\n        columns.col('edge')\n        columns.col('linenr')\n        columns.col('line')\n        columns.col('middle')\n        columns.col('middle')\n        columns.col('line')\n        columns.col('linenr')\n        columns.col('edge')\n\n    row = table.thead().tr()\n\n    header_left = options.get(\"header_left\")\n    header_right = options.get(\"header_right\")\n\n    def make_url(url_path, path):\n        params = { \"sha1\": commit.sha1, \"path\": path }\n        if review is None:\n            params[\"repository\"] = str(file.repository.id)\n        else:\n            params[\"review\"] = str(review.id)\n        return \"/%s?%s\" % (url_path, urllib.urlencode(params))\n\n    if header_left:\n        header_left(db, row.td('left', colspan=4, align='left'), file)\n    else:\n        cell = row.td('left', colspan=4, align='left')\n\n        commit = options.get(\"commit\")\n        if commit:\n            cell.a(\"showtree root\", href=make_url(\"showtree\", \"/\")).text(\"root\")\n            cell.span(\"slash\").text('/')\n\n            components = file.path.split(\"/\")\n            for index, component in enumerate(components[:-1]):\n                cell.a(\"showtree\", href=make_url(\"showtree\", \"/\".join(components[:index + 1]))).text(component, escape=True)\n                cell.span(\"slash\").text('/')\n\n            if not file.wasRemoved():\n                cell.a(\"showtree\", href=make_url(\"showfile\", \"/\".join(components))).text(components[-1], escape=True)\n            else:\n                cell.text(components[-1], escape=True)\n        else:\n            cell.text(file.path)\n\n        if not compact:\n            cell.comment(\"sha1: %s to %s\" % (file.old_sha1, file.new_sha1))\n\n    if header_right:\n        header_right(db, row.td('right', colspan=4, align='right'), file)\n    else:\n        row.td('right', colspan=4, align='right').text(chunksText)\n\n    next_old_offset = 1\n    next_new_offset = 1\n\n    display_type = options.get(\"display_type\", \"both\")\n    deleted_file = False\n    added_file = False\n\n    if not file.isBinaryChanges() and not file.isEmptyChanges():\n        if file.old_sha1 == 40 * '0':\n            display_type = \"new\"\n\n            if file.getLanguage() is None:\n                limit = configuration.limits.MAXIMUM_ADDED_LINES_UNRECOGNIZED\n            else:\n                limit = configuration.limits.MAXIMUM_ADDED_LINES_RECOGNIZED\n\n            count = file.newCount()\n            if count > limit and len(file.macro_chunks) == 1 and len(file.macro_chunks[0].lines) == count:\n                added_file = True\n        elif file.new_sha1 == 40 * '0':\n            display_type = \"old\"\n            deleted_file = not options.get(\"include_deleted\", False)\n\n    def baseFileId(file):\n        if file.move_source_file and file.move_target_file:\n            return \"fm%d_%d\" % (file.move_source_file.id, file.move_target_file.id)\n        else:\n            return \"f%d\" % file.id\n\n    def baseLineId(file, line, index):\n        file_id = fileId(file)\n        if line.type == diff.Line.DELETED:\n            return \"%so%dn0\" % (file_id, line.old_offset)\n        elif line.type == diff.Line.INSERTED:\n            return \"%so0n%d\" % (file_id, line.new_offset)\n        else:\n            return \"%so%dn%d\" % (file_id, line.old_offset, line.new_offset)\n\n    def baseLineCellId(file, version, line):\n        if file.move_source_file and file.move_target_file:\n            if version == \"o\": file_id = file.move_source_file.id\n            else: file_id = file.move_target_file.id\n        else:\n            file_id = file.id\n        if line: return \"f%d%s%d\" % (file_id, version, line)\n        else: return None\n\n    fileId = baseFileId\n\n    customLineId = options.get(\"line_id\")\n    if customLineId:\n        lineId = lambda file, line, index: customLineId(baseLineId(file, line, index))\n    else:\n        lineId = baseLineId\n\n    customLineCellId = options.get(\"line_cell_id\")\n    if customLineCellId:\n        lineCellId = lambda file, version, line: customLineCellId(baseLineCellId(file, version, line))\n    else:\n        lineCellId = baseLineCellId\n\n    def lineType(line, index):\n        type = line.type\n        if type == diff.Line.DELETED: return \"deleted\"\n        elif type == diff.Line.INSERTED: return \"inserted\"\n        elif type == diff.Line.MODIFIED: return \"modified whitespace\" if line.is_whitespace else \"modified\"\n        elif type == diff.Line.REPLACED: return \"replaced\"\n        else: return \"context\"\n\n    support_expand = options.get(\"support_expand\", True)\n    style = options.get(\"style\", \"horizontal\")\n    collapse_simple_hunks = user.getPreference(db, 'commit.diff.collapseSimpleHunks')\n\n    content_before = options.get(\"content_before\")\n    if content_before:\n        content = table.tbody('content')\n\n        row = content.tr('content')\n        row.td(colspan=2).text()\n        content_before(db, row.td(colspan=4))\n        row.td(colspan=2).text()\n\n    if added_file or deleted_file:\n        table.tbody('spacer').tr('spacer').td(colspan=8).text()\n\n        verb = \"added\" if added_file else \"deleted\"\n        side = \"new\" if added_file else \"old\"\n\n        if added_file: count = file.newCount()\n        else: count = file.oldCount()\n\n        tbody = table.tbody('deleted')\n\n        row = tbody.tr('deleted')\n        row.td(colspan=2).text()\n        row.td(colspan=4).h2().text(\"File was %s.\" % verb)\n        row.td(colspan=2).text()\n\n        if not file.isEmptyChanges():\n            row = tbody.tr('deleted')\n            row.td(colspan=2).text()\n            parent_index = options.get(\"parent_index\", -1)\n            if parent_index != -1:\n                fileset = \"files[%d]\" % parent_index\n            else:\n                fileset = \"files\"\n            row.td(colspan=4).button(onclick=\"fetchFile(%s, %d, '%s', event.currentTarget.parentNode.parentNode.parentNode);\" % (fileset, file.id, side)).text(\"Fetch %d %s Lines\" % (count, verb.capitalize()))\n            row.td(colspan=2).text()\n\n        table.tbody('spacer').tr('spacer').td(colspan=8).text()\n    elif file.isBinaryChanges() or file.isEmptyChanges():\n        table.tbody('spacer').tr('spacer').td(colspan=8).text()\n\n        if file.isBinaryChanges():\n            title = \"Binary\"\n            class_name = \"binary\"\n        else:\n            title = \"Empty\"\n            class_name = \"empty\"\n\n        tbody = table.tbody(class_name)\n\n        if file.wasAdded():\n            title += \" file added.\"\n        elif file.wasRemoved():\n            title += \" file removed.\"\n        else:\n            title += \" file modified.\"\n\n        row = tbody.tr(class_name)\n        row.td(colspan=2).text()\n        row.td(colspan=4).h2().text(title)\n        row.td(colspan=2).text()\n\n        if file.isBinaryChanges():\n            row = tbody.tr('download')\n            row.td(colspan=2).text()\n            cell = row.td(colspan=4)\n\n            def linkToFile(target, file, sha1):\n                is_image = False\n\n                base, _, extension = file.path.rpartition(\".\")\n                if base and configuration.mimetypes.MIMETYPES.get(extension, \"\").startswith(\"image/\"):\n                    is_image = True\n\n                url = \"/download/%s?sha1=%s&repository=%d\" % (file.path, sha1, file.repository.id)\n                link = target.a(href=url)\n\n                if is_image: link.img(src=url)\n                else: link.text(sha1)\n\n            if file.wasAdded():\n                linkToFile(cell, file, file.new_sha1)\n            elif file.wasRemoved():\n                linkToFile(cell, file, file.old_sha1)\n            else:\n                linkToFile(cell, file, file.old_sha1)\n                cell.innerHTML(\" &#8594; \")\n                linkToFile(cell, file, file.new_sha1)\n\n            row.td(colspan=2).text()\n\n        table.tbody('spacer').tr('spacer').td(colspan=8).text()\n    else:\n        if options.get(\"tabify\"):\n            tabwidth = file.getTabWidth()\n            indenttabsmode = file.getIndentTabsMode()\n            tabify = lambda line: htmlutils.tabify(line, tabwidth, indenttabsmode)\n        else:\n            tabify = lambda line: line\n\n        code_contexts = CodeContexts(db, file.new_sha1,\n                                     file.macro_chunks[0].lines[0].new_offset,\n                                     file.macro_chunks[-1].lines[-1].new_offset)\n\n        blocks = [(\"[%d,%d]\" % (macro_chunk.lines[0].new_offset, macro_chunk.lines[-1].new_offset))\n                  for macro_chunk in file.macro_chunks]\n\n        target.script(type=\"text/javascript\").text(\"blocks[%d] = [%s];\" % (file.id, \",\".join(blocks)))\n\n        for index, macro_chunk in enumerate(file.macro_chunks):\n            first_line = macro_chunk.lines[0]\n            last_line = macro_chunk.lines[-1]\n\n            spacer = table.tbody('spacer')\n\n            if support_expand and next_old_offset < first_line.old_offset and next_new_offset < first_line.new_offset:\n                row = spacer.tr('expand').td(colspan='8')\n                expandHTML(db, file, next_old_offset, next_new_offset, first_line.old_offset - next_old_offset, row)\n\n            code_context = code_contexts.find(first_line.new_offset)\n            if code_context: spacer.tr('context').td(colspan='8').text(code_context)\n\n            spacer.tr('spacer').td(colspan='8').text()\n\n            lines = table.tbody('lines')\n\n            local_display_type = display_type\n\n            for line in macro_chunk.lines:\n                if line.type != diff.Line.INSERTED:\n                    match = re_tailws.match(line.old_value)\n                    if match:\n                        line.old_value = match.group(1) + \"<i class='tailws'>\" + match.group(2) + \"</i>\" + match.group(3)\n                if line.type != diff.Line.DELETED:\n                    match = re_tailws.match(line.new_value)\n                    if match:\n                        line.new_value = match.group(1) + \"<i class='tailws'>\" + match.group(2) + \"</i>\" + match.group(3)\n\n                if line.old_value:\n                    line.old_value = line.old_value.replace(\"\\r\", \"<i class='cr'></i>\")\n                    if line.old_offset == 1:\n                        line.old_value = line.old_value.replace(\n                            \"&#65279;\", \"<i class='bom'></i>\", 1)\n                if line.new_value:\n                    line.new_value = line.new_value.replace(\"\\r\", \"<i class='cr'></i>\")\n                    if line.new_offset == 1:\n                        line.new_value = line.new_value.replace(\n                            \"&#65279;\", \"<i class='bom'></i>\", 1)\n\n            if collapse_simple_hunks:\n                if local_display_type == \"both\":\n                    deleted = False\n                    inserted = False\n\n                    for line in macro_chunk.lines:\n                        if line.type == diff.Line.MODIFIED or line.type == diff.Line.REPLACED:\n                            break\n                        elif line.type == diff.Line.DELETED:\n                            if inserted: break\n                            deleted = True\n                        elif line.type == diff.Line.INSERTED:\n                            if deleted: break\n                            inserted = True\n                    else:\n                        if deleted: local_display_type = \"old\"\n                        if inserted: local_display_type = \"new\"\n\n            if compact:\n                def packSyntaxHighlighting(line):\n                    return re_tag.sub(lambda m: \"<%s%s>\" % (m.group(1), m.group(2)), line)\n\n                items = []\n                for line in macro_chunk.lines:\n                    if line.type == diff.Line.MODIFIED and line.is_whitespace:\n                        line_type = diff.Line.WHITESPACE\n                    elif conflicts and line.type == diff.Line.DELETED and line.isConflictMarker():\n                        line_type = diff.Line.CONFLICT\n                    else:\n                        line_type = line.type\n                    data = [str(line_type)]\n                    if line.type != diff.Line.INSERTED:\n                        data.append(jsify(packSyntaxHighlighting(tabify(line.old_value)),\n                                          as_json=True))\n                    if line.type != diff.Line.DELETED:\n                        data.append(jsify(packSyntaxHighlighting(tabify(line.new_value)),\n                                          as_json=True))\n                    items.append(\"[%s]\" % \",\".join(data))\n                data = \"[%d,%d,%d,%d,%s]\" % (file.id,\n                                             2 if local_display_type == \"both\" else 1,\n                                             macro_chunk.lines[0].old_offset,\n                                             macro_chunk.lines[0].new_offset,\n                                             \"[%s]\" % \",\".join(items))\n                lines.comment(data.replace(\"--\", \"-\\u002d\"))\n            elif style == \"vertical\" or local_display_type != \"both\":\n                linesIterator = iter(macro_chunk.lines)\n                line = linesIterator.next()\n\n                def lineHTML(what, file, line, is_whitespace, target):\n                    line_class = what\n\n                    if is_whitespace and line.type == diff.Line.MODIFIED:\n                        line_class = \"modified\"\n\n                    if what == \"deleted\":\n                        linenr = line.old_offset\n                    else:\n                        linenr = line.new_offset\n\n                    row = target.tr(\"line single \" + line_class, id=lineId(file, line, 0))\n                    row.td(\"edge\").text()\n                    row.td(\"linenr old\").text(linenr)\n\n                    if what == \"deleted\" or local_display_type == \"old\":\n                        code = line.old_value\n                        lineClass = \"old\"\n                    else:\n                        code = line.new_value\n                        lineClass = \"new\"\n\n                    if not code: code = \"&nbsp;\"\n\n                    row.td('line single ' + lineClass, colspan=4, id=lineCellId(file, lineClass[0], linenr)).innerHTML(tabify(code))\n                    row.td('linenr new').text(linenr)\n\n                    row.td(\"edge\").text()\n\n                try:\n                    while line:\n                        while line.type == diff.Line.CONTEXT:\n                            lineHTML(\"context\", file, line, False, lines)\n                            line = linesIterator.next()\n\n                        deleted = []\n                        inserted = []\n\n                        while line.is_whitespace:\n                            lineHTML(\"modified\", file, line, True, lines)\n                            line = linesIterator.next()\n\n                        previous_type = diff.Line.DELETED\n\n                        try:\n                            while line.type >= previous_type and not line.is_whitespace:\n                                if line.type != diff.Line.INSERTED: deleted.append(line)\n                                if line.type != diff.Line.DELETED: inserted.append(line)\n                                previous_type = line.type\n                                line = None\n                                line = linesIterator.next()\n                        except StopIteration:\n                            line = None\n\n                        for deletedLine in deleted:\n                            lineHTML(\"deleted\", file, deletedLine, False, lines)\n                        for insertedLine in inserted:\n                            lineHTML(\"inserted\", file, insertedLine, False, lines)\n                except StopIteration:\n                    pass\n            elif style == \"horizontal\":\n                for line in macro_chunk.lines:\n                    old_offset = None\n                    new_offset = None\n                    old_line = None\n                    new_line = None\n\n                    if line.type != diff.Line.INSERTED:\n                        old_offset = line.old_offset\n                        old_line = tabify(line.old_value)\n\n                    if line.type != diff.Line.DELETED:\n                        new_offset = line.new_offset\n                        new_line = tabify(line.new_value)\n\n                    if not old_line: old_line = \"&nbsp;\"\n                    if old_line is None: old_offset = None\n                    if not new_line: new_line = \"&nbsp;\"\n                    if new_line is None: new_offset = None\n\n                    line_type = lineType(line, 0)\n\n                    if conflicts and line.isConflictMarker():\n                        line_type += \" conflict\"\n\n                    row = (\"<tr class='line %s' id='%s'>\"\n                             \"<td class='edge'>&nbsp;</td>\"\n                             \"<td class='linenr old'>%s</td>\"\n                             \"<td class='line old'%s>%s</td>\"\n                             \"<td class='middle' colspan=2>&nbsp;</td>\"\n                             \"<td class='line new'%s>%s</td>\"\n                             \"<td class='linenr new'>%s</td>\"\n                             \"<td class='edge'>&nbsp;</td>\"\n                           \"</tr>\\n\") % (line_type, lineId(file, line, 0),\n                                         str(old_offset) if old_offset else \"&nbsp;\",\n                                         \" id='%s'\" % lineCellId(file, \"o\", old_offset) if old_offset else \"\", old_line,\n                                         \" id='%s'\" % lineCellId(file, \"n\", new_offset) if new_offset else \"\", new_line,\n                                         str(new_offset) if new_offset else \"&nbsp;\")\n\n                    lines.innerHTML(row)\n\n            next_old_offset = last_line.old_offset + 1\n            next_new_offset = last_line.new_offset + 1\n\n        spacer = table.tbody('spacer')\n\n        if support_expand and next_old_offset < file.oldCount() + 1 and next_new_offset < file.newCount() + 1:\n            row = spacer.tr('expand').td(colspan='8')\n            expandHTML(db, file, next_old_offset, next_new_offset, 1 + file.oldCount() - next_old_offset, row)\n\n        spacer.tr('spacer').td(colspan='8').text()\n\n    content_after = options.get(\"content_after\")\n    if content_after:\n        content = table.tbody('content')\n\n        row = content.tr('content')\n        row.td(colspan=2).text()\n        content_after(db, row.td(colspan=4), file=file)\n        row.td(colspan=2).text()\n\n        content.tr('spacer').td(colspan=8).text()\n\n    row = table.tfoot().tr()\n    cell = row.td('left', colspan=4)\n\n    commit = options.get(\"commit\")\n    if commit:\n        cell.a(\"showtree root\", href=make_url(\"showtree\", \"/\")).text(\"root\")\n        cell.span(\"slash\").text('/')\n\n        components = file.path.split(\"/\")\n        for index, component in enumerate(components[:-1]):\n            cell.a(\"showtree\", href=make_url(\"showtree\", \"/\".join(components[:index + 1]))).text(component, escape=True)\n            cell.span(\"slash\").text('/')\n\n        if not file.wasRemoved():\n            cell.a(\"showtree\", href=make_url(\"showfile\", \"/\".join(components))).text(components[-1], escape=True)\n        else:\n            cell.text(components[-1], escape=True)\n    else:\n        cell.text(file.path)\n\n    row.td('right', colspan=4).text(chunksText)\n\ndef addResources(db, user, repository, review, compact, tabify, target):\n    target.addExternalStylesheet(\"resource/changeset.css\")\n    target.addExternalScript(\"resource/changeset.js\")\n\n    target.addInternalStylesheet(stripStylesheet(user.getResource(db, \"syntax.css\")[1], compact))\n    target.addInternalStylesheet(stripStylesheet(user.getResource(db, \"diff.css\")[1], compact))\n\n    if user.getPreference(db, \"commit.diff.highlightIllegalWhitespace\"):\n        target.addInternalStylesheet(stripStylesheet(user.getResource(db, \"whitespace.css\")[1], compact))\n\n    ruler_column = user.getPreference(db, \"commit.diff.rulerColumn\", repository=repository)\n\n    if ruler_column > 0:\n        target.addExternalScript(\"resource/ruler.js\")\n\n    # Injected unconditionally (for tests).\n    target.addInternalScript(\"var rulerColumn = %d;\" % ruler_column)\n\n    if review:\n        target.addExternalStylesheet(\"resource/comment.css\")\n        target.addExternalStylesheet(\"resource/review.css\")\n        target.addExternalScript(\"resource/comment.js\")\n        target.addExternalScript(\"resource/review.js\")\n\n    if tabify:\n        target.addExternalScript(\"resource/tabify.js\")\n"
  },
  {
    "path": "src/changeset/load.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport diff\nimport dbutils\nimport gitutils\n\ndef loadChangeset(db, repository, changeset_id, filtered_file_ids=None, load_chunks=True):\n    return loadChangesets(db, repository,\n                          changesets=[diff.Changeset.fromId(db, repository, changeset_id)],\n                          filtered_file_ids=filtered_file_ids,\n                          load_chunks=load_chunks)[0]\n\ndef loadChangesetsForCommits(db, repository, commits, filtered_file_ids=None, load_chunks=True):\n    commit_ids = dict([(commit.getId(db), commit) for commit in commits])\n\n    def getCommit(commit_id):\n        return commit_ids.get(commit_id) or gitutils.Commit.fromId(db, repository, commit_id)\n\n    cursor = db.cursor()\n    cursor.execute(\"SELECT id, parent, child FROM changesets WHERE child=ANY (%s) AND type='direct'\", (commit_ids.keys(),))\n\n    changesets = []\n\n    for changeset_id, parent_id, child_id in cursor:\n        changesets.append(diff.Changeset(changeset_id, getCommit(parent_id), getCommit(child_id), \"direct\"))\n\n    return loadChangesets(db, repository, changesets, filtered_file_ids=filtered_file_ids, load_chunks=load_chunks)\n\ndef loadChangesets(db, repository, changesets, filtered_file_ids=None, load_chunks=True):\n    cursor = db.cursor()\n\n    changeset_ids = [changeset.id for changeset in changesets]\n    filtered_file_ids = list(filtered_file_ids) if filtered_file_ids else None\n\n    if filtered_file_ids is None:\n        cursor.execute(\"\"\"SELECT changeset, file, path, old_sha1, new_sha1, old_mode, new_mode\n                            FROM fileversions\n                            JOIN files ON (files.id=fileversions.file)\n                           WHERE changeset=ANY (%s)\"\"\",\n                       (changeset_ids,))\n    else:\n        cursor.execute(\"\"\"SELECT changeset, file, path, old_sha1, new_sha1, old_mode, new_mode\n                            FROM fileversions\n                            JOIN files ON (files.id=fileversions.file)\n                           WHERE changeset=ANY (%s)\n                             AND file=ANY (%s)\"\"\",\n                       (changeset_ids, filtered_file_ids))\n\n    files = dict([(changeset.id, {}) for changeset in changesets])\n\n    for changeset_id, file_id, file_path, file_old_sha1, file_new_sha1, file_old_mode, file_new_mode in cursor.fetchall():\n        files[changeset_id][file_id] = diff.File(file_id, file_path,\n                                                 file_old_sha1, file_new_sha1,\n                                                 repository,\n                                                 old_mode=file_old_mode,\n                                                 new_mode=file_new_mode,\n                                                 chunks=[])\n\n    if load_chunks:\n        if filtered_file_ids is None:\n            cursor.execute(\"\"\"SELECT id, changeset, file, deleteOffset, deleteCount, insertOffset, insertCount, analysis, whitespace\n                                FROM chunks\n                                WHERE changeset=ANY (%s)\n                                ORDER BY file, deleteOffset ASC\"\"\",\n                           (changeset_ids,))\n        else:\n            cursor.execute(\"\"\"SELECT id, changeset, file, deleteOffset, deleteCount, insertOffset, insertCount, analysis, whitespace\n                                FROM chunks\n                                WHERE changeset=ANY (%s)\n                                  AND file=ANY (%s)\n                                ORDER BY file, deleteOffset ASC\"\"\",\n                           (changeset_ids, filtered_file_ids))\n\n        for chunk_id, changeset_id, file_id, delete_offset, delete_count, insert_offset, insert_count, analysis, is_whitespace in cursor:\n            files[changeset_id][file_id].chunks.append(diff.Chunk(delete_offset, delete_count,\n                                                                  insert_offset, insert_count,\n                                                                  id=chunk_id,\n                                                                  is_whitespace=is_whitespace,\n                                                                  analysis=analysis))\n\n    for changeset in changesets:\n        changeset.files = diff.File.sorted(files[changeset.id].values())\n\n    return changesets\n"
  },
  {
    "path": "src/changeset/process.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport gitutils\nimport stat\n\ndef joinPaths(dirname, basename):\n    return \"%s/%s\" % (dirname, basename) if dirname else basename\n\nclass ChangedPath:\n    def __init__(self, path, oldEntry, newEntry):\n        self.path = path\n        self.oldEntry = oldEntry\n        self.newEntry = newEntry\n\ndef removedTree(repository, path, sha1):\n    changedPaths = []\n    for entry in gitutils.Tree.fromSHA1(repository, sha1):\n        changedPaths.extend(\n            removedEntry(repository, path, entry))\n    return changedPaths\n\ndef removedEntry(repository, path, entry):\n    path = joinPaths(path, entry.name)\n\n    changedPaths = [ChangedPath(path, entry, None)]\n    if stat.S_ISDIR(entry.mode):\n        changedPaths.extend(\n            removedTree(repository, path, entry.sha1))\n    return changedPaths\n\ndef addedTree(repository, path, sha1):\n    changedPaths = []\n    for entry in gitutils.Tree.fromSHA1(repository, sha1):\n        changedPaths.extend(\n            addedEntry(repository, path, entry))\n    return changedPaths\n\ndef addedEntry(repository, path, entry):\n    path = joinPaths(path, entry.name)\n\n    changedPaths = [ChangedPath(path, None, entry)]\n    if stat.S_ISDIR(entry.mode):\n        changedPaths.extend(\n            addedTree(repository, path, entry.sha1))\n    return changedPaths\n\ndef diffTrees(repository, path, oldTree, newTree):\n    oldNames = set(oldTree.keys())\n    newNames = set(newTree.keys())\n\n    commonNames = oldNames & newNames\n    removedNames = oldNames - commonNames\n    addedNames = newNames - commonNames\n\n    changedPaths = []\n\n    for name in removedNames:\n        changedPaths.extend(\n            removedEntry(repository, joinPaths(path, name), oldTree[name]))\n    for name in addedNames:\n        changedPaths.extend(\n            addedEntry(repository, joinPaths(path, name), newTree[name]))\n\n    for name in commonNames:\n        oldEntry = oldTree[name]\n        newEntry = newTree[name]\n\n        if oldEntry.sha1 != newEntry.sha1 or oldEntry.mode != newEntry.mode:\n            changedPath = joinPaths(path, name)\n            changedPaths.append(ChangedPath(changedPath, oldEntry, newEntry))\n\n            commonMode = oldEntry.mode & newEntry.mode\n            removedMode = oldEntry.mode - commonMode\n            addedMode = newEntry.mode - commonMode\n\n            if stat.S_ISDIR(removedMode):\n                changedPaths.extend(\n                    removedTree(repository, changedPath, oldEntry.sha1))\n            elif stat.S_ISDIR(addedMode):\n                changedPaths.extend(\n                    addedTree(repository, changedPath, newEntry.sha1))\n            elif stat.S_ISDIR(commonMode) and oldEntry.sha1 != newEntry.sha1:\n                changedPaths.extend(\n                    diffTrees(repository, changedPath,\n                              gitutils.Tree.fromSHA1(repository, oldEntry.sha1),\n                              gitutils.Tree.fromSHA1(repository, newEntry.sha1)))\n\n    return changedPaths\n\ndef diffCommits(repository, commitA, commitB):\n    return diffTrees(repository,\n                     None,\n                     gitutils.Tree.fromSHA1(repository, commitA.tree),\n                     gitutils.Tree.fromSHA1(repository, commitB.tree))\n"
  },
  {
    "path": "src/changeset/text.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport diff\nimport diff.context\n\nfrom changeset.utils import getCodeContext\n\ndef unified(db, changeset, context_lines=3):\n    result = \"\"\n\n    for file in changeset.files:\n        file.loadOldLines()\n        file.loadNewLines()\n\n        try:\n            lines = diff.context.ContextLines(file, file.chunks)\n\n            file.macro_chunks = lines.getMacroChunks(context_lines, highlight=False)\n\n            oldPath = file.path if not file.wasAdded() else \"dev/null\"\n            newPath = file.path if not file.wasRemoved() else \"dev/null\"\n\n            result += \"--- a/%s\\n+++ b/%s\\n\" % (oldPath, newPath)\n\n            if file.isBinaryChanges():\n                result += \"  Binary file.\\n\"\n                continue\n\n            for chunk in file.macro_chunks:\n                deleteOffset = chunk.lines[0].old_offset\n                deleteCount = len(filter(lambda line: line.type != diff.Line.INSERTED, chunk.lines))\n                insertOffset = chunk.lines[0].new_offset\n                insertCount = len(filter(lambda line: line.type != diff.Line.DELETED, chunk.lines))\n\n                chunkHeader = \"@@ -%d,%d +%d,%d @@\" % (deleteOffset, deleteCount, insertOffset, insertCount)\n\n                if db: codeContext = getCodeContext(db, file.new_sha1, insertOffset, minimized=True)\n                else: codeContext = None\n\n                if codeContext: chunkHeader += \" %s\" % codeContext[:80 - len(chunkHeader)]\n\n                result += chunkHeader + \"\\n\"\n\n                lines = iter(chunk.lines)\n                line = lines.next()\n\n                try:\n                    while line:\n                        while line.type == diff.Line.CONTEXT:\n                            result += \"  %s\\n\" % line.new_value\n                            line = lines.next()\n\n                        deleted = []\n                        inserted = []\n\n                        try:\n                            while line.type != diff.Line.CONTEXT:\n                                if line.type != diff.Line.INSERTED: deleted.append(line)\n                                if line.type != diff.Line.DELETED: inserted.append(line)\n                                line = lines.next()\n                        except StopIteration:\n                            line = None\n\n                        for deletedLine in deleted:\n                            result += \"- %s\\n\" % deletedLine.old_value\n                        for insertedLine in inserted:\n                            result += \"+ %s\\n\" % insertedLine.new_value\n                except StopIteration:\n                    pass\n        finally:\n            file.cleanLines()\n\n    return result\n"
  },
  {
    "path": "src/changeset/utils.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nfrom subprocess import Popen as process, PIPE\nfrom sys import argv, stderr, exit\nimport re\nfrom dbutils import find_file, describe_file\nimport gitutils\nimport syntaxhighlight\nimport syntaxhighlight.request\nfrom diffutils import expandWithContext\nfrom htmlutils import htmlify, jsify\nfrom time import strftime\n\nimport diff\nimport diff.analyze\nimport diff.parse\nimport diff.merge\n\nimport load\nimport dbutils\nimport client\n\ndef createFullMergeChangeset(db, user, repository, commit, **kwargs):\n    assert len(commit.parents) > 1\n\n    changesets = createChangeset(db, user, repository, commit, **kwargs)\n\n    assert len(changesets) == len(commit.parents)\n\n    replayed = createChangeset(db, user, repository, commit, conflicts=True, **kwargs)\n\n    assert len(replayed) == 1\n\n    changesets.append(replayed[0])\n\n    return changesets\n\ndef createChangesets(db, repository, commits):\n    cursor = db.cursor()\n    requests = []\n\n    for commit in commits:\n        if len(commit.parents) > 1: changeset_type = 'merge'\n        else: changeset_type = 'direct'\n\n        cursor.execute(\"SELECT 1 FROM changesets WHERE child=%s AND type=%s\", (commit.getId(db), changeset_type))\n\n        if not cursor.fetchone():\n            requests.append({ \"repository_name\": repository.name,\n                              \"changeset_type\": changeset_type,\n                              \"child_sha1\": commit.sha1 })\n\n    if requests:\n        client.requestChangesets(requests)\n\ndef createChangeset(db, user, repository, commit=None, from_commit=None, to_commit=None, rescan=False, reanalyze=False, conflicts=False, filtered_file_ids=None, review=None, do_highlight=True, load_chunks=True):\n    cursor = db.cursor()\n\n    if conflicts:\n        if commit:\n            assert len(commit.parents) > 1\n\n            cursor.execute(\"SELECT replay FROM mergereplays WHERE original=%s\", (commit.getId(db),))\n            row = cursor.fetchone()\n\n            if row:\n                replay = gitutils.Commit.fromId(db, repository, row[0])\n            else:\n                replay = repository.replaymerge(db, user, commit)\n                if not replay: return None\n                cursor.execute(\"INSERT INTO mergereplays (original, replay) VALUES (%s, %s)\", (commit.getId(db), replay.getId(db)))\n\n            from_commit = replay\n            to_commit = commit\n\n            parents = [replay]\n        else:\n            parents = [from_commit]\n            commit = to_commit\n\n        changeset_type = 'conflicts'\n    elif commit:\n        parents = [gitutils.Commit.fromSHA1(db, repository, sha1) for sha1 in commit.parents] or [None]\n        changeset_type = 'merge' if len(parents) > 1 else 'direct'\n    else:\n        parents = [from_commit]\n        commit = to_commit\n        changeset_type = 'direct' if len(to_commit.parents) == 1 and from_commit == to_commit.parents[0] else 'custom'\n\n    changesets = []\n\n    thin_diff = False\n\n    changeset_ids = []\n\n    for parent in parents:\n        if parent:\n            cursor.execute(\"SELECT id FROM changesets WHERE parent=%s AND child=%s AND type=%s\",\n                           (parent.getId(db), commit.getId(db), changeset_type))\n        else:\n            cursor.execute(\"SELECT id FROM changesets WHERE parent IS NULL AND child=%s AND type=%s\",\n                           (commit.getId(db), changeset_type))\n\n        row = cursor.fetchone()\n\n        if row: changeset_ids.append(row[0])\n        else: break\n\n    assert len(changeset_ids) in (0, len(parents))\n\n    if changeset_ids:\n        if rescan and user.hasRole(db, \"developer\"):\n            cursor.executemany(\"DELETE FROM changesets WHERE id=%s\", [(changeset_id,) for changeset_id in changeset_ids])\n            db.commit()\n            changeset_ids = []\n        else:\n            for changeset_id in changeset_ids:\n                if changeset_type == 'custom':\n                    cursor.execute(\"UPDATE customchangesets SET time=NOW() WHERE changeset=%s\", (changeset_id,))\n\n                changeset = load.loadChangeset(db, repository, changeset_id, filtered_file_ids=filtered_file_ids, load_chunks=load_chunks)\n                changeset.conflicts = conflicts\n\n                if reanalyze and user.hasRole(db, \"developer\"):\n                    analysis_values = []\n\n                    for file in changeset.files:\n                        if not filtered_file_ids or file.id in filtered_file_ids:\n                            for index, chunk in enumerate(file.chunks):\n                                old_analysis = chunk.analysis\n                                chunk.analyze(file, index == len(file.chunks) - 1, True)\n                                if old_analysis != chunk.analysis:\n                                    analysis_values.append((chunk.analysis, chunk.id))\n\n                    if reanalyze == \"commit\" and analysis_values:\n                        cursor.executemany(\"UPDATE chunks SET analysis=%s WHERE id=%s\", analysis_values)\n\n                changesets.append(changeset)\n\n    if not changesets:\n        if len(parents) == 1 and from_commit and to_commit and filtered_file_ids:\n            if from_commit.isAncestorOf(to_commit):\n                iter_commit = to_commit\n                while iter_commit != from_commit:\n                    if len(iter_commit.parents) > 1:\n                        thin_diff = True\n                        break\n                    iter_commit = gitutils.Commit.fromSHA1(db, repository, iter_commit.parents[0])\n            else:\n                thin_diff = True\n\n        if not thin_diff:\n            if changeset_type == \"direct\":\n                request = { \"changeset_type\": \"direct\",\n                            \"child_sha1\": commit.sha1 }\n            elif changeset_type == \"custom\":\n                request = { \"changeset_type\": \"custom\",\n                            \"parent_sha1\": from_commit.sha1 if from_commit else \"0\" * 40,\n                            \"child_sha1\": to_commit.sha1 }\n            elif changeset_type == \"merge\":\n                request = { \"changeset_type\": \"merge\",\n                            \"child_sha1\": commit.sha1 }\n            else:\n                request = { \"changeset_type\": \"conflicts\",\n                            \"parent_sha1\": from_commit.sha1,\n                            \"child_sha1\": to_commit.sha1 }\n\n            request[\"repository_name\"] = repository.name\n\n            db.commit()\n\n            client.requestChangesets([request])\n\n            db.commit()\n\n            for parent in parents:\n                if parent:\n                    cursor.execute(\"SELECT id FROM changesets WHERE parent=%s AND child=%s AND type=%s\",\n                                   (parent.getId(db), commit.getId(db), changeset_type))\n                else:\n                    cursor.execute(\"SELECT id FROM changesets WHERE parent IS NULL AND child=%s AND type=%s\",\n                                   (commit.getId(db), changeset_type))\n\n                changeset_id = cursor.fetchone()[0]\n                changeset = load.loadChangeset(db, repository, changeset_id, filtered_file_ids=filtered_file_ids, load_chunks=load_chunks)\n                changeset.conflicts = conflicts\n\n                changesets.append(changeset)\n        else:\n            changes = diff.parse.parseDifferences(repository, from_commit=from_commit, to_commit=to_commit, filter_paths=[describe_file(db, file_id) for file_id in filtered_file_ids])[from_commit.sha1]\n\n            dbutils.find_files(db, changes)\n\n            for file in changes:\n                for index, chunk in enumerate(file.chunks):\n                    chunk.analyze(file, index == len(file.chunks) - 1)\n\n            changeset = diff.Changeset(None, from_commit, to_commit, changeset_type)\n            changeset.conflicts = conflicts\n            changeset.files = diff.File.sorted(changes)\n\n            changesets.append(changeset)\n\n    if do_highlight:\n        highlights = {}\n\n        for changeset in changesets:\n            for file in changeset.files:\n                if file.canHighlight():\n                    if file.old_sha1 and file.old_sha1 != '0' * 40:\n                        highlights[file.old_sha1] = (file.path, file.getLanguage())\n                    if file.new_sha1 and file.new_sha1 != '0' * 40:\n                        highlights[file.new_sha1] = (file.path, file.getLanguage())\n\n        syntaxhighlight.request.requestHighlights(repository, highlights, \"legacy\")\n\n    return changesets\n\ndef getCodeContext(db, sha1, line, minimized=False):\n    cursor = db.cursor()\n    cursor.execute(\"SELECT context FROM codecontexts WHERE sha1=%s AND first_line<=%s AND last_line>=%s ORDER BY first_line DESC LIMIT 1\", [sha1, line, line])\n    row = cursor.fetchone()\n    if row:\n        context = row[0]\n        if minimized: context = re.sub(\"\\\\(.*(?:\\\\)|...$)\", \"(...)\", context)\n        return context\n    else: return None\n"
  },
  {
    "path": "src/cli.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport argparse\nimport sys\nimport traceback\n\nimport dbutils\nimport gitutils\nimport mailutils\nimport reviewing.utils\nimport reviewing.filters\nimport reviewing.mail\nimport reviewing.comment\nimport reviewing.comment.propagate\n\nfrom textutils import json_encode, json_decode\n\ndb = None\n\ndef init(user_id, authentication_labels):\n    global db\n\n    db = dbutils.Database.forUser()\n\n    if user_id is None:\n        user = dbutils.User.makeAnonymous()\n    else:\n        user = dbutils.User.fromId(db, user_id)\n\n    db.setUser(user, authentication_labels)\n\ndef finish():\n    global db\n\n    if db:\n        db.commit()\n        db.close()\n        db = None\n\ndef abort():\n    global db\n\n    if db:\n        db.rollback()\n        db.close()\n        db = None\n\ndef sendCustomMail(from_user, recipients, subject, headers, body, review):\n    assert recipients is not None or review is not None\n\n    if review:\n        if recipients is None:\n            recipients = review.getRecipients(db)\n\n    files = []\n\n    for to_user in recipients:\n        if not to_user.getPreference(db, \"email.activated\") \\\n               or to_user.status == \"retired\":\n            continue\n\n        if review:\n            parent_message_id = reviewing.mail.getReviewMessageId(\n                db, to_user, review, files)\n\n        message_id = mailutils.generateMessageId(len(files) + 1)\n\n        if review:\n            filename = reviewing.mail.sendMail(\n                db, review, message_id, from_user, to_user, recipients, subject,\n                body, parent_message_id, headers)\n        else:\n            filename = mailutils.queueMail(\n                from_user, to_user, recipients, subject, body,\n                message_id=message_id, headers=headers)\n\n        files.append(filename)\n\n    return files\n\ndef propagateComment(data):\n    try:\n        review = dbutils.Review.fromId(db, data[\"review_id\"])\n        commit = gitutils.Commit.fromId(db, review.repository, data[\"commit_id\"])\n        propagation = reviewing.comment.propagate.Propagation(db)\n        if \"chain_id\" in data:\n            chain = reviewing.comment.CommentChain.fromId(\n                db, data[\"chain_id\"], user=None, review=review)\n            if chain is None:\n                return \"invalid chain id\"\n            if commit != chain.addressed_by:\n                return \"wrong commit: must be current addressed_by\"\n            propagation.setExisting(\n                review, chain.id, commit, data[\"file_id\"],\n                data[\"first_line\"], data[\"last_line\"], True)\n            commits = review.getCommitSet(db).without(commit.parents)\n            propagation.calculateAdditionalLines(\n                commits, review.branch.getHead(db))\n        else:\n            if not propagation.setCustom(\n                    review, commit, data[\"file_id\"],\n                    data[\"first_line\"], data[\"last_line\"]):\n                return \"invalid location\"\n            propagation.calculateInitialLines()\n        data = {\n            \"status\": \"clean\" if propagation.active else \"modified\",\n            \"lines\": [[sha1, first_line, last_line]\n                      for sha1, (first_line, last_line)\n                      in propagation.new_lines.items()]\n            }\n        if not propagation.active:\n            data[\"addressed_by\"] = propagation.addressed_by[0].child.getId(db)\n        return data\n    except dbutils.NoSuchReview:\n        return \"invalid review id\"\n    except gitutils.GitReferenceError:\n        return \"invalid commit id\"\n    except Exception as exception:\n        return str(exception)\n\ndef checkRepositoryAccess(data):\n    import auth\n\n    read = modify = False\n\n    try:\n        gitutils.Repository.fromId(db, data[\"repository_id\"], for_modify=False)\n    except auth.AccessDenied:\n        pass\n    else:\n        read = True\n\n    try:\n        gitutils.Repository.fromId(db, data[\"repository_id\"], for_modify=True)\n    except auth.AccessDenied:\n        pass\n    else:\n        modify = True\n\n    return {\n        \"read\": read,\n        \"modify\": modify,\n    }\n\nHANDLERS = { \"propagate-comment\": propagateComment,\n             \"check-repository-access\": checkRepositoryAccess }\n\ndef main():\n    parser = argparse.ArgumentParser()\n\n    parser.add_argument(\"-u\", dest=\"user_id\", type=int)\n    parser.add_argument(\"-l\", dest=\"auth_labels\", action=\"append\", default=[])\n    parser.add_argument(\"command\", nargs=\"*\")\n\n    arguments = parser.parse_args()\n\n    try:\n        init(arguments.user_id, arguments.auth_labels)\n\n        for command in arguments.command:\n            pending_mails = None\n\n            if command == \"generate-mails-for-batch\":\n                data = json_decode(sys.stdin.readline())\n                batch_id = data[\"batch_id\"]\n                was_accepted = data[\"was_accepted\"]\n                is_accepted = data[\"is_accepted\"]\n                pending_mails = reviewing.utils.generateMailsForBatch(db, batch_id, was_accepted, is_accepted)\n            elif command == \"generate-mails-for-assignments-transaction\":\n                data = json_decode(sys.stdin.readline())\n                transaction_id = data[\"transaction_id\"]\n                pending_mails = reviewing.utils.generateMailsForAssignmentsTransaction(db, transaction_id)\n            elif command == \"apply-filters\":\n                data = json_decode(sys.stdin.readline())\n                filters = reviewing.filters.Filters()\n                user = dbutils.User.fromId(db, data[\"user_id\"]) if \"user_id\" in data else None\n                if \"review_id\" in data:\n                    review = dbutils.Review.fromId(db, data[\"review_id\"])\n                    filters.setFiles(db, review=review)\n                    filters.load(db, review=review, user=user,\n                                 added_review_filters=data.get(\"added_review_filters\", []),\n                                 removed_review_filters=data.get(\"removed_review_filters\", []))\n                else:\n                    repository = gitutils.Repository.fromId(db, data[\"repository_id\"])\n                    filters.setFiles(db, file_ids=data[\"file_ids\"])\n                    filters.load(db, repository=repository, recursive=data.get(\"recursive\", False), user=user)\n                sys.stdout.write(json_encode(filters.data) + \"\\n\")\n            elif command == \"generate-custom-mails\":\n                pending_mails = []\n                for data in json_decode(sys.stdin.readline()):\n                    from_user = dbutils.User.fromId(db, data[\"sender\"])\n                    if data.get(\"recipients\"):\n                        recipients = [dbutils.User.fromId(db, user_id)\n                                      for user_id in data[\"recipients\"]]\n                    else:\n                        recipients = None\n                    subject = data[\"subject\"]\n                    headers = data.get(\"headers\")\n                    body = data[\"body\"]\n                    if \"review_id\" in data:\n                        review = dbutils.Review.fromId(db, data[\"review_id\"])\n                    else:\n                        review = None\n                    pending_mails.extend(sendCustomMail(\n                        from_user, recipients, subject, headers, body, review))\n            elif command == \"set-review-state\":\n                data = json_decode(sys.stdin.readline())\n                error = \"\"\n                try:\n                    user = dbutils.User.fromId(db, data[\"user_id\"])\n                    review = dbutils.Review.fromId(db, data[\"review_id\"])\n                    if review.state != data[\"old_state\"]:\n                        error = \"invalid old state\"\n                    elif data[\"new_state\"] == \"open\":\n                        review.reopen(db, user)\n                    elif data[\"new_state\"] == \"closed\":\n                        review.close(db, user)\n                    elif data[\"new_state\"] == \"dropped\":\n                        review.drop(db, user)\n                    else:\n                        error = \"invalid new state\"\n                except dbutils.NoSuchUser:\n                    error = \"invalid user id\"\n                except dbutils.NoSuchReview:\n                    error = \"invalid review id\"\n                except Exception as error:\n                    error = str(error)\n                sys.stdout.write(error + \"\\n\")\n            elif command in HANDLERS:\n                data_in = json_decode(sys.stdin.readline())\n                data_out = HANDLERS[command](data_in)\n                sys.stdout.write(json_encode(data_out) + \"\\n\")\n            else:\n                sys.stdout.write(json_encode(\"unknown command: %s\" % command) + \"\\n\")\n                sys.exit(0)\n\n            if pending_mails is not None:\n                sys.stdout.write(json_encode(pending_mails) + \"\\n\")\n\n        finish()\n    except Exception:\n        sys.stdout.write(json_encode(traceback.format_exc()) + \"\\n\")\n    finally:\n        abort()\n\nif __name__ == \"__main__\":\n    main()\n"
  },
  {
    "path": "src/communicate.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport os\nimport fcntl\nimport select\nimport cStringIO\nimport time\nimport errno\n\nclass ProcessTimeout(Exception):\n    def __init__(self, timeout):\n        super(ProcessTimeout, self).__init__(\n            \"Process timed out (after %d seconds)\" % timeout)\n        self.timeout = timeout\n\nclass ProcessError(Exception):\n    def __init__(self, process, stderr):\n        super(ProcessError, self).__init__(\n            \"Process returned non-zero exit status %d\" % process.returncode)\n        self.returncode = process.returncode\n        self.stderr = stderr\n\ndef setnonblocking(fd):\n    fcntl.fcntl(fd, fcntl.F_SETFL, fcntl.fcntl(fd, fcntl.F_GETFL) | os.O_NONBLOCK)\n\nclass Communicate(object):\n    def __init__(self, process):\n        self.process = process\n        self.timeout = None\n        self.deadline = None\n        self.stdin_data = None\n        self.stdout_callbacks = [None, None]\n        self.stderr_callbacks = [None, None]\n        self.returncode = None\n\n    def setTimeout(self, timeout):\n        self.timeout = timeout\n        self.deadline = time.time() + timeout\n\n    def setInput(self, data):\n        self.stdin_data = data\n\n    def setCallbacks(self, stdout=None, stdout_line=None, stderr=None, stderr_line=None):\n        assert stdout is None or stdout_line is None\n        assert stderr is None or stderr_line is None\n        self.stdout_callbacks[:] = stdout, stdout_line\n        self.stderr_callbacks[:] = stderr, stderr_line\n\n    def __read(self, source, target, callbacks):\n        while True:\n            cb_data, cb_line = callbacks\n            try:\n                if cb_line:\n                    line = source.readline()\n                    if not line:\n                        return True\n                    cb_line(line)\n                else:\n                    data = source.read()\n                    if not data:\n                        return True\n                    if cb_data:\n                        cb_data(data)\n                    else:\n                        target.write(data)\n            except IOError as error:\n                if error.errno == errno.EAGAIN:\n                    return False\n                raise\n\n    def run(self):\n        process = self.process\n        poll = select.poll()\n\n        if callable(self.stdin_data):\n            stdin_data = \"\"\n        else:\n            stdin_data = self.stdin_data\n            self.stdin_data = None\n        stdin_done = False\n\n        stdout = cStringIO.StringIO()\n        stdout_done = False\n\n        stderr = cStringIO.StringIO()\n        stderr_done = False\n\n        if process.stdin:\n            setnonblocking(process.stdin)\n            poll.register(process.stdin, select.POLLOUT)\n        else:\n            stdin_done = True\n\n        if process.stdout:\n            setnonblocking(process.stdout)\n            poll.register(process.stdout, select.POLLIN)\n        else:\n            stdout_done = True\n\n        if process.stderr:\n            setnonblocking(process.stderr)\n            poll.register(process.stderr, select.POLLIN)\n        else:\n            stderr_done = True\n\n        while (not stdin_done or not stdout_done or not stderr_done) \\\n                and (self.deadline is None or time.time() < self.deadline):\n            if self.deadline is None:\n                timeout = None\n            else:\n                timeout = 1000 * (self.deadline - time.time())\n\n            while True:\n                try:\n                    events = poll.poll(timeout)\n                except select.error as (errnum, _):\n                    if errnum == errno.EINTR:\n                        continue\n                    raise\n                else:\n                    break\n\n            for fd, event in events:\n                if not stdin_done and fd == process.stdin.fileno():\n                    if callable(self.stdin_data):\n                        data = self.stdin_data()\n                        if data is None:\n                            self.stdin_data = None\n                        else:\n                            stdin_data += data\n\n                    if stdin_data:\n                        nwritten = os.write(process.stdin.fileno(), stdin_data)\n                        stdin_data = stdin_data[nwritten:]\n\n                    if not stdin_data and self.stdin_data is None:\n                        process.stdin.close()\n                        stdin_done = True\n                        poll.unregister(fd)\n\n                if not stdout_done and fd == process.stdout.fileno():\n                    stdout_done = self.__read(process.stdout, stdout,\n                                              self.stdout_callbacks)\n                    if stdout_done:\n                        poll.unregister(fd)\n\n                if not stderr_done and fd == process.stderr.fileno():\n                    stderr_done = self.__read(process.stderr, stderr,\n                                              self.stderr_callbacks)\n                    if stderr_done:\n                        poll.unregister(fd)\n\n        if stdin_done and stdout_done and stderr_done:\n            process.wait()\n\n            stdout_data = stdout.getvalue() if process.stdout else None\n            stderr_data = stderr.getvalue() if process.stderr else None\n\n            self.returncode = process.returncode\n\n            if self.returncode == 0:\n                return stdout_data, stderr_data\n            else:\n                raise ProcessError(process, stderr_data)\n\n        process.kill()\n        process.wait()\n\n        raise ProcessTimeout(self.timeout)\n"
  },
  {
    "path": "src/coverage.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2013 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport os\nimport trace\nimport errno\nimport tempfile\nimport shutil\nimport re\nimport json\n\ndef call(context, fn, *args, **kwargs):\n    import configuration\n\n    if not configuration.debug.COVERAGE_DIR:\n        return fn(*args, **kwargs)\n\n    context_dir = os.path.join(configuration.debug.COVERAGE_DIR, context)\n\n    try:\n        os.makedirs(context_dir)\n    except OSError as error:\n        if error.errno != errno.EEXIST:\n            raise\n\n    output_dir = tempfile.mkdtemp(dir=context_dir)\n    counts = output_dir + \".counts\"\n    tracer = trace.Trace(count=1, trace=0, outfile=counts)\n\n    try:\n        return tracer.runfunc(fn, *args, **kwargs)\n    finally:\n        results = tracer.results()\n        results.write_results(show_missing=False, coverdir=output_dir)\n        shutil.rmtree(output_dir)\n\nif __name__ == \"__main__\":\n    import argparse\n\n    parser = argparse.ArgumentParser(\"Critic Code Coverage Collection\")\n    parser.add_argument(\"--coverage-dir\")\n    parser.add_argument(\"--critic-dir\", action=\"append\")\n\n    arguments = parser.parse_args()\n\n    ignore_dirs = filter(None, sys.path)\n\n    coverage_dir = arguments.coverage_dir\n    sys.path[:0] = arguments.critic_dir\n\n    data = { \"contexts\": [] }\n\n    for context in os.listdir(coverage_dir):\n        context_dir = os.path.join(coverage_dir, context)\n\n        if not os.path.isdir(context_dir):\n            continue\n\n        context_index = len(data[\"contexts\"])\n        data[\"contexts\"].append(context)\n\n        tracer = trace.Trace()\n        results = tracer.results()\n\n        for filename in os.listdir(context_dir):\n            if filename.endswith(\".counts\"):\n                counts = os.path.join(context_dir, filename)\n                results.update(trace.Trace(infile=counts).results())\n                os.unlink(counts)\n\n        results.write_results(show_missing=False, coverdir=context_dir)\n\n        for filename in os.listdir(context_dir):\n            if filename.endswith(\".cover\"):\n                module_filename = filename[:-6].replace(\".\", \"/\") + \".py\"\n                if os.path.isfile(module_filename):\n                    counts = {}\n                    with open(os.path.join(context_dir, filename)) as coverage:\n                        lines = coverage.read().splitlines()\n                    executed = []\n                    for index, line in enumerate(lines):\n                        match = re.match(\" *\\d+:\", line)\n                        if match:\n                            executed.append(index)\n                    data.setdefault(module_filename, {})[context] = executed\n\n    json.dump(data, sys.stdout)\n\n    print\n"
  },
  {
    "path": "src/critic.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport os\nimport gitutils\nimport time\nimport re\nimport itertools\nimport traceback\nimport cStringIO\nimport wsgiref.util\nimport calendar\n\nfrom htmlutils import htmlify, Document\nfrom profiling import formatDBProfiling\nfrom textutils import json_encode, reflow\n\nimport request\nimport dbutils\nimport reviewing.filters as review_filters\nimport log.commitset as log_commitset\nimport diff\nimport mailutils\nimport configuration\nimport auth\nimport htmlutils\nimport api\nimport jsonapi\n\nimport operation.createcomment\nimport operation.createreview\nimport operation.manipulatecomment\nimport operation.manipulatereview\nimport operation.manipulatefilters\nimport operation.manipulateuser\nimport operation.manipulateassignments\nimport operation.fetchlines\nimport operation.markfiles\nimport operation.draftchanges\nimport operation.blame\nimport operation.trackedbranch\nimport operation.rebasereview\nimport operation.recipientfilter\nimport operation.editresource\nimport operation.autocompletedata\nimport operation.servicemanager\nimport operation.addrepository\nimport operation.news\nimport operation.checkrebase\nimport operation.applyfilters\nimport operation.savesettings\nimport operation.searchreview\nimport operation.registeruser\nimport operation.brancharchiving\nimport operation.miscellaneous\n\nimport page.utils\nimport page.createreview\nimport page.branches\nimport page.showcomment\nimport page.showcommit\nimport page.showreview\nimport page.showreviewlog\nimport page.showbatch\nimport page.showbranch\nimport page.showtree\nimport page.showfile\nimport page.config\nimport page.dashboard\nimport page.home\nimport page.managereviewers\nimport page.filterchanges\nimport page.tutorial\nimport page.news\nimport page.editresource\nimport page.statistics\nimport page.confirmmerge\nimport page.addrepository\nimport page.checkbranch\nimport page.search\nimport page.repositories\nimport page.services\nimport page.rebasetrackingreview\nimport page.createuser\nimport page.verifyemail\nimport page.manageextensions\nimport page.showfilters\n\ntry:\n    from customization.email import getUserEmailAddress\nexcept ImportError:\n    def getUserEmailAddress(_username):\n        return None\n\nif configuration.extensions.ENABLED:\n    RE_EXTENSION_RESOURCE = re.compile(\"^extension-resource/([a-z0-9][-._a-z0-9]+(?:/[a-z0-9][-._a-z0-9]+)+)$\", re.IGNORECASE)\n\nfrom operation import OperationResult, OperationError, OperationFailureMustLogin\n\ndef setContentTypeFromPath(req):\n    match = re.search(\"\\\\.([a-z]+)$\", req.path)\n    if match:\n        req.setContentType(configuration.mimetypes.MIMETYPES.get(match.group(1), \"text/plain\"))\n    else:\n        req.setContentType(\"text/plain\")\n\ndef handleStaticResource(req):\n    if req.path == \"static-resource/\":\n        raise request.Forbidden(\"Directory listing disabled!\")\n    resources_path = os.path.join(\n        configuration.paths.INSTALL_DIR, \"resources\")\n    resource_path = os.path.abspath(os.path.join(\n        resources_path, req.path.split(\"/\", 1)[1]))\n    if not resource_path.startswith(resources_path + \"/\"):\n        raise request.Forbidden()\n    if not os.path.isfile(resource_path):\n        raise request.NotFound()\n    last_modified = htmlutils.mtime(resource_path)\n    HTTP_DATE = \"%a, %d %b %Y %H:%M:%S GMT\"\n    if_modified_since = req.getRequestHeader(\"If-Modified-Since\")\n    if if_modified_since:\n        try:\n            if_modified_since = time.strptime(if_modified_since, HTTP_DATE)\n        except ValueError:\n            pass\n        else:\n            if last_modified <= calendar.timegm(if_modified_since):\n                raise request.NotModified()\n    req.addResponseHeader(\"Last-Modified\", time.strftime(HTTP_DATE, time.gmtime(last_modified)))\n    if req.query and req.query == htmlutils.base36(last_modified):\n        req.addResponseHeader(\"Expires\", time.strftime(HTTP_DATE, time.gmtime(time.time() + 2592000)))\n        req.addResponseHeader(\"Cache-Control\", \"max-age=2592000\")\n    setContentTypeFromPath(req)\n    req.start()\n    with open(resource_path, \"r\") as resource_file:\n        return [resource_file.read()]\n\ndef handleDownload(db, req, user):\n    sha1 = req.getParameter(\"sha1\")\n\n    try:\n        repository_arg = req.getParameter(\"repository\")\n        repository = gitutils.Repository.fromParameter(db, repository_arg)\n    except gitutils.NoSuchRepository as error:\n        raise page.utils.DisplayMessage(\n            title=\"No such repository\",\n            body=error.message,\n            status=404)\n\n    try:\n        git_object = repository.fetch(sha1)\n    except gitutils.GitReferenceError as error:\n        raise page.utils.DisplayMessage(\n            title=\"File not found\",\n            body=error.message,\n            status=404)\n\n    if git_object.type != \"blob\":\n        raise page.utils.DisplayMessage(\n            title=\"File not found\",\n            body=(\"%s is a %s, not a blob\"\n                  % (git_object.sha1[:8], git_object.type)),\n            status=404)\n\n    setContentTypeFromPath(req)\n    req.start()\n    return [git_object.data]\n\ndef findreview(req, db):\n    sha1 = req.getParameter(\"sha1\")\n\n    try:\n        repository = gitutils.Repository.fromSHA1(db, sha1)\n        commit = gitutils.Commit.fromSHA1(db, repository, repository.revparse(sha1))\n    except gitutils.GitReferenceError as error:\n        raise request.DisplayMessage(error.message)\n\n    cursor = db.readonly_cursor()\n    cursor.execute(\"\"\"SELECT reviews.id\n                        FROM reviews\n                        JOIN branches ON (branches.id=reviews.branch)\n                        JOIN reachable ON (reachable.branch=branches.id)\n                       WHERE reachable.commit=%s\"\"\",\n                   (commit.getId(db),))\n\n    row = cursor.fetchone()\n\n    if row:\n        review_id = row[0]\n    else:\n        cursor.execute(\"\"\"SELECT reviewchangesets.review\n                            FROM reviewchangesets\n                            JOIN changesets ON (changesets.id=reviewchangesets.changeset)\n                           WHERE changesets.child=%s\"\"\",\n                       (commit.getId(db),))\n\n        row = cursor.fetchone()\n\n        if row:\n            review_id = row[0]\n        else:\n            raise request.DisplayMessage(\"No review found!\")\n\n    raise request.MovedTemporarily(\"/r/%d?highlight=%s#%s\" % (review_id, sha1, sha1))\n\nOPERATIONS = { \"fetchlines\": operation.fetchlines.FetchLines(),\n               \"reviewersandwatchers\": operation.createreview.ReviewersAndWatchers(),\n               \"submitreview\": operation.createreview.SubmitReview(),\n               \"fetchremotebranches\": operation.createreview.FetchRemoteBranches(),\n               \"fetchremotebranch\": operation.createreview.FetchRemoteBranch(),\n               \"validatecommentchain\": operation.createcomment.ValidateCommentChain(),\n               \"createcommentchain\": operation.createcomment.CreateCommentChain(),\n               \"createcomment\": operation.createcomment.CreateComment(),\n               \"reopenresolvedcommentchain\": operation.manipulatecomment.ReopenResolvedCommentChain(),\n               \"reopenaddressedcommentchain\": operation.manipulatecomment.ReopenAddressedCommentChain(),\n               \"resolvecommentchain\": operation.manipulatecomment.ResolveCommentChain(),\n               \"morphcommentchain\": operation.manipulatecomment.MorphCommentChain(),\n               \"updatecomment\": operation.manipulatecomment.UpdateComment(),\n               \"deletecomment\": operation.manipulatecomment.DeleteComment(),\n               \"markchainsasread\": operation.manipulatecomment.MarkChainsAsRead(),\n               \"closereview\": operation.manipulatereview.CloseReview(),\n               \"dropreview\": operation.manipulatereview.DropReview(),\n               \"reopenreview\": operation.manipulatereview.ReopenReview(),\n               \"pingreview\": operation.manipulatereview.PingReview(),\n               \"updatereview\": operation.manipulatereview.UpdateReview(),\n               \"setfullname\": operation.manipulateuser.SetFullname(),\n               \"setgitemails\": operation.manipulateuser.SetGitEmails(),\n               \"changepassword\": operation.manipulateuser.ChangePassword(),\n               \"requestverificationemail\": operation.manipulateuser.RequestVerificationEmail(),\n               \"deleteemailaddress\": operation.manipulateuser.DeleteEmailAddress(),\n               \"selectemailaddress\": operation.manipulateuser.SelectEmailAddress(),\n               \"addemailaddress\": operation.manipulateuser.AddEmailAddress(),\n               \"getassignedchanges\": operation.manipulateassignments.GetAssignedChanges(),\n               \"setassignedchanges\": operation.manipulateassignments.SetAssignedChanges(),\n               \"watchreview\": operation.manipulatereview.WatchReview(),\n               \"unwatchreview\": operation.manipulatereview.UnwatchReview(),\n               \"addreviewfilters\": operation.manipulatefilters.AddReviewFilters(),\n               \"removereviewfilter\": operation.manipulatefilters.RemoveReviewFilter(),\n               \"queryglobalfilters\": operation.applyfilters.QueryGlobalFilters(),\n               \"applyglobalfilters\": operation.applyfilters.ApplyGlobalFilters(),\n               \"queryparentfilters\": operation.applyfilters.QueryParentFilters(),\n               \"applyparentfilters\": operation.applyfilters.ApplyParentFilters(),\n               \"suggestupstreams\": operation.rebasereview.SuggestUpstreams(),\n               \"checkrebase\": operation.rebasereview.CheckRebase(),\n               \"preparerebase\": operation.rebasereview.PrepareRebase(),\n               \"cancelrebase\": operation.rebasereview.CancelRebase(),\n               \"rebasereview\": operation.rebasereview.RebaseReview(),\n               \"revertrebase\": operation.rebasereview.RevertRebase(),\n               \"addfilter\": operation.manipulatefilters.AddFilter(),\n               \"deletefilter\": operation.manipulatefilters.DeleteFilter(),\n               \"reapplyfilters\": operation.manipulatefilters.ReapplyFilters(),\n               \"countmatchedpaths\": operation.manipulatefilters.CountMatchedPaths(),\n               \"getmatchedpaths\": operation.manipulatefilters.GetMatchedPaths(),\n               \"markfiles\": operation.markfiles.MarkFiles(),\n               \"submitchanges\": operation.draftchanges.SubmitChanges(),\n               \"abortchanges\": operation.draftchanges.AbortChanges(),\n               \"reviewstatechange\": operation.draftchanges.ReviewStateChange(),\n               \"savesettings\": operation.savesettings.SaveSettings(),\n               \"rebasebranch\": operation.miscellaneous.RebaseBranch(),\n               \"checkserial\": operation.miscellaneous.CheckSerial(),\n               \"suggestreview\": operation.miscellaneous.SuggestReview(),\n               \"blame\": operation.blame.Blame(),\n               \"addcheckbranchnote\": page.checkbranch.addNote,\n               \"deletecheckbranchnote\": page.checkbranch.deleteNote,\n               \"addrepository\": operation.addrepository.AddRepository(),\n               \"storeresource\": operation.editresource.StoreResource(),\n               \"resetresource\": operation.editresource.ResetResource(),\n               \"restoreresource\": operation.editresource.RestoreResource(),\n               \"addnewsitem\": operation.news.AddNewsItem(),\n               \"editnewsitem\": operation.news.EditNewsItem(),\n               \"getautocompletedata\": operation.autocompletedata.GetAutoCompleteData(),\n               \"getrepositorypaths\": operation.autocompletedata.GetRepositoryPaths(),\n               \"addrecipientfilter\": operation.recipientfilter.AddRecipientFilter(),\n               \"trackedbranchlog\": operation.trackedbranch.TrackedBranchLog(),\n               \"disabletrackedbranch\": operation.trackedbranch.DisableTrackedBranch(),\n               \"triggertrackedbranchupdate\": operation.trackedbranch.TriggerTrackedBranchUpdate(),\n               \"enabletrackedbranch\": operation.trackedbranch.EnableTrackedBranch(),\n               \"deletetrackedbranch\": operation.trackedbranch.DeleteTrackedBranch(),\n               \"addtrackedbranch\": operation.trackedbranch.AddTrackedBranch(),\n               \"restartservice\": operation.servicemanager.RestartService(),\n               \"getservicelog\": operation.servicemanager.GetServiceLog(),\n               \"checkmergestatus\": operation.checkrebase.CheckMergeStatus(),\n               \"checkconflictsstatus\": operation.checkrebase.CheckConflictsStatus(),\n               \"checkhistoryrewritestatus\": operation.checkrebase.CheckHistoryRewriteStatus(),\n               \"searchreview\": operation.searchreview.SearchReview(),\n               \"registeruser\": operation.registeruser.RegisterUser(),\n               \"archivebranch\": operation.brancharchiving.ArchiveBranch(),\n               \"resurrectbranch\": operation.brancharchiving.ResurrectBranch(),\n               \"schedulebrancharchival\": operation.brancharchiving.ScheduleBranchArchival() }\n\nPAGES = { \"showreview\": page.showreview.renderShowReview,\n          \"showcommit\": page.showcommit.renderShowCommit,\n          \"dashboard\": page.dashboard.renderDashboard,\n          \"showcomment\": page.showcomment.renderShowComment,\n          \"showcomments\": page.showcomment.renderShowComments,\n          \"showfile\": page.showfile.renderShowFile,\n          \"statistics\": page.statistics.renderStatistics,\n          \"home\": page.home.renderHome,\n          \"config\": page.config.renderConfig,\n          \"branches\": page.branches.renderBranches,\n          \"tutorial\": page.tutorial.renderTutorial,\n          \"news\": page.news.renderNews,\n          \"managereviewers\": page.managereviewers.renderManageReviewers,\n          \"log\": page.showbranch.renderShowBranch,\n          \"checkbranch\": page.checkbranch.renderCheckBranch,\n          \"checkbranchtext\": page.checkbranch.renderCheckBranch,\n          \"filterchanges\": page.filterchanges.renderFilterChanges,\n          \"showtree\": page.showtree.renderShowTree,\n          \"showbatch\": page.showbatch.renderShowBatch,\n          \"showreviewlog\": page.showreviewlog.renderShowReviewLog,\n          \"createreview\": page.createreview.renderCreateReview,\n          \"newrepository\": page.addrepository.renderNewRepository,\n          \"confirmmerge\": page.confirmmerge.renderConfirmMerge,\n          \"editresource\": page.editresource.renderEditResource,\n          \"search\": page.search.renderSearch,\n          \"repositories\": page.repositories.renderRepositories,\n          \"services\": page.services.renderServices,\n          \"rebasetrackingreview\": page.rebasetrackingreview.RebaseTrackingReview(),\n          \"createuser\": page.createuser.CreateUser(),\n          \"verifyemail\": page.verifyemail.renderVerifyEmail,\n          \"manageextensions\": page.manageextensions.renderManageExtensions,\n          \"showfilters\": page.showfilters.renderShowFilters }\n\nif configuration.extensions.ENABLED:\n    import extensions\n    import extensions.role.page\n    import extensions.role.processcommits\n    import operation.extensioninstallation\n    import page.loadmanifest\n    import page.processcommits\n\n    OPERATIONS[\"installextension\"] = operation.extensioninstallation.InstallExtension()\n    OPERATIONS[\"uninstallextension\"] = operation.extensioninstallation.UninstallExtension()\n    OPERATIONS[\"reinstallextension\"] = operation.extensioninstallation.ReinstallExtension()\n    OPERATIONS[\"clearextensionstorage\"] = operation.extensioninstallation.ClearExtensionStorage()\n    OPERATIONS[\"addextensionhookfilter\"] = operation.extensioninstallation.AddExtensionHookFilter()\n    OPERATIONS[\"deleteextensionhookfilter\"] = operation.extensioninstallation.DeleteExtensionHookFilter()\n    PAGES[\"loadmanifest\"] = page.loadmanifest.renderLoadManifest\n    PAGES[\"processcommits\"] = page.processcommits.renderProcessCommits\n\nif configuration.base.AUTHENTICATION_MODE != \"host\" and configuration.base.SESSION_TYPE == \"cookie\":\n    import operation.usersession\n    import page.login\n\n    if configuration.base.AUTHENTICATION_MODE == \"critic\":\n        OPERATIONS[\"validatelogin\"] = operation.usersession.ValidateLogin()\n\n    OPERATIONS[\"endsession\"] = operation.usersession.EndSession()\n    PAGES[\"login\"] = page.login.Login()\n\ndef handleException(db, req, user, as_html=False):\n    error_message = traceback.format_exc()\n    environ = req.getEnvironment()\n\n    environ[\"wsgi.errors\"].write(error_message)\n\n    if not user or not db or not user.hasRole(db, \"developer\"):\n        url = wsgiref.util.request_uri(environ)\n\n        x_forwarded_host = req.getRequestHeader(\"X-Forwarded-Host\")\n        if x_forwarded_host:\n            original_host = x_forwarded_host.split(\",\")[0].strip()\n            def replace_host(match):\n                return match.group(1) + original_host\n            url = re.sub(\"^([a-z]+://)[^/]+\", replace_host, url)\n\n        if user and not user.isAnonymous():\n            user_string = str(user)\n        else:\n            user_string = \"<anonymous>\"\n\n        mailutils.sendExceptionMessage(db,\n            \"wsgi\", \"\\n\".join([\"User:   %s\" % user_string,\n                               \"Method: %s\" % req.method,\n                               \"URL:    %s\" % url,\n                               \"\",\n                               error_message]))\n\n        admin_message_sent = True\n    else:\n        admin_message_sent = False\n\n    if not user or not db or user.hasRole(db, \"developer\") \\\n            or configuration.debug.IS_DEVELOPMENT \\\n            or configuration.debug.IS_TESTING:\n        error_title = \"Unexpected error!\"\n        error_message = error_message.strip()\n        if as_html:\n            error_message = \"<p class='pre inset'>%s</p>\" % htmlify(error_message)\n        error_body = [error_message]\n        if admin_message_sent:\n            admin_message_sent = (\"A message has been sent to the system \"\n                                  \"administrator(s) with details about the \"\n                                  \"problem.\")\n            if as_html:\n                admin_message_sent = \"<p>%s</p>\" % admin_message_sent\n            error_body.append(admin_message_sent)\n    else:\n        error_title = \"Request failed!\"\n        error_message = (\"An unexpected error occurred while handling the \"\n                         \"request.  A message has been sent to the system \"\n                         \"administrator(s) with details about the problem.  \"\n                         \"Please contact them for further information and/or \"\n                         \"assistance.\")\n        if as_html:\n            error_message = \"<p>%s</p>\" % error_message\n        error_body = [error_message]\n\n    return error_title, error_body\n\nclass WrappedResult(object):\n    def __init__(self, db, req, user, result):\n        self.db = db\n        self.req = req\n        self.user = user\n        self.result = iter(result)\n        # Fetch the first block \"prematurely,\" so that errors from it are raised\n        # early, and handled by the normal error handling code in main().\n        self.first = self.result.next()\n        self.failed = False\n\n    def __iter__(self):\n        return self\n\n    def next(self):\n        if self.failed:\n            raise StopIteration\n\n        try:\n            if self.first:\n                value = self.first\n                self.first = None\n            else:\n                value = self.result.next()\n\n            self.db.rollback()\n            return value\n        except StopIteration:\n            self.db.close()\n            raise\n        except Exception:\n            error_title, error_body = handleException(\n                self.db, self.req, self.user)\n\n            self.db.close()\n\n            if self.req.getContentType().startswith(\"text/html\"):\n                self.failed = True\n\n                error_body = \"\".join(\"<p>%s</p>\" % htmlify(part)\n                                     for part in error_body)\n\n                # Close a bunch of tables, in case we're in any.  Not\n                # pretty, but probably makes the end result prettier.\n                return (\"</table></table></table></table></div>\"\n                        \"<div class='fatal'><table align=center><tr>\"\n                        \"<td><h1>%s</h1>%s</td></tr></table></div>\"\n                        % (error_title, error_body))\n            else:\n                raise StopIteration\n\ndef handleRepositoryPath(db, req, user, suffix):\n    if \"http\" not in configuration.base.REPOSITORY_URL_TYPES:\n        return False\n\n    components = req.path.split(\"/\")\n\n    for index in range(1, len(components) + 1):\n        repository_path = \"/\".join(components[:index])\n        additional_path = \"/\".join(components[index:])\n\n        if suffix is not None:\n            if not repository_path.endswith(suffix):\n                continue\n\n        repository_path = os.path.join(configuration.paths.GIT_DIR,\n                                       repository_path)\n\n        if not repository_path.endswith(\".git\"):\n            repository_path += \".git\"\n\n        try:\n            repository = gitutils.Repository.fromPath(db, repository_path)\n        except gitutils.NoSuchRepository:\n            continue\n        else:\n            db.close()\n\n            try:\n                repository.invokeGitHttpBackend(req, user, additional_path)\n            except gitutils.GitHttpBackendNeedsUser:\n                req.requestHTTPAuthentication()\n\n            return True\n\n    return False\n\ndef handleDisplayMessage(db, req, message):\n    user = db.user\n\n    if user is None:\n        user = dbutils.User.makeAnonymous()\n\n    document = page.utils.displayMessage(\n        db, req, user, title=message.title, message=message.body,\n        review=message.review, is_html=message.html)\n\n    req.setContentType(\"text/html\")\n    req.setStatus(message.status)\n    req.start()\n\n    return [str(document)]\n\ndef handleDisplayFormattedText(db, req, formatted_text):\n    user = db.user\n\n    if user is None:\n        user = dbutils.User.makeAnonymous()\n\n    document = page.utils.displayFormattedText(\n        db, req, user, formatted_text.source)\n\n    req.setContentType(\"text/html\")\n    req.start()\n\n    return [str(document)]\n\ndef handleMissingWSGIRemoteUser(db, req):\n    return handleDisplayMessage(\n        db, req, request.DisplayMessage(\n            title=\"Configuration error\",\n            body=\"\"\"\\\n<p>\nCritic was configured with \"<code>--auth-mode host</code>\", meaning the host web\nserver will authenticate users, but there was no <code>REMOTE_USER</code>\nvariable in the WSGI environment provided by the web server, indicating it is\nnot actually configured to authenticate users.\n</p>\n\n<p>\nTo fix this you can either reinstall Critic using \"<code>--auth-mode\ncritic</code>\" (to let Critic handle user authentication internally using its\nown user database), or you can configure user authentication properly in the web\nserver.  For Apache 2.x, the latter can be done by adding the something like the\nfollowing to the apache site configuration for Critic:\n</p>\n\n<pre>\n  &lt;Location /&gt;\n    AuthType Basic\n    AuthName \"Authentication Required\"\n    AuthUserFile \"/path/to/critic-main.htpasswd.users\"\n    Require valid-user\n  &lt;/Location&gt;\n</pre>\n\n<p>\nIf you need more dynamic http authentication you can instead setup mod_wsgi with\na custom <code>WSGIAuthUserScript</code> directive.  This will cause the\nprovided credentials to be passed to a Python function called check_password()\nthat you can implement yourself.  This way you can validate the user/pass via\nany existing database or for example an LDAP server.\n</p>\n\n<p>\nFor more information on setting up such authentication in Apache 2.x, see:\n<a href=\"%(url)s\">Apache Authentication Provider</a>.\n</p>\"\"\"\n            % { \"url\": (\"http://code.google.com/p/modwsgi/wiki/\"\n                        \"AccessControlMechanisms#Apache_Authentication_Provider\") },\n            html=True,\n            status=500))\n\ndef finishOAuth(db, req, provider):\n    try:\n        provider.finish(db, req)\n    except (auth.InvalidRequest, auth.Failure):\n        _, error_body = handleException(\n            db, req, dbutils.User.makeAnonymous(), as_html=True)\n        raise page.utils.DisplayMessage(\n            title=\"Authentication failed\",\n            body=\"\".join(error_body),\n            html=True)\n\ndef process_request(environ, start_response):\n    request_start = time.time()\n\n    critic = api.critic.startSession(for_user=True)\n    db = critic.database\n    user = None\n\n    try:\n        try:\n            req = request.Request(db, environ, start_response)\n\n            # Handle static resources very early.  We don't bother with checking\n            # for an authenticated user; static resources aren't sensitive, and\n            # are referenced from special-case resources like the login page and\n            # error messages that, that we want to display even if something\n            # went wrong with the authentication.\n            if req.path.startswith(\"static-resource/\"):\n                return handleStaticResource(req)\n\n            if req.path.startswith(\"externalauth/\"):\n                provider_name = req.path[len(\"externalauth/\"):]\n                if provider_name in auth.PROVIDERS:\n                    provider = auth.PROVIDERS[provider_name]\n                    authorize_url = provider.start(db, req)\n                    if authorize_url:\n                        raise request.Found(authorize_url)\n\n            if req.path.startswith(\"oauth/\"):\n                provider_name = req.path[len(\"oauth/\"):]\n                if provider_name in auth.PROVIDERS:\n                    provider = auth.PROVIDERS[provider_name]\n                    if isinstance(provider, auth.OAuthProvider):\n                        finishOAuth(db, req, provider)\n\n            auth.checkSession(db, req)\n            auth.AccessControl.accessHTTP(db, req)\n\n            user = req.user\n            user.loadPreferences(db)\n\n            if user.status == 'retired':\n                # If a retired user accesses the system, change the status back\n                # to 'current' again.\n                with db.updating_cursor(\"users\") as cursor:\n                    cursor.execute(\"\"\"UPDATE users\n                                         SET status='current'\n                                       WHERE id=%s\"\"\",\n                                   (user.id,))\n                user.status = 'current'\n\n            if not user.getPreference(db, \"debug.profiling.databaseQueries\"):\n                db.disableProfiling()\n\n            original_path = req.path\n\n            if not req.path:\n                if user.isAnonymous():\n                    location = \"tutorial\"\n                else:\n                    location = user.getPreference(db, \"defaultPage\")\n\n                if req.query:\n                    location += \"?\" + req.query\n\n                raise request.MovedTemporarily(location)\n\n            if req.path == \"redirect\":\n                target = req.getParameter(\"target\", \"/\")\n                raise request.SeeOther(target)\n\n            if req.path == \"findreview\":\n                # This raises either DisplayMessage or MovedTemporarily.\n                findreview(req, db)\n\n            # Require a .git suffix on HTTP(S) repository URLs unless the user-\n            # agent starts with \"git/\" (as Git's normally does.)\n            #\n            # Our objectives are:\n            #\n            # 1) Not to require Git's user-agent to be its default value, since\n            #    the user might have to override it to get through firewalls.\n            # 2) Never to send regular user requests to 'git http-backend' by\n            #    mistake.\n            #\n            # This is a compromise.\n\n            if req.getRequestHeader(\"User-Agent\", \"\").startswith(\"git/\"):\n                suffix = None\n            else:\n                suffix = \".git\"\n\n            if handleRepositoryPath(db, req, user, suffix):\n                db = None\n                return []\n\n            # Extension \"page\" roles.  Prefixing a path with \"!/\" bypasses all\n            # extensions.\n            #\n            # Also bypass extensions if the user is anonymous unless general\n            # anonymous access is allowed.  If it's not and the user is still\n            # anonymous, access was allowed because of a path-based exception,\n            # which was not intended to allow access to an extension.\n            if req.path.startswith(\"!/\"):\n                req.path = req.path[2:]\n            elif configuration.extensions.ENABLED:\n                handled = extensions.role.page.execute(db, req, user)\n                if isinstance(handled, basestring):\n                    req.start()\n                    return [handled]\n\n            if req.path.startswith(\"r/\"):\n                req.updateQuery({ \"id\": [req.path[2:]] })\n                req.path = \"showreview\"\n\n            if configuration.extensions.ENABLED:\n                match = RE_EXTENSION_RESOURCE.match(req.path)\n                if match:\n                    content_type, resource = extensions.resource.get(req, db, user, match.group(1))\n                    if resource:\n                        req.setContentType(content_type)\n                        if content_type.startswith(\"image/\"):\n                            req.addResponseHeader(\"Cache-Control\", \"max-age=3600\")\n                        req.start()\n                        return [resource]\n                    else:\n                        req.setStatus(404)\n                        req.start()\n                        return []\n\n            if req.path.startswith(\"download/\"):\n                return handleDownload(db, req, user)\n\n            if req.path == \"api\" or req.path.startswith(\"api/\"):\n                try:\n                    result = jsonapi.handleRequest(critic, req)\n                except jsonapi.Error as error:\n                    req.setStatus(error.http_status)\n                    result = { \"error\": { \"title\": error.title,\n                                          \"message\": error.message }}\n                else:\n                    req.setStatus(200)\n\n                accept_header = req.getRequestHeader(\"Accept\")\n                if accept_header == \"application/vnd.api+json\":\n                    default_indent = None\n                else:\n                    default_indent = 2\n                indent = req.getParameter(\"indent\", default_indent, filter=int)\n                if indent == 0:\n                    # json.encode(..., indent=0) still gives line-breaks, just\n                    # no indentation.  This is not so useful, so set indent to\n                    # None instead, which disables formatting entirely.\n                    indent = None\n\n                req.setContentType(\"application/vnd.api+json\")\n                req.start()\n                return [json_encode(result, indent=indent)]\n\n            operationfn = OPERATIONS.get(req.path)\n            if operationfn:\n                result = operationfn(req, db, user)\n\n                if isinstance(result, (OperationResult, OperationError)):\n                    req.setContentType(\"text/json\")\n\n                    if isinstance(result, OperationResult):\n                        if db.profiling:\n                            result.set(\"__profiling__\", formatDBProfiling(db))\n                            result.set(\"__time__\", time.time() - request_start)\n                elif not req.hasContentType():\n                    req.setContentType(\"text/plain\")\n\n                req.start()\n\n                if isinstance(result, unicode):\n                    return [result.encode(\"utf8\")]\n                else:\n                    return [str(result)]\n\n            impersonate_user = user\n\n            if not user.isAnonymous():\n                user_parameter = req.getParameter(\"user\", None)\n                if user_parameter:\n                    impersonate_user = dbutils.User.fromName(db, user_parameter)\n\n            while True:\n                pagefn = PAGES.get(req.path)\n                if pagefn:\n                    try:\n                        result = pagefn(req, db, impersonate_user)\n\n                        if db.profiling and not (isinstance(result, str) or\n                                                 isinstance(result, Document)):\n                            source = \"\"\n                            for fragment in result:\n                                source += fragment\n                            result = source\n\n                        if isinstance(result, page.utils.ResponseBody):\n                            req.setContentType(result.content_type)\n                            req.start()\n                            return [result.data]\n\n                        if isinstance(result, str) or isinstance(result, Document):\n                            req.setContentType(\"text/html\")\n                            req.start()\n                            result = str(result)\n                            result += (\"<!-- total request time: %.2f ms -->\"\n                                       % ((time.time() - request_start) * 1000))\n                            if db.profiling:\n                                result += (\"<!--\\n\\n%s\\n\\n -->\"\n                                           % formatDBProfiling(db))\n                            return [result]\n\n                        result = WrappedResult(db, req, user, result)\n\n                        req.setContentType(\"text/html\")\n                        req.start()\n\n                        # Prevent the finally clause below from closing the\n                        # connection.  WrappedResult does it instead.\n                        db = None\n\n                        return result\n                    except gitutils.NoSuchRepository as error:\n                        raise page.utils.DisplayMessage(\n                            title=\"Invalid URI Parameter!\",\n                            body=error.message)\n                    except gitutils.GitReferenceError as error:\n                        if error.ref:\n                            raise page.utils.DisplayMessage(\n                                title=\"Specified ref not found\",\n                                body=(\"There is no ref named \\\"%s\\\" in %s.\"\n                                      % (error.ref, error.repository)))\n                        elif error.sha1:\n                            raise page.utils.DisplayMessage(\n                                title=\"SHA-1 not found\",\n                                body=error.message)\n                        else:\n                            raise\n                    except dbutils.NoSuchUser as error:\n                        raise page.utils.DisplayMessage(\n                            title=\"Invalid URI Parameter!\",\n                            body=error.message)\n                    except dbutils.NoSuchReview as error:\n                        raise page.utils.DisplayMessage(\n                            title=\"Invalid URI Parameter!\",\n                            body=error.message)\n\n                if \"/\" in req.path:\n                    repository_name, _, rest = req.path.partition(\"/\")\n                    repository = gitutils.Repository.fromName(db, repository_name)\n                    if repository:\n                        req.path = rest\n                else:\n                    repository = None\n\n                def revparsePlain(item):\n                    try: return gitutils.getTaggedCommit(repository, repository.revparse(item))\n                    except: raise\n                revparse = revparsePlain\n\n                if repository is None:\n                    review_id = req.getParameter(\"review\", None, filter=int)\n\n                    if review_id:\n                        cursor = db.cursor()\n                        cursor.execute(\"\"\"SELECT repository\n                                            FROM branches\n                                            JOIN reviews ON (reviews.branch=branches.id)\n                                           WHERE reviews.id=%s\"\"\",\n                                       (review_id,))\n                        row = cursor.fetchone()\n                        if row:\n                            repository = gitutils.Repository.fromId(db, row[0])\n                            def revparseWithReview(item):\n                                if re.match(\"^[0-9a-f]+$\", item):\n                                    cursor.execute(\"\"\"SELECT sha1\n                                                        FROM commits\n                                                        JOIN changesets ON (changesets.child=commits.id)\n                                                        JOIN reviewchangesets ON (reviewchangesets.changeset=changesets.id)\n                                                       WHERE reviewchangesets.review=%s\n                                                         AND commits.sha1 LIKE %s\"\"\",\n                                                   (review_id, item + \"%\"))\n                                    row = cursor.fetchone()\n                                    if row: return row[0]\n                                    else: return revparsePlain(item)\n                            revparse = revparseWithReview\n\n                if repository is None:\n                    repository = user.getDefaultRepository(db)\n\n                    if gitutils.re_sha1.match(req.path):\n                        if repository and not repository.iscommit(req.path):\n                            repository = None\n\n                        if not repository:\n                            try:\n                                repository = gitutils.Repository.fromSHA1(db, req.path)\n                            except gitutils.GitReferenceError:\n                                repository = None\n\n                if repository:\n                    try:\n                        items = filter(None, map(revparse, req.path.split(\"..\")))\n                        updated_query = {}\n\n                        if len(items) == 1:\n                            updated_query[\"repository\"] = [repository.name]\n                            updated_query[\"sha1\"] = [items[0]]\n                        elif len(items) == 2:\n                            updated_query[\"repository\"] = [repository.name]\n                            updated_query[\"from\"] = [items[0]]\n                            updated_query[\"to\"] = [items[1]]\n\n                        if updated_query:\n                            req.updateQuery(updated_query)\n                            req.path = \"showcommit\"\n                            continue\n                    except gitutils.GitReferenceError:\n                        pass\n\n                break\n\n            raise page.utils.DisplayMessage(\n                title=\"Not found!\",\n                body=\"Page not handled: /%s\" % original_path,\n                status=404)\n        except GeneratorExit:\n            raise\n        except auth.AccessDenied as error:\n            return handleDisplayMessage(\n                db, req, request.DisplayMessage(\n                    title=\"Access denied\",\n                    body=error.message,\n                    status=403))\n        except request.HTTPResponse as response:\n            return response.execute(db, req)\n        except request.MissingWSGIRemoteUser as error:\n            return handleMissingWSGIRemoteUser(db, req)\n        except page.utils.DisplayMessage as message:\n            return handleDisplayMessage(db, req, message)\n        except page.utils.DisplayFormattedText as formatted_text:\n            return handleDisplayFormattedText(db, req, formatted_text)\n        except Exception:\n            # crash might be psycopg2.ProgrammingError so rollback to avoid\n            # \"InternalError: current transaction is aborted\" inside handleException()\n            if db and db.closed():\n                db = None\n            elif db:\n                db.rollback()\n\n            error_title, error_body = handleException(db, req, user)\n            error_body = reflow(\"\\n\\n\".join(error_body))\n            error_message = \"\\n\".join([error_title,\n                                       \"=\" * len(error_title),\n                                       \"\",\n                                       error_body])\n\n            assert not req.isStarted()\n\n            req.setStatus(500)\n            req.setContentType(\"text/plain\")\n            req.start()\n\n            return [error_message]\n    finally:\n        if db:\n            db.close()\n\nif configuration.debug.COVERAGE_DIR:\n    def main(environ, start_response):\n        import coverage\n\n        def do_process_request(environ, start_response):\n            # Apply list() to force the request to be fully performed by this\n            # call.  It might return an iterator whose .next() does all the\n            # work, and if we just return that from here, the actual work is not\n            # subject to coverage measurement.\n            return list(process_request(environ, start_response))\n\n        return coverage.call(\"wsgi\", do_process_request, environ, start_response)\nelse:\n    main = process_request\n"
  },
  {
    "path": "src/data/preferences.json",
    "content": "{\n  \"comment.diff.contextLines\": {\n    \"type\": \"integer\",\n    \"default\": 3,\n    \"description\": \"Default number of context lines added to diffs when displaying comment chains.  Can be overridden by 'context' URI parameter.\"\n  },\n  \"commit.diff.collapseSimpleHunks\": {\n    \"type\": \"boolean\",\n    \"default\": false,\n    \"description\": \"When a hunk in a diff contains only deleted or only inserted lines, collapse the \\\"other\\\" side, so that the hunk is displayed as a single wide column.  NOT FULLY FUNCTIONAL!\"\n  },\n  \"commit.diff.compactMode\": {\n    \"type\": \"boolean\",\n    \"default\": true,\n    \"description\": \"Generate more compact HTML for diffs, and generate some HTML on-demand client-side.  Faster download and initial rendering, and slower interaction.\"\n  },\n  \"commit.diff.contextLines\": {\n    \"type\": \"integer\",\n    \"default\": 5,\n    \"description\": \"Default number of context lines added to diffs when displaying commits.  Can be overridden by 'context' URI parameter.\"\n  },\n  \"commit.diff.highlightIllegalWhitespace\": {\n    \"type\": \"boolean\",\n    \"default\": true,\n    \"description\": \"Use an angry red color scheme for illegal whitespace (trailing whitespace or tabs in files with \\\"intent-tabs-mode: nil\\\".)\"\n  },\n  \"commit.diff.rulerColumn\": {\n    \"type\": \"integer\",\n    \"default\": 0,\n    \"description\": \"The column at which a ruler is shown. Can be set to 0 to disable the ruler.\",\n    \"relevance\": { \"repository\": true }\n  },\n  \"commit.diff.visualTabs\": {\n    \"type\": \"boolean\",\n    \"default\": false,\n    \"description\": \"Replace tab characters with U+2192 (RIGHTWARDS ARROW) styled to the correct width (taking the file's Emacs mode-line into account.)\"\n  },\n  \"commit.expandAllFiles\": {\n    \"type\": \"boolean\",\n    \"default\": false,\n    \"description\": \"On the 'showcommit' page, expand the diffs in all files on page load.\"\n  },\n  \"commit.expandFilesLimit\": {\n    \"type\": \"integer\",\n    \"default\": 0,\n    \"description\": \"If 'commit.expandAllFiles' is enabled, and this limit is non-zero, all files are initially collapsed if the diff contains changes in more files than this limit.\"\n  },\n  \"dashboard.defaultGroups\": {\n    \"type\": \"string\",\n    \"default\": \"owned,draft,active,watched\",\n    \"description\": \"Review groups to show on the dashboard by default.  Available groups are owned, draft, active, watched, open and closed.\"\n  },\n  \"debug.enabled\": {\n    \"type\": \"boolean\",\n    \"default\": false,\n    \"description\": \"Enable debugging preferences.\"\n  },\n  \"debug.extensions.customProcessCommits\": {\n    \"type\": \"boolean\",\n    \"default\": false,\n    \"description\": \"Enable button for performing installed ProcessCommits hooks on arbitrary sets of commits for testing.\"\n  },\n  \"debug.profiling.abortChanges\": {\n    \"type\": \"boolean\",\n    \"default\": false,\n    \"description\": \"Enable profiling of /abortchanges.\"\n  },\n  \"debug.profiling.databaseQueries\": {\n    \"type\": \"boolean\",\n    \"default\": false,\n    \"description\": \"Enable profiling of database queries.\"\n  },\n  \"debug.profiling.pageGeneration\": {\n    \"type\": \"boolean\",\n    \"default\": false,\n    \"description\": \"Enable profiling of generation of various pages.  Results are emitted as a comment at the end of the HTML.\"\n  },\n  \"debug.profiling.submitChanges\": {\n    \"type\": \"boolean\",\n    \"default\": false,\n    \"description\": \"Enable profiling of /submitchanges.\"\n  },\n  \"defaultPage\": {\n    \"type\": \"string\",\n    \"default\": \"home\",\n    \"description\": \"The default page displayed when accessing the system.\"\n  },\n  \"defaultRepository\": {\n    \"type\": \"string\",\n    \"default\": \"\",\n    \"description\": \"Name of default repository.  In situations where the repository is not implied, this is the one that is used, or preferred.\"\n  },\n  \"email.activated\": {\n    \"type\": \"boolean\",\n    \"default\": true,\n    \"description\": \"Must be enabled before the system sends any emails to the user.\"\n  },\n  \"email.comment.contextLines\": {\n    \"type\": \"integer\",\n    \"default\": 0,\n    \"description\": \"Default number of context lines added to code excerpts when displaying comment threads in emails.\"\n  },\n  \"email.enableAssociationRecipients\": {\n    \"type\": \"boolean\",\n    \"default\": false,\n    \"description\": \"Add phony recipients to review emails representing your associations with the review.\",\n    \"relevance\": { \"repository\": true }\n  },\n  \"email.ignoreOwnChanges\": {\n    \"type\": \"boolean\",\n    \"default\": false,\n    \"description\": \"Don't send emails about own changes (reviewing, commits added to reviews and rebased reviews.)\"\n  },\n  \"email.lineLength\": {\n    \"type\": \"integer\",\n    \"default\": 80,\n    \"description\": \"Maximum line length in emails sent.  Plain text will be reflowed to adhere to this.  Diffs and other non-prose items are never reflowed.\"\n  },\n  \"email.listId\": {\n    \"type\": \"string\",\n    \"default\": \"\",\n    \"description\": \"Identifier used to construct a List-Id header for review emails.  Used for the $id in \\\"<$id.$hostname>\\\" in the header value.\",\n    \"relevance\": { \"repository\": true }\n  },\n  \"email.newReview.diff.contextLines\": {\n    \"type\": \"integer\",\n    \"default\": 3,\n    \"description\": \"Number of context lines added to diffs in the email sent when a new review is submitted.\"\n  },\n  \"email.newReview.diff.maxLines\": {\n    \"type\": \"integer\",\n    \"default\": 250,\n    \"description\": \"Maximum number of lines of diffs to include in the email sent when a new review is submitted.  If exceeded, no diffs are included at all.\"\n  },\n  \"email.newReview.displayCommits\": {\n    \"type\": \"boolean\",\n    \"default\": true,\n    \"description\": \"Include a list of all commits to be reviewed in the email sent when a new review is submitted.\"\n  },\n  \"email.newReview.displayDiffs\": {\n    \"type\": \"boolean\",\n    \"default\": true,\n    \"description\": \"Include diffs of each commit to be reviewed in the email sent when a new review is submitted.\"\n  },\n  \"email.newReview.displayReviewers\": {\n    \"type\": \"boolean\",\n    \"default\": true,\n    \"description\": \"Include a list of all assigned reviewers in the email sent when a new review is submitted.\"\n  },\n  \"email.newReview.displayStats\": {\n    \"type\": \"boolean\",\n    \"default\": false,\n    \"description\": \"Include --stat output for commits added to the review.\"\n  },\n  \"email.newReview.displayWatchers\": {\n    \"type\": \"boolean\",\n    \"default\": true,\n    \"description\": \"Include a list of all additional watchers in the email sent when a new review is submitted.\"\n  },\n  \"email.newReview.stats.maxLines\": {\n    \"type\": \"integer\",\n    \"default\": 250,\n    \"description\": \"Maximum number of lines of commit stats to include in the email sent when a review is submitted.  If exceeded, no stats are included at all.\"\n  },\n  \"email.subjectLine.newReview\": {\n    \"type\": \"string\",\n    \"default\": \"New Review: %(summary)s\",\n    \"description\": \"Python format string for subject line of email sent for newly created reviews.\"\n  },\n  \"email.subjectLine.newishReview\": {\n    \"type\": \"string\",\n    \"default\": \"New(ish) Review: %(summary)s\",\n    \"description\": \"Python format string for subject line of email sent for new(ish) reviews.\"\n  },\n  \"email.subjectLine.pingedReview\": {\n    \"type\": \"string\",\n    \"default\": \"Pinged Review: %(summary)s\",\n    \"description\": \"Python format string for subject line of email sent when someone pings a review.\"\n  },\n  \"email.subjectLine.updatedReview.assignmentsChanged\": {\n    \"type\": \"string\",\n    \"default\": \"Updated Review: %(summary)s\",\n    \"description\": \"Python format string for subject line of email sent when someone changes review assignments.\"\n  },\n  \"email.subjectLine.updatedReview.commitsPushed\": {\n    \"type\": \"string\",\n    \"default\": \"Updated Review: %(summary)s\",\n    \"description\": \"Python format string for subject line of email sent when someone pushes additional commits to a review.\"\n  },\n  \"email.subjectLine.updatedReview.parentFiltersApplied\": {\n    \"type\": \"string\",\n    \"default\": \"Updated Review: %(summary)s\",\n    \"description\": \"Python format string for subject line of email sent when someone applies parent repository filters to a review.\"\n  },\n  \"email.subjectLine.updatedReview.reviewRebased\": {\n    \"type\": \"string\",\n    \"default\": \"Updated Review: %(summary)s\",\n    \"description\": \"Python format string for subject line of email sent when someone rebases a review branch.\"\n  },\n  \"email.subjectLine.updatedReview.submittedChanges\": {\n    \"type\": \"string\",\n    \"default\": \"Updated Review: %(summary)s\",\n    \"description\": \"Python format string for subject line of email sent when someone submits changes to a review.\"\n  },\n  \"email.updatedReview.commentThreading\": {\n    \"type\": \"boolean\",\n    \"default\": false,\n    \"description\": \"Send emails containing comments so that comment threads form email threads.  This will increase the number of emails sent from Critic to the user.\"\n  },\n  \"email.updatedReview.diff.contextLines\": {\n    \"type\": \"integer\",\n    \"default\": 3,\n    \"description\": \"Number of context lines added to diffs in the email sent when a review is updated.\"\n  },\n  \"email.updatedReview.diff.maxLines\": {\n    \"type\": \"integer\",\n    \"default\": 250,\n    \"description\": \"Maximum number of lines of diffs to include in the email sent when a review is updated.  If exceeded, no diffs are included at all.\"\n  },\n  \"email.updatedReview.displayCommits\": {\n    \"type\": \"boolean\",\n    \"default\": true,\n    \"description\": \"Include a list of all new commits to be reviewed in the email sent when a review is updated.\"\n  },\n  \"email.updatedReview.displayStats\": {\n    \"type\": \"boolean\",\n    \"default\": false,\n    \"description\": \"Include --stat output for commits added to the review.\"\n  },\n  \"email.updatedReview.quotedComments\": {\n    \"type\": \"string\",\n    \"default\": \"all\",\n    \"description\": \"Selection of comments in a comment thread that are quoted when new replies are submitted.\"\n  },\n  \"email.updatedReview.relevantChangesOnly\": {\n    \"type\": \"boolean\",\n    \"default\": false,\n    \"description\": \"Only generate emails about reviewed files and written comments that are relevant.\"\n  },\n  \"email.updatedReview.stats.maxLines\": {\n    \"type\": \"integer\",\n    \"default\": 250,\n    \"description\": \"Maximum number of lines of commit stats to include in the email sent when a review is updated.  If exceeded, no stats are included at all.\"\n  },\n  \"email.urlType\": {\n    \"type\": \"string\",\n    \"default\": \"main\",\n    \"description\": \"Type of URLs used in emails.\"\n  },\n  \"repository.urlType\": {\n    \"type\": \"string\",\n    \"default\": \"http\",\n    \"description\": \"Type of repository URL to display.\"\n  },\n  \"review.applyUpstreamFilters\": {\n    \"type\": \"boolean\",\n    \"default\": true,\n    \"description\": \"If enabled, filters from upstream repositories are applied when creating review via push.\"\n  },\n  \"review.branchArchiveDelay.closed\": {\n    \"type\": \"integer\",\n    \"default\": 7,\n    \"description\": \"If non-zero, archive review branches belonging to closed reviews this number of days after the review was (last) closed.\",\n    \"relevance\": { \"repository\": true }\n  },\n  \"review.branchArchiveDelay.dropped\": {\n    \"type\": \"integer\",\n    \"default\": 1,\n    \"description\": \"If non-zero, archive review branches belonging to dropped reviews this number of days after the review was (last) dropped.\",\n    \"relevance\": { \"repository\": true }\n  },\n  \"review.createViaPush\": {\n    \"type\": \"boolean\",\n    \"default\": false,\n    \"description\": \"If enabled, reviews can be requested by pushing a new branch whose name starts with \\\"r/\\\" to the repository.\"\n  },\n  \"review.defaultOptOut\": {\n    \"type\": \"boolean\",\n    \"default\": false,\n    \"description\": \"Opt out of receiving review emails (except for \\\"New review\\\" emails) by default.\",\n    \"relevance\": { \"repository\": true,\n                   \"filter\": true }\n  },\n  \"review.dropAnyReview\": {\n    \"type\": \"boolean\",\n    \"default\": false,\n    \"description\": \"Show the \\\"Drop Review\\\" button on the front-page of every review, instead of only those you own.\"\n  },\n  \"review.pingAnyReview\": {\n    \"type\": \"boolean\",\n    \"default\": false,\n    \"description\": \"Show the \\\"Ping Review\\\" button on the front-page of every review, instead of only those you own.\"\n  },\n  \"review.updateCheckInterval\": {\n    \"type\": \"integer\",\n    \"default\": 300,\n    \"description\": \"Check for updates of reviews every N seconds while displaying review front pages.  If zero, the check is disabled.\"\n  },\n  \"review.useMustRevalidate\": {\n    \"type\": \"boolean\",\n    \"default\": true,\n    \"description\": \"Deliver review front-pages with \\\"Cache-Control: must-revalidate\\\" to prevent history navigation to stale versions.\"\n  },\n  \"style.defaultFont\": {\n    \"type\": \"string\",\n    \"default\": \"font-size: 10pt; font-family: sans-serif\",\n    \"description\": \"Font setting applied to the BODY element on every page.\"\n  },\n  \"style.sourceFont\": {\n    \"type\": \"string\",\n    \"default\": \"font-size: 10pt; font-family: monospace\",\n    \"description\": \"Font setting applied to source code text in diff display.\"\n  },\n  \"style.tutorialFont\": {\n    \"type\": \"string\",\n    \"default\": \"font-size: 11pt; font-family: serif\",\n    \"description\": \"Font setting applied to tutorial text.\"\n  },\n  \"timezone\": {\n    \"type\": \"string\",\n    \"default\": \"Universal/UTC\",\n    \"description\": \"Timezone to present (most) dates in.\"\n  },\n  \"ui.asynchronousReviewMarking\": {\n    \"type\": \"boolean\",\n    \"default\": false,\n    \"description\": \"Mark changes as reviewed (or not reviewed) using an asynchronous XMLHttpRequest.\"\n  },\n  \"ui.convertIssueToNote\": {\n    \"type\": \"boolean\",\n    \"default\": false,\n    \"description\": \"Enable \\\"Convert To Note\\\" operation for issues.  This operation is considered an inferior alternative to resolving an issue; use of it is not recommended.\"\n  },\n  \"ui.keyboardShortcuts\": {\n    \"type\": \"boolean\",\n    \"default\": true,\n    \"description\": \"Enabled keyboard shortcuts on those pages that define any.\"\n  },\n  \"ui.resolveIssueWarning\": {\n    \"type\": \"boolean\",\n    \"default\": true,\n    \"description\": \"Show a warning when resolving an issue raised by another user.\"\n  },\n  \"ui.title.showReview\": {\n    \"type\": \"string\",\n    \"default\": \"%(id)s (%(progress)s) - %(summary)s - Opera Critic\",\n    \"description\": \"Python format string for title of review front-page documents.\"\n  }\n}\n"
  },
  {
    "path": "src/dbaccess.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\ntry:\n    import configuration\nexcept ImportError:\n    IntegrityError = ProgrammingError = OperationalError = Exception\n    TransactionRollbackError = Exception\n\n    def connect():\n        raise Exception(\"not supported\")\nelse:\n    if configuration.database.DRIVER == \"postgresql\":\n        import psycopg2 as driver\n\n        TransactionRollbackError = driver.extensions.TransactionRollbackError\n    else:\n        import sys\n        import os\n\n        sys.path.insert(0, os.path.dirname(os.path.dirname(__file__)))\n\n        import installation.qs.sqlite as driver\n\n        # SQLite doesn't appear to be throwing this type of error.\n        class TransactionRollbackError(Exception):\n            pass\n\n    IntegrityError = driver.IntegrityError\n    OperationalError = driver.OperationalError\n    ProgrammingError = driver.ProgrammingError\n\n    def connect():\n        return driver.connect(**configuration.database.PARAMETERS)\n"
  },
  {
    "path": "src/dbutils/__init__.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nfrom dbaccess import (IntegrityError, OperationalError, ProgrammingError,\n                      TransactionRollbackError)\n\nfrom dbutils.session import Session\nfrom dbutils.database import (InvalidCursorError, FailedToLock, NOWAIT,\n                              Database, boolean)\nfrom dbutils.user import InvalidUserId, NoSuchUser, User\nfrom dbutils.review import NoSuchReview, ReviewState, Review\nfrom dbutils.branch import Branch\nfrom dbutils.paths import (InvalidFileId, InvalidPath, File, find_file,\n                           find_files, describe_file)\nfrom dbutils.timezones import (loadTimezones, updateTimezones, sortedTimezones,\n                               adjustTimestamp)\nfrom dbutils.system import (getInstalledSHA1, getURLPrefix,\n                           getAdministratorContacts)\n"
  },
  {
    "path": "src/dbutils/branch.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nclass Branch(object):\n    def __init__(self, id, repository, name, head_sha1, base, tail_sha1, branch_type, archived):\n        self.id = id\n        self.repository = repository\n        self.name = name\n        self.head_sha1 = head_sha1\n        self.base = base\n        self.tail_sha1 = tail_sha1\n        self.type = branch_type\n        self.archived = archived\n        self.review = None\n        self.__commits = None\n        self.__head = None\n        self.__tail = None\n\n    def __eq__(self, other):\n        return self.id == other.id\n\n    def __ne__(self, other):\n        return self.id != other.id\n\n    def contains(self, db, commit):\n        import gitutils\n        cursor = db.cursor()\n        if isinstance(commit, gitutils.Commit) and commit.id is not None:\n            cursor.execute(\"SELECT 1 FROM reachable WHERE branch=%s AND commit=%s\", [self.id, commit.id])\n        else:\n            cursor.execute(\"SELECT 1 FROM reachable, commits WHERE branch=%s AND commit=id AND sha1=%s\", [self.id, str(commit)])\n        return cursor.fetchone() is not None\n\n    def getHead(self, db):\n        import gitutils\n        if not self.__head:\n            self.__head = gitutils.Commit.fromSHA1(db, self.repository, self.head_sha1)\n        return self.__head\n\n    def getTail(self, db):\n        import gitutils\n        if not self.__tail:\n            self.__tail = gitutils.Commit.fromSHA1(db, self.repository, self.tail_sha1)\n        return self.__tail\n\n    def getJSConstructor(self):\n        from htmlutils import jsify\n        if self.base:\n            return \"new Branch(%d, %s, %s)\" % (self.id, jsify(self.name), self.base.getJSConstructor())\n        else:\n            return \"new Branch(%d, %s, null)\" % (self.id, jsify(self.name))\n\n    def getJS(self):\n        return \"var branch = %s;\" % self.getJSConstructor()\n\n    def getCommits(self, db):\n        import gitutils\n        if self.__commits is None:\n            cursor = db.cursor()\n            cursor.execute(\"\"\"SELECT commits.id, commits.sha1\n                                FROM reachable\n                                JOIN commits ON (commits.id=reachable.commit)\n                               WHERE reachable.branch=%s\"\"\",\n                           (self.id,))\n            self.__commits = [gitutils.Commit.fromSHA1(db, self.repository, sha1, commit_id=commit_id)\n                              for commit_id, sha1 in cursor]\n        return self.__commits\n\n    def rebase(self, db, base):\n        import gitutils\n\n        cursor = db.cursor()\n\n        def findReachable(head, base_branch_id, force_include=set()):\n            bases = [base_branch_id]\n\n            while True:\n                cursor.execute(\"SELECT base FROM branches WHERE id=%s\", (bases[-1],))\n                branch_id = cursor.fetchone()[0]\n                if branch_id is None: break\n                bases.append(branch_id)\n\n            expression = \"SELECT 1 FROM reachable, commits WHERE branch IN (%s) AND commit=id AND sha1=%%s\" % \", \".join([\"%s\"] * len(bases))\n\n            def exclude(sha1):\n                cursor.execute(expression, bases + [sha1])\n                return cursor.fetchone() is not None\n\n            stack = [head.sha1]\n            processed = set()\n            values = []\n\n            while stack:\n                sha1 = stack.pop(0)\n\n                if sha1 not in processed:\n                    processed.add(sha1)\n\n                    commit = gitutils.Commit.fromSHA1(db, self.repository, sha1)\n\n                    if sha1 in force_include or not exclude(sha1):\n                        values.append(commit.getId(db))\n\n                        for sha1 in commit.parents:\n                            if sha1 not in processed and (sha1 in force_include or not exclude(sha1)):\n                                stack.append(sha1)\n\n            return values\n\n        cursor.execute(\"SELECT COUNT(*) FROM reachable WHERE branch=%s\", (self.id,))\n        old_count = cursor.fetchone()[0]\n\n        if base.base and base.base.id == self.id:\n            cursor.execute(\"SELECT count(*) FROM reachable WHERE branch=%s\", (base.id,))\n            base_old_count = cursor.fetchone()[0]\n\n            base_reachable = findReachable(base.getHead(db), self.base.id, set(commit.sha1 for commit in self.getCommits(db)))\n            base_new_count = len(base_reachable)\n\n            cursor.execute(\"DELETE FROM reachable WHERE branch=%s\", [base.id])\n            cursor.executemany(\"INSERT INTO reachable (branch, commit) VALUES (%s, %s)\", [(base.id, commit) for commit in base_reachable])\n            cursor.execute(\"UPDATE branches SET base=%s WHERE id=%s\", [self.base.id, base.id])\n\n            base.base = self.base\n            base.__commits = None\n        else:\n            base_old_count = None\n            base_new_count = None\n\n        our_reachable = findReachable(self.getHead(db), base.id)\n        new_count = len(our_reachable)\n\n        cursor.execute(\"DELETE FROM reachable WHERE branch=%s\", [self.id])\n        cursor.executemany(\"INSERT INTO reachable (branch, commit) VALUES (%s, %s)\", [(self.id, commit) for commit in our_reachable])\n        cursor.execute(\"UPDATE branches SET base=%s WHERE id=%s\", [base.id, self.id])\n\n        self.base = base\n        self.__commits = None\n\n        return old_count, new_count, base_old_count, base_new_count\n\n    def archive(self, db):\n        import gitutils\n\n        try:\n            head = self.getHead(db)\n        except gitutils.GitReferenceError:\n            # The head commit appears to be missing from the repository.\n            head = None\n        else:\n            self.repository.keepalive(head.sha1)\n\n        if head:\n            try:\n                self.repository.deleteref(\"refs/heads/\" + self.name, head)\n            except gitutils.GitError:\n                # Branch either doesn't exist, or points to the wrong commit.\n                try:\n                    sha1 = self.repository.revparse(\"refs/heads/\" + self.name)\n                except gitutils.GitReferenceError:\n                    # Branch doesn't exist.  Pretend it's been archived already.\n                    pass\n                else:\n                    # Branch points to the wrong commit.  Don't delete the ref.\n                    return\n\n        cursor = db.cursor()\n        cursor.execute(\"\"\"UPDATE branches\n                             SET archived=TRUE\n                           WHERE id=%s\"\"\",\n                       (self.id,))\n\n        self.archived = True\n\n    def resurrect(self, db):\n        self.repository.createref(\"refs/heads/\" + self.name, self.getHead(db))\n\n        cursor = db.cursor()\n        cursor.execute(\"\"\"UPDATE branches\n                             SET archived=FALSE\n                           WHERE id=%s\"\"\",\n                       (self.id,))\n\n        self.archived = False\n\n    @staticmethod\n    def fromId(db, branch_id, load_review=False, repository=None, for_update=False, profiler=None):\n        import gitutils\n\n        cursor = db.cursor()\n        cursor.execute(\"\"\"SELECT name, repository, head, base, tail, branches.type, archived\n                            FROM branches\n                           WHERE branches.id=%s\"\"\",\n                       (branch_id,),\n                       for_update=for_update)\n        row = cursor.fetchone()\n\n        if not row: return None\n        else:\n            branch_name, repository_id, head_commit_id, base_branch_id, tail_commit_id, type, archived = row\n\n            def commit_sha1(commit_id):\n                cursor.execute(\"SELECT sha1 FROM commits WHERE id=%s\", (commit_id,))\n                return cursor.fetchone()[0]\n\n            head_commit_sha1 = commit_sha1(head_commit_id)\n            tail_commit_sha1 = (commit_sha1(tail_commit_id)\n                                if tail_commit_id is not None else None)\n\n            if profiler: profiler.check(\"Branch.fromId: basic\")\n\n            if repository is None:\n                repository = gitutils.Repository.fromId(db, repository_id)\n\n            assert repository.id == repository_id\n\n            if profiler: profiler.check(\"Branch.fromId: repository\")\n\n            base_branch = (Branch.fromId(db, base_branch_id, repository=repository)\n                           if base_branch_id is not None else None)\n\n            if profiler: profiler.check(\"Branch.fromId: base\")\n\n            branch = Branch(branch_id, repository, branch_name, head_commit_sha1, base_branch, tail_commit_sha1, type, archived)\n\n            if load_review:\n                from dbutils import Review\n\n                branch.review = Review.fromBranch(db, branch)\n\n                if profiler: profiler.check(\"Branch.fromId: review\")\n\n            return branch\n\n    @staticmethod\n    def fromName(db, repository, name, **kwargs):\n        cursor = db.cursor()\n        cursor.execute(\"\"\"SELECT id\n                            FROM branches\n                           WHERE repository=%s\n                             AND name=%s\"\"\",\n                       (repository.id, name))\n        row = cursor.fetchone()\n        if not row:\n            return None\n        else:\n            return Branch.fromId(db, row[0], **kwargs)\n"
  },
  {
    "path": "src/dbutils/database.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport contextlib\nimport re\nimport time\n\nimport base\nimport dbaccess\n\nfrom dbutils.session import Session\n\nclass InvalidCursorError(base.ImplementationError):\n    pass\n\n# Raised when \"SELECT ... FOR UPDATE NOWAIT\" fails to acquire row locks (without\n# blocking.)\nclass FailedToLock(Exception):\n    pass\n\n# Singleton used as the value to Database.Cursor.execute()'s 'for_update'\n# argument to request NOWAIT behavior (fail instead of blocking if rows are\n# already locked.)\nclass NoWait:\n    pass\nNOWAIT = NoWait()\n\nclass _CursorIterator(object):\n    def __init__(self, base):\n        self.__base = base\n        self.__invalid = False\n\n    def next(self):\n        if self.__invalid:\n            raise InvalidCursorError(\"cursor re-used during iteration\")\n        return next(self.__base)\n\n    def invalidate(self):\n        self.__invalid = True\n\nclass _CursorBase(object):\n    def __init__(self, db, cursor, profiling):\n        self.db = db\n        self.__cursor = cursor\n        self.__profiling = profiling is not None\n        self.__rows = None\n        self.__iterators = []\n\n    def __iter__(self):\n        if self.__rows:\n            base = iter(self.__rows)\n            self.__rows = None\n        else:\n            base = iter(self.__cursor)\n        iterator = _CursorIterator(base)\n        self.__iterators.append(iterator)\n        return iterator\n\n    def __getitem__(self, index):\n        if not self.__profiling:\n            return self.__cursor[index]\n        else:\n            return self.__rows[index]\n\n    @property\n    def description(self):\n        return self.__cursor.description\n\n    def fetchone(self):\n        if not self.__profiling:\n            return self.__cursor.fetchone()\n        elif self.__rows:\n            row = self.__rows[0]\n            self.__rows = self.__rows[1:]\n            return row\n        else:\n            return None\n\n    def fetchall(self):\n        if not self.__profiling:\n            return self.__cursor.fetchall()\n        else:\n            return self.__rows\n\n    def execute(self, query, params=(), for_update=False):\n        self.validate(query, for_update)\n        if for_update:\n            assert query.upper().startswith(\"SELECT \")\n            query += \" FOR UPDATE\"\n            if for_update is NOWAIT:\n                query += \" NOWAIT\"\n        try:\n            if not self.__profiling:\n                self.__cursor.execute(query, params)\n            else:\n                map(_CursorIterator.invalidate, self.__iterators)\n                self.__iterators = []\n                before = time.time()\n                self.__cursor.execute(query, params)\n                try:\n                    self.__rows = self.__cursor.fetchall()\n                except dbaccess.ProgrammingError:\n                    self.__rows = None\n                after = time.time()\n                self.db.recordProfiling(query, after - before, rows=len(self.__rows) if self.__rows else 0)\n        except dbaccess.OperationalError:\n            if for_update is NOWAIT:\n                raise FailedToLock()\n            raise\n\n    def executemany(self, query, params=()):\n        self.validate(query, False)\n        if self.__profiling is None:\n            self.__cursor.executemany(query, params)\n        else:\n            before = time.time()\n            params = list(params)\n            self.__cursor.executemany(query, params)\n            after = time.time()\n            self.db.recordProfiling(query, after - before, repetitions=len(params))\n\n    def mogrify(self, *args):\n        return self.__cursor.mogrify(*args)\n\n    def validate(self, query, for_update):\n        raise InvalidCursorError(\"invalid use of _CursorBase\")\n\nclass _UnsafeCursor(_CursorBase):\n    def validate(self, query, for_update):\n        try:\n            command, _ = Database.analyzeQuery(query)\n        except ValueError:\n            command = None\n        if command != \"SELECT\" or for_update:\n            self.db.unsafe_queries = True\n\nclass _ReadOnlyCursor(_CursorBase):\n    def validate(self, query, for_update):\n        try:\n            command, _ = Database.analyzeQuery(query)\n        except ValueError as error:\n            raise InvalidCursorError(error.message)\n        if command != \"SELECT\" or for_update:\n            raise InvalidCursorError(\n                \"invalid SQL query for read-only cursor: \" +\n                query.split(None, 1)[0])\n\nclass _UpdatingCursor(_ReadOnlyCursor):\n    def __init__(self, tables, *args):\n        super(_UpdatingCursor, self).__init__(*args)\n        self.__disabled = False\n        self.__tables = set(tables)\n\n    @property\n    def disabled(self):\n        return self.__disabled\n\n    def validate(self, query, for_update):\n        if self.__disabled:\n            raise InvalidCursorError(\"disabled updating cursor used\")\n        try:\n            command, table = Database.analyzeQuery(query)\n        except ValueError as error:\n            raise InvalidCursorError(error.message)\n        if command == \"SELECT\":\n            return True\n        elif command not in (\"INSERT\", \"UPDATE\", \"DELETE\"):\n            raise InvalidCursorError(\n                \"invalid query for updating cursor: \" + command)\n        elif table not in self.__tables:\n            raise InvalidCursorError(\n                \"invalid table for updating cursor: \" + table)\n        else:\n            return True\n\n    def disable(self):\n        self.__disabled = True\n\nRE_COMMAND = re.compile(\n    # Optional WITH clause first:\n    r\"(?:WITH\\s+\\w+\\s+\\(\\)\\s+AS\\s+\\(\\)(?:\\s*,\\s*\\w+\\s+\\(\\)\\s+AS\\s+\\(\\))*\\s*)?\"\n    # Then query start.\n    r\"(INSERT(?=\\s+INTO)|UPDATE|DELETE(?=\\s+FROM)|SELECT)\\s+(.*)\",\n    # Let . match line breaks, and ignore case.\n    re.DOTALL | re.IGNORECASE)\n\nclass Database(Session):\n    def __init__(self, critic=None, allow_unsafe_cursors=True):\n        super(Database, self).__init__(critic)\n\n        self.__connection = dbaccess.connect()\n        self.__transaction_callbacks = []\n        self.__allow_unsafe_cursors = allow_unsafe_cursors\n        self.__updating_cursor = None\n        self.unsafe_queries = False\n\n    def __call_transaction_callbacks(self, *args):\n        keep_transaction_callbacks = []\n        for callback in self.__transaction_callbacks:\n            if callback(*args):\n                keep_transaction_callbacks.append(callback)\n        self.__transaction_callbacks = keep_transaction_callbacks\n\n    def cursor(self):\n        if not self.__allow_unsafe_cursors:\n            raise InvalidCursorError(\"unsafe cursors are not allowed\")\n        return _UnsafeCursor(self, self.__connection.cursor(), self.profiling)\n\n    def readonly_cursor(self):\n        return _ReadOnlyCursor(self, self.__connection.cursor(), self.profiling)\n\n    @contextlib.contextmanager\n    def updating_cursor(self, *tables):\n        if self.__updating_cursor:\n            raise InvalidCursorError(\"concurrent updating cursor requested\")\n        if self.unsafe_queries:\n            raise InvalidCursorError(\"mixed unsafe and updating cursors\")\n        # Commit the current transaction.  It's guaranteed to have made no\n        # modifications at this point, but might hold locks from earlier\n        # queries that could increase the likelihook of deadlocks.\n        self.commit()\n        self.__updating_cursor = _UpdatingCursor(\n            tables, self, self.__connection.cursor(), self.profiling)\n        rolled_back = False\n        try:\n            yield self.__updating_cursor\n            if self.unsafe_queries:\n                raise InvalidCursorError(\"mixed unsafe and updating cursors\")\n        except: # Yes, we really mean to handle *all* exceptions here.\n            rolled_back = True\n            self.rollback()\n            raise\n        finally:\n            do_commit = not (rolled_back or self.__updating_cursor.disabled)\n            self.__updating_cursor.disable()\n            self.__updating_cursor = None\n            if do_commit:\n                self.commit()\n\n    def commit(self):\n        if self.__updating_cursor:\n            raise InvalidCursorError(\"manual commit when using updating cursor\")\n        before = time.time()\n        self.__connection.commit()\n        after = time.time()\n        self.recordProfiling(\"<commit>\", after - before, 0)\n        self.__call_transaction_callbacks(\"commit\")\n        self.unsafe_queries = False\n\n    def rollback(self):\n        if self.__updating_cursor:\n            self.__updating_cursor.disable()\n        before = time.time()\n        self.__connection.rollback()\n        after = time.time()\n        self.recordProfiling(\"<rollback>\", after - before, 0)\n        self.__call_transaction_callbacks(\"rollback\")\n        self.unsafe_queries = False\n\n    def close(self):\n        super(Database, self).close()\n        if self.__connection:\n            self.__connection.rollback()\n            self.__connection.close()\n            self.__connection = None\n\n    def closed(self):\n        return self.__connection is None\n\n    def __enter__(self):\n        return self\n\n    def __exit__(self, *args):\n        self.close()\n        return False\n\n    def registerTransactionCallback(self, callback):\n        self.__transaction_callbacks.append(callback)\n\n    @staticmethod\n    def analyzeQuery(query):\n        \"\"\"Extract the SQL command and affected table (if any) from a query\n\n           Supported commands are SELECT, UPDATE, INSERT and DELETE.  Any other\n           kind of query will raise a ValueError.\"\"\"\n\n        level = 0\n        top_level = \"\"\n\n        for part in re.split(\"([()])\", query):\n            if part == \")\":\n                level -= 1\n            if level == 0:\n                top_level += part\n            if part == \"(\":\n                level += 1\n\n        match = RE_COMMAND.match(top_level)\n\n        if not match:\n            raise ValueError(\"unrecognized query: %s\" % query.split()[0])\n\n        command, rest = match.groups()\n\n        if command in (\"INSERT\", \"UPDATE\", \"DELETE\"):\n            rest = rest.split()\n            if command in (\"INSERT\", \"DELETE\"):\n                table = rest[1]\n            else:\n                table = rest[0]\n        else:\n            table = None\n\n        return command, table\n\n    @staticmethod\n    def forUser(critic=None):\n        return Database(critic)\n\n    @staticmethod\n    def forSystem(critic=None):\n        import dbutils\n\n        db = Database(critic)\n        db.setUser(dbutils.User.makeSystem())\n        return db\n\n    @staticmethod\n    def forTesting(critic):\n        try:\n            import configuration\n        except ImportError:\n            # Not an installed system.\n            pass\n        else:\n            assert configuration.debug.IS_TESTING\n\n        return Database.forSystem(critic)\n\n# This function performs a NULL-safe conversion from a \"truth\" value or\n# arbitrary type to True/False (or None.)  It's a utility for working around the\n# fact that SQLite stores booleans as integers (zero or one.)\ndef boolean(value):\n    return None if value is None else bool(value)\n"
  },
  {
    "path": "src/dbutils/database_unittest.py",
    "content": "def cursors():\n    import api\n    import dbutils\n\n    class TestException(Exception):\n        pass\n\n    critic = api.critic.startSession(for_testing=True)\n\n    # Create some playground tables.  We'll drop them later if all goes well,\n    # but it doesn't really matter if we don't.\n    with dbutils.Database.forTesting(critic) as db:\n        db.cursor().execute(\n            \"CREATE TABLE playground1 ( x INTEGER PRIMARY KEY, y INTEGER )\")\n        db.cursor().execute(\n            \"CREATE TABLE playground2 ( x INTEGER PRIMARY KEY, y INTEGER )\")\n        db.commit()\n\n    # Basic testing of read-only / updating cursors.\n    with dbutils.Database.forTesting(critic) as db:\n        ro_cursor = db.readonly_cursor()\n\n        with db.updating_cursor(\"playground1\") as cursor:\n            cursor.executemany(\"INSERT INTO playground1 (x, y) VALUES (%s, %s)\",\n                               [(1, 1),\n                                (2, 2),\n                                (3, 3)])\n\n        db.rollback()\n        ro_cursor.execute(\"SELECT x, y FROM playground1\")\n        assert len(list(ro_cursor)) == 3\n\n        try:\n            with db.updating_cursor(\"playground1\") as cursor:\n                cursor.execute(\"INSERT INTO playground1 (x, y) VALUES (%s, %s)\",\n                               (4, 4))\n                raise TestException\n        except TestException:\n            pass\n\n        db.commit()\n        ro_cursor.execute(\"SELECT x, y FROM playground1\")\n        assert len(list(ro_cursor)) == 3\n\n        try:\n            with db.updating_cursor(\"playground1\") as cursor:\n                cursor.execute(\"INSERT INTO playground2 (x, y) VALUES (%s, %s)\",\n                               (1, 1))\n        except dbutils.InvalidCursorError as error:\n            assert error.message == \"invalid table for updating cursor: playground2\"\n\n        db.commit()\n        ro_cursor.execute(\"SELECT x, y FROM playground2\")\n        assert len(list(ro_cursor)) == 0\n\n        try:\n            with db.updating_cursor(\"playground2\") as cursor:\n                cursor.execute(\"DELETE FROM playground1 WHERE x=1\")\n        except dbutils.InvalidCursorError as error:\n            assert error.message == \"invalid table for updating cursor: playground1\"\n\n        db.commit()\n        ro_cursor.execute(\"SELECT x, y FROM playground1\")\n        assert len(list(ro_cursor)) == 3\n\n        with db.updating_cursor(\"playground1\") as cursor:\n            cursor.execute(\"DELETE FROM playground1 WHERE x=1\")\n\n        db.rollback()\n        ro_cursor.execute(\"SELECT x, y FROM playground1\")\n        assert len(list(ro_cursor)) == 2\n\n        with db.updating_cursor(\"playground1\") as cursor:\n            cursor.execute(\"UPDATE playground1 SET y=-2 WHERE x=2\")\n\n        db.rollback()\n        ro_cursor.execute(\"SELECT y FROM playground1 WHERE x=2\")\n        assert ro_cursor.fetchone()[0] == -2\n\n        with db.updating_cursor(\"playground1\") as cursor:\n            try:\n                with db.updating_cursor(\"playground2\"):\n                    assert False\n            except dbutils.InvalidCursorError as error:\n                assert error.message == \"concurrent updating cursor requested\"\n\n        stored_cursor = None\n        with db.updating_cursor(\"playground1\") as cursor:\n            stored_cursor = cursor\n        try:\n            stored_cursor.execute(\"UPDATE playground1 SET y=-3 WHERE x=3\")\n        except dbutils.InvalidCursorError as error:\n            assert error.message == \"disabled updating cursor used\"\n\n        db.commit()\n        ro_cursor.execute(\"SELECT y FROM playground1 WHERE x=3\")\n        assert ro_cursor.fetchone()[0] == 3\n\n        try:\n            with db.updating_cursor(\"playground1\") as cursor:\n                cursor.execute(\"DROP TABLE playground1\")\n        except dbutils.InvalidCursorError as error:\n            assert error.message == \"unrecognized query: DROP\", error.message\n\n        try:\n            with db.updating_cursor(\"playground1\") as cursor:\n                cursor.execute(\"DELETE FROM playground1\")\n                db.commit()\n        except dbutils.InvalidCursorError as error:\n            assert error.message == \"manual commit when using updating cursor\", error.message\n\n        db.commit()\n        ro_cursor.execute(\"SELECT x, y FROM playground1\")\n        assert len(list(ro_cursor)) == 2\n\n    # Test mixing of unsafe cursor and updating cursor.\n    with dbutils.Database.forTesting(critic) as db:\n        ro_cursor = db.readonly_cursor()\n        unsafe_cursor = db.cursor()\n\n        with db.updating_cursor(\"playground1\") as cursor:\n            cursor.execute(\"DELETE FROM playground1\")\n            cursor.executemany(\"INSERT INTO playground1 (x, y) VALUES (%s, %s)\",\n                               [(1, 1),\n                                (2, 2),\n                                (3, 3)])\n\n        db.rollback()\n        ro_cursor.execute(\"SELECT x, y FROM playground1\")\n        assert len(list(ro_cursor)) == 3\n\n        # Can't create an updating cursor after executing an updating query\n        # using an unsafe cursor.\n        try:\n            unsafe_cursor.execute(\"DELETE FROM playground1\")\n            with db.updating_cursor(\"playground1\") as cursor:\n                assert False\n        except dbutils.InvalidCursorError as error:\n            assert error.message == \"mixed unsafe and updating cursors\"\n\n        db.rollback()\n\n        # Can't commit an updating cursor after executing an updating query\n        # using an unsafe cursor.\n        try:\n            with db.updating_cursor(\"playground1\") as cursor:\n                cursor.execute(\"INSERT INTO playground1 (x, y) VALUES (%s, %s)\",\n                               (4, 4))\n                unsafe_cursor.execute(\"DELETE FROM playground1\")\n        except dbutils.InvalidCursorError as error:\n            assert error.message == \"mixed unsafe and updating cursors\"\n\n        db.commit()\n        ro_cursor.execute(\"SELECT x, y FROM playground1\")\n        assert len(list(ro_cursor)) == 3\n\n        # If the transaction is committed or rolled back after execution of\n        # updating query using unsafe cursor, then use of updating cursor is\n        # fine.\n        unsafe_cursor.execute(\"DELETE FROM playground1\")\n        db.rollback()\n        with db.updating_cursor(\"playground1\") as cursor:\n            cursor.execute(\"UPDATE playground1 SET y=-2 WHERE x=2\")\n\n        db.rollback()\n        ro_cursor.execute(\"SELECT y FROM playground1\")\n        assert set(y for (y,) in ro_cursor) == set([1, -2, 3])\n\n        # If the transaction is committed or rolled back after execution of\n        # updating query using unsafe cursor, then use of updating cursor is\n        # fine.\n        unsafe_cursor.execute(\"UPDATE playground1 SET y=-1 WHERE x=1\")\n        db.commit()\n        with db.updating_cursor(\"playground1\") as cursor:\n            cursor.execute(\"UPDATE playground1 SET y=-3 WHERE x=3\")\n\n        db.rollback()\n        ro_cursor.execute(\"SELECT y FROM playground1\")\n        assert set(y for (y,) in ro_cursor) == set([-1, -2, -3])\n\n    # Drop the playground table.\n    with dbutils.Database.forTesting(critic) as db:\n        db.cursor().execute(\"DROP TABLE playground1\")\n        db.cursor().execute(\"DROP TABLE playground2\")\n        db.commit()\n\n    print \"cursors: ok\"\n\ndef analyzeQuery():\n    import dbutils\n\n    # Trivial cases.\n    assert dbutils.Database.analyzeQuery(\n        \"SELECT foo FROM bar WHERE fie\") == (\"SELECT\", None)\n    assert dbutils.Database.analyzeQuery(\n        \"UPDATE foo SET bar=10 WHERE fie\") == (\"UPDATE\", \"foo\")\n    assert dbutils.Database.analyzeQuery(\n        \"INSERT INTO foo (bar) VALUES (10)\") == (\"INSERT\", \"foo\")\n    assert dbutils.Database.analyzeQuery(\n        \"DELETE FROM foo WHERE bar AND fie\") == (\"DELETE\", \"foo\")\n\n    # Something more complex.\n    assert dbutils.Database.analyzeQuery(\n        \"\"\"WITH allpaths (path) AS (VALUES (%s)),\n                missingpaths (path) AS (SELECT allpaths.path\n                                          FROM allpaths\n                               LEFT OUTER JOIN files ON (MD5(files.path)=MD5(allpaths.path))\n                                         WHERE files.path IS NULL)\n           INSERT INTO files (path)\n                SELECT path\n                  FROM missingpaths\"\"\") == (\"INSERT\", \"files\")\n\n    print \"analyzeQuery: ok\"\n"
  },
  {
    "path": "src/dbutils/paths.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nclass InvalidFileId(Exception):\n    def __init__(self, file_id):\n        super(InvalidFileId, self).__init__(\"Invalid file id: %d\" % file_id)\n\nclass InvalidPath(Exception):\n    pass\n\nclass File(object):\n    def __init__(self, file_id, path):\n        self.id = file_id\n        self.path = path\n\n    def __int__(self):\n        return self.id\n    def __str__(self):\n        return self.path\n\n    @staticmethod\n    def fromId(db, file_id):\n        return File(file_id, describe_file(db, file_id))\n\n    @staticmethod\n    def fromPath(db, path, insert=True):\n        file_id = find_file(db, path, insert)\n        if file_id is None:\n            # Only happens when insert=False.\n            raise InvalidPath(\"Path does not exist: %s\" % path)\n        return File(file_id, path)\n\ndef find_file(db, path, insert=True):\n    path = path.lstrip(\"/\")\n\n    if path.endswith(\"/\"):\n        raise InvalidPath(\"Trailing path separator: %r\" % path)\n\n    cursor = db.cursor()\n    cursor.execute(\"SELECT id, path FROM files WHERE MD5(path)=MD5(%s)\", (path,))\n\n    row = cursor.fetchone()\n\n    if row:\n        file_id, found_path = row\n        assert path == found_path, \"MD5 collision in files table: %r != %r\" % (path, found_path)\n        return file_id\n\n    if insert:\n        cursor.execute(\"INSERT INTO files (path) VALUES (%s) RETURNING id\", (path,))\n        return cursor.fetchone()[0]\n\n    return None\n\ndef find_files(db, files):\n    for file in files:\n        file.id = find_file(db, file.path)\n\ndef describe_file(db, file_id):\n    cursor = db.cursor()\n    cursor.execute(\"SELECT path FROM files WHERE id=%s\", (file_id,))\n    row = cursor.fetchone()\n    if not row:\n        raise InvalidFileId(file_id)\n    return row[0]\n"
  },
  {
    "path": "src/dbutils/review.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport time\n\nimport base\n\ndef countDraftItems(db, user, review):\n    cursor = db.cursor()\n\n    cursor.execute(\"\"\"SELECT reviewfilechanges.to_state, SUM(deleted) + SUM(inserted)\n                        FROM reviewfiles\n                        JOIN reviewfilechanges ON (reviewfilechanges.file=reviewfiles.id)\n                       WHERE reviewfiles.review=%s\n                         AND reviewfilechanges.uid=%s\n                         AND reviewfilechanges.state='draft'\n                    GROUP BY reviewfilechanges.to_state\"\"\",\n                   (review.id, user.id))\n\n    reviewed = unreviewed = 0\n\n    for to_state, lines in cursor:\n        if to_state == \"reviewed\": reviewed = lines\n        else: unreviewed = lines\n\n    cursor.execute(\"\"\"SELECT reviewfilechanges.to_state, COUNT(*)\n                        FROM reviewfiles\n                        JOIN reviewfilechanges ON (reviewfilechanges.file=reviewfiles.id)\n                       WHERE reviewfiles.review=%s\n                         AND reviewfiles.deleted=0\n                         AND reviewfiles.inserted=0\n                         AND reviewfilechanges.uid=%s\n                         AND reviewfilechanges.state='draft'\n                    GROUP BY reviewfilechanges.to_state\"\"\",\n                   (review.id, user.id))\n\n    reviewedBinary = unreviewedBinary = 0\n\n    for to_state, lines in cursor:\n        if to_state == \"reviewed\": reviewedBinary = lines\n        else: unreviewedBinary = lines\n\n    cursor.execute(\"SELECT count(*) FROM commentchains, comments WHERE commentchains.review=%s AND comments.chain=commentchains.id AND comments.uid=%s AND comments.state='draft'\", [review.id, user.id])\n    comments = cursor.fetchone()[0]\n\n    cursor.execute(\"\"\"SELECT DISTINCT commentchains.id\n                        FROM commentchains\n                        JOIN commentchainchanges ON (commentchainchanges.chain=commentchains.id)\n                       WHERE commentchains.review=%s\n                         AND commentchainchanges.uid=%s\n                         AND commentchainchanges.state='draft'\n                         AND ((commentchains.state=commentchainchanges.from_state\n                           AND commentchainchanges.from_state IN ('addressed', 'closed')\n                           AND commentchainchanges.to_state='open')\n                          OR (commentchainchanges.from_addressed_by IS NOT NULL\n                           AND commentchainchanges.to_addressed_by IS NOT NULL))\"\"\",\n                   [review.id, user.id])\n    reopened = len(cursor.fetchall())\n\n    cursor.execute(\"\"\"SELECT count(*) FROM commentchains, commentchainchanges\n                       WHERE commentchains.review=%s\n                         AND commentchains.state='open'\n                         AND commentchainchanges.chain=commentchains.id\n                         AND commentchainchanges.uid=%s\n                         AND commentchainchanges.state='draft'\n                         AND commentchainchanges.from_state='open'\n                         AND commentchainchanges.to_state='closed'\"\"\",\n                   [review.id, user.id])\n    closed = cursor.fetchone()[0]\n\n    cursor.execute(\"\"\"SELECT count(*) FROM commentchains, commentchainchanges\n                       WHERE commentchains.review=%s\n                         AND commentchainchanges.chain=commentchains.id\n                         AND commentchainchanges.uid=%s\n                         AND commentchainchanges.state='draft'\n                         AND commentchainchanges.from_type=commentchains.type\n                         AND commentchainchanges.to_type!=commentchains.type\"\"\",\n                   [review.id, user.id])\n    morphed = cursor.fetchone()[0]\n\n    return { \"reviewedNormal\": reviewed,\n             \"unreviewedNormal\": unreviewed,\n             \"reviewedBinary\": reviewedBinary,\n             \"unreviewedBinary\": unreviewedBinary,\n             \"writtenComments\": comments,\n             \"reopenedIssues\": reopened,\n             \"resolvedIssues\": closed,\n             \"morphedChains\": morphed }\n\nclass NoSuchReview(base.Error):\n    def __init__(self, review_id):\n        super(NoSuchReview, self).__init__(\"No such review: r/%d\" % review_id)\n        self.id = review_id\n\nclass ReviewState(object):\n    def __init__(self, review, accepted, pending, reviewed, issues):\n        self.review = review\n        self.accepted = accepted\n        self.pending = pending\n        self.reviewed = reviewed\n        self.issues = issues\n\n    def getPercentReviewed(self):\n        if self.pending + self.reviewed:\n            return 100.0 * self.reviewed / (self.pending + self.reviewed)\n        else:\n            return 50.0\n\n    def getProgress(self):\n        if self.pending + self.reviewed == 0:\n            return \"?? %\"\n        percent = self.getPercentReviewed()\n        if int(percent) > 0 and (percent < 99.0 or percent == 100.0):\n            return \"%d %%\" % int(percent)\n        elif percent > 0:\n            precision = 1\n            while precision < 10:\n                progress = (\"%%.%df\" % precision) % percent\n                if progress[-1] != '0': break\n                precision += 1\n            return progress + \" %\"\n        else:\n            return \"No progress\"\n\n    def getIssues(self):\n        if self.issues: return \"%d issue%s\" % (self.issues, \"s\" if self.issues > 1 else \"\")\n        else: return \"\"\n\n    def __str__(self):\n        if self.review.state == 'dropped': return \"Dropped...\"\n        elif self.review.state == 'closed': return \"Finished!\"\n        elif self.accepted: return \"Accepted!\"\n        else:\n            progress = self.getProgress()\n            issues = self.getIssues()\n            if issues: return \"%s and %s\" % (progress, issues)\n            else: return progress\n\nclass ReviewRebase(object):\n    def __init__(self, review, old_head, new_head, old_upstream, new_upstream, user,\n                 equivalent_merge, replayed_rebase):\n        self.review = review\n        self.old_head = old_head\n        self.new_head = new_head\n        self.old_upstream = old_upstream\n        self.new_upstream = new_upstream\n        self.user = user\n        self.equivalent_merge = equivalent_merge\n        self.replayed_rebase = replayed_rebase\n\nclass ReviewRebases(list):\n    def __init__(self, db, review):\n        import gitutils\n        from dbutils import User\n\n        self.__old_head_map = {}\n        self.__new_head_map = {}\n\n        cursor = db.cursor()\n        cursor.execute(\"\"\"SELECT old_head, new_head, old_upstream, new_upstream, uid,\n                                 equivalent_merge, replayed_rebase\n                            FROM reviewrebases\n                           WHERE review=%s\n                             AND new_head IS NOT NULL\"\"\",\n                       (review.id,))\n\n        for (old_head_id, new_head_id, old_upstream_id, new_upstream_id, user_id,\n             equivalent_merge_id, replayed_rebase_id) in cursor:\n            old_head = gitutils.Commit.fromId(db, review.repository, old_head_id)\n            new_head = gitutils.Commit.fromId(db, review.repository, new_head_id)\n\n            if old_upstream_id is not None and new_upstream_id is not None:\n                old_upstream = gitutils.Commit.fromId(db, review.repository, old_upstream_id)\n                new_upstream = gitutils.Commit.fromId(db, review.repository, new_upstream_id)\n            else:\n                old_upstream = new_upstream = None\n\n            if equivalent_merge_id:\n                equivalent_merge = gitutils.Commit.fromId(db, review.repository, equivalent_merge_id)\n            else:\n                equivalent_merge = None\n\n            if replayed_rebase_id:\n                replayed_rebase = gitutils.Commit.fromId(db, review.repository, replayed_rebase_id)\n            else:\n                replayed_rebase = None\n\n            user = User.fromId(db, user_id)\n            rebase = ReviewRebase(review, old_head, new_head, old_upstream, new_upstream, user,\n                                  equivalent_merge, replayed_rebase)\n\n            self.append(rebase)\n            self.__old_head_map[old_head] = rebase\n            self.__new_head_map[new_head] = rebase\n\n            if equivalent_merge:\n                self.__old_head_map[equivalent_merge] = rebase\n\n        if review.performed_rebase:\n            self.__old_head_map[review.performed_rebase.old_head] = review.performed_rebase\n            self.__new_head_map[review.performed_rebase.new_head] = review.performed_rebase\n\n    def fromOldHead(self, commit):\n        return self.__old_head_map.get(commit)\n\n    def fromNewHead(self, commit):\n        return self.__new_head_map.get(commit)\n\nclass ReviewTrackedBranch(object):\n    def __init__(self, review, trackedbranch_id, remote, name, disabled):\n        self.id = trackedbranch_id\n        self.review = review\n        self.remote = remote\n        self.name = name\n        self.disabled = disabled\n\nclass Review(object):\n    def __init__(self, review_id, owners, review_type, branch, state, serial, summary, description, applyfilters, applyparentfilters):\n        self.id = review_id\n        self.owners = owners\n        self.type = review_type\n        self.repository = branch.repository\n        self.branch = branch\n        self.state = state\n        self.serial = serial\n        self.summary = summary\n        self.description = description\n        self.reviewers = []\n        self.watchers = {}\n        self.commentchains = None\n        self.applyfilters = applyfilters\n        self.applyparentfilters = applyparentfilters\n        self.filters = None\n        self.relevant_files = None\n        self.draft_status = None\n        self.performed_rebase = None\n\n    @staticmethod\n    def isAccepted(db, review_id):\n        cursor = db.cursor()\n\n        cursor.execute(\"SELECT 1 FROM reviewfiles WHERE review=%s AND state='pending' LIMIT 1\", (review_id,))\n        if cursor.fetchone(): return False\n\n        cursor.execute(\"SELECT 1 FROM commentchains WHERE review=%s AND type='issue' AND state='open' LIMIT 1\", (review_id,))\n        if cursor.fetchone(): return False\n\n        return True\n\n    def accepted(self, db):\n        if self.state != 'open': return False\n        else: return Review.isAccepted(db, self.id)\n\n    def getReviewState(self, db):\n        cursor = db.cursor()\n\n        cursor.execute(\"\"\"SELECT state, SUM(deleted) + SUM(inserted)\n                            FROM reviewfiles\n                           WHERE reviewfiles.review=%s\n                        GROUP BY state\"\"\",\n                       (self.id,))\n\n        pending = 0\n        reviewed = 0\n\n        for state, count in cursor.fetchall():\n            if state == \"pending\": pending = count\n            else: reviewed = count\n\n        cursor.execute(\"\"\"SELECT count(id)\n                            FROM commentchains\n                           WHERE review=%s\n                             AND type='issue'\n                             AND state='open'\"\"\",\n                       (self.id,))\n\n        issues = cursor.fetchone()[0]\n\n        return ReviewState(self, self.accepted(db), pending, reviewed, issues)\n\n    def setPerformedRebase(self, old_head, new_head, old_upstream, new_upstream, user,\n                           equivalent_merge, replayed_rebase):\n        self.performed_rebase = ReviewRebase(self, old_head, new_head, old_upstream, new_upstream, user,\n                                             equivalent_merge, replayed_rebase)\n\n    def getReviewRebases(self, db):\n        return ReviewRebases(db, self)\n\n    def getTrackedBranch(self, db):\n        cursor = db.cursor()\n        cursor.execute(\"\"\"SELECT trackedbranches.id, remote, remote_name, disabled\n                            FROM trackedbranches\n                            JOIN branches ON (trackedbranches.repository=branches.repository\n                                          AND trackedbranches.local_name=branches.name)\n                            JOIN reviews ON (branches.id=reviews.branch)\n                           WHERE reviews.id=%s\"\"\",\n                       (self.id,))\n\n        for trackedbranch_id, remote, name, disabled in cursor:\n            return ReviewTrackedBranch(self, trackedbranch_id, remote, name, disabled)\n\n    def getCommitSet(self, db):\n        import gitutils\n        import log.commitset\n\n        cursor = db.cursor()\n        cursor.execute(\"\"\"SELECT DISTINCT commits.id, commits.sha1\n                            FROM commits\n                            JOIN changesets ON (changesets.child=commits.id)\n                            JOIN reviewchangesets ON (reviewchangesets.changeset=changesets.id)\n                           WHERE reviewchangesets.review=%s\"\"\",\n                       (self.id,))\n\n        commits = []\n\n        for commit_id, commit_sha1 in cursor:\n            commits.append(gitutils.Commit.fromSHA1(db, self.repository, commit_sha1, commit_id))\n\n        return log.commitset.CommitSet(commits)\n\n    def containsCommit(self, db, commit, include_head_and_tails=False, include_actual_log=False):\n        import gitutils\n\n        commit_id = None\n        commit_sha1 = None\n\n        if isinstance(commit, gitutils.Commit):\n            commit_id = commit.id\n            commit_sha1 = commit.sha1\n        elif isinstance(commit, str):\n            commit_sha1 = self.repository.revparse(commit)\n            commit = None\n        elif isinstance(commit, int):\n            commit_id = commit\n            commit = None\n        else:\n            raise TypeError\n\n        cursor = db.cursor()\n\n        if commit_id is not None:\n            cursor.execute(\"\"\"SELECT 1\n                                FROM reviewchangesets\n                                JOIN changesets ON (id=changeset)\n                               WHERE reviewchangesets.review=%s\n                                 AND changesets.child=%s\n                                 AND changesets.type!='conflicts'\"\"\",\n                           (self.id, commit_id))\n        else:\n            cursor.execute(\"\"\"SELECT 1\n                                FROM reviewchangesets\n                                JOIN changesets ON (changesets.id=reviewchangesets.changeset)\n                                JOIN commits ON (commits.id=changesets.child)\n                               WHERE reviewchangesets.review=%s\n                                 AND changesets.type!='conflicts'\n                                 AND commits.sha1=%s\"\"\",\n                           (self.id, commit_sha1))\n\n        if cursor.fetchone() is not None:\n            return True\n\n        if include_head_and_tails:\n            head_and_tails = set([self.branch.getHead(db)])\n\n            commitset = self.getCommitSet(db)\n\n            if commitset:\n                head_and_tails |= commitset.getTails()\n\n            if commit_sha1 is None:\n                if commit is None:\n                    commit = gitutils.Commit.fromId(db, self.repository, commit_id)\n                commit_sha1 = commit.sha1\n\n            if commit_sha1 in head_and_tails:\n                return True\n\n        if include_actual_log:\n            if commit_id is not None:\n                cursor.execute(\"\"\"SELECT 1\n                                    FROM reachable\n                                    JOIN branches ON (branches.id=reachable.branch)\n                                    JOIN reviews ON (reviews.branch=branches.id)\n                                   WHERE reachable.commit=%s\n                                     AND reviews.id=%s\"\"\",\n                               (commit_id, self.id))\n            else:\n                cursor.execute(\"\"\"SELECT 1\n                                    FROM commits\n                                    JOIN reachable ON (reachable.commit=commits.id)\n                                    JOIN branches ON (branches.id=reachable.branch)\n                                    JOIN reviews ON (reviews.branch=branches.id)\n                                   WHERE commits.sha1=%s\n                                     AND reviews.id=%s\"\"\",\n                               (commit_sha1, self.id))\n\n            if cursor.fetchone() is not None:\n                return True\n\n        return False\n\n    def getJS(self):\n        return \"var review = critic.review = { id: %d, branch: { id: %d, name: %r }, owners: [ %s ], serial: %d };\" % (self.id, self.branch.id, self.branch.name, \", \".join(owner.getJSConstructor() for owner in self.owners), self.serial)\n\n    def getETag(self, db, user=None):\n        import configuration\n\n        cursor = db.cursor()\n        etag = \"\"\n\n        if configuration.debug.IS_DEVELOPMENT:\n            cursor.execute(\"SELECT installed_at FROM systemidentities WHERE name=%s\", (configuration.base.SYSTEM_IDENTITY,))\n            installed_at = cursor.fetchone()[0]\n            etag += \"install%s.\" % time.mktime(installed_at.timetuple())\n\n        if user and not user.isAnonymous():\n            etag += \"user%d.\" % user.id\n\n        etag += \"review%d.serial%d\" % (self.id, self.serial)\n\n        if user:\n            items = self.getDraftStatus(db, user)\n            if any(items.values()):\n                etag += \".draft%d\" % hash(tuple(sorted(items.items())))\n\n            cursor.execute(\"SELECT id FROM reviewrebases WHERE review=%s AND uid=%s AND new_head IS NULL\", (self.id, user.id))\n            row = cursor.fetchone()\n            if row:\n                etag += \".rebase%d\" % row[0]\n\n        return '\"%s\"' % etag\n\n    def getURL(self, db, user=None, indent=0, separator=\"\\n\"):\n        import dbutils\n\n        indent = \" \" * indent\n\n        if user:\n            url_prefixes = user.getCriticURLs(db)\n        else:\n            url_prefixes = [dbutils.getURLPrefix(db)]\n\n        return separator.join([\"%s%s/r/%d\" % (indent, url_prefix, self.id) for url_prefix in url_prefixes])\n\n    def getRecipients(self, db):\n        from dbutils import User\n\n        cursor = db.cursor()\n        cursor.execute(\"SELECT uid, include FROM reviewrecipientfilters WHERE review=%s\", (self.id,))\n\n        default_include = True\n        included = set(owner.id for owner in self.owners)\n        excluded = set()\n\n        for uid, include in cursor:\n            if uid is None:\n                default_include = include\n            elif include:\n                included.add(uid)\n            elif uid not in self.owners:\n                excluded.add(uid)\n\n        cursor.execute(\"SELECT uid FROM reviewusers WHERE review=%s\", (self.id,))\n\n        recipients = []\n        for (user_id,) in cursor:\n            if user_id in excluded:\n                continue\n            elif user_id not in included and not default_include:\n                continue\n\n            user = User.fromId(db, user_id)\n            if user.status != \"retired\":\n                recipients.append(user)\n\n        return recipients\n\n    def getDraftStatus(self, db, user):\n        if self.draft_status is None:\n            self.draft_status = countDraftItems(db, user, self)\n        return self.draft_status\n\n    def incrementSerial(self, db):\n        self.serial += 1\n        db.cursor().execute(\"UPDATE reviews SET serial=%s WHERE id=%s\", [self.serial, self.id])\n\n    def scheduleBranchArchival(self, db, delay=None):\n        import dbutils\n\n        # First, cancel current scheduled archival, if there is one.\n        self.cancelScheduledBranchArchival(db)\n\n        # If review is not closed or dropped, don't schedule a branch archival.\n        # Also don't schedule one if the branch has already been archived.\n        if self.state not in (\"closed\", \"dropped\") or self.branch.archived:\n            return\n\n        if delay is None:\n            # Configuration policy:\n            #\n            # Any owner of a review can, by having changed the relevant\n            # preference setting, increase the time before a review branch is\n            # archived, or disable archival entirely, but they can't make it\n            # happen sooner than the system or repository default, or what any\n            # other owner has requested.\n\n            # Find configured value for each owner, and also the per-repository\n            # (or per-system) default, in case each owner has changed the\n            # setting.\n            preference_item = \"review.branchArchiveDelay.\" + self.state\n            repository_default = dbutils.User.fetchPreference(\n                db, preference_item, repository=self.repository)\n            delays = set([repository_default])\n            for owner in self.owners:\n                delays.add(owner.getPreference(db, preference_item,\n                                               repository=self.repository))\n\n            # If configured to zero (by any owner,) don't schedule a branch\n            # archival.\n            if min(delays) <= 0:\n                return\n\n            # Otherwise, use maximum configured value for any owner.\n            delay = max(delays)\n\n        cursor = db.cursor()\n        cursor.execute(\"\"\"INSERT INTO scheduledreviewbrancharchivals (review, deadline)\n                               VALUES (%s, NOW() + INTERVAL %s)\"\"\",\n                       (self.id, \"%d DAYS\" % delay))\n\n        return delay\n\n    def cancelScheduledBranchArchival(self, db):\n        cursor = db.cursor()\n        cursor.execute(\"\"\"DELETE FROM scheduledreviewbrancharchivals\n                                WHERE review=%s\"\"\",\n                       (self.id,))\n\n    def close(self, db, user):\n        self.serial += 1\n        self.state = \"closed\"\n        db.cursor().execute(\"UPDATE reviews SET state='closed', serial=%s, closed_by=%s WHERE id=%s\", (self.serial, user.id, self.id))\n        self.scheduleBranchArchival(db)\n\n    def drop(self, db, user):\n        self.serial += 1\n        self.state = \"dropped\"\n        db.cursor().execute(\"UPDATE reviews SET state='dropped', serial=%s, closed_by=%s WHERE id=%s\", (self.serial, user.id, self.id))\n        self.scheduleBranchArchival(db)\n\n    def reopen(self, db, user):\n        self.serial += 1\n        if self.branch.archived:\n            self.branch.resurrect(db)\n        db.cursor().execute(\"UPDATE reviews SET state='open', serial=%s, closed_by=NULL WHERE id=%s\", (self.serial, self.id))\n        self.cancelScheduledBranchArchival(db)\n\n    def disableTracking(self, db):\n        db.cursor().execute(\"UPDATE trackedbranches SET disabled=TRUE WHERE repository=%s AND local_name=%s\", (self.repository.id, self.branch.name))\n\n    def setSummary(self, db, summary):\n        self.serial += 1\n        self.summary = summary\n        db.cursor().execute(\"UPDATE reviews SET summary=%s, serial=%s WHERE id=%s\", [self.summary, self.serial, self.id])\n\n    def setDescription(self, db, description):\n        self.serial += 1\n        self.description = description\n        db.cursor().execute(\"UPDATE reviews SET description=%s, serial=%s WHERE id=%s\", [self.description, self.serial, self.id])\n\n    def addOwner(self, db, owner):\n        if not owner in self.owners:\n            self.serial += 1\n            self.owners.append(owner)\n\n            cursor = db.cursor()\n            cursor.execute(\"SELECT 1 FROM reviewusers WHERE review=%s AND uid=%s\", (self.id, owner.id))\n\n            if cursor.fetchone():\n                cursor.execute(\"UPDATE reviewusers SET owner=TRUE WHERE review=%s AND uid=%s\", (self.id, owner.id))\n            else:\n                cursor.execute(\"INSERT INTO reviewusers (review, uid, owner) VALUES (%s, %s, TRUE)\", (self.id, owner.id))\n\n            cursor.execute(\"SELECT id FROM trackedbranches WHERE repository=%s AND local_name=%s\", (self.repository.id, self.branch.name))\n\n            row = cursor.fetchone()\n            if row:\n                trackedbranch_id = row[0]\n                cursor.execute(\"INSERT INTO trackedbranchusers (branch, uid) VALUES (%s, %s)\", (trackedbranch_id, owner.id))\n\n    def removeOwner(self, db, owner):\n        if owner in self.owners:\n            self.serial += 1\n            self.owners.remove(owner)\n\n            cursor = db.cursor()\n            cursor.execute(\"UPDATE reviewusers SET owner=FALSE WHERE review=%s AND uid=%s\", (self.id, owner.id))\n            cursor.execute(\"SELECT id FROM trackedbranches WHERE repository=%s AND local_name=%s\", (self.repository.id, self.branch.name))\n\n            row = cursor.fetchone()\n            if row:\n                trackedbranch_id = row[0]\n                cursor.execute(\"DELETE FROM trackedbranchusers WHERE branch=%s AND uid=%s\", (trackedbranch_id, owner.id))\n\n    def getReviewFilters(self, db):\n        cursor = db.cursor()\n        cursor.execute(\"SELECT uid, path, type, NULL FROM reviewfilters WHERE review=%s\", (self.id,))\n        return cursor.fetchall() or None\n\n    def getFilteredTails(self, db):\n        import log.commitset\n        commitset = log.commitset.CommitSet(self.branch.getCommits(db))\n        return commitset.getFilteredTails(self.branch.repository)\n\n    def getRelevantFiles(self, db, user):\n        if not self.filters:\n            from reviewing.filters import Filters\n\n            self.filters = Filters()\n            self.filters.setFiles(db, review=self)\n            self.filters.load(db, review=self)\n            self.relevant_files = self.filters.getRelevantFiles()\n\n            cursor = db.cursor()\n            cursor.execute(\"SELECT assignee, file FROM fullreviewuserfiles WHERE review=%s\", (self.id,))\n            for user_id, file_id in cursor:\n                self.relevant_files.setdefault(user_id, set()).add(file_id)\n\n        return self.relevant_files.get(user.id, set())\n\n    def getUserAssociation(self, db, user):\n        cursor = db.cursor()\n\n        association = []\n\n        if user in self.owners:\n            association.append(\"owner\")\n\n        cursor.execute(\"\"\"SELECT 1\n                            FROM reviewchangesets\n                            JOIN changesets ON (changesets.id=reviewchangesets.changeset)\n                            JOIN commits ON (commits.id=changesets.child)\n                            JOIN gitusers ON (gitusers.id=commits.author_gituser)\n                            JOIN usergitemails USING (email)\n                           WHERE reviewchangesets.review=%s\n                             AND usergitemails.uid=%s\"\"\",\n                       (self.id, user.id))\n        if cursor.fetchone():\n            association.append(\"author\")\n\n        cursor.execute(\"SELECT COUNT(*) FROM fullreviewuserfiles WHERE review=%s AND assignee=%s\", (self.id, user.id))\n        if cursor.fetchone()[0] != 0:\n            association.append(\"reviewer\")\n        elif user not in self.owners:\n            cursor.execute(\"SELECT 1 FROM reviewusers WHERE review=%s AND uid=%s\", (self.id, user.id))\n            if cursor.fetchone():\n                association.append(\"watcher\")\n\n        if not association:\n            association.append(\"none\")\n\n        return \", \".join(association)\n\n    @staticmethod\n    def fromId(db, review_id, branch=None, profiler=None):\n        from dbutils import User\n\n        cursor = db.cursor()\n        cursor.execute(\"SELECT type, branch, state, serial, summary, description, applyfilters, applyparentfilters FROM reviews WHERE id=%s\", [review_id])\n        row = cursor.fetchone()\n        if not row: raise NoSuchReview(review_id)\n\n        type, branch_id, state, serial, summary, description, applyfilters, applyparentfilters = row\n\n        if profiler: profiler.check(\"Review.fromId: basic\")\n\n        if branch is None:\n            from dbutils import Branch\n            branch = Branch.fromId(db, branch_id, load_review=False, profiler=profiler)\n\n        cursor.execute(\"SELECT uid FROM reviewusers WHERE review=%s AND owner\", (review_id,))\n\n        owners = User.fromIds(db, [user_id for (user_id,) in cursor])\n\n        if profiler: profiler.check(\"Review.fromId: owners\")\n\n        review = Review(review_id, owners, type, branch, state, serial, summary, description, applyfilters, applyparentfilters)\n        branch.review = review\n\n        # Reviewers: all users that have at least one review file assigned to them.\n        cursor.execute(\"\"\"SELECT DISTINCT uid, assignee IS NOT NULL, type\n                            FROM reviewusers\n                 LEFT OUTER JOIN fullreviewuserfiles ON (fullreviewuserfiles.review=reviewusers.review AND assignee=uid)\n                           WHERE reviewusers.review=%s\"\"\",\n                       (review_id,))\n\n        reviewers = []\n        watchers = []\n        watcher_types = {}\n\n        for user_id, is_reviewer, user_type in cursor.fetchall():\n            if is_reviewer:\n                reviewers.append(user_id)\n            elif user_id not in review.owners:\n                watchers.append(user_id)\n                watcher_types[user_id] = user_type\n\n        review.reviewers = User.fromIds(db, reviewers)\n\n        for watcher in User.fromIds(db, watchers):\n            review.watchers[watcher] = watcher_types[watcher]\n\n        if profiler: profiler.check(\"Review.fromId: users\")\n\n        return review\n\n    @staticmethod\n    def fromBranch(db, branch):\n        if branch:\n            cursor = db.cursor()\n            cursor.execute(\"SELECT id FROM reviews WHERE branch=%s\", [branch.id])\n            row = cursor.fetchone()\n            if not row: return None\n            else: return Review.fromId(db, row[0], branch)\n        else:\n            return None\n\n    @staticmethod\n    def fromName(db, repository, name):\n        from dbutils import Branch\n        return Review.fromBranch(db, Branch.fromName(db, repository, name))\n\n    @staticmethod\n    def fromArgument(db, argument):\n        try:\n            return Review.fromId(db, int(argument))\n        except:\n            from dbutils import Branch\n            branch = Branch.fromName(db, str(argument))\n            if not branch: return None\n            return Review.fromBranch(db, branch)\n\n    @staticmethod\n    def fromAPI(api_review):\n        return Review.fromId(api_review.critic.database, api_review.id)\n"
  },
  {
    "path": "src/dbutils/session.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nclass Session(object):\n    def __init__(self, critic):\n        self.critic = critic\n\n        self.__atexit = []\n        self.storage = { \"Repository\": {},\n                         \"User\": {},\n                         \"Commit\": {},\n                         \"CommitUserTime\": {},\n                         \"Timezones\": {} }\n        self.profiling = {}\n\n        self.__user = None\n        self.__authentication_labels = set()\n        self.__profiles = set()\n\n    def atexit(self, fn):\n        self.__atexit.append(fn)\n\n    def close(self):\n        for fn in self.__atexit:\n            try: fn(self)\n            except: pass\n        self.__atexit = []\n\n    def disableProfiling(self):\n        self.profiling = None\n\n    def recordProfiling(self, item, duration, rows=None, repetitions=1):\n        if self.profiling is not None:\n            count, accumulated_ms, maximum_ms, accumulated_rows, maximum_rows = self.profiling.get(item, (0, 0.0, 0.0, None, None))\n\n            count += repetitions\n            accumulated_ms += 1000 * duration\n            maximum_ms = max(maximum_ms, 1000 * duration)\n\n            if rows is not None:\n                if accumulated_rows is None: accumulated_rows = 0\n                if maximum_rows is None: maximum_rows = 0\n                accumulated_rows += rows\n                maximum_rows = max(maximum_rows, rows)\n\n            self.profiling[item] = count, accumulated_ms, maximum_ms, accumulated_rows, maximum_rows\n\n    @property\n    def user(self):\n        return self.__user\n\n    @property\n    def authentication_labels(self):\n        return self.__authentication_labels\n\n    @property\n    def profiles(self):\n        return frozenset(self.__profiles)\n\n    def setUser(self, user, authentication_labels=()):\n        import auth\n        import api\n        assert not self.__user or self.__user.isAnonymous()\n        self.__user = user\n        self.__authentication_labels.update(authentication_labels)\n        self.__profiles.add(auth.AccessControlProfile.forUser(\n            self, user, self.__authentication_labels))\n        if self.critic and not (user.isAnonymous() or user.isSystem()):\n            self.critic.setActualUser(api.user.fetch(self.critic, user_id=user.id))\n\n    def addProfile(self, profile):\n        self.__profiles.add(profile)\n"
  },
  {
    "path": "src/dbutils/system.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2013 Martin Olsson\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\ndef getInstalledSHA1(db):\n    import configuration\n    cursor = db.cursor()\n    cursor.execute(\"SELECT installed_sha1 FROM systemidentities WHERE name=%s\",\n                   (configuration.base.SYSTEM_IDENTITY,))\n    return cursor.fetchone()[0]\n\ndef getURLPrefix(db, user=None):\n    import configuration\n    cursor = db.cursor()\n    cursor.execute(\"\"\"SELECT anonymous_scheme, authenticated_scheme, hostname\n                        FROM systemidentities\n                       WHERE name=%s\"\"\",\n                   (configuration.base.SYSTEM_IDENTITY,))\n    anonymous_scheme, authenticated_scheme, hostname = cursor.fetchone()\n    if user and not user.isAnonymous():\n        scheme = authenticated_scheme\n    else:\n        scheme = anonymous_scheme\n    return \"%s://%s\" % (scheme, hostname)\n\ndef getAdministratorContacts(db, indent=0, as_html=False):\n    import dbutils\n    administrators = dbutils.User.withRole(db, \"administrator\")\n\n    # Sort by id, IOW, by user creation time.  Probably gives \"primary\"\n    # administrator first and auxiliary administrators second, but might also\n    # just be arbitrary.  If nothing else, it makes the order stable.\n    administrators = sorted(administrators, key=lambda user: user.id)\n\n    # Skip administrators with no email addresses, since those are unhelpful in\n    # this context.\n    administrators = filter(lambda user: user.email, administrators)\n\n    if as_html:\n        result = \"the system administrator\"\n\n        if not administrators:\n            return result\n        if len(administrators) > 1:\n            result += \"s\"\n\n        result += \" (%s)\"\n\n        mailto_links = \\\n            [(\"<a href='mailto:%(email)s'>%(fullname)s</a>\"\n              % { \"email\": user.email, \"fullname\": user.fullname })\n             for user in administrators]\n\n        if len(mailto_links) == 1:\n            return result % mailto_links[0]\n        else:\n            return result % (\"one of %s or %s\" % (\", \".join(mailto_links[:-1]),\n                                                  mailto_links[-1]))\n    else:\n        if not administrators:\n            return \"\"\n\n        administrators = [\"%s <%s>\" % (user.fullname, user.email)\n                          for user in administrators]\n\n        prefix = \" \" * indent\n        return prefix + (\"\\n\" + prefix).join(administrators)\n"
  },
  {
    "path": "src/dbutils/timezones.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport time\nimport datetime\n\ndef loadTimezones(db):\n    \"\"\"\n    Insert (interesting) timezones from 'pg_timezone_names' into 'timezones'\n\n    The 'pg_timezone_names' table contains all the information we want (but\n    typically unnecessarily many different timezones) but is very slow to query,\n    so can't be used during normal operations.\n    \"\"\"\n\n    import configuration\n\n    def add(name, abbrev, utc_offset):\n        cursor.execute(\"\"\"INSERT INTO timezones (name, abbrev, utc_offset)\n                               VALUES (%s, %s, %s)\"\"\",\n                       (name, abbrev, utc_offset))\n\n    if configuration.database.DRIVER == \"postgresql\":\n        cursor = db.cursor()\n        cursor.execute(\"DELETE FROM timezones\")\n\n        add(\"Universal/UTC\", \"UTC\", datetime.timedelta())\n\n        cursor.execute(\"SELECT name, abbrev, utc_offset FROM pg_timezone_names\")\n        for full_name, abbrev, utc_offset in cursor.fetchall():\n            region, _, name = full_name.partition(\"/\")\n            if region not in (\"posix\", \"Etc\") and name and \"/\" not in name:\n                add(full_name, abbrev, utc_offset)\n\n        db.commit()\n\ndef updateTimezones(db):\n    \"\"\"\n    Update UTC offses in 'timezones' with values in 'pg_timezone_names'\n\n    The UTC offsets in 'pg_timezone_names' are DST adjusted (for the timezones\n    we care about) so we need to copy the values regularly to keep the cached\n    values in 'timezones' up-to-date.\n    \"\"\"\n\n    import configuration\n\n    if configuration.database.DRIVER == \"postgresql\":\n        cursor = db.cursor()\n        cursor.execute(\"\"\"UPDATE timezones\n                             SET utc_offset=pg_timezone_names.utc_offset\n                            FROM pg_timezone_names\n                           WHERE pg_timezone_names.name=timezones.name\"\"\")\n        db.commit()\n\ndef __fetchTimezones(db):\n    groups = db.storage[\"Timezones\"].get(None, {})\n\n    if not groups:\n        cursor = db.cursor()\n        cursor.execute(\"SELECT name, abbrev, utc_offset FROM timezones\")\n\n        for full_name, abbrev, utc_offset in cursor.fetchall():\n            group, name = full_name.split(\"/\")\n            if isinstance(utc_offset, int):\n                utc_offset = datetime.timedelta(utc_offset)\n            groups.setdefault(group, {})[name] = (abbrev, utc_offset)\n\n        db.storage[\"Timezones\"][None] = groups\n\n    return groups\n\ndef sortedTimezones(db):\n    groups = __fetchTimezones(db)\n    result = []\n\n    for key in sorted(groups.keys()):\n        result.append((key, sorted([(name, abbrev, utc_offset) for name, (abbrev, utc_offset) in groups[key].items()])))\n\n    return result\n\ndef __fetchUTCOffset(db, timezone):\n    utc_offset = db.storage[\"Timezones\"].get(timezone)\n\n    if utc_offset is None:\n        groups = db.storage[\"Timezones\"].get(None)\n\n        if groups:\n            group, name = timezone.split(\"/\")\n            utc_offset = groups[group][name][2]\n        else:\n            cursor = db.cursor()\n            cursor.execute(\"SELECT utc_offset FROM timezones WHERE name=%s\", (timezone,))\n\n            row = cursor.fetchone()\n            if row:\n                utc_offset = row[0]\n            else:\n                return 0\n\n        db.storage[\"Timezones\"][timezone] = utc_offset\n\n    return utc_offset\n\ndef adjustTimestamp(db, timestamp, timezone):\n    return timestamp + __fetchUTCOffset(db, timezone)\n\ndef formatTimestamp(db, timestamp, timezone):\n    utc_offset = __fetchUTCOffset(db, timezone)\n    seconds = utc_offset.total_seconds()\n    offset = \" %s%02d:%02d\" % (\"-\" if seconds < 0 else \"+\", seconds / 3600, (seconds % 3600) / 60)\n\n    return time.strftime(\"%Y-%m-%d %H:%M\", (timestamp + utc_offset).timetuple()) + offset\n\ndef validTimezone(db, timezone):\n    cursor = db.cursor()\n    cursor.execute(\"SELECT 1 FROM timezones WHERE name=%s\", (timezone,))\n    return bool(cursor.fetchone())\n"
  },
  {
    "path": "src/dbutils/unittest.py",
    "content": "def independence():\n    # Simply check that dbutils can be imported.  This is run in a test flagged\n    # as \"local\" since we want dbutils to be possible to import in standalone\n    # unit tests.\n    #\n    # Hardly anything in dbutils can actually be used, of course, but that's not\n    # a problem; the unit tests simply need to make sure not to depend on that.\n\n    import dbutils\n\n    print \"independence: ok\"\n"
  },
  {
    "path": "src/dbutils/user.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport os\nimport base\n\ndef _preferenceCacheKey(item, repository, filter_id):\n    cache_key = item\n    if filter_id is not None:\n        cache_key += \":f%d\" % filter_id\n    if repository is not None:\n        cache_key += \":r%d\" % repository.id\n    return cache_key\n\nclass InvalidUserId(base.Error):\n    def __init__(self, user_id):\n        super(InvalidUserId, self).__init__(\"Invalid user id: %d\" % user_id)\n        self.user_id = user_id\n\nclass NoSuchUser(base.Error):\n    def __init__(self, name):\n        super(NoSuchUser, self).__init__(\"No such user: %s\" % name)\n        self.name = name\n\nclass User(object):\n    def __init__(self, user_id, name, fullname, status, email, email_verified):\n        self.id = user_id\n        self.name = name\n        self.email = email\n        self.email_verified = email_verified\n        self.fullname = fullname\n        self.status = status\n        self.preferences = {}\n        self.__resources = {}\n\n    def __eq__(self, other):\n        if self.isAnonymous(): return False\n        elif isinstance(other, User):\n            if other.isAnonymous(): return False\n            else: return self.id == other.id\n        elif isinstance(other, int):\n            return self.id == other\n        elif isinstance(other, basestring):\n            return self.name == other\n        else:\n            raise base.Error(\"invalid comparison\")\n\n    def __ne__(self, other):\n        return not (self == other)\n\n    def __int__(self):\n        assert not self.isAnonymous()\n        return self.id\n\n    def __str__(self):\n        assert not self.isAnonymous()\n        return self.name\n\n    def __repr__(self):\n        return \"User(%r, %r, %r, %r)\" % (self.id, self.name, self.email, self.fullname)\n\n    def __hash__(self):\n        return hash(self.id)\n\n    def isAnonymous(self):\n        return self.status == 'anonymous'\n\n    def isSystem(self):\n        return self.status == 'system'\n\n    def hasRole(self, db, role):\n        cursor = db.cursor()\n        cursor.execute(\"SELECT 1 FROM userroles WHERE uid=%s AND role=%s\", (self.id, role))\n        return bool(cursor.fetchone())\n\n    def loadPreferences(self, db):\n        if not self.preferences:\n            cursor = db.cursor()\n            cursor.execute(\"\"\"SELECT uid, item, type, integer, string\n                                FROM preferences\n                                JOIN userpreferences USING (item)\n                               WHERE (uid=%s OR uid IS NULL)\n                                 AND repository IS NULL\n                                 AND filter IS NULL\"\"\",\n                           (self.id,))\n\n            rows = sorted(cursor, key=lambda row: row[0], reverse=True)\n\n            for _, item, preference_type, integer, string in rows:\n                cache_key = _preferenceCacheKey(item, None, None)\n                if cache_key not in self.preferences:\n                    if preference_type == \"boolean\":\n                        self.preferences[cache_key] = bool(integer)\n                    elif preference_type == \"integer\":\n                        self.preferences[cache_key] = integer\n                    else:\n                        self.preferences[cache_key] = string\n\n    @staticmethod\n    def fetchPreference(db, item, user=None, repository=None, filter_id=None):\n        cursor = db.cursor()\n        cursor.execute(\"SELECT type FROM preferences WHERE item=%s\", (item,))\n        row = cursor.fetchone()\n        if not row:\n            raise base.ImplementationError(\"invalid preference: %s\" % item)\n        preference_type = row[0]\n\n        arguments = [item]\n        where = [\"item=%s\"]\n\n        if preference_type in (\"boolean\", \"integer\"):\n            columns = [\"integer\"]\n        else:\n            columns = [\"string\"]\n\n        if user is not None and not user.isAnonymous():\n            arguments.append(user.id)\n            where.append(\"uid=%s OR uid IS NULL\")\n            columns.append(\"uid\")\n        else:\n            where.append(\"uid IS NULL\")\n\n        if repository is not None:\n            arguments.append(repository.id)\n            where.append(\"repository=%s OR repository IS NULL\")\n            columns.append(\"repository\")\n        else:\n            where.append(\"repository IS NULL\")\n\n        if filter_id is not None:\n            arguments.append(filter_id)\n            where.append(\"filter=%s OR filter IS NULL\")\n            columns.append(\"filter\")\n        else:\n            where.append(\"filter IS NULL\")\n\n        query = (\"\"\"SELECT %(columns)s\n                      FROM userpreferences\n                     WHERE %(where)s\"\"\"\n                 % { \"columns\": \", \".join(columns),\n                     \"where\": \" AND \".join(\"(%s)\" % condition\n                                           for condition in where) })\n\n        cursor.execute(query, arguments)\n\n        rows = cursor.fetchall()\n        if not rows:\n            raise base.ImplementationError(\n                \"invalid preference read: %s (no value found)\" % item)\n\n        value = sorted(rows, key=lambda row: row[1:])[-1][0]\n        if preference_type == \"boolean\":\n            return bool(value)\n        return value\n\n    @staticmethod\n    def storePreference(db, item, value, user=None, repository=None, filter_id=None):\n        # A preference value can be set for either a repository or a filter, but\n        # not for both at the same time.  A filter implies a repository anyway,\n        # so there would be no point.\n        assert repository is None or filter_id is None\n        assert filter_id is None or user is not None\n\n        # If all are None, we'd be deleting the global default and not setting a\n        # new one, which would be bad.\n        if value is None and user is None \\\n                and repository is None and filter is None:\n            raise base.ImplementationError(\"attempted to delete global default\")\n\n        if User.fetchPreference(db, item, user, repository, filter_id) != value:\n            cursor = db.cursor()\n\n            arguments = [item]\n            where = [\"item=%s\"]\n\n            user_id = repository_id = None\n\n            if user is not None:\n                user_id = user.id\n                arguments.append(user_id)\n                where.append(\"uid=%s\")\n            else:\n                where.append(\"uid IS NULL\")\n\n            if repository is not None:\n                repository_id = repository.id\n                arguments.append(repository_id)\n                where.append(\"repository=%s\")\n            else:\n                where.append(\"repository IS NULL\")\n\n            if filter_id is not None:\n                arguments.append(filter_id)\n                where.append(\"filter=%s\")\n            else:\n                where.append(\"filter IS NULL\")\n\n            query = (\"DELETE FROM userpreferences WHERE %s\"\n                     % (\" AND \".join(\"(%s)\" % condition\n                                     for condition in where)))\n\n            cursor.execute(query, arguments)\n\n            if value is not None:\n                cursor.execute(\"SELECT type FROM preferences WHERE item=%s\", (item,))\n\n                (value_type,) = cursor.fetchone()\n                integer = string = None\n\n                if value_type == \"boolean\":\n                    value = bool(value)\n                    integer = int(value)\n                elif value_type == \"integer\":\n                    integer = int(value)\n                else:\n                    string = str(value)\n\n                cursor.execute(\"\"\"INSERT INTO userpreferences (item, uid, repository, filter, integer, string)\n                                       VALUES (%s, %s, %s, %s, %s, %s)\"\"\",\n                               (item, user_id, repository_id, filter_id, integer, string))\n\n            if user is not None:\n                cache_key = _preferenceCacheKey(item, repository, filter_id)\n                if cache_key in user.preferences:\n                    del user.preferences[cache_key]\n\n            return True\n        else:\n            return False\n\n    def getPreference(self, db, item, repository=None, filter_id=None):\n        cache_key = _preferenceCacheKey(item, repository, filter_id)\n\n        if cache_key not in self.preferences:\n            self.preferences[cache_key] = User.fetchPreference(\n                db, item, self, repository, filter_id)\n\n        return self.preferences[cache_key]\n\n    def setPreference(self, db, item, value, repository=None, filter_id=None):\n        return User.storePreference(db, item, value, self, repository, filter_id)\n\n    def getDefaultRepository(self, db):\n        import auth\n        import gitutils\n\n        default_repo = self.getPreference(db, \"defaultRepository\")\n\n        if not default_repo:\n            return None\n\n        try:\n            return gitutils.Repository.fromName(db, default_repo)\n        except auth.AccessDenied:\n            return None\n\n    def getResource(self, db, name):\n        import configuration\n\n        if name in self.__resources:\n            return self.__resources[name]\n\n        cursor = db.cursor()\n        cursor.execute(\"SELECT revision, source FROM userresources WHERE uid=%s AND name=%s ORDER BY revision DESC FETCH FIRST ROW ONLY\", (self.id, name))\n\n        row = cursor.fetchone()\n\n        if row and row[1] is not None:\n            resource = self.__resources[name] = (\"\\\"critic.rev.%d\\\"\" % row[0], row[1])\n            return resource\n\n        path = os.path.join(configuration.paths.INSTALL_DIR, \"resources\", name)\n        mtime = os.stat(path).st_mtime\n\n        resource = self.__resources[name] = (\"\\\"critic.mtime.%d\\\"\" % mtime, open(path).read())\n        return resource\n\n    def adjustTimestamp(self, db, timestamp):\n        import dbutils.timezones\n        return dbutils.timezones.adjustTimestamp(db, timestamp, self.getPreference(db, \"timezone\"))\n\n    def formatTimestamp(self, db, timestamp):\n        import dbutils.timezones\n        return dbutils.timezones.formatTimestamp(db, timestamp, self.getPreference(db, \"timezone\"))\n\n    def getCriticURLs(self, db):\n        url_types = self.getPreference(db, 'email.urlType').split(\",\")\n\n        cursor = db.cursor()\n        cursor.execute(\"\"\"SELECT key, anonymous_scheme, authenticated_scheme, hostname\n                            FROM systemidentities\"\"\")\n\n        url_prefixes = dict((row[0], row[1:]) for row in cursor)\n        urls = []\n\n        for url_type in url_types:\n            if url_prefixes.has_key(url_type):\n                anonymous_scheme, authenticated_scheme, hostname = url_prefixes[url_type]\n                if self.isAnonymous():\n                    scheme = anonymous_scheme\n                else:\n                    scheme = authenticated_scheme\n                urls.append(\"%s://%s\" % (scheme, hostname))\n\n        return urls\n\n    def getFirstName(self):\n        return self.fullname.split(\" \")[0]\n\n    def getJSConstructor(self, db=None):\n        from htmlutils import jsify\n        if self.isAnonymous():\n            return \"new User(null, null, null, null, null, { ui: {} })\"\n        if db:\n            options = (\"{ ui: { keyboardShortcuts: %s, resolveIssueWarning: %s, convertIssueToNote: %s, asynchronousReviewMarking: %s } }\" %\n                       (\"true\" if self.getPreference(db, \"ui.keyboardShortcuts\") else \"false\",\n                        \"true\" if self.getPreference(db, \"ui.resolveIssueWarning\") else \"false\",\n                        \"true\" if self.getPreference(db, \"ui.convertIssueToNote\") else \"false\",\n                        \"true\" if self.getPreference(db, \"ui.asynchronousReviewMarking\") else \"false\"))\n        else:\n            options = \"{ ui: {} }\"\n        return \"new User(%d, %s, %s, %s, %s, %s)\" % (self.id, jsify(self.name), jsify(self.email), jsify(self.fullname), jsify(self.status), options)\n\n    def getJS(self, db=None, name=\"user\"):\n        return \"var %s = %s;\" % (name, self.getJSConstructor(db))\n\n    def getJSON(self):\n        return { \"id\": self.id,\n                 \"name\": self.name,\n                 \"email\": self.email,\n                 \"displayName\": self.fullname }\n\n    def getAbsence(self, db):\n        cursor = db.cursor()\n        cursor.execute(\"SELECT until FROM userabsence WHERE uid=%s\", (self.id,))\n        row = cursor.fetchone()\n        if row[0] is None:\n            return \"absent\"\n        else:\n            return \"absent until %04d-%02d-%02d\" % (row[0].year, row[0].month, row[0].day)\n\n    def hasGitEmail(self, db, address):\n        cursor = db.cursor()\n        cursor.execute(\"\"\"SELECT 1\n                            FROM usergitemails\n                           WHERE email=%s\n                             AND uid=%s\"\"\",\n                       (address, self.id))\n        return bool(cursor.fetchone())\n\n    @staticmethod\n    def cache(db, user):\n        storage = db.storage[\"User\"]\n        storage[user.id] = user\n        if user.name: storage[\"n:\" + user.name] = user\n        if user.email: storage[\"e:\" + user.email] = user\n        return user\n\n    @staticmethod\n    def makeAnonymous():\n        return User(None, None, None, 'anonymous', None, None)\n\n    @staticmethod\n    def makeSystem():\n        import configuration\n        return User(0, configuration.base.SYSTEM_USER_NAME, \"Critic System\",\n                    \"system\", configuration.base.SYSTEM_USER_EMAIL, None)\n\n    @staticmethod\n    def _fromQuery(db, where, *values):\n        cursor = db.cursor()\n        cursor.execute(\"\"\"SELECT users.id, name, fullname, status,\n                                 useremails.email, verified\n                            FROM users\n                 LEFT OUTER JOIN useremails ON (useremails.id=users.email)\n                           \"\"\" + where,\n                       values)\n        return [User.cache(db, User(*row)) for row in cursor]\n\n    @staticmethod\n    def fromId(db, user_id):\n        cached_user = db.storage[\"User\"].get(user_id)\n        if cached_user:\n            return cached_user\n        else:\n            found = User._fromQuery(db, \"WHERE users.id=%s\", user_id)\n            if not found:\n                raise InvalidUserId(user_id)\n            return found[0]\n\n    @staticmethod\n    def fromIds(db, user_ids):\n        need_fetch = []\n        cache = db.storage[\"User\"]\n        for user_id in user_ids:\n            if user_id not in cache:\n                need_fetch.append(user_id)\n        if need_fetch:\n            User._fromQuery(db, \"WHERE users.id=ANY (%s)\", need_fetch)\n        return [cache.get(user_id) for user_id in user_ids]\n\n    @staticmethod\n    def fromName(db, name):\n        cached_user = db.storage[\"User\"].get(\"n:\" + name)\n        if cached_user:\n            return cached_user\n        else:\n            found = User._fromQuery(db, \"WHERE users.name=%s\", name)\n            if not found:\n                raise NoSuchUser(name)\n            return found[0]\n\n    @staticmethod\n    def fromAPI(api_user):\n        if api_user.is_anonymous:\n            return User.makeAnonymous()\n        return User.fromId(api_user.critic.database, api_user.id)\n\n    @staticmethod\n    def withRole(db, role):\n        cursor = db.cursor()\n        cursor.execute(\"\"\"SELECT uid\n                            FROM userroles\n                           WHERE role=%s\"\"\",\n                       (role,))\n        return User.fromIds(db, [user_id for user_id, in cursor])\n\n    @staticmethod\n    def create(db, name, fullname, email, email_verified, password=None,\n               status=\"current\", external_user_id=None):\n        tables = [\"users\"]\n        if email is not None:\n            tables.extend([\"useremails\", \"usergitemails\"])\n        if external_user_id is not None:\n            tables.append(\"externalusers\")\n        with db.updating_cursor(*tables) as cursor:\n            cursor.execute(\n                \"\"\"INSERT INTO users (name, fullname, password, status)\n                        VALUES (%s, %s, %s, %s)\n                     RETURNING id\"\"\",\n                (name, fullname, password, status))\n            user_id, = cursor.fetchone()\n            if email is not None:\n                cursor.execute(\n                    \"\"\"INSERT INTO useremails (uid, email, verified)\n                            VALUES (%s, %s, %s)\n                         RETURNING id\"\"\",\n                    (user_id, email, email_verified))\n                email_id = cursor.fetchone()[0]\n                cursor.execute(\"UPDATE users SET email=%s WHERE id=%s\",\n                               (email_id, user_id))\n                cursor.execute(\"\"\"INSERT INTO usergitemails (email, uid)\n                                       VALUES (%s, %s)\"\"\",\n                               (email, user_id))\n            if external_user_id is not None:\n                cursor.execute(\"\"\"UPDATE externalusers\n                                     SET uid=%s\n                                   WHERE id=%s\"\"\",\n                               (user_id, external_user_id))\n        return User.fromId(db, user_id)\n\n    def sendUserCreatedMail(self, source, external=None):\n        import mailutils\n\n        if self.email_verified is False:\n            email_status = \" (pending verification)\"\n        else:\n            email_status = \"\"\n\n        message = \"\"\"\\\nA new user has been created:\n\nUser name: %(username)r\nFull name: %(fullname)r\nEmail:     %(email)r%(email_status)s\n\"\"\" % { \"username\": self.name,\n        \"fullname\": self.fullname,\n        \"email\": self.email,\n        \"email_status\": email_status }\n\n        if external:\n            import auth\n\n            provider = auth.PROVIDERS[external[\"provider\"]]\n            message += \"\"\"\\\n\nExternal:  %(provider)s %(account)r\n\"\"\" % { \"provider\": provider.getTitle(),\n        \"account\": external[\"account\"] }\n\n        message += \"\"\"\\\n\n-- critic\n\"\"\"\n\n        mailutils.sendAdministratorMessage(\n            source, \"User '%s' registered\" % self.name, message)\n"
  },
  {
    "path": "src/diff/__init__.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport re\n\nimport gitutils\nimport diff.analyze\nimport syntaxhighlight\nimport syntaxhighlight.request\nimport htmlutils\nimport textutils\n\nre_modeline = re.compile(r\"-\\*-\\s*(.*?)\\s*-\\*-\")\nre_tabwidth = re.compile(r\"(?:^|[ \\t;])tab-width:\\s*([0-9]+)(?:$|;)\", re.I)\nre_indent_tabs_mode = re.compile(r\"(?:^|[ \\t;])indent-tabs-mode:\\s*(t|nil)(?:$|;)\", re.I)\nre_mode = re.compile(r\"(?:^|[ \\t;])mode:\\s*([^;]+)(?:$|;)\", re.I)\n\n# Low-level chunk of difference between two versions of a file.  One chunk\n# represents a possibly empty set of consecutive lines in the old version of the\n# file being replaced by another possibly empty set of consecutive lines in the\n# new version of the file.  (Both sets are never empty, of course.)\nclass Chunk:\n    def __init__(self, delete_offset, delete_count, insert_offset, insert_count, **kwargs):\n        # Primary information: identifying the line numbers of deleted lines and\n        # the line numbers of inserted lines.  If lines are only inserted\n        # delete_count is zero and delete_offset marks where, in the old version\n        # of the file, this chunks adds its lines.  If lines are only deleted,\n        # insert_count is zero and insert_offset marks where, in the new version\n        # of the file, the deleted lines would have been.\n        #\n        # Line numbers are 1-based, that is, the first line is number 1.\n        self.delete_offset = delete_offset\n        self.delete_count = delete_count\n        self.insert_offset = insert_offset\n        self.insert_count = insert_count\n\n        # Optional: the ID of the chunk.\n        self.id = kwargs.get(\"id\")\n\n        # Optional: True if the chunk contains only white-space changes.\n        self.is_whitespace = kwargs.get(\"is_whitespace\")\n\n        # Optional: the actual deleted and/or inserted lines.\n        self.deleted_lines = kwargs.get(\"deleted_lines\")\n        self.inserted_lines = kwargs.get(\"inserted_lines\")\n\n        # Optional: chunk analysis, linking together \"matching\" lines within the\n        # chunk and describing how such matching lines changed from the old\n        # version to the new version.\n        self.analysis = kwargs.get(\"analysis\")\n\n    def copy(self):\n        return Chunk(self.delete_offset, self.delete_count,\n                     self.insert_offset, self.insert_count,\n                     deleted_lines=self.deleted_lines,\n                     inserted_lines=self.inserted_lines,\n                     analysis=self.analysis)\n\n    def isBinary(self):\n        return self.delete_count == self.insert_count == 0\n\n    def analyze(self, file, last_chunk=False, reanalyze=False):\n        if (reanalyze or not self.analysis) and self.delete_count != 0 and self.insert_count != 0:\n            if not self.deleted_lines:\n                file.loadOldLines()\n                self.deleted_lines = file.getOldLines(self)\n\n            if not self.inserted_lines:\n                file.loadNewLines()\n                self.inserted_lines = file.getNewLines(self)\n\n            if self.is_whitespace:\n                self.analysis = diff.analyze.analyzeWhiteSpaceChanges(self.deleted_lines, self.inserted_lines, last_chunk and self.delete_offset + self.delete_count + file.oldCount())\n            else:\n                self.analysis = diff.analyze.analyzeChunk(self.deleted_lines, self.inserted_lines)\n\n    def deleteEnd(self):\n        return self.delete_offset + self.delete_count\n\n    def insertEnd(self):\n        return self.insert_offset + self.insert_count\n\n    def delta(self):\n        return self.insert_count - self.delete_count\n\n    def __str__(self):\n        return \"@@ -%d,%d +%d,%d @@\" % (self.delete_offset, self.delete_count, self.insert_offset, self.insert_count)\n\n    def __repr__(self):\n        if self.analysis: analysis = \", analysis=%r\" % self.analysis\n        else: analysis = \"\"\n        return \"Chunk(delete_offset=%d, delete_count=%d, insert_offset=%d, insert_count=%d%s)\" % (self.delete_offset, self.delete_count, self.insert_offset, self.insert_count, analysis)\n\n    def __eq__(self, other):\n        return self.delete_offset == other.delete_offset and self.insert_offset == other.insert_offset\n\n    def getLines(self):\n        assert not (self.deleted_lines is None or self.inserted_lines is None)\n\n        lines = []\n        terminator = \"%d=%d\" % (self.delete_count, self.insert_count)\n\n        if self.analysis: analysis = self.analysis + \";\" + terminator\n        else: analysis = terminator\n\n        mappings = analysis.split(\";\")\n        old_offset = self.delete_offset\n        new_offset = self.insert_offset\n\n        for mapping in mappings:\n            old_line, new_line = mapping.split(\":\")[0].split(\"=\")\n            old_line = self.delete_offset + int(old_line)\n            new_line = self.insert_offset + int(new_line)\n\n            while old_offset < old_line and new_offset < new_line:\n                old_value = self.deleted_lines[old_offset - self.delete_offset]\n                new_value = self.inserted_lines[new_offset - self.insert_offset]\n                line_type = Line.CONTEXT if old_value == new_value else Line.REPLACED\n                lines.append(Line(line_type, old_offset, old_value, new_offset, new_value))\n                old_offset += 1\n                new_offset += 1\n\n            while old_offset < old_line:\n                old_value = self.deleted_lines[old_offset - self.delete_offset]\n                lines.append(Line(Line.DELETED, old_offset, old_value, new_offset, None))\n                old_offset += 1\n\n            while new_offset < new_line:\n                new_value = self.inserted_lines[new_offset - self.insert_offset]\n                lines.append(Line(Line.INSERTED, old_offset, None, new_offset, new_value))\n                new_offset += 1\n\n            if old_line == self.deleteEnd(): break\n\n            old_value = self.deleted_lines[old_line - self.delete_offset]\n            new_value = self.inserted_lines[new_line - self.insert_offset]\n            lines.append(Line(Line.MODIFIED, old_line, old_value, new_line, new_value))\n            old_offset += 1\n            new_offset += 1\n\n        return lines\n\nre_conflict = re.compile(\"^(?:(?:&lt;){7}|={7}|(?:<b[^>]*>=</b>){7}|(?:&gt;){7})\")\n\n# Line in \"macro chunk\".  Representing either a context line, or a line that has\n# been changed (modified, deleted or inserted.)\nclass Line:\n    CONTEXT    = 1\n    DELETED    = 2\n    MODIFIED   = 3\n    REPLACED   = 4\n    INSERTED   = 5\n    WHITESPACE = 6\n    CONFLICT   = 7\n\n    def __init__(self, type, old_offset, old_value, new_offset, new_value, **kwargs):\n        # The type of line.  One of CONTEXT, MODIFIED, DELETED, INSERTED and\n        # REPLACED.\n        self.type = type\n\n        # The line number of this line in the old and new versions of the file,\n        # and the actual line value.  If the line represents an inserted line,\n        # old_offset will be the line number of the next non-deleted line in the\n        # old version of the file and old_value will be None.  If the line\n        # represents a deleted line, new_offset will be the line number of the\n        # next non-inserted line in the new version of the file and new_value\n        # will be None.\n        self.old_offset = old_offset\n        self.old_value = old_value\n        self.new_offset = new_offset\n        self.new_value = new_value\n\n        # The difference between old_value and new_value is only in white-space.\n        self.is_whitespace = kwargs.get(\"is_whitespace\", False)\n        self.analysis = kwargs.get(\"analysis\", None)\n\n    def __repr__(self):\n        if self.type == Line.CONTEXT: type_string = \"CONTEXT\"\n        elif self.type == Line.DELETED: type_string = \"DELETED\"\n        elif self.type == Line.MODIFIED: type_string = \"MODIFIED\"\n        elif self.type == Line.REPLACED: type_string = \"REPLACED\"\n        elif self.type == Line.INSERTED: type_string = \"INSERTED\"\n        return \"Line(%s, %d:%d)\" % (type_string, self.old_offset, self.new_offset)\n\n    def isConflictMarker(self):\n        return self.old_value and bool(re_conflict.match(self.old_value))\n\n# Higher-level chunk of differences between two versions of a file.  Constructed\n# by padding low-level chunks with a variable number of context lines.  Chunks\n# whose contexts overlap are merged into a single \"macro chunk.\"\nclass MacroChunk:\n    def __init__(self, chunks, lines):\n        # List of low-level chunks that make up this macro chunk.\n        self.chunks = chunks\n\n        # List of lines in the macro chunk.\n        self.lines = lines\n\n        # Line numbers and size of this macro chunk in the old and new versions\n        # of the file.  Note that this includes the context lines, and thus does\n        # not only represent actual changes.\n        self.old_offset = lines[0].old_offset\n        self.old_count = lines[-1].old_offset - lines[0].old_offset + 1\n        self.new_offset = lines[0].new_offset\n        self.new_count = lines[-1].new_offset - lines[0].new_offset + 1\n\n# Container for difference information per file.\nclass File:\n    def __init__(self, id=None, path=None, old_sha1=None, new_sha1=None, repository=None, **kwargs):\n        self.id = id\n        self.path = path\n        self.old_sha1 = old_sha1\n        self.new_sha1 = new_sha1\n        self.old_mode = kwargs.get(\"old_mode\")\n        self.new_mode = kwargs.get(\"new_mode\")\n        self.repository = repository\n\n        if isinstance(self.old_mode, int):\n            self.old_mode = \"%o\" % self.old_mode\n        if isinstance(self.new_mode, int):\n            self.new_mode = \"%o\" % self.new_mode\n\n        # List of low-level chunks affecting the file.\n        self.chunks = kwargs.get(\"chunks\")\n\n        # List of macro chunks affecting the file.\n        self.macro_chunks = kwargs.get(\"macro_chunks\")\n\n        # Lists of actual lines in the old and new versions of the file.  Each\n        # line is a string, not including the linebreak character.\n        self.old_plain = kwargs.get(\"old_plain\")\n        self.new_plain = kwargs.get(\"new_plain\")\n        self.old_highlighted = kwargs.get(\"old_highlighted\")\n        self.old_is_highlighted = bool(self.old_highlighted)\n        self.new_highlighted = kwargs.get(\"new_highlighted\")\n        self.new_is_highlighted = bool(self.new_highlighted)\n\n        # List of comment chains that apply to the whole file.\n        self.file_comment_chains = []\n\n        # List of comment chains that apply to the selected lines in the file.\n        self.code_comment_chains = []\n\n        self.move_source_file = kwargs.get(\"move_source_file\")\n        self.move_target_file = kwargs.get(\"move_target_file\")\n\n        self.modeline = {}\n        self.interpreter = {}\n\n    def clean(self):\n        self.chunks = None\n        self.macro_chunks = None\n        self.cleanLines()\n\n    def cleanLines(self):\n        self.old_plain = None\n        self.new_plain = None\n        self.old_highlighted = None\n        self.new_highlighted = None\n\n    def __hash__(self):\n        return hash(self.id)\n\n    def __int__(self):\n        return self.id\n\n    def __repr__(self):\n        return \"diff.File(id=%d, path=%r)\" % (self.id or -1, self.path)\n\n    def hasChanges(self):\n        return self.old_sha1 is not None and self.new_sha1 is not None\n\n    def isEmptyChanges(self):\n        \"\"\"Return true if empty diff information is recorded.\"\"\"\n        return self.hasChanges() \\\n            and len(self.chunks) == 1 \\\n            and self.chunks[0].delete_count == 0 \\\n            and self.chunks[0].insert_count == 0\n\n    def isEmptyFile(self):\n        \"\"\"Return true if this is an added or deleted empty (zero-length) file.\"\"\"\n        if self.isEmptyChanges():\n            if self.wasAdded() and self.newSize() == 0:\n                return True\n            elif self.wasRemoved() and self.oldSize() == 0:\n                return True\n        return False\n\n    def isBinaryChanges(self):\n        \"\"\"Return true if this is a binary file.\"\"\"\n        return self.isEmptyChanges() and not self.isEmptyFile()\n\n    def wasAdded(self):\n        \"\"\"Return true if this file was added.\"\"\"\n        return self.old_sha1 == '0' * 40\n\n    def wasRemoved(self):\n        \"\"\"Return true if this file was deleted.\"\"\"\n        return self.new_sha1 == '0' * 40\n\n    def oldSize(self):\n        \"\"\"Return size of old version of file, or None if file was deleted.\"\"\"\n        if self.old_sha1 != '0' * 40:\n            return self.repository.fetch(self.old_sha1, fetchData=False).size\n        else:\n            return None\n\n    def newSize(self):\n        \"\"\"Return size of new version of file, or None if file was deleted.\"\"\"\n        if self.new_sha1 != '0' * 40:\n            return self.repository.fetch(self.new_sha1, fetchData=False).size\n        else:\n            return None\n\n    def ensureHighlight(self, highlight_mode=\"legacy\"):\n        \"\"\"Ensure that the old and new version are syntax highlighted\n\n           If they are, True is returned. If they are not, an asynchronous\n           request to syntax highlight them is made, and False is returned.\"\"\"\n        sha1s = {}\n        if self.old_sha1 \\\n                and self.old_sha1 != \"0\" * 40 \\\n                and self.old_mode != \"160000\":\n            old_language = self.getLanguage(use_content=\"old\")\n            if old_language:\n                sha1s[self.old_sha1] = (self.path, old_language)\n        if self.new_sha1 \\\n                and self.new_sha1 != \"0\" * 40 \\\n                and self.new_mode != \"160000\":\n            new_language = self.getLanguage(use_content=\"new\")\n            if new_language:\n                sha1s[self.new_sha1] = (self.path, new_language)\n        return not syntaxhighlight.request.requestHighlights(\n            self.repository, sha1s, highlight_mode, async=True)\n\n    def loadOldLines(self, highlighted=False, request_highlight=False, highlight_mode=\"legacy\"):\n        \"\"\"Load the lines of the old version of the file, optionally highlighted.\"\"\"\n\n        from diff.parse import splitlines\n\n        if self.old_sha1 is None or self.old_sha1 == '0' * 40:\n            self.old_plain = []\n            self.old_highlighted = []\n            return\n        elif self.old_mode and self.old_mode == \"160000\":\n            self.old_plain = \"Subproject commit %s\" % self.old_sha1\n            self.old_highlighted = splitlines(syntaxhighlight.wrap(\n                self.old_plain, highlight_mode))\n            return\n\n        if highlighted:\n            if self.old_highlighted and self.old_is_highlighted:\n                return\n            else:\n                self.old_is_highlighted = True\n                language = self.getLanguage(use_content=\"old\")\n                data = syntaxhighlight.readHighlight(\n                    self.repository, self.old_sha1, self.path, language,\n                    request=request_highlight, mode=highlight_mode)\n                self.old_highlighted = splitlines(data)\n                self.old_eof_eol = data and data[-1] in \"\\n\\r\"\n        else:\n            if self.old_plain: return\n            else:\n                data = self.repository.fetch(self.old_sha1).data\n                self.old_plain = splitlines(data)\n                self.old_eof_eol = data and data[-1] in \"\\n\\r\"\n\n    def loadNewLines(self, highlighted=False, request_highlight=False, highlight_mode=\"legacy\"):\n        \"\"\"Load the lines of the new version of the file, optionally highlighted.\"\"\"\n\n        from diff.parse import splitlines\n\n        if self.new_sha1 is None or self.new_sha1 == '0' * 40:\n            self.new_plain = []\n            self.new_highlighted = []\n            return\n        elif self.new_mode and self.new_mode == \"160000\":\n            self.new_plain = \"Subproject commit %s\" % self.new_sha1\n            self.new_highlighted = splitlines(syntaxhighlight.wrap(\n                self.new_plain, highlight_mode))\n            return\n\n        if highlighted:\n            if self.new_highlighted and self.new_is_highlighted:\n                return\n            else:\n                self.new_is_highlighted = True\n                language = self.getLanguage(use_content=\"new\")\n                data = syntaxhighlight.readHighlight(\n                    self.repository, self.new_sha1, self.path, language,\n                    request=request_highlight, mode=highlight_mode)\n                self.new_highlighted = splitlines(data)\n                self.new_eof_eol = data and data[-1] in \"\\n\\r\"\n        else:\n            if self.new_plain: return\n            else:\n                data = self.repository.fetch(self.new_sha1).data\n                self.new_plain = splitlines(data)\n                self.new_eof_eol = data and data[-1] in \"\\n\\r\"\n\n    def getOldLines(self, chunk, highlighted=False):\n        begin = chunk.delete_offset - 1\n        end = begin + chunk.delete_count\n        return self.oldLines(highlighted)[begin:end]\n\n    def getNewLines(self, chunk, highlighted=False):\n        begin = chunk.insert_offset - 1\n        end = begin + chunk.insert_count\n        return self.newLines(highlighted)[begin:end]\n\n    def oldLines(self, highlighted):\n        if highlighted: return self.old_highlighted\n        else: return self.old_plain\n\n    def oldCount(self):\n        if self.old_highlighted is not None: return len(self.old_highlighted)\n        else: return len(self.old_plain)\n\n    def newLines(self, highlighted):\n        if highlighted: return self.new_highlighted\n        else: return self.new_plain\n\n    def newCount(self):\n        if self.new_highlighted is not None: return len(self.new_highlighted)\n        else: return len(self.new_plain)\n\n    def canHighlight(self):\n        return self.getLanguage() is not None\n\n    def getLanguage(self, use_content=False):\n        if (self.path.endswith(\".h\") or\n            self.path.endswith(\".c\") or\n            self.path.endswith(\".cpp\") or\n            self.path.endswith(\".hh\") or\n            self.path.endswith(\".cc\")):\n            return \"c++\"\n        elif (self.path.endswith(\".py\") or\n              self.path.endswith(\".gyp\") or\n              self.path.endswith(\".gypi\")):\n            return \"python\"\n        elif (self.path.endswith(\".pl\") or\n              self.path.endswith(\".pm\")):\n            return \"perl\"\n        elif self.path.endswith(\".java\"):\n            return \"java\"\n        elif self.path.endswith(\".rb\"):\n            return \"ruby\"\n        elif self.path.endswith(\".js\"):\n            return \"javascript\"\n        elif self.path.endswith(\".php\"):\n            return \"php\"\n        elif (self.path.endswith(\".mk\") or\n              self.path.endswith(\"/Makefile\")):\n            return \"makefile\"\n        elif (self.path.endswith(\".m\") or\n              self.path.endswith(\".mm\")):\n            return \"objective-c\"\n        elif self.path.endswith(\".sql\"):\n            return \"sql\"\n        # XML syntax highlighting is disabled due to issues (the pygments\n        # lexer messes with the line-endings in the file.)\n        #elif self.path.endswith(\".xml\"):\n        #    return \"xml\"\n\n        if use_content:\n            interpreter = self.getInterpreter(use_content)\n            if interpreter:\n                executable = interpreter.split(\"/\")[-1]\n                if executable.startswith(\"python\"):\n                    return \"python\"\n                elif executable.startswith(\"perl\"):\n                    return \"perl\"\n\n            modeline = self.getModeLine(use_content)\n            if modeline:\n                match = re_mode.search(modeline)\n                if match:\n                    mode = match.group(1).strip()\n                    if mode in (\"c++\", \"python\", \"perl\", \"java\", \"ruby\", \"js\", \"php\", \"makefile\"):\n                        return mode\n\n        return None\n\n    def getInterpreter(self, side=\"new\"):\n        if side not in self.interpreter:\n            if side == \"new\":\n                self.loadNewLines()\n                lines = self.new_plain\n            else:\n                self.loadOldLines()\n                lines = self.old_plain\n            self.interpreter[side] = \"\"\n            for line in lines:\n                if line.startswith(\"#!\"):\n                    words = line[2:].split()\n                    if re.search(\"(^|/)env$\", words[0]): self.interpreter[side] = words[1]\n                    else: self.interpreter[side] = words[0]\n                    break\n        return self.interpreter[side]\n\n    def getModeLine(self, side=\"new\"):\n        if side not in self.modeline:\n            if side == \"new\":\n                self.loadNewLines()\n                lines = self.new_plain\n            else:\n                self.loadOldLines()\n                lines = self.old_plain\n            self.modeline[side] = \"\"\n            for line in lines:\n                if line.startswith(\"#!\"): continue\n                match = re_modeline.search(line)\n                if match: self.modeline[side] = match.group(1)\n                break\n        return self.modeline[side]\n\n    def getTabWidth(self, side=\"new\", default=8):\n        modeline = self.getModeLine(side)\n        try: return int(re_tabwidth.search(modeline).group(1))\n        except: return default\n\n    def getIndentTabsMode(self, side=\"new\", default=True):\n        modeline = self.getModeLine(side)\n        try: return re_indent_tabs_mode.search(modeline).group(1) == \"t\"\n        except: return default\n\n    @staticmethod\n    def sorted(files, key=lambda file: file.path):\n        def compareFilenames(a, b):\n            def isSource(name): return name.endswith(\".cpp\") or name.endswith(\".cc\")\n            def isHeader(name): return name.endswith(\".h\")\n\n            if isHeader(a) and isSource(b) and a.rsplit(\".\", 1)[0] == b.rsplit(\".\", 1)[0]: return -1\n            elif isSource(a) and isHeader(b) and a.rsplit(\".\", 1)[0] == b.rsplit(\".\", 1)[0]: return 1\n            else: return cmp(a, b)\n\n        return sorted(files, key=key, cmp=compareFilenames)\n\n    @staticmethod\n    def eliminateCommonPrefixes(files, text=False, getpath=None, setpath=None):\n        assert (getpath is None) == (setpath is None)\n\n        if getpath is None:\n            def defaultGetPath(x): return x\n            getpath = defaultGetPath\n        if setpath is None:\n            def defaultSetPath(x, p): files[index] = p\n            setpath = defaultSetPath\n\n        def commonPrefixLength(pathA, pathB):\n            componentsA = pathA.split('/')\n            componentsB = pathB.split('/')\n            for index in range(min(len(componentsA), len(componentsB))):\n                if componentsA[index] != componentsB[index]:\n                    return sum(map(len, componentsA[:index])) + index\n\n        if files:\n            previous = None\n            for index in range(len(files)):\n                current = getpath(files[index])\n\n                if index > 0:\n                    length = commonPrefixLength(previous, current)\n                else:\n                    length = 0\n\n                if text:\n                    if length > 4:\n                        updated = (\" \" * (length - 4) + \".../\" +\n                                   textutils.escape(current[length:]))\n                    else:\n                        updated = textutils.escape(current)\n                else:\n                    if length > 2:\n                        updated = (\" \" * (length - 2) + \"&#8230;/\" +\n                                   htmlutils.htmlify(textutils.escape(current[length:])))\n                    else:\n                        updated = htmlutils.htmlify(textutils.escape(current))\n\n                if updated != current:\n                    setpath(files[index], updated)\n\n                previous = current\n\n        return files\n\nclass Changeset:\n    def __init__(self, id, parent, child, type, files=None, commits=None):\n        self.id = id\n        self.parent = parent\n        self.child = child\n        self.type = type\n        self.files = files\n        self.conflicts = False\n\n        self.__commits = commits if commits else [child] if type == \"direct\" else None\n        self.__file_by_id = {}\n\n    def __hash__(self): return hash(self.id)\n    def __eq__(self, other): return self.id == other.id\n\n    def commits(self, db):\n        if self.__commits is None:\n            iter = self.child\n            self.__commits = [iter]\n            while self.parent not in iter.parents:\n                if len(iter.parents) != 1: return []\n                iter = gitutils.Commit.fromSHA1(db, iter.repository, iter.parents[0])\n                self.__commits.append(iter)\n        return self.__commits\n\n    def setCommits(self, commits):\n        self.__commits = commits\n\n    def getFile(self, file_id):\n        if self.files and not self.__file_by_id:\n            for file in self.files:\n                self.__file_by_id[file.id] = file\n        return self.__file_by_id.get(file_id)\n\n    def getReviewFiles(self, db, user, review):\n        files = {}\n\n        if self.files and not self.__file_by_id:\n            for file in self.files:\n                self.__file_by_id[file.id] = file\n\n        def process(cursor):\n            for file_id, state, reviewer, is_reviewer, draft_from, draft_to in cursor:\n                if file_id not in self.__file_by_id: continue\n                if draft_from == state: state = draft_to\n                if files.has_key(file_id):\n                    existing = files[file_id]\n                    reviewers = existing[2]\n                    files[file_id] = (existing[0] or is_reviewer, state if existing[1] == state else \"mixed\", existing[2])\n                else:\n                    reviewers = set()\n                    files[file_id] = (is_reviewer, state, reviewers)\n                if reviewer is not None: reviewers.add(reviewer)\n\n        if self.type in (\"merge\", \"conflicts\"):\n            cursor = db.cursor()\n            cursor.execute(\"\"\"SELECT reviewfiles.file, reviewfiles.state, reviewfiles.reviewer, reviewuserfiles.uid IS NOT NULL, reviewfilechanges.from_state, reviewfilechanges.to_state\n                                FROM reviewfiles\n                     LEFT OUTER JOIN reviewuserfiles ON (reviewuserfiles.file=reviewfiles.id AND reviewuserfiles.uid=%s)\n                     LEFT OUTER JOIN reviewfilechanges ON (reviewfilechanges.file=reviewfiles.id AND reviewfilechanges.uid=%s AND reviewfilechanges.state='draft')\n                               WHERE reviewfiles.review=%s\n                                 AND reviewfiles.changeset=%s\"\"\",\n                           (user.id, user.id, review.id, self.id))\n            process(cursor)\n        elif self.__commits:\n            cursor = db.cursor()\n            cursor.execute(\"\"\"SELECT reviewfiles.file, reviewfiles.state, reviewfiles.reviewer, reviewuserfiles.uid IS NOT NULL, reviewfilechanges.from_state, reviewfilechanges.to_state\n                                FROM reviewfiles\n                                JOIN changesets ON (changesets.id=reviewfiles.changeset)\n                     LEFT OUTER JOIN reviewuserfiles ON (reviewuserfiles.file=reviewfiles.id AND reviewuserfiles.uid=%s)\n                     LEFT OUTER JOIN reviewfilechanges ON (reviewfilechanges.file=reviewfiles.id AND reviewfilechanges.uid=%s AND reviewfilechanges.state='draft')\n                               WHERE reviewfiles.review=%s\n                                 AND changesets.child=ANY (%s)\"\"\",\n                           (user.id, user.id, review.id, [commit.getId(db) for commit in self.__commits]))\n            process(cursor)\n\n        return files\n\n    @staticmethod\n    def fromId(db, repository, id):\n        cursor = db.cursor()\n\n        cursor.execute(\"SELECT parent, child, type FROM changesets WHERE id=%s\", [id])\n        parent_id, child_id, type = cursor.fetchone()\n\n        parent = gitutils.Commit.fromId(db, repository, parent_id) if parent_id else None\n        child = gitutils.Commit.fromId(db, repository, child_id)\n\n        return Changeset(id, parent, child, type)\n"
  },
  {
    "path": "src/diff/analyze.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport difflib\nimport re\n\nimport textutils\n\nre_ignore = re.compile(\"^\\\\s*(?:[{}*]|else|do|\\\\*/)?\\\\s*$\")\nre_words = re.compile(\"([0-9]+|[A-Z][a-z]+|[A-Z]+|[a-z]+|[\\\\[\\\\]{}()]|\\\\s+|.)\")\nre_ws = re.compile(\"\\\\s+\")\nre_conflict = re.compile(\"^<<<<<<< .*$|^=======$|^>>>>>>> .*$\")\n\ndef analyzeChunk(deletedLines, insertedLines, moved=False):\n    # Pure delete or pure insert, nothing to analyze.\n    if not deletedLines or not insertedLines: return None\n\n    deletedLines = map(textutils.decode, deletedLines)\n    insertedLines = map(textutils.decode, insertedLines)\n\n    # Large chunk, analysis would be expensive, so skip it.\n    if len(deletedLines) * len(insertedLines) <= 10000 and not moved:\n        analysis = analyzeChunk1(deletedLines, insertedLines)\n    else:\n        deletedLinesNoWS = [re_ws.sub(\" \", line.strip()) for line in deletedLines]\n        insertedLinesNoWS = [re_ws.sub(\" \", line.strip()) for line in insertedLines]\n\n        sm = difflib.SequenceMatcher(None, deletedLinesNoWS, insertedLinesNoWS)\n        blocks = sm.get_matching_blocks()\n\n        analysis = []\n\n        pi = 0\n        pj = 0\n\n        for i, j, n in blocks:\n            if not n: continue\n\n            if i > pi and j > pj:\n                analysis.append(analyzeChunk1(deletedLines[pi:i], insertedLines[pj:j],\n                                              offsetA=pi, offsetB=pj))\n\n            analysis.append(analyzeWhiteSpaceChanges(deletedLines[i:i+n], insertedLines[j:j+n],\n                                                     offsetA=i, offsetB=j, full=moved))\n\n            pi = i + n\n            pj = j + n\n\n        if pi < len(deletedLines) and pj < len(insertedLines):\n            analysis.append(analyzeChunk1(deletedLines[pi:], insertedLines[pj:],\n                                          offsetA=pi, offsetB=pj))\n\n        analysis = \";\".join(filter(None, analysis))\n\n    if analysis: return analysis\n    else: return None\n\ndef analyzeChunk1(deletedLines, insertedLines, offsetA=0, offsetB=0):\n    matches = []\n    equals = []\n\n    if len(deletedLines) * len(insertedLines) > 10000: return \"\"\n\n    def ratio(sm, a, b, aLength, bLength):\n        matching = 0\n        for i, j, n in sm.get_matching_blocks():\n            matching += sum(map(len, map(unicode.strip, a[i:i+n])))\n        if aLength > 5 and len(sm.get_matching_blocks()) == 2:\n            return float(matching) / aLength\n        else:\n            return 2.0 * matching / (aLength + bLength)\n\n    for deletedIndex, deleted in enumerate(deletedLines):\n        deletedStripped = deleted.strip()\n        deletedNoWS = re_ws.sub(\"\", deletedStripped)\n\n        # Don't match conflict lines against anything.\n        if re_conflict.match(deleted): continue\n\n        if not re_ignore.match(deleted):\n            deletedWords = re_words.findall(deleted)\n\n            for insertedIndex, inserted in enumerate(insertedLines):\n                insertedStripped = inserted.strip()\n                insertedNoWS = re_ws.sub(\"\", insertedStripped)\n\n                if not re_ignore.match(inserted):\n                    insertedWords = re_words.findall(inserted)\n                    sm = difflib.SequenceMatcher(None, deletedWords, insertedWords)\n                    r = ratio(sm, deletedWords, insertedWords, len(deletedNoWS), len(insertedNoWS))\n                    if r > 0.5: matches.append((r, deletedIndex, insertedIndex, deletedWords, insertedWords, sm))\n                elif deletedStripped == insertedStripped:\n                    equals.append((deletedIndex, insertedIndex))\n        else:\n            for insertedIndex, inserted in enumerate(insertedLines):\n                if deletedStripped == inserted.strip():\n                    equals.append((deletedIndex, insertedIndex))\n\n    if matches:\n        matches.sort(key=lambda x: x[0], reverse=True)\n\n        final = []\n\n        while matches:\n            r, deletedIndex, insertedIndex, deletedWords, insertedWords, sm = matches.pop(0)\n            final.append((deletedIndex, insertedIndex, deletedWords, insertedWords, sm))\n            matches = filter(lambda data: data[1] != deletedIndex and data[2] != insertedIndex and (data[1] < deletedIndex) == (data[2] < insertedIndex), matches)\n            equals = filter(lambda data: (data[0] < deletedIndex) == (data[1] < insertedIndex), equals)\n\n        final.sort()\n        equals.sort()\n        result = []\n\n        previousDeletedIndex = -1\n        previousInsertedIndex = -1\n\n        final.append((len(deletedLines), len(insertedLines), None, None, None))\n\n        for deletedIndex, insertedIndex, deletedWords, insertedWords, sm in final:\n            while equals and (equals[0][0] < deletedIndex or equals[0][1] < insertedIndex):\n                di, ii = equals.pop(0)\n                if previousDeletedIndex < di < deletedIndex and previousInsertedIndex < ii < insertedIndex:\n                    deletedLine = deletedLines[di]\n                    insertedLine = insertedLines[ii]\n                    lineDiff = analyzeWhiteSpaceLine(deletedLine, insertedLine)\n                    if lineDiff: result.append(\"%d=%d:ws,%s\" % (di + offsetA, ii + offsetB, lineDiff))\n                    else: result.append(\"%d=%d\" % (di + offsetA, ii + offsetB))\n                    previousDeletedIndex = di\n                    previousInsertedIndex = ii\n                while equals and (di == equals[0][0] or ii == equals[0][1]): equals.pop(0)\n\n            if sm is None: break\n\n            lineDiff = []\n            deletedLine = deletedLines[deletedIndex]\n            insertedLine = insertedLines[insertedIndex]\n            if deletedLine != insertedLine and deletedLine.strip() == insertedLine.strip():\n                lineDiff.append(\"ws\")\n                lineDiff.append(analyzeWhiteSpaceLine(deletedLine, insertedLine))\n            else:\n                for tag, i1, i2, j1, j2 in sm.get_opcodes():\n                    if tag == 'replace': lineDiff.append(\"r%d-%d=%d-%d\" % (offsetInLine(deletedWords, i1), offsetInLine(deletedWords, i2), offsetInLine(insertedWords, j1), offsetInLine(insertedWords, j2)))\n                    elif tag == 'delete': lineDiff.append(\"d%d-%d\" % (offsetInLine(deletedWords, i1), offsetInLine(deletedWords, i2)))\n                    elif tag == 'insert': lineDiff.append(\"i%d-%d\" % (offsetInLine(insertedWords, j1), offsetInLine(insertedWords, j2)))\n            if lineDiff:\n                result.append(\"%d=%d:%s\" % (deletedIndex + offsetA, insertedIndex + offsetB, \",\".join(lineDiff)))\n            else:\n                result.append(\"%d=%d\" % (deletedIndex + offsetA, insertedIndex + offsetB))\n\n            previousDeletedIndex = deletedIndex\n            previousInsertedIndex = insertedIndex\n\n        return \";\".join(result)\n    elif deletedLines[-1] == insertedLines[-1]:\n        ndeleted = len(deletedLines)\n        ninserted = len(insertedLines)\n        result = []\n        index = 1\n\n        while index <= ndeleted and index <= ninserted and deletedLines[-index] == insertedLines[-index]:\n            result.append(\"%d=%d\" % (ndeleted - index + offsetA, ninserted - index + offsetB))\n            index += 1\n\n        return \";\".join(reversed(result))\n    else:\n        return \"\"\n\ndef offsetInLine(words, offset):\n    return sum(map(lambda word: len(word.encode(\"utf-8\")), words[0:offset]))\n\nre_ws_words = re.compile(\"( |\\t|\\\\s+|\\\\S+)\")\n\ndef analyzeWhiteSpaceLine(deletedLine, insertedLine):\n    deletedLine = textutils.decode(deletedLine)\n    insertedLine = textutils.decode(insertedLine)\n\n    deletedWords = filter(None, re_ws_words.findall(deletedLine))\n    insertedWords = filter(None, re_ws_words.findall(insertedLine))\n\n    sm = difflib.SequenceMatcher(None, deletedWords, insertedWords)\n    lineDiff = []\n\n    for tag, i1, i2, j1, j2 in sm.get_opcodes():\n        if tag == 'replace':\n            lineDiff.append(\"r%d-%d=%d-%d\" % (offsetInLine(deletedWords, i1),\n                                              offsetInLine(deletedWords, i2),\n                                              offsetInLine(insertedWords, j1),\n                                              offsetInLine(insertedWords, j2)))\n        elif tag == 'delete':\n            lineDiff.append(\"d%d-%d\" % (offsetInLine(deletedWords, i1),\n                                        offsetInLine(deletedWords, i2)))\n        elif tag == 'insert':\n            lineDiff.append(\"i%d-%d\" % (offsetInLine(insertedWords, j1),\n                                        offsetInLine(insertedWords, j2)))\n\n    return \",\".join(lineDiff)\n\ndef analyzeWhiteSpaceChanges(deletedLines, insertedLines, at_eof=False,\n                             offsetA=0, offsetB=0, full=False):\n    result = []\n\n    for index, (deletedLine, insertedLine) in enumerate(zip(deletedLines, insertedLines)):\n        if deletedLine != insertedLine:\n            result.append(\"%d=%d:%s\"\n                          % (index + offsetA, index + offsetB,\n                             analyzeWhiteSpaceLine(deletedLine, insertedLine)))\n        elif index == len(deletedLines) - 1 and at_eof:\n            result.append(\"%d=%d:eol\" % (index + offsetA, index + offsetB))\n        elif full:\n            result.append(\"%d=%d\" % (index + offsetA, index + offsetB))\n\n    if not result and (offsetA or offsetB):\n        result.append(\"%d=%d\" % (offsetA, offsetB))\n\n    return \";\".join(result)\n"
  },
  {
    "path": "src/diff/context.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport diff\nimport diff.html\n\nclass ContextLines:\n    def __init__(self, file, chunks, chains=None, merge=False, conflicts=False):\n        self.file = file\n        self.chunks = chunks\n        self.chains = chains\n        self.merge = merge\n        self.conflicts = conflicts\n\n    def getMacroChunks(self, context_lines=3, minimum_gap=3, highlight=True, lineFilter=None, skip_interline_diff=False):\n        old_lines = self.file.oldLines(highlight)\n        new_lines = self.file.newLines(highlight)\n\n        lines = []\n\n        def addLine(line):\n            if not lineFilter or lineFilter(line): lines.append(line)\n\n        for chunk in self.chunks:\n            old_offset = chunk.delete_offset\n            new_offset = chunk.insert_offset\n\n            if chunk.analysis:\n                mappings = chunk.analysis.split(';')\n\n                for mapping in mappings:\n                    if ':' in mapping:\n                        mapped_lines, ops = mapping.split(':')\n                        ops_list = ops.split(\",\")\n                    else:\n                        mapped_lines = mapping\n                        ops = None\n                        ops_list = None\n\n                    old_line, new_line = mapped_lines.split('=')\n                    old_line = chunk.delete_offset + int(old_line)\n                    new_line = chunk.insert_offset + int(new_line)\n\n                    while old_offset < old_line and new_offset < new_line:\n                        if old_lines[old_offset - 1] == new_lines[new_offset - 1]:\n                            line_type = diff.Line.CONTEXT\n                        else:\n                            line_type = diff.Line.REPLACED\n\n                        line = diff.Line(line_type,\n                                         old_offset, old_lines[old_offset - 1],\n                                         new_offset, new_lines[new_offset - 1],\n                                         is_whitespace=chunk.is_whitespace)\n\n                        if self.conflicts and line_type == diff.Line.REPLACED and line.isConflictMarker():\n                            addLine(diff.Line(diff.Line.DELETED,\n                                              old_offset, old_lines[old_offset - 1],\n                                              new_offset, None))\n                        else:\n                            addLine(line)\n                            new_offset += 1\n\n                        old_offset += 1\n\n                    while old_offset < old_line:\n                        addLine(diff.Line(diff.Line.DELETED,\n                                          old_offset, old_lines[old_offset - 1],\n                                          new_offset, None))\n                        old_offset += 1\n\n                    while new_offset < new_line:\n                        addLine(diff.Line(diff.Line.INSERTED,\n                                          old_offset, None,\n                                          new_offset, new_lines[new_offset - 1]))\n                        new_offset += 1\n\n                    try:\n                        deleted_line = old_lines[old_offset - 1]\n                        inserted_line = new_lines[new_offset - 1]\n                    except:\n                        raise repr((self.file.path, self.file.old_sha1, self.file.new_sha1, new_offset, len(new_lines)))\n\n                    if deleted_line == inserted_line:\n                        line_type = diff.Line.CONTEXT\n                        is_whitespace = False\n                    else:\n                        if ops_list and ops_list[0] == \"ws\":\n                            is_whitespace = True\n                            if len(ops_list) > 1:\n                                ops_list = ops_list[1:]\n                            else:\n                                ops_list = None\n                        else:\n                            is_whitespace = False\n\n                        line_type = diff.Line.MODIFIED\n\n                        if highlight and ops_list and not skip_interline_diff:\n                            if len(ops_list) == 1 and ops_list[0] == \"eol\":\n                                line_type = diff.Line.REPLACED\n                                if highlight:\n                                    if not self.file.old_eof_eol: deleted_line += \"<i class='eol'>[missing linebreak]</i>\"\n                                    if not self.file.new_eof_eol: deleted_line += \"<i class='eol'>[missing linebreak]</i>\"\n                            else:\n                                deleted_line, inserted_line = diff.html.lineDiffHTML(ops_list, deleted_line, inserted_line)\n\n                    addLine(diff.Line(line_type,\n                                      old_offset, deleted_line,\n                                      new_offset, inserted_line,\n                                      is_whitespace=chunk.is_whitespace or is_whitespace,\n                                      analysis=ops_list))\n\n                    old_offset += 1\n                    new_offset += 1\n\n            old_line = chunk.delete_offset + chunk.delete_count\n            new_line = chunk.insert_offset + chunk.insert_count\n\n            while old_offset < old_line and new_offset < new_line:\n                if old_lines[old_offset - 1] == new_lines[new_offset - 1]:\n                    line_type = diff.Line.CONTEXT\n                else:\n                    line_type = diff.Line.REPLACED\n\n                line = diff.Line(line_type,\n                                 old_offset, old_lines[old_offset - 1],\n                                 new_offset, new_lines[new_offset - 1],\n                                 is_whitespace=chunk.is_whitespace)\n\n                if self.conflicts and line_type == diff.Line.REPLACED and line.isConflictMarker():\n                    addLine(diff.Line(diff.Line.DELETED,\n                                      old_offset, old_lines[old_offset - 1],\n                                      new_offset, None))\n                else:\n                    addLine(line)\n                    new_offset += 1\n\n                old_offset += 1\n\n            while old_offset < old_line:\n                try:\n                    addLine(diff.Line(diff.Line.DELETED,\n                                      old_offset, old_lines[old_offset - 1],\n                                      new_offset, None))\n                except:\n                    addLine(diff.Line(diff.Line.DELETED,\n                                      old_offset, \"\",\n                                      new_offset, None))\n\n                old_offset += 1\n\n            while new_offset < new_line:\n                try:\n                    addLine(diff.Line(diff.Line.INSERTED,\n                                      old_offset, None,\n                                      new_offset, new_lines[new_offset - 1]))\n                except:\n                    addLine(diff.Line(diff.Line.INSERTED,\n                                      old_offset, None,\n                                      new_offset, \"\"))\n\n                new_offset += 1\n\n        old_table = {}\n        new_table = {}\n\n        for line in lines:\n            if line.old_value is not None:\n                old_table[line.old_offset] = line\n            if line.new_value is not None:\n                new_table[line.new_offset] = line\n\n        def translateInChunk(chunk, old_delta=None, new_delta=None):\n            if chunk.analysis:\n                mappings = chunk.analysis.split(';')\n\n                previous_old_line = 0\n                previous_new_line = 0\n\n                for mapping in mappings:\n                    if ':' in mapping:\n                        mapped_lines, ops = mapping.split(':')\n                    else:\n                        mapped_lines = mapping\n\n                    old_line, new_line = mapped_lines.split('=')\n                    old_line = int(old_line)\n                    new_line = int(new_line)\n\n                    if old_delta is not None:\n                        if old_line == old_delta:\n                            return new_line\n                        elif old_line > old_delta:\n                            return previous_new_line\n                    else:\n                        if new_line == new_delta:\n                            return old_line\n                        elif new_line > new_delta:\n                            return previous_old_line\n\n                    previous_old_line = old_line\n                    previous_new_line = new_line\n\n            if old_delta is not None: return min(old_delta, chunk.insert_count)\n            else: return min(new_delta, chunk.delete_count)\n\n        def findMatchingOldOffset(offset):\n            precedingChunk = None\n            for chunk in self.chunks:\n                if chunk.insert_offset + chunk.insert_count > offset:\n                    if chunk.insert_offset <= offset:\n                        delta = translateInChunk(chunk, new_delta=offset - chunk.insert_offset)\n                        offset = chunk.delete_offset + delta\n                        return offset\n                    break\n                precedingChunk = chunk\n            if precedingChunk:\n                offset -= precedingChunk.insert_offset + precedingChunk.insert_count\n                offset += precedingChunk.delete_offset + precedingChunk.delete_count\n            return offset\n\n        def findMatchingNewOffset(offset):\n            precedingChunk = None\n            for chunk in self.chunks:\n                if chunk.delete_offset + chunk.delete_count > offset:\n                    if chunk.delete_offset <= offset:\n                        delta = translateInChunk(chunk, old_delta=offset - chunk.delete_offset)\n                        offset = chunk.insert_offset + delta\n                        return offset\n                    break\n                precedingChunk = chunk\n            if precedingChunk:\n                offset -= precedingChunk.delete_offset + precedingChunk.delete_count\n                offset += precedingChunk.insert_offset + precedingChunk.insert_count\n            return offset\n\n        if self.chains and not self.merge:\n            for (chain, use_old) in self.chains:\n                if chain.comments:\n                    if not use_old and self.file.new_sha1 in chain.lines_by_sha1:\n                        chain_offset, chain_count = chain.lines_by_sha1[self.file.new_sha1]\n                        old_offset = findMatchingOldOffset(chain_offset)\n                        new_offset = chain_offset\n                        first_line = new_table.get(new_offset)\n                    else:\n                        chain_offset, chain_count = chain.lines_by_sha1[self.file.old_sha1]\n                        old_offset = chain_offset\n                        new_offset = findMatchingNewOffset(chain_offset)\n                        first_line = old_table.get(old_offset)\n\n                    count = chain_count\n\n                    while count:\n                        if old_offset not in old_table and new_offset not in new_table:\n                            try:\n                                line = diff.Line(diff.Line.CONTEXT,\n                                                 old_offset, old_lines[old_offset - 1],\n                                                 new_offset, new_lines[new_offset - 1])\n                            except IndexError:\n                                break\n                            if not lineFilter or lineFilter(line):\n                                if not first_line: first_line = line\n                                old_table[old_offset] = line\n                                new_table[new_offset] = line\n\n                        if old_offset in old_table: old_offset += 1\n                        if new_offset in new_table: new_offset += 1\n                        count -= 1\n\n        class queue:\n            def __init__(self, iterable):\n                self.__list = list(iterable)\n                self.__offset = 0\n\n            def __getitem__(self, index): return self.__list[self.__offset + index]\n            def __nonzero__(self): return self.__offset < len(self.__list)\n            def __len__(self): return len(self.__list) - self.__offset\n            def __str__(self): return str(self.__list[self.__offset:])\n            def __repr__(self): return repr(self.__list[self.__offset:])\n\n            def pop(self):\n                self.__offset += 1\n                return self.__list[self.__offset - 1]\n\n        all_lines = queue(sorted([(key, value.new_offset, value) for key, value in old_table.items()] +\n                                 [(value.old_offset, key, value) for key, value in new_table.items() if value.type not in (diff.Line.CONTEXT, diff.Line.MODIFIED, diff.Line.REPLACED)]))\n        all_chunks = self.chunks[:]\n        all_chains = self.chains and self.chains[:] or None\n\n        macro_chunks = []\n\n        def lineOrNone(lines, index):\n            try: return lines[index]\n            except IndexError: return None\n\n        while all_lines:\n            old_offset, new_offset, first_line = all_lines.pop()\n\n            count = min(context_lines, max(old_offset - 1, new_offset - 1))\n            old_offset = max(1, old_offset - count)\n            new_offset = max(1, new_offset - count)\n            lines = []\n\n            while count:\n                if old_offset <= len(old_lines) and new_offset <= len(new_lines):\n                    addLine(diff.Line(diff.Line.CONTEXT,\n                                      old_offset, old_lines[old_offset - 1],\n                                      new_offset, new_lines[new_offset - 1]))\n                    old_offset += 1\n                    new_offset += 1\n                elif old_offset <= len(old_lines):\n                    old_offset += 1\n                else:\n                    new_offset += 1\n                count -= 1\n\n            lines.append(first_line)\n            if first_line.type != diff.Line.INSERTED: old_offset += 1\n            if first_line.type != diff.Line.DELETED: new_offset += 1\n\n            while all_lines:\n                while all_lines and (old_offset == all_lines[0][0] or new_offset == all_lines[0][1]):\n                    line = all_lines.pop()[2]\n                    lines.append(line)\n                    if line.type != diff.Line.INSERTED: old_offset += 1\n                    if line.type != diff.Line.DELETED: new_offset += 1\n\n                if all_lines and all_lines[0][1] - new_offset <= 2 * context_lines + minimum_gap:\n                    while old_offset != all_lines[0][0] and new_offset != all_lines[0][1]:\n                        line = diff.Line(diff.Line.CONTEXT,\n                                         old_offset, lineOrNone(old_lines, old_offset - 1),\n                                         new_offset, lineOrNone(new_lines, new_offset - 1))\n                        addLine(line)\n                        if line.old_value is not None: old_offset += 1\n                        if line.new_value is not None: new_offset += 1\n                else: break\n\n            count = context_lines\n\n            while count:\n                if old_offset <= len(old_lines) and new_offset <= len(new_lines):\n                    addLine(diff.Line(diff.Line.CONTEXT,\n                                      old_offset, old_lines[old_offset - 1],\n                                      new_offset, new_lines[new_offset - 1]))\n                    old_offset += 1\n                    new_offset += 1\n                elif old_offset <= len(old_lines):\n                    old_offset += 1\n                else:\n                    new_offset += 1\n                count -= 1\n\n            chunks = []\n\n            while all_chunks and (all_chunks[0].delete_offset < old_offset or all_chunks[0].insert_offset < new_offset):\n                chunks.append(all_chunks.pop(0))\n\n            chains = []\n\n            if all_chains:\n                index = 0\n                while index < len(all_chains):\n                    chain, use_old = all_chains[index]\n\n                    if not use_old and self.file.new_sha1 in chain.lines_by_sha1:\n                        chain_offset, chain_count = chain.lines_by_sha1[self.file.new_sha1]\n                        compare_offset = new_offset\n                    else:\n                        chain_offset, chain_count = chain.lines_by_sha1[self.file.old_sha1]\n                        compare_offset = old_offset\n\n                    if chain_offset < compare_offset:\n                        chains.append(chain)\n                        del all_chains[index]\n                    else:\n                        index += 1\n\n            macro_chunks.append(diff.MacroChunk(chunks, lines))\n\n        if not lineFilter:\n            return filter(lambda macro_chunk: bool(macro_chunk.chunks), macro_chunks)\n        else:\n            return macro_chunks\n"
  },
  {
    "path": "src/diff/html.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport re\n\nimport textutils\nfrom htmlutils import htmlify\n\nre_tag = re.compile(\"(<[^>]*>)\")\nre_decimal_entity = re.compile(\"&#([0-9]+);\")\n\nclass Tag:\n    def __init__(self, value): self.value = value\n    def __str__(self): return self.value\n    def __nonzero__(self): return False\n    def __repr__(self): return \"Tag(%r)\" % self.value\n\ndef splitTags(line):\n    def process(token):\n        if token[0] == '<':\n            return Tag(token)\n        else:\n            def replace_decimal(match):\n                return unichr(int(match.group(1)))\n\n            token = textutils.decode(token)\n            token = re_decimal_entity.sub(replace_decimal, token)\n            token = token.encode(\"utf-8\")\n\n            return token.replace(\"&lt;\", \"<\").replace(\"&gt;\", \">\").replace(\"&amp;\", \"&\")\n\n    return map(process, filter(None, re_tag.split(line)))\n\ndef joinTags(tags):\n    def process(token):\n        if token: return htmlify(token)\n        else: return str(token)\n\n    return \"\".join(map(process, tags))\n\ndef insertTag(tags, offset, newTag):\n    newTag = Tag(newTag)\n    index = 0\n    while index < len(tags):\n        tag = tags[index]\n        if tag:\n            if len(tag) < offset: offset -= len(tag)\n            else:\n                if len(tag) == offset: tags.insert(index + 1, newTag)\n                elif not offset: tags.insert(index, newTag)\n                else: tags[index:index+1] = tag[:offset], newTag, tag[offset:]\n                return\n        index += 1\n    tags.append(newTag)\n\ndef lineDiffHTML(ops, old, new):\n    old = splitTags(old)\n    new = splitTags(new)\n\n    for op in ops:\n        old_lines = None\n        oldType = None\n        new_lines = None\n        newType = None\n\n        if op[0] == 'r':\n            old_lines, new_lines = op[1:].split('=')\n            oldType = 'r'\n            newType = 'r'\n        elif op[0] == 'd':\n            old_lines = op[1:]\n            oldType = 'd'\n        else:\n            new_lines = op[1:]\n            newType = 'i'\n\n        if old_lines:\n            start, end = old_lines.split('-')\n            insertTag(old, int(start), \"<i class='%s'>\" % oldType)\n            insertTag(old, int(end), \"</i>\")\n\n        if new_lines:\n            start, end = new_lines.split('-')\n            insertTag(new, int(start), \"<i class='%s'>\" % newType)\n            insertTag(new, int(end), \"</i>\")\n\n    return joinTags(old), joinTags(new)\n"
  },
  {
    "path": "src/diff/merge.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport diff\nimport diff.parse\nimport gitutils\n\n# Maximum number of lines allowed between a two chunks to consider\n# them near enough to warrant inclusion.\nPROXIMITY_LIMIT = 3\n\ndef filterChunks(log, file_on_branch, file_in_merge, path):\n    \"\"\"filterChunks([diff.Chunk, ...], [diff.Chunk, ...]) => [diff.Chunk, ...]\n\n    Filter the second list of chunks to only include chunks that affect lines\n    that are within PROXIMITY_LIMIT lines of a chunk in the first list of\n    chunks.\"\"\"\n\n    result = []\n\n    on_branch = iter(file_on_branch.chunks)\n    in_merge = iter(file_in_merge.chunks)\n\n    try:\n        chunk_on_branch = on_branch.next()\n        chunk_in_merge = in_merge.next()\n\n        while True:\n            if chunk_in_merge.delete_offset - chunk_on_branch.insertEnd() > PROXIMITY_LIMIT:\n                # Chunk_on_branch is significantly earlier than chunk_in_merge,\n                # so continue to next one from on_branch.\n                chunk_on_branch = on_branch.next()\n            elif chunk_on_branch.insert_offset - chunk_in_merge.deleteEnd() > PROXIMITY_LIMIT:\n                chunk_in_merge = in_merge.next()\n            else:\n                # The two chunks are near each other, or intersects, so include\n                # the one from the merge\n                result.append(chunk_in_merge)\n\n                # ... and continue to the next one from in_merge.\n                chunk_in_merge = in_merge.next()\n    except StopIteration:\n        # We ran out of chunks from either on_branch or in_merge.  If we ran out\n        # of chunks from in_merge, we obviously don't need to include any more\n        # chunks in the result.  If we ran out of chunks from on_branch, we\n        # don't either, because the previous one was apparently significant\n        # earlier than the current, and thus all following, chunks from in_merge.\n        pass\n\n    return result\n\ndef parseMergeDifferences(db, repository, commit):\n    mergebase = gitutils.Commit.fromSHA1(db, repository, repository.mergebase(commit, db=db))\n\n    result = {}\n    log = [\"\"]\n\n    for parent_sha1 in commit.parents:\n        parent = gitutils.Commit.fromSHA1(db, repository, parent_sha1)\n\n        if parent_sha1 == mergebase:\n            result[parent_sha1] = diff.parse.parseDifferences(repository, from_commit=parent, to_commit=commit)[parent_sha1]\n        else:\n            paths_on_branch = set(repository.run('diff', '--name-only', \"%s..%s\" % (mergebase, parent)).splitlines())\n            paths_in_merge = set(repository.run('diff', '--name-only', \"%s..%s\" % (parent, commit)).splitlines())\n\n            filter_paths = paths_on_branch & paths_in_merge\n\n            on_branch = diff.parse.parseDifferences(repository, from_commit=mergebase, to_commit=parent, filter_paths=filter_paths)[mergebase.sha1]\n            in_merge = diff.parse.parseDifferences(repository, from_commit=parent, to_commit=commit, filter_paths=filter_paths)[parent_sha1]\n\n            files_on_branch = dict([(file.path, file) for file in on_branch])\n\n            result_for_parent = []\n\n            for file_in_merge in in_merge:\n                file_on_branch = files_on_branch.get(file_in_merge.path)\n                if file_on_branch:\n                    filtered_chunks = filterChunks(log, file_on_branch, file_in_merge, file_in_merge.path)\n\n                    if filtered_chunks:\n                        result_for_parent.append(diff.File(id=None,\n                                                           repository=repository,\n                                                           path=file_in_merge.path,\n                                                           old_sha1=file_in_merge.old_sha1,\n                                                           new_sha1=file_in_merge.new_sha1,\n                                                           old_mode=file_in_merge.old_mode,\n                                                           new_mode=file_in_merge.new_mode,\n                                                           chunks=filtered_chunks))\n\n            result[parent_sha1] = result_for_parent\n\n    return result\n"
  },
  {
    "path": "src/diff/parse.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport configuration\nimport subprocess\nimport gitutils\nimport diff\nimport re\nimport itertools\nimport analyze\n\nGIT_EMPTY_TREE = \"4b825dc642cb6eb9a060e54bf8d69288fbee4904\"\n\ndef demunge(path):\n    special = { \"a\": \"\\a\",\n                \"b\": \"\\b\",\n                \"t\": \"\\t\",\n                \"n\": \"\\n\",\n                \"v\": \"\\v\",\n                \"f\": \"\\f\",\n                \"r\": \"\\r\",\n                '\"': '\"',\n                \"'\": \"'\",\n                \"/\": \"/\",\n                \"\\\\\": \"\\\\\" }\n\n    def unescape(match):\n        escaped = match.group(1)\n        if escaped in special:\n            return special[escaped]\n        else:\n            return chr(int(escaped, 8))\n\n    return re.sub(r\"\"\"\\\\([abtnvfr\"'/\\\\]|[0-9]{3})\"\"\", unescape, path)\n\ndef splitlines(source):\n    if not source: return source\n    elif source[-1] == \"\\n\": return source[:-1].split(\"\\n\")\n    else: return source.split(\"\\n\")\n\ndef detectWhiteSpaceChanges(file, old_lines, begin_old_offset, end_old_offset, old_ending_linebreak, new_lines, begin_new_offset, end_new_offset, new_ending_linebreak):\n    start_old_offset = None\n\n    for old_offset, new_offset in itertools.izip(xrange(begin_old_offset, end_old_offset), xrange(begin_new_offset, end_new_offset)):\n        if old_lines[old_offset - 1] != new_lines[new_offset - 1] or (old_offset == len(old_lines) and old_ending_linebreak != new_ending_linebreak):\n            if start_old_offset is None:\n                start_old_offset = old_offset\n                start_new_offset = new_offset\n        elif start_old_offset is not None:\n            assert old_offset - start_old_offset != 0 and new_offset - start_new_offset != 0\n            chunk = diff.Chunk(start_old_offset, old_offset - start_old_offset,\n                               start_new_offset, new_offset - start_new_offset,\n                               is_whitespace=True)\n            chunk.is_whitespace = True\n            file.chunks.append(chunk)\n            start_old_offset = None\n\n    if start_old_offset is not None:\n        assert end_old_offset - start_old_offset != 0 and end_new_offset - start_new_offset != 0\n        chunk = diff.Chunk(start_old_offset, end_old_offset - start_old_offset,\n                           start_new_offset, end_new_offset - start_new_offset,\n                           is_whitespace=True)\n        chunk.is_whitespace = True\n        file.chunks.append(chunk)\n\nws = re.compile(\"\\\\s+\")\n\ndef isWhitespaceChange(deleted_line, inserted_line):\n    return ws.sub(\" \", deleted_line.strip()) == ws.sub(\" \", inserted_line.strip())\n\ndef createChunks(delete_offset, deleted_lines, insert_offset, inserted_lines):\n    ws_before = None\n    ws_after = None\n\n    if deleted_lines and inserted_lines and isWhitespaceChange(deleted_lines[0], inserted_lines[0]):\n        ws_lines = 1\n        max_lines = min(len(deleted_lines), len(inserted_lines))\n\n        while ws_lines < max_lines and isWhitespaceChange(deleted_lines[ws_lines], inserted_lines[ws_lines]):\n            ws_lines += 1\n\n        ws_before = diff.Chunk(delete_offset, ws_lines, insert_offset, ws_lines, is_whitespace=True)\n\n        delete_offset += ws_lines\n        del deleted_lines[:ws_lines]\n\n        insert_offset += ws_lines\n        del inserted_lines[:ws_lines]\n\n    if deleted_lines and inserted_lines and isWhitespaceChange(deleted_lines[-1], inserted_lines[-1]):\n        ws_lines = 1\n        max_lines = min(len(deleted_lines), len(inserted_lines))\n\n        while ws_lines < max_lines and isWhitespaceChange(deleted_lines[-(ws_lines + 1)], inserted_lines[-(ws_lines + 1)]):\n            ws_lines += 1\n\n        ws_after = diff.Chunk(delete_offset + len(deleted_lines) - ws_lines, ws_lines,\n                              insert_offset + len(inserted_lines) - ws_lines, ws_lines,\n                              is_whitespace=True)\n\n        del deleted_lines[-ws_lines:]\n        del inserted_lines[-ws_lines:]\n\n    if deleted_lines or inserted_lines:\n        chunks = [diff.Chunk(delete_offset, len(deleted_lines), insert_offset, len(inserted_lines))]\n    else:\n        chunks = []\n\n    if ws_before: chunks.insert(0, ws_before)\n    if ws_after: chunks.append(ws_after)\n\n    return chunks\n\ndef mergeChunks(file):\n    if len(file.chunks) > 1:\n        file.loadOldLines(False)\n        old_lines = file.oldLines(False)\n        file.loadNewLines(False)\n        new_lines = file.newLines(False)\n\n        merged = []\n        previous = file.chunks[0]\n\n        for chunk in file.chunks[1:]:\n            assert previous.delete_count != 0 or previous.insert_count != 0\n\n            offset = previous.delete_offset + previous.delete_count\n\n            while offset < chunk.delete_offset:\n                if not analyze.re_ignore.match(old_lines[offset - 1]):\n                    break\n                offset += 1\n            else:\n                previous.delete_count = (chunk.delete_offset - previous.delete_offset) + chunk.delete_count\n                previous.insert_count = (chunk.insert_offset - previous.insert_offset) + chunk.insert_count\n\n                assert previous.delete_count != 0 or previous.insert_count != 0\n\n                previous.is_whitespace = previous.is_whitespace and chunk.is_whitespace\n                continue\n\n            merged.append(previous)\n            previous = chunk\n\n        merged.append(previous)\n\n        for chunk in merged:\n            while chunk.insert_count > 1 and chunk.delete_count > 1:\n                insert_last = new_lines[chunk.insert_offset + chunk.insert_count - 2]\n                delete_last = old_lines[chunk.delete_offset + chunk.delete_count - 2]\n\n                if insert_last == delete_last:\n                    chunk.delete_count -= 1\n                    chunk.insert_count -= 1\n                else:\n                    break\n\n        file.clean()\n        file.chunks = merged\n\ndef parseDifferences(repository, commit=None, from_commit=None, to_commit=None, filter_paths=None, selected_path=None, simple=False):\n    \"\"\"parseDifferences(repository, [commit] | [from_commit, to_commit][, selected_path]) =>\n         dict(parent_sha1 => [diff.File, ...] (if selected_path is None)\n         diff.File                            (if selected_path is not None)\"\"\"\n\n    options = []\n\n    if to_commit:\n        command = 'diff'\n        if from_commit:\n            what = [from_commit.sha1 + \"..\" + to_commit.sha1]\n        else:\n            what = [GIT_EMPTY_TREE, to_commit.sha1]\n    elif not commit.parents:\n        # Root commit.\n\n        command = \"show\"\n        what = [commit.sha1]\n\n        options.append(\"--pretty=format:\")\n    else:\n        assert len(commit.parents) == 1\n\n        command = 'diff'\n        what = [commit.parents[0] + '..' + commit.sha1]\n\n    if filter_paths is None and selected_path is None and not simple:\n        names = repository.run(command, *(options + [\"--name-only\"] + what))\n        paths = set(filter(None, map(str.strip, names.splitlines())))\n    else:\n        paths = set()\n\n    if not simple:\n        options.append('--ignore-space-change')\n\n    options.extend(what)\n\n    if filter_paths is not None:\n        options.append('--')\n        options.extend(filter_paths)\n    elif selected_path is not None:\n        options.append('--')\n        options.append(selected_path)\n\n    stdout = repository.run(command, '--full-index', '--unified=1', '--patience', *options)\n    selected_file = None\n\n    re_chunk = re.compile('^@@ -(\\\\d+)(?:,\\\\d+)? \\\\+(\\\\d+)(?:,\\\\d+)? @@')\n    re_binary = re.compile('^Binary files ([\"\\']?)(?:a/(.+)\\\\1|/dev/null) and ([\"\\']?)(?:b/(.+)\\\\3|/dev/null) differ')\n    re_diff = re.compile(\"^diff --git ([\\\"']?)a/(.*)\\\\1 ([\\\"']?)b/(.*)\\\\3$\")\n    re_old_path = re.compile(\"--- ([\\\"']?)a/(.*?)\\\\1\\t?$\")\n    re_new_path = re.compile(\"\\\\+\\\\+\\\\+ ([\\\"']?)b/(.*?)\\\\1\\t?$\")\n\n    def isplitlines(text):\n        start = 0\n        length = len(text)\n\n        while start < length:\n            try:\n                end = text.index('\\n', start)\n                yield text[start:end]\n                start = end + 1\n            except ValueError:\n                yield text[start:]\n                break\n\n    lines = isplitlines(stdout)\n\n    included = set()\n    files = []\n    files_by_path = {}\n\n    def addFile(new_file):\n        assert new_file.path not in files_by_path, \"duplicate path: %s\" % new_file.path\n        files.append(new_file)\n        files_by_path[new_file.path] = new_file\n        included.add(new_file.path)\n\n    old_mode = None\n    new_mode = None\n\n    try:\n        line = lines.next()\n\n        names = None\n\n        while True:\n            old_mode = None\n            new_mode = None\n\n            # Scan to the 'index <sha1>..<sha1>' line that marks the beginning\n            # of the differences in one file.\n            while not line.startswith(\"index \"):\n                match = re_diff.match(line)\n                if match:\n                    if old_mode is not None and new_mode is not None:\n                        addFile(diff.File(None, names[0], None, None, repository, old_mode=old_mode, new_mode=new_mode, chunks=[]))\n                    old_name = match.group(2)\n                    if match.group(1):\n                        old_name = demunge(old_name)\n                    new_name = match.group(4)\n                    if match.group(3):\n                        new_name = demunge(new_name)\n                    names = (old_name, new_name)\n                elif line.startswith(\"old mode \"):\n                    old_mode = line[9:]\n                elif line.startswith(\"new mode \"):\n                    new_mode = line[9:]\n                elif line.startswith(\"new file mode \"):\n                    new_mode = line[14:]\n                elif line.startswith(\"deleted file mode \"):\n                    old_mode = line[18:]\n\n                line = lines.next()\n\n            is_submodule = False\n\n            try:\n                sha1range, mode = line[6:].split(' ', 2)\n                if mode == \"160000\":\n                    is_submodule = True\n                    old_mode = new_mode = mode\n                old_sha1, new_sha1 = sha1range.split('..')\n            except:\n                old_sha1, new_sha1 = line[6:].split(' ', 1)[0].split(\"..\")\n\n            try:\n                line = lines.next()\n            except StopIteration:\n                if old_mode is not None or new_mode is not None:\n                    assert names[0] == names[1]\n\n                    addFile(diff.File(None, names[0], old_sha1, new_sha1, repository,\n                                      old_mode=old_mode, new_mode=new_mode,\n                                      chunks=[diff.Chunk(0, 0, 0, 0)]))\n\n                    old_mode = new_mode = None\n                raise\n\n            if re_diff.match(line):\n                new_file = diff.File(None, names[0] or names[1], old_sha1, new_sha1, repository, old_mode=old_mode, new_mode=new_mode)\n\n                if '0' * 40 == old_sha1 or '0' * 40 == new_sha1:\n                    new_file.chunks = [diff.Chunk(0, 0, 0, 0)]\n                else:\n                    new_file.loadOldLines()\n                    new_file.loadNewLines()\n                    new_file.chunks = []\n\n                    detectWhiteSpaceChanges(new_file,\n                                            new_file.oldLines(False), 1, new_file.oldCount() + 1, True,\n                                            new_file.newLines(False), 1, new_file.newCount() + 1, True)\n\n\n                addFile(new_file)\n\n                old_mode = new_mode = False\n\n                continue\n\n            binary = re_binary.match(line)\n            if binary:\n                path = (binary.group(2) or binary.group(4)).strip()\n\n                if path in files_by_path:\n                    new_file = files_by_path[path]\n                    if old_sha1 != '0' * 40:\n                        assert new_file.old_sha1 == '0' * 40\n                        new_file.old_sha1 = old_sha1\n                        new_file.old_mode = old_mode\n                    if new_sha1 != '0' * 40:\n                        assert new_file.new_sha1 == '0' * 40\n                        new_file.new_sha1 = new_sha1\n                        new_file.new_mode = new_mode\n                    new_file.chunks = [diff.Chunk(0, 0, 0, 0)]\n                else:\n                    new_file = diff.File(None, path, old_sha1, new_sha1, repository, old_mode=old_mode, new_mode=new_mode)\n                    new_file.chunks = [diff.Chunk(0, 0, 0, 0)]\n                    addFile(new_file)\n\n                continue\n\n            match = re_old_path.match(line)\n            if match:\n                old_path = match.group(2)\n                if match.group(1):\n                    old_path = demunge(old_path)\n            else:\n                old_path = None\n\n            line = lines.next()\n\n            match = re_new_path.match(line)\n            if match:\n                new_path = match.group(2)\n                if match.group(1):\n                    new_path = demunge(new_path)\n            else:\n                new_path = None\n\n            assert (old_path is None) == ('0' * 40 == old_sha1)\n            assert (new_path is None) == ('0' * 40 == new_sha1)\n\n            if old_path:\n                path = old_path\n            else:\n                path = new_path\n\n            if is_submodule:\n                line = lines.next()\n                match = re_chunk.match(line)\n                assert match, repr(line)\n                assert match.group(1) == match.group(2) == \"1\", repr(match.groups())\n\n                line = lines.next()\n                assert line == \"-Subproject commit %s\" % old_sha1, repr(line)\n\n                line = lines.next()\n                assert line == \"+Subproject commit %s\" % new_sha1, repr(line)\n\n                new_file = diff.File(None, path, old_sha1, new_sha1, repository,\n                                     old_mode=old_mode, new_mode=new_mode,\n                                     chunks=[diff.Chunk(1, 1, 1, 1, analysis=\"0=0:r18-58=18-58\")])\n\n                if path not in files_by_path: addFile(new_file)\n\n                old_mode = new_mode = None\n\n                continue\n\n            try:\n                line = lines.next()\n\n                delete_offset = 1\n                deleted_lines = []\n                insert_offset = 1\n                inserted_lines = []\n\n                if old_path and new_path and not simple:\n                    old_lines = splitlines(repository.fetch(old_sha1).data)\n                    new_lines = splitlines(repository.fetch(new_sha1).data)\n                else:\n                    old_lines = None\n                    new_lines = None\n\n                if path in files_by_path:\n                    new_file = files_by_path[path]\n                    if old_sha1 != '0' * 40:\n                        assert new_file.old_sha1 == '0' * 40\n                        new_file.old_sha1 = old_sha1\n                        new_file.old_mode = old_mode\n                    if new_sha1 != '0' * 40:\n                        assert new_file.new_sha1 == '0' * 40\n                        new_file.new_sha1 = new_sha1\n                        new_file.new_mode = new_mode\n                    new_file.chunks = []\n                else:\n                    new_file = diff.File(None, path, old_sha1, new_sha1, repository, old_mode=old_mode, new_mode=new_mode, chunks=[])\n\n                old_mode = new_mode = None\n\n                if selected_path is not None and selected_path == path:\n                    selected_file = new_file\n\n                if path not in files_by_path: addFile(new_file)\n\n                previous_delete_offset = 1\n                previous_insert_offset = 1\n\n                while True:\n                    match = re_chunk.match(line)\n\n                    if not match: break\n\n                    groups = match.groups()\n\n                    delete_offset = int(groups[0])\n                    deleted_lines = []\n\n                    insert_offset = int(groups[1])\n                    inserted_lines = []\n\n                    while True:\n                        line = lines.next()\n\n                        if line == \"\\\\ No newline at end of file\": continue\n                        if line[0] not in (' ', '-', '+'): break\n\n                        if line[0] != ' ' and previous_delete_offset is not None and old_lines and new_lines and not simple:\n                            detectWhiteSpaceChanges(files[-1], old_lines, previous_delete_offset, delete_offset, True, new_lines, previous_insert_offset, insert_offset, True)\n                            previous_delete_offset = None\n\n                        if line[0] == ' ' and previous_delete_offset is None:\n                            previous_delete_offset = delete_offset\n                            previous_insert_offset = insert_offset\n\n                        type = line[0]\n\n                        if type == '-':\n                            delete_offset += 1\n                            deleted_lines.append(line[1:])\n                        elif type == '+':\n                            insert_offset += 1\n                            inserted_lines.append(line[1:])\n                        else:\n                            if deleted_lines or inserted_lines:\n                                chunks = createChunks(delete_offset - len(deleted_lines),\n                                                      deleted_lines,\n                                                      insert_offset - len(inserted_lines),\n                                                      inserted_lines)\n                                files[-1].chunks.extend(chunks)\n                                deleted_lines = []\n                                inserted_lines = []\n\n                            delete_offset += 1\n                            insert_offset += 1\n\n                    if deleted_lines or inserted_lines:\n                        chunks = createChunks(delete_offset - len(deleted_lines),\n                                              deleted_lines,\n                                              insert_offset - len(inserted_lines),\n                                              inserted_lines)\n                        files[-1].chunks.extend(chunks)\n                        deleted_lines = []\n                        inserted_lines = []\n\n                if previous_delete_offset is not None and old_lines and new_lines and not simple:\n                    detectWhiteSpaceChanges(files[-1], old_lines, previous_delete_offset, len(old_lines) + 1, True, new_lines, previous_insert_offset, len(new_lines) + 1, True)\n                    previous_delete_offset = None\n            except StopIteration:\n                if deleted_lines or inserted_lines:\n                    chunks = createChunks(delete_offset - len(deleted_lines),\n                                          deleted_lines,\n                                          insert_offset - len(inserted_lines),\n                                          inserted_lines)\n                    files[-1].chunks.extend(chunks)\n                    deleted_lines = []\n                    inserted_lines = []\n\n                if previous_delete_offset is not None and old_lines and new_lines and not simple:\n                    detectWhiteSpaceChanges(files[-1], old_lines, previous_delete_offset, len(old_lines) + 1, True, new_lines, previous_insert_offset, len(new_lines) + 1, True)\n\n                raise\n    except StopIteration:\n        if old_mode is not None and new_mode is not None:\n            assert names[0] == names[1]\n\n            addFile(diff.File(None, names[0], None, None, repository, old_mode=old_mode, new_mode=new_mode, chunks=[]))\n\n    for path in (paths - included):\n        lines = isplitlines(repository.run(command, '--full-index', '--unified=1', *(what + ['--', path])))\n\n        try:\n            line = lines.next()\n\n            while not line.startswith(\"index \"): line = lines.next()\n\n            try:\n                sha1range, mode = line[6:].split(' ')\n                if mode == \"160000\":\n                    continue\n                old_sha1, new_sha1 = sha1range.split(\"..\")\n            except:\n                old_sha1, new_sha1 = line[6:].split(' ', 1)[0].split(\"..\")\n\n            if old_sha1 == '0' * 40 or new_sha1 == '0' * 40:\n                # Added or removed empty file.\n                continue\n\n            addFile(diff.File(None, path, old_sha1, new_sha1, repository, chunks=[]))\n\n            old_data = repository.fetch(old_sha1).data\n            old_lines = splitlines(old_data)\n            new_data = repository.fetch(new_sha1).data\n            new_lines = splitlines(new_data)\n\n            assert len(old_lines) == len(new_lines), \"%s:%d != %s:%d\" % (old_sha1, len(old_lines), new_sha1, len(new_lines))\n\n            def endsWithLinebreak(data): return data and data[-1] in \"\\n\\r\"\n\n            detectWhiteSpaceChanges(files[-1], old_lines, 1, len(old_lines) + 1, endsWithLinebreak(old_data), new_lines, 1, len(new_lines) + 1, endsWithLinebreak(new_data))\n        except StopIteration:\n            pass\n\n    if not simple:\n        for file in files:\n            mergeChunks(file)\n\n    if to_commit:\n        if selected_path is not None:\n            return selected_file\n        elif from_commit:\n            return { from_commit.sha1: files }\n        else:\n            return { None: files }\n    elif not commit.parents:\n        return { None: files }\n    else:\n        return { commit.parents[0]: files }\n"
  },
  {
    "path": "src/diffutils.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nfrom difflib import SequenceMatcher\nfrom itertools import izip, repeat\n\nimport diff\nimport diff.html\n\ndef expandWithContext(chunks, old_lines, new_lines, context_lines, highlight=True):\n    if not chunks: return []\n\n    groups = []\n    group = []\n\n    chunks = iter(chunks)\n\n    try:\n        previousChunk = chunks.next()\n        group.append(previousChunk)\n\n        while True:\n            nextChunk = chunks.next()\n\n            distance = nextChunk.delete_offset - (previousChunk.delete_offset + previousChunk.delete_count)\n            gap_between = distance - 2 * context_lines\n\n            if gap_between >= 3:\n                groups.append(group)\n                group = []\n\n            group.append(nextChunk)\n            previousChunk = nextChunk\n    except StopIteration:\n        pass\n\n    groups.append(group)\n\n    macro_chunks = []\n\n    for group in groups:\n        delete_offset = max(1, group[0].delete_offset - context_lines)\n        insert_offset = max(1, group[0].insert_offset - context_lines)\n\n        lines = []\n\n        for chunk in group:\n            while delete_offset < chunk.delete_offset:\n                lines.append(diff.Line(diff.Line.CONTEXT, delete_offset, old_lines[delete_offset - 1], insert_offset, new_lines[insert_offset - 1]))\n                delete_offset += 1\n                insert_offset += 1\n\n            if chunk.analysis:\n                mappings = chunk.analysis.split(';')\n\n                for mapping in mappings:\n                    if ':' in mapping:\n                        mapped_lines, ops = mapping.split(':')\n                    else:\n                        mapped_lines = mapping\n                        ops = None\n\n                    delete_line, insert_line = mapped_lines.split('=')\n                    delete_line = chunk.delete_offset + int(delete_line)\n                    insert_line = chunk.insert_offset + int(insert_line)\n\n                    while delete_offset < delete_line and insert_offset < insert_line:\n                        lines.append(diff.Line(diff.Line.MODIFIED, delete_offset, old_lines[delete_offset - 1], insert_offset, new_lines[insert_offset - 1], is_whitespace=chunk.is_whitespace))\n                        delete_offset += 1\n                        insert_offset += 1\n\n                    while delete_offset < delete_line:\n                        lines.append(diff.Line(diff.Line.DELETED, delete_offset, old_lines[delete_offset - 1], insert_offset, None))\n                        delete_offset += 1\n\n                    while insert_offset < insert_line:\n                        lines.append(diff.Line(diff.Line.INSERTED, delete_offset, None, insert_offset, new_lines[insert_offset - 1]))\n                        insert_offset += 1\n\n                    deleted_line = old_lines[delete_offset - 1]\n                    inserted_line = new_lines[insert_offset - 1]\n\n                    if highlight and ops: deleted_line, inserted_line = diff.html.lineDiffHTML(ops, deleted_line, inserted_line)\n\n                    lines.append(diff.Line(diff.Line.MODIFIED, delete_offset, deleted_line, insert_offset, inserted_line, is_whitespace=chunk.is_whitespace))\n\n                    delete_offset += 1\n                    insert_offset += 1\n\n            deleteStop = chunk.delete_offset + chunk.delete_count\n            insertStop = chunk.insert_offset + chunk.insert_count\n\n            while delete_offset < deleteStop and insert_offset < insertStop:\n                lines.append(diff.Line(diff.Line.REPLACED, delete_offset, old_lines[delete_offset - 1], insert_offset, new_lines[insert_offset - 1], is_whitespace=chunk.is_whitespace))\n                delete_offset += 1\n                insert_offset += 1\n\n            while delete_offset < deleteStop:\n                lines.append(diff.Line(diff.Line.DELETED, delete_offset, old_lines[delete_offset - 1], insert_offset, None))\n                delete_offset += 1\n\n            while insert_offset < insertStop:\n                lines.append(diff.Line(diff.Line.INSERTED, delete_offset, None, insert_offset, new_lines[insert_offset - 1]))\n                insert_offset += 1\n\n        deleteStop = min(len(old_lines) + 1, delete_offset + context_lines)\n\n        while delete_offset < deleteStop:\n            lines.append(diff.Line(diff.Line.CONTEXT, delete_offset, old_lines[delete_offset - 1], insert_offset, new_lines[insert_offset - 1]))\n            delete_offset += 1\n            insert_offset += 1\n\n        macro_chunks.append(diff.MacroChunk(group, lines))\n\n    return macro_chunks\n"
  },
  {
    "path": "src/extensions/__init__.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport os\n\ndef getExtensionPath(author_name, extension_name):\n    return extension.Extension(author_name, extension_name).getPath()\n\ndef getExtensionInstallPath(sha1):\n    import configuration\n    return os.path.join(configuration.extensions.INSTALL_DIR, sha1)\n\nimport manifest\nimport extension\nimport resource\n"
  },
  {
    "path": "src/extensions/execute.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport errno\nimport os\nimport socket\nimport subprocess\nimport time\n\nimport configuration\nimport auth\nimport dbutils\n\nfrom extensions.extension import Extension\nfrom textutils import json_encode, json_decode\n\ndef startProcess(flavor):\n    executable = configuration.extensions.FLAVORS[flavor][\"executable\"]\n    library = configuration.extensions.FLAVORS[flavor][\"library\"]\n\n    process = subprocess.Popen(\n        [executable, \"critic-launcher.js\"],\n        stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE,\n        cwd=library)\n\n    return process\n\nclass ProcessException(Exception):\n    pass\n\nclass ProcessError(ProcessException):\n    def __init__(self, message):\n        super(ProcessError, self).__init__(\n            \"Failed to execute process: %s\" % message)\n\nclass ProcessTimeout(ProcessException):\n    def __init__(self, timeout):\n        super(ProcessTimeout, self).__init__(\n            \"Process timed out after %d seconds\" % timeout)\n\nclass ProcessFailure(ProcessException):\n    def __init__(self, returncode, stderr):\n        super(ProcessFailure, self).__init__(\n            \"Process returned non-zero exit status %d\" % returncode)\n        self.returncode = returncode\n        self.stderr = stderr\n\ndef executeProcess(db, manifest, role_name, script, function, extension_id,\n                   user_id, argv, timeout, stdin=None, rlimit_rss=256):\n    # If |user_id| is not the same as |db.user|, then one user's access of the\n    # system is triggering an extension on behalf of another user.  This will\n    # for instance happen when one user is adding changes to a review,\n    # triggering an extension filter hook set up by another user.\n    #\n    # In this case, we need to check that the other user can access the\n    # extension.\n    #\n    # If |user_id| is the same as |db.user|, we need to use |db.profiles|, which\n    # may contain a profile associated with an access token that was used to\n    # authenticate the user.\n    if user_id != db.user.id:\n        user = dbutils.User.fromId(db, user_id)\n        authentication_labels = auth.DATABASE.getAuthenticationLabels(user)\n        profiles = [auth.AccessControlProfile.forUser(\n            db, user, authentication_labels)]\n    else:\n        authentication_labels = db.authentication_labels\n        profiles = db.profiles\n\n    extension = Extension.fromId(db, extension_id)\n    if not auth.AccessControlProfile.isAllowedExtension(\n            profiles, \"execute\", extension):\n        raise auth.AccessDenied(\"Access denied to extension: execute %s\"\n                                % extension.getKey())\n\n    flavor = manifest.flavor\n\n    if manifest.flavor not in configuration.extensions.FLAVORS:\n        flavor = configuration.extensions.DEFAULT_FLAVOR\n\n    stdin_data = \"%s\\n\" % json_encode({\n            \"library_path\": configuration.extensions.FLAVORS[flavor][\"library\"],\n            \"rlimit\": { \"rss\": rlimit_rss },\n            \"hostname\": configuration.base.HOSTNAME,\n            \"dbname\": configuration.database.PARAMETERS[\"database\"],\n            \"dbuser\": configuration.database.PARAMETERS[\"user\"],\n            \"git\": configuration.executables.GIT,\n            \"python\": configuration.executables.PYTHON,\n            \"python_path\": \"%s:%s\" % (configuration.paths.CONFIG_DIR,\n                                      configuration.paths.INSTALL_DIR),\n            \"repository_work_copy_path\": configuration.extensions.WORKCOPY_DIR,\n            \"changeset_address\": configuration.services.CHANGESET[\"address\"],\n            \"branchtracker_pid_path\": configuration.services.BRANCHTRACKER[\"pidfile_path\"],\n            \"maildelivery_pid_path\": configuration.services.MAILDELIVERY[\"pidfile_path\"],\n            \"is_development\": configuration.debug.IS_DEVELOPMENT,\n            \"extension_path\": manifest.path,\n            \"extension_id\": extension_id,\n            \"user_id\": user_id,\n            \"authentication_labels\": list(authentication_labels),\n            \"role\": role_name,\n            \"script_path\": script,\n            \"fn\": function,\n            \"argv\": argv })\n\n    if stdin is not None:\n        stdin_data += stdin\n\n    # Double the timeout. Timeouts are primarily handled by the extension runner\n    # service, which returns an error response on timeout. This deadline here is\n    # thus mostly to catch the extension runner service itself timing out.\n    deadline = time.time() + timeout * 2\n\n    try:\n        connection = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM)\n        connection.settimeout(max(0, deadline - time.time()))\n        connection.connect(configuration.services.EXTENSIONRUNNER[\"address\"])\n        connection.sendall(json_encode({\n            \"stdin\": stdin_data,\n            \"flavor\": flavor,\n            \"timeout\": timeout\n        }))\n        connection.shutdown(socket.SHUT_WR)\n\n        data = \"\"\n\n        while True:\n            connection.settimeout(max(0, deadline - time.time()))\n            try:\n                received = connection.recv(4096)\n            except socket.error as error:\n                if error.errno == errno.EINTR:\n                    continue\n                raise\n            if not received:\n                break\n            data += received\n\n        connection.close()\n    except socket.timeout as error:\n        raise ProcessTimeout(timeout)\n    except socket.error as error:\n        raise ProcessError(\"failed to read response: %s\" % error)\n\n    try:\n        data = json_decode(data)\n    except ValueError as error:\n        raise ProcessError(\"failed to decode response: %s\" % error)\n\n    if data[\"status\"] == \"timeout\":\n        raise ProcessTimeout(timeout)\n\n    if data[\"status\"] == \"error\":\n        raise ProcessError(data[\"error\"])\n\n    if data[\"returncode\"] != 0:\n        raise ProcessFailure(data[\"returncode\"], data[\"stderr\"])\n\n    return data[\"stdout\"]\n"
  },
  {
    "path": "src/extensions/extension.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport os\nimport subprocess\nimport pwd\n\nimport dbutils\nimport htmlutils\n\nfrom extensions.manifest import Manifest, ManifestError\nfrom extensions import getExtensionInstallPath\n\nclass ExtensionError(Exception):\n    def __init__(self, message, extension=None):\n        super(ExtensionError, self).__init__(message)\n        self.extension = extension\n\nclass Extension(object):\n    def __init__(self, author_name, extension_name):\n        import configuration\n\n        if os.path.sep in extension_name:\n            raise ExtensionError(\n                \"Invalid extension name: %s\" % extension_name)\n\n        self.__author_name = author_name\n        self.__extension_name = extension_name\n        self.__manifest = {}\n\n        if author_name:\n            try:\n                user_home_dir = pwd.getpwnam(author_name).pw_dir\n            except KeyError:\n                raise ExtensionError(\n                    \"No such system user: %s\" % author_name,\n                    extension=self)\n\n            self.__path = os.path.join(\n                user_home_dir,\n                configuration.extensions.USER_EXTENSIONS_DIR,\n                extension_name)\n        else:\n            self.__path = os.path.join(\n                configuration.extensions.SYSTEM_EXTENSIONS_DIR,\n                extension_name)\n\n        if not (os.path.isdir(self.__path) and\n                os.access(self.__path, os.R_OK | os.X_OK)):\n            raise ExtensionError(\n                \"Invalid or inaccessible extension dir: %s\" % self.__path,\n                extension=self)\n\n    def isSystemExtension(self):\n        return self.__author_name is None\n\n    def getAuthorName(self):\n        if self.isSystemExtension():\n            return None\n        return self.__author_name\n\n    def getName(self):\n        return self.__extension_name\n\n    def getTitle(self, db, html=False):\n        if html:\n            title = \"<b>%s</b>\" % htmlutils.htmlify(self.getName())\n        else:\n            title = self.getName()\n\n        if not self.isSystemExtension():\n            author = self.getAuthor(db)\n\n            try:\n                manifest = self.getManifest()\n            except ManifestError:\n                # Can't access information from the manifest, so assume \"yes\".\n                is_author = True\n            else:\n                is_author = manifest.isAuthor(db, author)\n\n            if is_author:\n                title += \" by \"\n            else:\n                title += \" hosted by \"\n\n            if html:\n                title += htmlutils.htmlify(author.fullname)\n            else:\n                title += author.fullname\n\n        return title\n\n    def getKey(self):\n        if self.isSystemExtension():\n            return self.__extension_name\n        else:\n            return \"%s/%s\" % (self.__author_name, self.__extension_name)\n\n    def getPath(self):\n        return self.__path\n\n    def getVersions(self):\n        import configuration\n\n        try:\n            output = subprocess.check_output(\n                [configuration.executables.GIT, \"for-each-ref\",\n                 \"--format=%(refname)\", \"refs/heads/version/\"],\n                stderr=subprocess.STDOUT, cwd=self.__path)\n        except subprocess.CalledProcessError:\n            # Not a git repository => no versions (except \"Live\").\n            return []\n\n        versions = []\n        for ref in output.splitlines():\n            if ref.startswith(\"refs/heads/version/\"):\n                versions.append(ref[len(\"refs/heads/version/\"):])\n        return versions\n\n    def getManifest(self, version=None, sha1=None):\n        import configuration\n\n        path = self.__path\n        source = None\n\n        if sha1 is not None:\n            if sha1 in self.__manifest:\n                return self.__manifest[sha1]\n\n            install_path = getExtensionInstallPath(sha1)\n            with open(os.path.join(install_path, \"MANIFEST\")) as manifest_file:\n                source = manifest_file.read()\n\n            path = \"<snapshot of commit %s>\" % sha1[:8]\n        elif version in self.__manifest:\n            return self.__manifest[version]\n\n        if source is None and version is not None:\n            source = subprocess.check_output(\n                [configuration.executables.GIT, \"cat-file\", \"blob\",\n                 \"version/%s:MANIFEST\" % version],\n                cwd=self.__path)\n\n        manifest = Manifest(path, source)\n        manifest.read()\n\n        if sha1 is not None:\n            self.__manifest[sha1] = manifest\n        else:\n            self.__manifest[version] = manifest\n\n        return manifest\n\n    def getCurrentSHA1(self, version):\n        import configuration\n\n        return subprocess.check_output(\n            [configuration.executables.GIT, \"rev-parse\", \"--verify\",\n             \"version/%s\" % version],\n            cwd=self.__path).strip()\n\n    def prepareVersionSnapshot(self, version):\n        import configuration\n\n        sha1 = self.getCurrentSHA1(version)\n\n        if not os.path.isdir(getExtensionInstallPath(sha1)):\n            git_archive = subprocess.Popen(\n                [configuration.executables.GIT, \"archive\", \"--format=tar\",\n                 \"--prefix=%s/\" % sha1, sha1],\n                stdout=subprocess.PIPE, cwd=self.__path)\n            subprocess.check_call(\n                [configuration.executables.TAR, \"x\"],\n                stdin=git_archive.stdout,\n                cwd=configuration.extensions.INSTALL_DIR)\n\n        return sha1\n\n    def getAuthor(self, db):\n        if self.isSystemExtension():\n            return None\n        return dbutils.User.fromName(db, self.getAuthorName())\n\n    def getExtensionID(self, db, create=False):\n        cursor = db.cursor()\n\n        if self.isSystemExtension():\n            author_id = None\n            cursor.execute(\"\"\"SELECT extensions.id\n                                FROM extensions\n                               WHERE extensions.author IS NULL\n                                 AND extensions.name=%s\"\"\",\n                           (self.__extension_name,))\n        else:\n            author_id = self.getAuthor(db).id\n            cursor.execute(\"\"\"SELECT extensions.id\n                                FROM extensions\n                               WHERE extensions.author=%s\n                                 AND extensions.name=%s\"\"\",\n                           (author_id, self.__extension_name))\n\n        row = cursor.fetchone()\n\n        if row:\n            return row[0]\n        elif create:\n            cursor.execute(\"\"\"INSERT INTO extensions (author, name)\n                              VALUES (%s, %s)\n                           RETURNING id\"\"\",\n                           (author_id, self.__extension_name))\n            return cursor.fetchone()[0]\n        else:\n            return None\n\n    def getInstalledVersion(self, db, user):\n        \"\"\"Return (sha1, name) of the version currently installed by the user.\n\n        If the user doesn't have the extension installed, return (False, False).\n        If the user has the \"live\" version installed, return (None, None).\n        \"\"\"\n\n        extension_id = self.getExtensionID(db)\n\n        if extension_id is None:\n            # An extension is recorded in the database and assigned an ID the\n            # first time it's installed.  If it doesn't have an ID, then no user\n            # can have any version of it installed.\n            return (False, False)\n\n        cursor = db.cursor()\n\n        if user is None:\n            cursor.execute(\"\"\"SELECT extensionversions.sha1, extensionversions.name\n                                FROM extensioninstalls\n                     LEFT OUTER JOIN extensionversions ON (extensionversions.id=extensioninstalls.version)\n                               WHERE extensioninstalls.uid IS NULL\n                                 AND extensioninstalls.extension=%s\"\"\",\n                           (extension_id,))\n        else:\n            cursor.execute(\"\"\"SELECT extensionversions.sha1, extensionversions.name\n                                FROM extensioninstalls\n                     LEFT OUTER JOIN extensionversions ON (extensionversions.id=extensioninstalls.version)\n                               WHERE extensioninstalls.uid=%s\n                                 AND extensioninstalls.extension=%s\"\"\",\n                           (user.id, extension_id))\n\n        row = cursor.fetchone()\n\n        if row:\n            return row\n        else:\n            return (False, False)\n\n    @staticmethod\n    def fromId(db, extension_id):\n        cursor = db.cursor()\n        cursor.execute(\"\"\"SELECT users.name, extensions.name\n                            FROM extensions\n                 LEFT OUTER JOIN users ON (users.id=extensions.author)\n                           WHERE extensions.id=%s\"\"\",\n                       (extension_id,))\n        row = cursor.fetchone()\n        if not row:\n            raise ExtensionError(\"Invalid extension id: %d\" % extension_id)\n        author_name, extension_name = row\n        return Extension(author_name, extension_name)\n\n    @staticmethod\n    def getInstalls(db, user):\n        \"\"\"\n        Return a list of extension installs in effect for the specified user\n\n        If 'user' is None, all universal extension installs are listed.\n\n        Each install is returned as a tuple containing four elements, the\n        extension id, the version id, the version SHA-1 and a boolean which is\n        true if the install is universal.  For a LIVE version, the version id\n        and the version SHA-1 are None.\n\n        The list of installs is ordered by precedence; most significant install\n        first, least significant install last.\n        \"\"\"\n\n        cursor = db.cursor()\n        cursor.execute(\"\"\"SELECT extensioninstalls.id, extensioninstalls.extension,\n                                 extensionversions.id, extensionversions.sha1,\n                                 extensioninstalls.uid IS NULL\n                            FROM extensioninstalls\n                 LEFT OUTER JOIN extensionversions ON (extensionversions.id=extensioninstalls.version)\n                           WHERE uid=%s OR uid IS NULL\n                        ORDER BY uid NULLS FIRST\"\"\",\n                       (user.id if user else None,))\n\n        install_per_extension = {}\n\n        # Since we ordered by 'uid' with nulls (\"universal installs\") first,\n        # we'll overwrite universal installs with per-user installs, as intended.\n        for install_id, extension_id, version_id, version_sha1, is_universal in cursor:\n            install_per_extension[extension_id] = (install_id, version_id,\n                                                   version_sha1, is_universal)\n\n        installs = [(install_id, extension_id, version_id, version_sha1, is_universal)\n                    for extension_id, (install_id, version_id, version_sha1, is_universal)\n                    in install_per_extension.items()]\n\n        # Sort installs by install id, higher first.  This means a later install\n        # takes precedence over an earlier, if they both handle the same path.\n        installs.sort(reverse=True)\n\n        # Drop the install_id; it is not relevant past this point.\n        return [(extension_id, version_id, version_sha1, is_universal)\n                for _, extension_id, version_id, version_sha1, is_universal\n                in installs]\n\n    @staticmethod\n    def getUpdatedExtensions(db, user):\n        cursor = db.cursor()\n        cursor.execute(\"\"\"SELECT users.name, users.fullname, extensions.name,\n                                 extensionversions.name, extensionversions.sha1\n                            FROM users\n                            JOIN extensions ON (extensions.author=users.id)\n                            JOIN extensionversions ON (extensionversions.extension=extensions.id)\n                            JOIN extensioninstalls ON (extensioninstalls.version=extensionversions.id)\n                           WHERE extensioninstalls.uid=%s\"\"\",\n                       (user.id,))\n\n        updated = []\n        for author_name, author_fullname, extension_name, version_name, version_sha1 in cursor:\n            extension = Extension(author_name, extension_name)\n            if extension.getCurrentSHA1(version_name) != version_sha1:\n                updated.append((author_fullname, extension_name))\n        return updated\n\n    @staticmethod\n    def find(db):\n        import configuration\n\n        def search(user_name, search_dir):\n            if not (os.path.isdir(search_dir) and\n                    os.access(search_dir, os.X_OK | os.R_OK)):\n                return []\n\n            extensions = []\n\n            for extension_name in os.listdir(search_dir):\n                extension_dir = os.path.join(search_dir, extension_name)\n                manifest_path = os.path.join(extension_dir, \"MANIFEST\")\n\n                if not (os.path.isdir(extension_dir) and\n                        os.access(extension_dir, os.X_OK | os.R_OK) and\n                        os.access(manifest_path, os.R_OK)):\n                    continue\n\n                extensions.append(Extension(user_name, extension_name))\n\n            return extensions\n\n        extensions = search(None, configuration.extensions.SYSTEM_EXTENSIONS_DIR)\n\n        if configuration.extensions.USER_EXTENSIONS_DIR:\n            cursor = db.cursor()\n            cursor.execute(\"SELECT name FROM users WHERE status!='retired' ORDER BY name ASC\")\n\n            for (user_name,) in cursor:\n                try:\n                    pwd_entry = pwd.getpwnam(user_name)\n                except KeyError:\n                    continue\n\n                user_dir = os.path.join(\n                    pwd_entry.pw_dir, configuration.extensions.USER_EXTENSIONS_DIR)\n\n                extensions.extend(search(user_name, user_dir))\n\n        return extensions\n"
  },
  {
    "path": "src/extensions/installation.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport auth\n\nfrom extensions.extension import Extension, ExtensionError\n\nclass InstallationError(Exception):\n    def __init__(self, title, message, is_html=False):\n        self.title = title\n        self.message = message\n        self.is_html = is_html\n\ndef doInstallExtension(db, user, extension, version):\n    auth.AccessControl.accessExtension(db, \"install\", extension)\n\n    is_universal = user is None\n    extension_id = extension.getExtensionID(db, create=True)\n    manifest = extension.getManifest(version)\n\n    # Detect conflicting extension installs.\n    current_installs = Extension.getInstalls(db, user)\n    for current_extension_id, _, _, current_is_universal in current_installs:\n        # Two installs never conflict if one is universal and one is not.\n        if is_universal != current_is_universal:\n            continue\n\n        try:\n            current_extension = Extension.fromId(db, current_extension_id)\n        except ExtensionError as error:\n            # Invalid extension => no conflict.\n            #\n            # But if there would be a conflict, should the installed extension\n            # later become valid again, then delete the installation.\n            if extension.getName() == error.extension.getName():\n                doUninstallExtension(db, user, error.extension)\n            continue\n\n        # Same extension => conflict\n        #\n        # The web UI will typically not let you try to do this; if the extension\n        # is already installed the UI will only let you uninstall or upgrade it.\n        # But you never know.  Also, there's a UNIQUE constraint in the database\n        # that would prevent this, but with a significantly worse error message,\n        # of course.\n        if extension_id == current_extension_id:\n            raise InstallationError(\n                title=\"Conflicting install\",\n                message=(\"The extension <code>%s</code> is already \"\n                         \"%sinstalled.\"\n                         % (current_extension.getTitle(db),\n                            \"universally \" if is_universal else \"\")),\n                is_html=True)\n\n        # Different extensions, same name => also conflict\n        #\n        # Two extensions with the same name are probably simply two forks of the\n        # same extension, and are very likely to have overlapping and\n        # conflicting functionality.  Also, extension resource paths only\n        # contain the extension name as an identifier, and thus will conflict\n        # between the two extensions, even if they are actually completely\n        # unrelated.\n        if extension.getName() == current_extension.getName():\n            raise InstallationError(\n                title=\"Conflicting install\",\n                message=(\"The extension <code>%s</code> is already \"\n                         \"%sinstalled, and conflicts with the extension \"\n                         \"<code>%s</code> since they have the same name.\"\n                         % (current_extension.getTitle(db),\n                            \"universally \" if is_universal else \"\",\n                            extension.getTitle(db))),\n                is_html=True)\n\n    cursor = db.cursor()\n\n    if is_universal:\n        user_id = None\n    else:\n        user_id = user.id\n\n    if version is not None:\n        sha1 = extension.prepareVersionSnapshot(version)\n\n        cursor.execute(\"\"\"SELECT id\n                            FROM extensionversions\n                           WHERE extension=%s\n                             AND name=%s\n                             AND sha1=%s\"\"\",\n                       (extension_id, version, sha1))\n        row = cursor.fetchone()\n\n        if row:\n            (version_id,) = row\n        else:\n            cursor.execute(\"\"\"INSERT INTO extensionversions (extension, name, sha1)\n                                   VALUES (%s, %s, %s)\n                                RETURNING id\"\"\",\n                           (extension_id, version, sha1))\n            (version_id,) = cursor.fetchone()\n\n            for role in manifest.roles:\n                role.install(db, version_id)\n    else:\n        version_id = None\n\n    cursor.execute(\"\"\"INSERT INTO extensioninstalls (uid, extension, version)\n                           VALUES (%s, %s, %s)\"\"\",\n                   (user_id, extension_id, version_id))\n\ndef doUninstallExtension(db, user, extension):\n    extension_id = extension.getExtensionID(db)\n\n    if extension_id is None:\n        return\n\n    cursor = db.cursor()\n\n    if user is not None:\n        cursor.execute(\"\"\"DELETE FROM extensioninstalls\n                                WHERE uid=%s\n                                  AND extension=%s\"\"\",\n                       (user.id, extension_id))\n    else:\n        cursor.execute(\"\"\"DELETE FROM extensioninstalls\n                                WHERE uid IS NULL\n                                  AND extension=%s\"\"\",\n                       (extension_id,))\n\ndef getExtension(author_name, extension_name):\n    \"\"\"Create an Extension object ignoring whether it is valid\"\"\"\n    try:\n        return Extension(author_name, extension_name)\n    except ExtensionError as error:\n        if error.extension is None:\n            raise error\n        return error.extension\n\ndef installExtension(db, user, author_name, extension_name, version):\n    doInstallExtension(db, user, Extension(author_name, extension_name), version)\n    db.commit()\n\ndef uninstallExtension(db, user, author_name, extension_name, version):\n    doUninstallExtension(db, user, getExtension(author_name, extension_name))\n    db.commit()\n\ndef reinstallExtension(db, user, author_name, extension_name, version):\n    doUninstallExtension(db, user, getExtension(author_name, extension_name))\n    doInstallExtension(db, user, Extension(author_name, extension_name), version)\n    db.commit()\n"
  },
  {
    "path": "src/extensions/manifest.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport os\nimport re\n\nfrom textutils import json_decode\n\nRE_ROLE_Page = re.compile(r\"^\\[Page\\s+(.*?)\\s*\\]$\", re.IGNORECASE)\nRE_ROLE_Inject = re.compile(r\"^\\[Inject\\s+(.*?)\\s*\\]$\", re.IGNORECASE)\nRE_ROLE_ProcessCommits = re.compile(r\"^\\[ProcessCommits\\]$\", re.IGNORECASE)\nRE_ROLE_FilterHook = re.compile(r\"^\\[FilterHook\\s+(.*?)\\s*\\]$\", re.IGNORECASE)\nRE_ROLE_Scheduled = re.compile(r\"^\\[Scheduled\\]$\", re.IGNORECASE)\n\nclass ManifestError(Exception):\n    pass\n\nclass Role:\n    def __init__(self, location):\n        self.script = None\n        self.function = None\n        self.description = None\n        self.location = location\n\n    def install(self, db, version_id):\n        cursor = db.cursor()\n        cursor.execute(\"\"\"INSERT INTO extensionroles (version, script, function)\n                               VALUES (%s, %s, %s)\n                            RETURNING id\"\"\",\n                       (version_id, self.script, self.function))\n        return cursor.fetchone()[0]\n\n    def process(self, name, value, location):\n        if name == \"description\":\n            self.description = value\n            return True\n        elif name == \"script\":\n            self.script = value\n            return True\n        elif name == \"function\":\n            self.function = value\n            return True\n        else:\n            return False\n\n    def check(self):\n        if not self.description:\n            raise ManifestError(\"%s: manifest error: expected role description\" % self.location)\n        elif not self.script:\n            raise ManifestError(\"%s: manifest error: expected role script\" % self.location)\n        elif not self.function:\n            raise ManifestError(\"%s: manifest error: expected role function\" % self.location)\n\nclass URLRole(Role):\n    def __init__(self, location, pattern):\n        Role.__init__(self, location)\n        self.pattern = pattern\n        self.regexp = \"^\" + re.sub(r\"[\\|\\[\\](){}^$+]\",\n                                   lambda match: '\\\\' + match.group(0),\n                                   pattern.replace('.', '\\\\.').replace('?', '.').replace('*', '.*')) + \"$\"\n\n    def check(self):\n        Role.check(self)\n        if self.pattern.startswith(\"/\"):\n            raise ManifestError(\"%s: manifest error: path pattern should not start with a '/'\" % self.location)\n\nclass PageRole(URLRole):\n    def __init__(self, location, pattern):\n        URLRole.__init__(self, location, pattern)\n\n    def name(self):\n        return \"Page\"\n\n    def install(self, db, version_id):\n        role_id = Role.install(self, db, version_id)\n        cursor = db.cursor()\n        cursor.execute(\"\"\"INSERT INTO extensionpageroles (role, path)\n                               VALUES (%s, %s)\"\"\",\n                       (role_id, self.regexp))\n        return role_id\n\nclass InjectRole(URLRole):\n    def __init__(self, location, pattern):\n        URLRole.__init__(self, location, pattern)\n\n    def name(self):\n        return \"Inject\"\n\n    def process(self, name, value, location):\n        if Role.process(self, name, value, location):\n            return True\n        if name == \"cached\":\n            # Ignored for compatibility with extensions that use it.\n            return True\n        return False\n\n    def install(self, db, version_id):\n        role_id = Role.install(self, db, version_id)\n        cursor = db.cursor()\n        cursor.execute(\"\"\"INSERT INTO extensioninjectroles (role, path)\n                               VALUES (%s, %s)\"\"\",\n                       (role_id, self.regexp))\n        return role_id\n\nclass ProcessCommitsRole(Role):\n    def __init__(self, location):\n        Role.__init__(self, location)\n\n    def name(self):\n        return \"ProcessCommits\"\n\n    def install(self, db, version_id):\n        role_id = Role.install(self, db, version_id)\n        cursor = db.cursor()\n        cursor.execute(\"\"\"INSERT INTO extensionprocesscommitsroles (role)\n                               VALUES (%s)\"\"\",\n                       (role_id,))\n        return role_id\n\nclass FilterHookRole(Role):\n    def __init__(self, location, name):\n        Role.__init__(self, location)\n        self.name = name\n        self.title = name\n        self.data_description = None\n\n    def name(self):\n        return \"FilterHook\"\n\n    def process(self, name, value, location):\n        if Role.process(self, name, value, location):\n            return True\n        if name == \"title\":\n            self.title = value\n            return True\n        if name == \"datadescription\":\n            self.data_description = value\n            return True\n        return False\n\n    def check(self):\n        Role.check(self)\n        if not re.match(\"^[a-z0-9_]+$\", self.name, re.IGNORECASE):\n            raise ManifestError(\"%s: manifest error: invalid filter hook name: \"\n                                \"should contain only ASCII letters and numbers \"\n                                \"and underscores\" % self.location)\n\n    def install(self, db, version_id):\n        role_id = Role.install(self, db, version_id)\n        cursor = db.cursor()\n        cursor.execute(\"\"\"INSERT INTO extensionfilterhookroles\n                                        (role, name, title, role_description,\n                                         data_description)\n                               VALUES (%s, %s, %s, %s, %s)\"\"\",\n                       (role_id, self.name, self.title, self.description,\n                        self.data_description))\n        return role_id\n\nclass ScheduledRole(Role):\n    def __init__(self, location):\n        Role.__init__(self, location)\n        self.frequency = None\n        self.at = None\n\n    def name(self):\n        return \"Scheduled\"\n\n    def install(self, db, version_id):\n        role_id = Role.install(self, db, version_id)\n        cursor = db.cursor()\n        cursor.execute(\"\"\"INSERT INTO extensionscheduledroles (role, frequency, at)\n                               VALUES (%s, %s, %s)\"\"\",\n                       (role_id, self.frequency, self.at))\n        return role_id\n\n    def process(self, name, value, location):\n        if Role.process(self, name, value, location):\n            return True\n        elif name == \"frequency\":\n            if value in (\"monthly\", \"weekly\", \"daily\", \"hourly\"):\n                self.frequency = value.lower()\n            else:\n                raise ManifestError(\"%s: invalid frequency: must be one of 'monthly', 'weekly', 'daily' and 'hourly'\" % location)\n        elif name == \"at\":\n            self.at = value.lower()\n        else:\n            return False\n        return True\n\n    def check(self):\n        Role.check(self)\n\n        if not self.frequency:\n            raise ManifestError(\"%s: manifest error: expected role parameter 'frequency'\" % self.location)\n        if not self.at:\n            raise ManifestError(\"%s: manifest error: expected role parameter 'at'\" % self.location)\n\n        if self.frequency == \"monthly\":\n            match = re.match(\"(\\d+) (\\d{2}):(\\d{2})$\", self.at)\n            if match:\n                date = int(match.group(1).lstrip(\"0\"))\n                hour = int(match.group(2).lstrip(\"0\"))\n                minute = int(match.group(3).lstrip(\"0\"))\n                if (1 <= date <= 31) and (0 <= hour <= 23) and (0 <= minute <= 59):\n                    return\n            raise ManifestError(\"invalid at specification for monthly trigger, must be 'D HH:MM' (D = day in month; 1-31), is '%s'\" % self.at)\n        elif self.frequency == \"weekly\":\n            match = re.match(\"(?:mon(?:day)?|tue(?:sday)?|wed(?:nesday)?|thu(?:rsday)?|fri(?:day)?|sat(?:urday)?|sun(?:day)?) (\\d{2}):(\\d{2})$\", self.at)\n            if match:\n                hour = int(match.group(1).lstrip(\"0\"))\n                minute = int(match.group(2).lstrip(\"0\"))\n                if (0 <= hour <= 23) and (0 <= minute <= 59):\n                    return\n            raise ManifestError(\"invalid at specification for weekly trigger, must be 'WEEKDAY HH:MM' (WEEKDAY = mon|tue|wed|thu|fri|sat|sun), is '%s'\" % self.at)\n        elif self.frequency == \"daily\":\n            match = re.match(\"(\\d{2}):(\\d{2})$\", self.at)\n            if match:\n                hour = int(match.group(1).lstrip(\"0\"))\n                minute = int(match.group(2).lstrip(\"0\"))\n                if (0 <= hour <= 23) and (0 <= minute <= 59):\n                    return\n            raise ManifestError(\"invalid at specification for daily trigger, must be 'HH:MM'\")\n        elif self.frequency == \"hourly\":\n            match = re.match(\"(\\d{2})$\", self.at)\n            if match:\n                minute = int(match.group(1).lstrip(\"0\"))\n                if (0 <= minute <= 59):\n                    return\n            raise ManifestError(\"invalid at specification for hourly trigger, must be 'MM'\")\n\nclass Author(object):\n    def __init__(self, value):\n        match = re.match(r\"\\s*(.*?)\\s+<(.+?)>\\s*$\", value)\n        if match:\n            self.name, self.email = match.groups()\n        else:\n            self.name = value.strip()\n            self.email = None\n\nclass Manifest(object):\n    def __init__(self, path, source=None):\n        import configuration\n\n        self.path = path\n        self.source = source\n        self.authors = []\n        self.description = None\n        self.flavor = configuration.extensions.DEFAULT_FLAVOR\n        self.roles = []\n        self.status = None\n        self.hidden = False\n\n    def isAuthor(self, db, user):\n        for author in self.authors:\n            if author.name in (user.name, user.fullname) \\\n                    or user.hasGitEmail(db, author.email):\n                return True\n        return False\n\n    def getAuthors(self):\n        return self.authors\n\n    def read(self):\n        import configuration\n\n        path = os.path.join(self.path, \"MANIFEST\")\n\n        if self.source:\n            lines = self.source.splitlines()\n        else:\n            try:\n                lines = open(path).readlines()\n            except IOError:\n                raise ManifestError(\"%s: file not found\" % path)\n\n        lines = map(str.strip, lines)\n\n        def process(value):\n            value = value.strip()\n            if value[0] == '\"' == value[-1]:\n                return json_decode(value)\n            else:\n                return value\n\n        role = None\n\n        for index, line in enumerate(lines):\n            if not line or line.lstrip().startswith(\"#\"): continue\n\n            location = \"%s:%d\" % (path, index + 1)\n\n            if not role:\n                try:\n                    name, value = line.split(\"=\", 1)\n                    if name.strip().lower() == \"author\":\n                        for value in process(value).split(\",\"):\n                            self.authors.append(Author(value))\n                        continue\n                    elif name.strip().lower() == \"description\":\n                        self.description = process(value)\n                        continue\n                    elif name.strip().lower() == \"flavor\":\n                        self.flavor = process(value)\n                        if self.flavor not in configuration.extensions.FLAVORS:\n                            raise ManifestError(\"%s: manifest error: unsupported 'flavor', supported values are: %s\"\n                                                % (location, \", \".join(map(repr, configuration.extensions.FLAVORS.keys()))))\n                        continue\n                    elif name.strip().lower() == \"hidden\":\n                        value = process(value).lower()\n                        if value in (\"true\", \"yes\"):\n                            self.hidden = True\n                        elif value not in (\"false\", \"no\"):\n                            raise ManifestError(\"%s: manifest error: valid values for 'hidden' are 'true'/'yes' and 'false'/'no'\" % location)\n                        continue\n                except:\n                    pass\n\n                if not self.authors:\n                    raise ManifestError(\"%s: manifest error: expected extension author\" % location)\n                elif not self.description:\n                    raise ManifestError(\"%s: manifest error: expected extension description\" % location)\n\n            if role:\n                if \"=\" in line:\n                    name, value = line.split(\"=\", 1)\n                    if role.process(name.strip().lower(), process(value), location):\n                        continue\n\n                role.check()\n\n                self.roles.append(role)\n\n            match = RE_ROLE_Page.match(line)\n            if match:\n                role = PageRole(location, match.group(1))\n                continue\n\n            match = RE_ROLE_Inject.match(line)\n            if match:\n                role = InjectRole(location, match.group(1))\n                continue\n\n            match = RE_ROLE_ProcessCommits.match(line)\n            if match:\n                role = ProcessCommitsRole(location)\n                continue\n\n            match = RE_ROLE_FilterHook.match(line)\n            if match:\n                role = FilterHookRole(location, match.group(1))\n                continue\n\n            match = RE_ROLE_Scheduled.match(line)\n            if match:\n                role = ScheduledRole(location)\n                continue\n\n            raise ManifestError(\"%s: manifest error: unexpected line: %r\" % (location, line))\n\n        if not self.authors:\n            raise ManifestError(\"%s: manifest error: expected extension author\" % path)\n        elif not self.description:\n            raise ManifestError(\"%s: manifest error: expected extension description\" % path)\n\n        if role:\n            role.check()\n            self.roles.append(role)\n\n        if not self.roles:\n            raise ManifestError(\"%s: manifest error: no roles defined\" % path)\n\n    @staticmethod\n    def load(extension_path):\n        manifest = Manifest(extension_path)\n        manifest.read()\n        return manifest\n"
  },
  {
    "path": "src/extensions/resource.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport os\nimport errno\n\nfrom extensions import getExtensionPath, getExtensionInstallPath\n\ndef get(req, db, user, path):\n    import configuration\n\n    extension_name, resource_path = path.split(\"/\", 1)\n\n    cursor = db.cursor()\n    cursor.execute(\"\"\"SELECT users.name, extensionversions.sha1\n                        FROM extensions\n                        JOIN extensioninstalls ON (extensioninstalls.extension=extensions.id)\n             LEFT OUTER JOIN extensionversions ON (extensionversions.id=extensioninstalls.version)\n             LEFT OUTER JOIN users ON (users.id=extensions.author)\n                       WHERE extensions.name=%s\n                         AND (extensioninstalls.uid=%s OR extensioninstalls.uid IS NULL)\n                    ORDER BY extensioninstalls.uid ASC NULLS LAST\n                       LIMIT 1\"\"\",\n                   (extension_name, user.id))\n\n    row = cursor.fetchone()\n\n    if not row:\n        return None, None\n\n    author_name, version_sha1 = row\n\n    if version_sha1 is None:\n        extension_path = getExtensionPath(author_name, extension_name)\n    else:\n        extension_path = getExtensionInstallPath(version_sha1)\n\n    resource_path = os.path.join(extension_path, \"resources\", resource_path)\n\n    def guessContentType(name):\n        extension_parts = name.split(\".\")[1:]\n        while extension_parts:\n            extension = \".\".join(extension_parts)\n            mimetype = configuration.mimetypes.MIMETYPES.get(extension)\n            if mimetype:\n                return mimetype\n            extension_parts = extension_parts[1:]\n        else:\n            return \"application/octet-stream\"\n\n    try:\n        with open(resource_path) as resource_file:\n            resource = resource_file.read()\n    except IOError as error:\n        if error.errno in (errno.ENOENT, errno.EACCES):\n            return None, None\n        raise\n    else:\n        return guessContentType(os.path.basename(resource_path)), resource\n"
  },
  {
    "path": "src/extensions/role/__init__.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport page\nimport inject\nimport filterhook\nimport processcommits\n"
  },
  {
    "path": "src/extensions/role/filterhook.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport re\nimport os\nimport signal\nimport traceback\n\nimport configuration\nimport dbutils\nimport gitutils\nimport mailutils\nimport htmlutils\n\nfrom extensions.extension import Extension, ExtensionError\nfrom extensions.manifest import Manifest, ManifestError, FilterHookRole\nfrom extensions.execute import ProcessException, ProcessFailure, executeProcess\n\ndef signalExtensionTasksService():\n    try:\n        with open(configuration.services.EXTENSIONTASKS[\"pidfile_path\"]) as pidfile:\n            pid = int(pidfile.read().strip())\n        os.kill(pid, signal.SIGHUP)\n    except Exception:\n        # Print traceback to stderr.  Might end up in web server's error log,\n        # where it has a chance to be noticed.\n        traceback.print_exc()\n\ndef listFilterHooks(db, user):\n    cursor = db.cursor()\n\n    installs = Extension.getInstalls(db, user)\n    filterhooks = []\n\n    for extension_id, version_id, version_sha1, is_universal in installs:\n        if version_id is not None:\n            cursor.execute(\"\"\"SELECT 1\n                                FROM extensionroles\n                                JOIN extensionfilterhookroles ON (role=id)\n                               WHERE version=%s\"\"\",\n                           (version_id,))\n\n            if not cursor.fetchone():\n                continue\n\n            extension = Extension.fromId(db, extension_id)\n            manifest = extension.getManifest(sha1=version_sha1)\n        else:\n            try:\n                extension = Extension.fromId(db, extension_id)\n            except ExtensionError:\n                # If the author/hosting user no longer exists, or the extension\n                # directory no longer exists or is inaccessible, ignore the\n                # extension.\n                continue\n\n            try:\n                manifest = Manifest.load(extension.getPath())\n            except ManifestError:\n                # If the MANIFEST is missing or invalid, we can't know whether\n                # the extension has any filter hook roles, so assume it doesn't\n                # and ignore it.\n                continue\n\n            if not any(isinstance(role, FilterHookRole)\n                       for role in manifest.roles):\n                continue\n\n        filterhooks.append((extension, manifest, sorted(\n                    (role for role in manifest.roles\n                     if isinstance(role, FilterHookRole)),\n                    key=lambda role: role.title)))\n\n    return sorted(filterhooks,\n                  key=lambda (extension, manifest, roles): extension.getKey())\n\ndef getFilterHookRole(db, filter_id):\n    cursor = db.cursor()\n\n    cursor.execute(\"\"\"SELECT extension, uid, name\n                        FROM extensionhookfilters\n                       WHERE id=%s\"\"\",\n                   (filter_id,))\n\n    extension_id, user_id, filterhook_name = cursor.fetchone()\n\n    extension = Extension.fromId(db, extension_id)\n    user = dbutils.User.fromId(db, user_id)\n\n    installed_sha1, _ = extension.getInstalledVersion(db, user)\n\n    if installed_sha1 is False:\n        return\n\n    manifest = extension.getManifest(sha1=installed_sha1)\n\n    for role in manifest.roles:\n        if isinstance(role, FilterHookRole) and role.name == filterhook_name:\n            return role\n\ndef queueFilterHookEvent(db, filter_id, review, user, commits, file_ids):\n    cursor = db.cursor()\n    cursor.execute(\"\"\"SELECT data\n                        FROM extensionhookfilters\n                       WHERE id=%s\"\"\",\n                   (filter_id,))\n\n    data, = cursor.fetchone()\n\n    cursor.execute(\"\"\"INSERT INTO extensionfilterhookevents\n                                    (filter, review, uid, data)\n                           VALUES (%s, %s, %s, %s)\n                        RETURNING id\"\"\",\n                   (filter_id, review.id, user.id, data))\n\n    event_id, = cursor.fetchone()\n\n    cursor.executemany(\"\"\"INSERT INTO extensionfilterhookcommits\n                                        (event, commit)\n                               VALUES (%s, %s)\"\"\",\n                       [(event_id, commit.getId(db)) for commit in commits])\n\n    cursor.executemany(\"\"\"INSERT INTO extensionfilterhookfiles\n                                        (event, file)\n                               VALUES (%s, %s)\"\"\",\n                       [(event_id, file_id) for file_id in file_ids])\n\n    def transactionCallback(event):\n        if event == \"commit\":\n            signalExtensionTasksService()\n\n    db.registerTransactionCallback(transactionCallback)\n\ndef processFilterHookEvent(db, event_id, logfn):\n    cursor = db.cursor()\n\n    cursor.execute(\"\"\"SELECT filters.extension, filters.uid, filters.path,\n                             filters.name, events.review, events.uid, events.data\n                        FROM extensionfilterhookevents AS events\n                        JOIN extensionhookfilters AS filters ON (filters.id=events.filter)\n                       WHERE events.id=%s\"\"\",\n                   (event_id,))\n\n    # Note:\n    # - filter_user_id / filter_user represent the user whose filter was\n    #   triggered.\n    # - user_id /user represent the user that added commits and thereby\n    #   triggered the filter.\n\n    (extension_id, filter_user_id, filter_path,\n     filterhook_name, review_id, user_id, filter_data) = cursor.fetchone()\n\n    extension = Extension.fromId(db, extension_id)\n    filter_user = dbutils.User.fromId(db, filter_user_id)\n\n    installed_sha1, _ = extension.getInstalledVersion(db, filter_user)\n\n    if installed_sha1 is False:\n        # Invalid event (user doesn't have extension installed); do nothing.\n        # The event will be deleted by the caller.\n        return\n\n    manifest = extension.getManifest(sha1=installed_sha1)\n\n    for role in manifest.roles:\n        if isinstance(role, FilterHookRole) and role.name == filterhook_name:\n            break\n    else:\n        # Invalid event (installed version of extension doesn't have the named\n        # filter hook role); do nothing.  The event will be deleted by the\n        # caller.\n        return\n\n    cursor.execute(\"\"\"SELECT commit\n                        FROM extensionfilterhookcommits\n                       WHERE event=%s\"\"\",\n                   (event_id,))\n    commit_ids = [commit_id for (commit_id,) in cursor]\n\n    cursor.execute(\"\"\"SELECT file\n                        FROM extensionfilterhookfiles\n                       WHERE event=%s\"\"\",\n                   (event_id,))\n    file_ids = [file_id for (file_id,) in cursor]\n\n    argv = \"\"\"\n\n(function () {\n   var review = new critic.Review(%(review_id)d);\n   var user = new critic.User(%(user_id)d);\n   var repository = review.repository;\n   var commits = new critic.CommitSet(\n     %(commit_ids)r.map(\n       function (commit_id) {\n         return repository.getCommit(commit_id);\n       }));\n   var files = %(file_ids)r.map(\n     function (file_id) {\n       return critic.File.find(file_id);\n     });\n   return [%(filter_data)s, review, user, commits, files];\n })()\n\n\"\"\" % { \"filter_data\": htmlutils.jsify(filter_data),\n        \"review_id\": review_id,\n        \"user_id\": user_id,\n        \"commit_ids\": commit_ids,\n        \"file_ids\": file_ids }\n\n    argv = re.sub(\"[ \\n]+\", \" \", argv.strip())\n\n    logfn(\"argv=%r\" % argv)\n    logfn(\"script=%r\" % role.script)\n    logfn(\"function=%r\" % role.function)\n\n    try:\n        executeProcess(\n            db, manifest, \"filterhook\", role.script, role.function, extension_id,\n            filter_user_id, argv, configuration.extensions.LONG_TIMEOUT)\n    except ProcessException as error:\n        review = dbutils.Review.fromId(db, review_id)\n\n        recipients = set([filter_user])\n\n        author = extension.getAuthor(db)\n        if author is None:\n            recipients.update(dbutils.User.withRole(db, \"administrator\"))\n        else:\n            recipients.add(author)\n\n        body = \"\"\"\\\nAn error occurred while processing an extension hook filter event!\n\nFilter details:\n\n  Extension:   %(extension.title)s\n  Filter hook: %(role.title)s\n  Repository:  %(repository.name)s\n  Path:        %(filter.path)s\n  Data:        %(filter.data)s\n\nEvent details:\n\n  Review:  r/%(review.id)d \"%(review.summary)s\"\n  Commits: %(commits)s\n\nError details:\n\n  Error:  %(error.message)s\n  Output:%(error.output)s\n\n-- critic\"\"\"\n\n        commits = (gitutils.Commit.fromId(db, review.repository, commit_id)\n                   for commit_id in commit_ids)\n        commits_text = \"\\n           \".join(\n            ('%s \"%s\"' % (commit.sha1[:8], commit.niceSummary())\n             for commit in commits))\n\n        if isinstance(error, ProcessFailure):\n            error_output = \"\\n\\n    \" + \"\\n    \".join(error.stderr.splitlines())\n        else:\n            error_output = \" %s\" % error.message\n\n        body = body % { \"extension.title\": extension.getTitle(db),\n                        \"role.title\": role.title,\n                        \"repository.name\": review.repository.name,\n                        \"filter.path\": filter_path,\n                        \"filter.data\": htmlutils.jsify(filter_data),\n                        \"review.id\": review.id,\n                        \"review.summary\": review.summary,\n                        \"commits\": commits_text,\n                        \"error.message\": error.message,\n                        \"error.output\": error_output }\n\n        mailutils.sendMessage(\n            recipients=list(recipients),\n            subject=\"Failed: \" + role.title,\n            body=body)\n"
  },
  {
    "path": "src/extensions/role/inject.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport re\nimport urlparse\n\nimport configuration\n\nfrom auth import AccessDenied\nfrom htmlutils import jsify\nfrom request import decodeURIComponent\nfrom textutils import json_decode, json_encode\n\nfrom extensions import getExtensionInstallPath\nfrom extensions.extension import Extension, ExtensionError\nfrom extensions.execute import ProcessTimeout, ProcessFailure, executeProcess\nfrom extensions.manifest import Manifest, ManifestError, InjectRole\n\nclass InjectError(Exception):\n    pass\nclass InjectIgnored(Exception):\n    pass\n\ndef processLine(paths, line):\n    try:\n        command, value = line.split(\" \", 1)\n    except ValueError:\n        raise InjectError(\"Invalid line in output: %r\" % line)\n\n    if command not in (\"link\", \"script\", \"stylesheet\", \"preference\"):\n        raise InjectError(\"Invalid command: %r\" % command)\n\n    value = value.strip()\n\n    try:\n        value = json_decode(value)\n    except ValueError:\n        raise InjectError(\"Invalid JSON: %r\" % value)\n\n    def is_string(value):\n        return isinstance(value, basestring)\n\n    if command in (\"script\", \"stylesheet\") and not is_string(value):\n        raise InjectError(\"Invalid value for %r: %r (expected string)\"\n                          % (command, value))\n    elif command == \"link\":\n        if isinstance(value, dict):\n            if \"label\" not in value or not is_string(value[\"label\"]):\n                raise InjectError(\"Invalid value for %r: %r (expected attribute 'label' of type string)\"\n                                  % (command, value))\n            elif \"url\" not in value or not is_string(value[\"url\"]) or value[\"url\"] is None:\n                raise InjectError(\"Invalid value for %r: %r (expected attribute 'url' of type string or null)\"\n                                  % (command, value))\n        # Alternatively support [label, url] (backwards compatibility).\n        elif not isinstance(value, list) or len(value) != 2:\n            raise InjectError(\"Invalid value for %r: %r (expected object { \\\"label\\\": LABEL, \\\"url\\\": URL })\"\n                              % (command, value))\n        elif not is_string(value[0]):\n            raise InjectError(\"Invalid value for %r: %r (expected string at array[0])\"\n                              % (command, value))\n        elif not (is_string(value[1]) or value[1] is None):\n            raise InjectError(\"Invalid value for %r: %r (expected string or null at array[1])\"\n                              % (command, value))\n        else:\n            value = { \"label\": value[0], \"url\": value[1] }\n    elif command == \"preference\":\n        if \"config\" not in paths:\n            raise InjectError(\"Invalid command: %r only valid on /config page\"\n                              % command)\n        elif not isinstance(value, dict):\n            raise InjectError(\"Invalid value for %r: %r (expected object)\"\n                              % (command, value))\n\n        for name in (\"url\", \"name\", \"type\", \"value\", \"default\", \"description\"):\n            if name not in value:\n                raise InjectError(\"Invalid value for %r: %r (missing attribute %r)\"\n                                  % (command, value, name))\n\n        preference_url = value[\"url\"]\n        preference_name = value[\"name\"]\n        preference_type = value[\"type\"]\n        preference_value = value[\"value\"]\n        preference_default = value[\"default\"]\n        preference_description = value[\"description\"]\n\n        if not is_string(preference_url):\n            raise InjectError(\"Invalid value for %r: %r (expected attribute 'url' of type string)\"\n                              % (command, value))\n        elif not is_string(preference_name):\n            raise InjectError(\"Invalid value for %r: %r (expected attribute 'name' of type string)\"\n                              % (command, value))\n        elif not is_string(preference_description):\n            raise InjectError(\"Invalid value for %r: %r (expected attribute 'description' of type string)\"\n                              % (command, value))\n\n        if is_string(preference_type):\n            if preference_type not in (\"boolean\", \"integer\", \"string\"):\n                raise InjectError(\"Invalid value for %r: %r (unsupported preference type)\"\n                                  % (command, value))\n\n            if preference_type == \"boolean\":\n                type_check = lambda value: isinstance(value, bool)\n            elif preference_type == \"integer\":\n                type_check = lambda value: isinstance(value, int)\n            else:\n                type_check = is_string\n\n            if not type_check(preference_value):\n                raise InjectError(\"Invalid value for %r: %r (type mismatch between 'value' and 'type')\"\n                                  % (command, value))\n\n            if not type_check(preference_default):\n                raise InjectError(\"Invalid value for %r: %r (type mismatch between 'default' and 'type')\"\n                                  % (command, value))\n        else:\n            if not isinstance(preference_type, list):\n                raise InjectError(\"Invalid value for %r: %r (invalid 'type', expected string or array)\"\n                                  % (command, value))\n\n            for index, choice in enumerate(preference_type):\n                if not isinstance(choice, dict) \\\n                        or not isinstance(choice.get(\"value\"), basestring) \\\n                        or not isinstance(choice.get(\"title\"), basestring):\n                    raise InjectError(\"Invalid value for %r: %r (invalid preference choice: %r)\"\n                                      % (command, value, choice))\n\n            choices = set([choice[\"value\"] for choice in preference_type])\n\n            if not is_string(preference_value) or preference_value not in choices:\n                raise InjectError(\"Invalid value for %r: %r ('value' not among valid choices)\"\n                                  % (command, value))\n\n            if not is_string(preference_default) or preference_default not in choices:\n                raise InjectError(\"Invalid value for %r: %r ('default' not among valid choices)\"\n                                  % (command, value))\n\n    return (command, value)\n\ndef execute(db, req, user, document, links, injected, profiler=None):\n    cursor = db.cursor()\n\n    installs = Extension.getInstalls(db, user)\n\n    def get_matching_path(path_regexp):\n        if re.match(path_regexp, req.path):\n            return (req.path, req.query)\n        elif re.match(path_regexp, req.original_path):\n            return (req.original_path, req.original_query)\n        else:\n            return None, None\n\n    query = None\n\n    for extension_id, version_id, version_sha1, is_universal in installs:\n        handlers = []\n\n        try:\n            if version_id is not None:\n                cursor.execute(\"\"\"SELECT script, function, path\n                                    FROM extensionroles\n                                    JOIN extensioninjectroles ON (role=id)\n                                   WHERE version=%s\n                                ORDER BY id ASC\"\"\",\n                               (version_id,))\n\n                for script, function, path_regexp in cursor:\n                    path, query = get_matching_path(path_regexp)\n                    if path is not None:\n                        handlers.append((path, query, script, function))\n\n                if not handlers:\n                    continue\n\n                extension = Extension.fromId(db, extension_id)\n                manifest = Manifest.load(getExtensionInstallPath(version_sha1))\n            else:\n                extension = Extension.fromId(db, extension_id)\n                manifest = Manifest.load(extension.getPath())\n\n                for role in manifest.roles:\n                    if isinstance(role, InjectRole):\n                        path, query = get_matching_path(role.regexp)\n                        if path is not None:\n                            handlers.append((path, query, role.script, role.function))\n\n                if not handlers:\n                    continue\n\n            def construct_query(query):\n                if not query:\n                    return \"null\"\n\n                params = urlparse.parse_qs(query, keep_blank_values=True)\n\n                for key in params:\n                    values = params[key]\n                    if len(values) == 1:\n                        if not values[0]:\n                            params[key] = None\n                        else:\n                            params[key] = values[0]\n\n                return (\"Object.freeze({ raw: %s, params: Object.freeze(%s) })\"\n                        % (json_encode(query), json_encode(params)))\n\n            preferences = None\n            commands = []\n\n            for path, query, script, function in handlers:\n                argv = \"[%s, %s]\" % (jsify(path), construct_query(query))\n\n                try:\n                    stdout_data = executeProcess(\n                        db, manifest, \"inject\", script, function, extension_id, user.id, argv,\n                        configuration.extensions.SHORT_TIMEOUT)\n                except ProcessTimeout as error:\n                    raise InjectError(error.message)\n                except ProcessFailure as error:\n                    if error.returncode < 0:\n                        raise InjectError(\"Process terminated by signal %d.\" % -error.returncode)\n                    else:\n                        raise InjectError(\"Process returned %d.\\n%s\" % (error.returncode, error.stderr))\n                except AccessDenied:\n                    raise InjectIgnored()\n\n                for line in stdout_data.splitlines():\n                    if line.strip():\n                        commands.append(processLine(path, line.strip()))\n\n            for command, value in commands:\n                if command == \"script\":\n                    document.addExternalScript(value, use_static=False, order=1)\n                elif command == \"stylesheet\":\n                    document.addExternalStylesheet(value, use_static=False, order=1)\n                elif command == \"link\":\n                    for index, (_, label, _, _) in enumerate(links):\n                        if label == value[\"label\"]:\n                            if value[\"url\"] is None:\n                                del links[index]\n                            else:\n                                links[index][0] = value[\"url\"]\n                            break\n                    else:\n                        if value[\"url\"] is not None:\n                            links.append([value[\"url\"], value[\"label\"], None, None])\n                elif command == \"preference\":\n                    if not preferences:\n                        preferences = []\n                        injected.setdefault(\"preferences\", []).append(\n                            (extension.getName(), extension.getAuthor(db), preferences))\n                    preferences.append(value)\n\n            if profiler:\n                profiler.check(\"inject: %s\" % extension.getKey())\n        except ExtensionError as error:\n            document.comment(\"\\n\\n[%s] Extension error:\\nInvalid extension:\\n%s\\n\\n\"\n                             % (error.extension.getKey(), error.message))\n        except ManifestError as error:\n            document.comment(\"\\n\\n[%s] Extension error:\\nInvalid MANIFEST:\\n%s\\n\\n\"\n                             % (extension.getKey(), error.message))\n        except InjectError as error:\n            document.comment(\"\\n\\n[%s] Extension error:\\n%s\\n\\n\"\n                             % (extension.getKey(), error.message))\n        except InjectIgnored:\n            pass\n"
  },
  {
    "path": "src/extensions/role/page.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport time\nimport re\n\nimport configuration\n\nfrom htmlutils import jsify\nfrom request import decodeURIComponent\n\nfrom extensions import getExtensionInstallPath\nfrom extensions.extension import Extension, ExtensionError\nfrom extensions.execute import ProcessTimeout, ProcessFailure, executeProcess\nfrom extensions.manifest import Manifest, ManifestError, PageRole\nfrom extensions.utils import renderTutorial\n\ndef execute(db, req, user):\n    cursor = db.cursor()\n\n    installs = Extension.getInstalls(db, user)\n\n    argv = None\n    stdin_data = None\n\n    for extension_id, version_id, version_sha1, is_universal in installs:\n        handlers = []\n\n        if version_id is not None:\n            cursor.execute(\"\"\"SELECT script, function, path\n                                FROM extensionroles\n                                JOIN extensionpageroles ON (role=id)\n                               WHERE version=%s\n                            ORDER BY id ASC\"\"\",\n                           (version_id,))\n\n            for script, function, path_regexp in cursor:\n                if re.match(path_regexp, req.path):\n                    handlers.append((script, function))\n\n            if not handlers:\n                continue\n\n            extension_path = getExtensionInstallPath(version_sha1)\n            manifest = Manifest.load(extension_path)\n        else:\n            try:\n                extension = Extension.fromId(db, extension_id)\n            except ExtensionError:\n                # If the author/hosting user no longer exists, or the extension\n                # directory no longer exists or is inaccessible, ignore the\n                # extension.\n                continue\n\n            try:\n                manifest = Manifest.load(extension.getPath())\n            except ManifestError:\n                # If the MANIFEST is missing or invalid, we can't know whether\n                # the extension has a page role handling the path, so assume it\n                # doesn't and ignore it.\n                continue\n\n            for role in manifest.roles:\n                if isinstance(role, PageRole) and re.match(role.regexp, req.path):\n                    handlers.append((role.script, role.function))\n\n            if not handlers:\n                continue\n\n        if argv is None:\n            def param(raw):\n                parts = raw.split(\"=\", 1)\n                if len(parts) == 1:\n                    return \"%s: null\" % jsify(decodeURIComponent(raw))\n                else:\n                    return \"%s: %s\" % (jsify(decodeURIComponent(parts[0])),\n                                       jsify(decodeURIComponent(parts[1])))\n\n            if req.query:\n                query = (\"Object.freeze({ raw: %s, params: Object.freeze({ %s }) })\"\n                         % (jsify(req.query),\n                            \", \".join(map(param, req.query.split(\"&\")))))\n            else:\n                query = \"null\"\n\n            headers = (\"Object.freeze({ %s })\"\n                       % \", \".join((\"%s: %s\" % (jsify(name), jsify(value)))\n                                   for name, value in req.getRequestHeaders().items()))\n\n            argv = (\"[%(method)s, %(path)s, %(query)s, %(headers)s]\"\n                    % { 'method': jsify(req.method),\n                        'path': jsify(req.path),\n                        'query': query,\n                        'headers': headers })\n\n        if req.method == \"POST\":\n            if stdin_data is None:\n                stdin_data = req.read()\n\n        for script, function in handlers:\n            before = time.time()\n\n            try:\n                stdout_data = executeProcess(\n                    db, manifest, \"page\", script, function, extension_id, user.id,\n                    argv, configuration.extensions.LONG_TIMEOUT, stdin=stdin_data)\n            except ProcessTimeout as error:\n                req.setStatus(500, \"Extension Timeout\")\n                return error.message\n            except ProcessFailure as error:\n                req.setStatus(500, \"Extension Failure\")\n                if error.returncode < 0:\n                    return (\"Extension failure: terminated by signal %d\\n\"\n                            % -error.returncode)\n                else:\n                    return (\"Extension failure: returned %d\\n%s\"\n                            % (error.returncode, error.stderr))\n\n            after = time.time()\n\n            status = None\n            headers = {}\n\n            if not stdout_data:\n                return False\n\n            while True:\n                try: line, stdout_data = stdout_data.split(\"\\n\", 1)\n                except:\n                    req.setStatus(500, \"Extension Error\")\n                    return \"Extension error: output format error.\\n%r\\n\" % stdout_data\n\n                if status is None:\n                    try: status = int(line.strip())\n                    except:\n                        req.setStatus(500, \"Extension Error\")\n                        return \"Extension error: first line should contain only a numeric HTTP status code.\\n%r\\n\" % line\n                elif not line:\n                    break\n                else:\n                    try: name, value = line.split(\":\", 1)\n                    except:\n                        req.setStatus(500, \"Extension Error\")\n                        return \"Extension error: header line should be on 'name: value' format.\\n%r\\n\" % line\n                    headers[name.strip()] = value.strip()\n\n            if status is None:\n                req.setStatus(500, \"Extension Error\")\n                return \"Extension error: first line should contain only a numeric HTTP status code.\\n\"\n\n            content_type = \"text/plain\"\n\n            for name, value in headers.items():\n                if name.lower() == \"content-type\":\n                    content_type = value\n                    del headers[name]\n                else:\n                    headers[name] = value\n\n            req.setStatus(status)\n            req.setContentType(content_type)\n\n            for name, value in headers.items():\n                req.addResponseHeader(name, value)\n\n            if content_type == \"text/tutorial\":\n                req.setContentType(\"text/html\")\n                return renderTutorial(db, user, stdout_data)\n\n            if content_type.startswith(\"text/html\"):\n                stdout_data += \"\\n\\n<!-- extension execution time: %.2f seconds -->\\n\" % (after - before)\n\n            return stdout_data\n\n    return False\n"
  },
  {
    "path": "src/extensions/role/processcommits.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport re\n\nimport configuration\nimport gitutils\n\nimport log.commitset\nimport changeset.utils\n\nfrom extensions import getExtensionInstallPath\nfrom extensions.extension import Extension\nfrom extensions.execute import ProcessTimeout, ProcessFailure, executeProcess\nfrom extensions.manifest import Manifest, ManifestError, ProcessCommitsRole\n\ndef execute(db, user, review, all_commits, old_head, new_head, output):\n    cursor = db.cursor()\n\n    installs = Extension.getInstalls(db, user)\n\n    data = None\n\n    for extension_id, version_id, version_sha1, is_universal in installs:\n        handlers = []\n\n        extension = Extension.fromId(db, extension_id)\n\n        if version_id is not None:\n            cursor.execute(\"\"\"SELECT script, function\n                                FROM extensionroles\n                                JOIN extensionprocesscommitsroles ON (role=id)\n                               WHERE version=%s\n                            ORDER BY id ASC\"\"\",\n                           (version_id,))\n\n            handlers.extend(cursor)\n\n            if not handlers:\n                continue\n\n            extension_path = getExtensionInstallPath(version_sha1)\n            manifest = Manifest.load(extension_path)\n        else:\n            manifest = Manifest.load(extension.getPath())\n\n            for role in manifest.roles:\n                if isinstance(role, ProcessCommitsRole):\n                    handlers.append((role.script, role.function))\n\n            if not handlers:\n                continue\n\n        if data is None:\n            commitset = log.commitset.CommitSet(all_commits)\n\n            assert old_head is None or old_head in commitset.getTails()\n            assert new_head in commitset.getHeads()\n            assert len(commitset.getHeads()) == 1\n\n            tails = commitset.getFilteredTails(review.repository)\n            if len(tails) == 1:\n                tail = gitutils.Commit.fromSHA1(db, review.repository, tails.pop())\n                changeset_id = changeset.utils.createChangeset(\n                    db, user, review.repository, from_commit=tail, to_commit=new_head)[0].id\n                changeset_arg = \"repository.getChangeset(%d)\" % changeset_id\n            else:\n                changeset_arg = \"null\"\n\n            commits_arg = \"[%s]\" % \",\".join(\n                [(\"repository.getCommit(%d)\" % commit.getId(db))\n                 for commit in all_commits])\n\n            data = { \"review_id\": review.id,\n                     \"changeset\": changeset_arg,\n                     \"commits\": commits_arg }\n\n        for script, function in handlers:\n            class Error(Exception):\n                pass\n\n            def print_header():\n                header = \"%s::%s()\" % (script, function)\n                print >>output, (\"\\n[%s] %s\\n[%s] %s\"\n                                 % (extension.getName(), header,\n                                    extension.getName(), \"=\" * len(header)))\n\n            try:\n                argv = \"\"\"\n\n(function ()\n {\n   var review = new critic.Review(%(review_id)d);\n   var repository = review.repository;\n   var changeset = %(changeset)s;\n   var commitset = new critic.CommitSet(%(commits)s);\n\n   return [review, changeset, commitset];\n })()\n\n\"\"\" % data\n                argv = re.sub(\"[ \\n]+\", \" \", argv.strip())\n\n                try:\n                    stdout_data = executeProcess(\n                        db, manifest, \"processcommits\", script, function, extension_id, user.id,\n                        argv, configuration.extensions.SHORT_TIMEOUT)\n                except ProcessTimeout as error:\n                    raise Error(error.message)\n                except ProcessError as error:\n                    if error.returncode < 0:\n                        raise Error(\"Process terminated by signal %d.\" % -error.returncode)\n                    else:\n                        raise Error(\"Process returned %d.\\n%s\" % (error.returncode, error.stderr))\n\n                if stdout_data.strip():\n                    print_header()\n                    for line in stdout_data.splitlines():\n                        print >>output, \"[%s] %s\" % (extension.getName(), line)\n            except Error as error:\n                print_header()\n                print >>output, \"[%s] Extension error: %s\" % (extension.getName(), error.message)\n"
  },
  {
    "path": "src/extensions/unittest.py",
    "content": "def independence():\n    # Simply check that extensions can be imported.  This is run in a test\n    # flagged as \"local\" since we want extensions to be possible to import in\n    # standalone unit tests.\n    #\n    # Nothing in extensions can actually be used, of course, but that's not a\n    # problem; the unit tests simply need to make sure not to depend on that.\n\n    import extensions\n\n    print \"independence: ok\"\n"
  },
  {
    "path": "src/extensions/utils.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport htmlutils\nimport page.utils\nimport textformatting\n\ndef renderTutorial(db, user, source):\n    document = htmlutils.Document()\n\n    document.addExternalStylesheet(\"resource/tutorial.css\")\n    document.addExternalScript(\"resource/tutorial.js\")\n    document.addInternalStylesheet(\"div.main table td.text { %s }\" % user.getPreference(db, \"style.tutorialFont\"))\n\n    html = document.html()\n    head = html.head()\n    body = html.body()\n\n    page.utils.generateHeader(body, db, user)\n\n    table = body.div(\"main\").table(\"paleyellow\", align=\"center\")\n    textformatting.renderFormatted(db, user, table, source.splitlines(), toc=True)\n\n    return str(document)\n"
  },
  {
    "path": "src/gitutils.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport subprocess\nimport re\nimport time\nimport atexit\nimport os\nimport traceback\nimport threading\nimport tempfile\nimport shutil\nimport stat\nimport contextlib\nimport base64\n\nimport base\nimport configuration\nimport textutils\nimport htmlutils\nimport communicate\nimport diff.parse\n\nre_author_committer = re.compile(\"(.*) <(.*)> ([0-9]+ [-+][0-9]+)\")\nre_sha1 = re.compile(\"^[A-Za-z0-9]{40}$\")\n\nREPOSITORY_RELAYCOPY_DIR = os.path.join(configuration.paths.DATA_DIR, \"relay\")\nREPOSITORY_WORKCOPY_DIR = os.path.join(configuration.paths.DATA_DIR, \"temporary\")\n\n# Reference used to keep various commits alive.\nKEEPALIVE_REF_CHAIN = \"refs/internal/keepalive-chain\"\nKEEPALIVE_REF_PREFIX = \"refs/keepalive/\"\n\n# This is what an empty tree object hashes to.\nEMPTY_TREE_SHA1 = \"4b825dc642cb6eb9a060e54bf8d69288fbee4904\"\n\ndef same_filesystem(pathA, pathB):\n    return os.stat(pathA).st_dev == os.stat(pathB).st_dev\n\ndef getGitEnvironment(author=True, committer=True):\n    env = {}\n    def name(parameter):\n        if parameter is True:\n            return \"Critic System\"\n        elif isinstance(parameter, CommitUserTime):\n            return parameter.name\n        else:\n            return parameter.fullname\n    def email(parameter):\n        if parameter is True or not parameter.email:\n            return configuration.base.SYSTEM_USER_EMAIL\n        else:\n            return parameter.email\n    if author:\n        env[\"GIT_AUTHOR_NAME\"] = name(author)\n        env[\"GIT_AUTHOR_EMAIL\"] = email(author)\n    if committer:\n        env[\"GIT_COMMITTER_NAME\"] = name(committer)\n        env[\"GIT_COMMITTER_EMAIL\"] = email(committer)\n    return env\n\nclass GitError(base.Error):\n    pass\n\nclass GitReferenceError(GitError):\n    \"\"\"Exception raised on an invalid SHA-1s or refs.\"\"\"\n\n    def __init__(self, message, sha1=None, ref=None, repository=None):\n        super(GitReferenceError, self).__init__(message)\n        self.sha1 = sha1\n        self.ref = ref\n        self.repository = repository\n\nclass GitCommandError(GitError):\n    \"\"\"Exception raised when a Git command fails.\"\"\"\n\n    def __init__(self, cmdline, output, cwd):\n        super(GitCommandError, self).__init__(\"'%s' failed: %s (in %s)\" % (cmdline, output, cwd))\n        self.cmdline = cmdline\n        self.output = output\n        self.cwd = cwd\n\nclass GitObject:\n    def __init__(self, sha1, type, size, data):\n        self.sha1 = sha1\n        self.type = type\n        self.size = size\n        self.data = data\n\n    def __getitem__(self, index):\n        if index == 0: return self.type\n        elif index == 1: return self.size\n        elif index == 2: return self.data\n        raise IndexError(\"GitObject index out of range: %d\" % index)\n\nclass GitHttpBackendError(GitError):\n    def __init__(self, returncode, stderr):\n        message = \"Git failed!\"\n        if returncode < 0:\n            message = \"Git terminated by signal %d!\" % -returncode\n        elif returncode > 0:\n            message = \"Git exited with status %d!\" % returncode\n        if stderr.strip():\n            message += \"\\n\" + stderr\n        super(GitHttpBackendError, self).__init__(message)\n        self.returncode = returncode\n        self.stderr = stderr\n\nclass GitHttpBackendNeedsUser(GitError):\n    pass\n\nclass NoSuchRepository(base.Error):\n    \"\"\"Exception raised by Repository.fromName() for invalid names.\"\"\"\n\n    def __init__(self, value):\n        super(NoSuchRepository, self).__init__(\"No such repository: %s\" % str(value))\n        self.value = value\n\nclass Repository:\n    class FromParameter:\n        def __init__(self, db): self.db = db\n        def __call__(self, value): return Repository.fromParameter(self.db, value)\n\n    def __init__(self, db=None, repository_id=None, parent=None, name=None, path=None):\n        assert path\n\n        self.id = repository_id\n        self.name = name\n        self.path = path\n        self.parent = parent\n\n        self.__batch = None\n        self.__batchCheck = None\n        self.__cacheBlobs = False\n        self.__cacheDisabled = False\n\n        if db:\n            self.__db = db\n            db.atexit(self.__terminate)\n        else:\n            self.__db = None\n            atexit.register(self.__terminate)\n\n    def __str__(self):\n        return self.path\n\n    def getURL(self, db, user):\n        return Repository.constructURL(db, user, self.path)\n\n    @staticmethod\n    def constructURL(db, user, path):\n        path = os.path.relpath(path, configuration.paths.GIT_DIR)\n        url_type = user.getPreference(db, \"repository.urlType\")\n\n        if url_type == \"git\":\n            url_format = \"git://%s/%s\"\n        elif url_type in (\"ssh\", \"host\"):\n            if url_type == \"ssh\":\n                prefix = \"ssh://%s\"\n            else:\n                prefix = \"%s:\"\n            url_format = prefix + os.path.join(configuration.paths.GIT_DIR, \"%s\")\n        else:\n            import dbutils\n            url_prefix = dbutils.getURLPrefix(db, user)\n            return \"%s/%s\" % (url_prefix, path)\n\n        return url_format % (configuration.base.HOSTNAME, path)\n\n    def enableBlobCache(self):\n        assert self.__db\n        self.__cacheBlobs = True\n\n    def disableCache(self):\n        self.__cacheDisabled = True\n\n    def checkAccess(self, db, access_type):\n        import auth\n        assert access_type in (\"read\", \"modify\")\n        auth.AccessControl.accessRepository(db, access_type, self)\n\n    @staticmethod\n    def fromId(db, repository_id, for_modify=False):\n        if repository_id in db.storage[\"Repository\"]:\n            repository = db.storage[\"Repository\"][repository_id]\n        else:\n            cursor = db.cursor()\n            cursor.execute(\"SELECT parent, name, path FROM repositories WHERE id=%s\", (repository_id,))\n\n            parent_id, name, path = cursor.fetchone()\n            parent = None if parent_id is None else Repository.fromId(db, parent_id)\n            repository = Repository(db, repository_id=repository_id, parent=parent, name=name, path=path)\n\n        # Raises auth.AccessDenied if access should not be allowed.\n        repository.checkAccess(db, \"modify\" if for_modify else \"read\")\n\n        db.storage[\"Repository\"][repository_id] = repository\n        db.storage[\"Repository\"][repository.name] = repository\n\n        return repository\n\n    @staticmethod\n    def fromName(db, name, for_modify=False):\n        if name in db.storage[\"Repository\"]:\n            return db.storage[\"Repository\"][name]\n        else:\n            cursor = db.cursor()\n            cursor.execute(\"SELECT id FROM repositories WHERE name=%s\", (name,))\n            row = cursor.fetchone()\n            if not row:\n                return None\n            repository_id, = row\n            return Repository.fromId(db, repository_id, for_modify)\n\n    @staticmethod\n    def fromParameter(db, parameter):\n        try: repository = Repository.fromId(db, int(parameter))\n        except: repository = Repository.fromName(db, parameter)\n        if repository: return repository\n        else: raise NoSuchRepository(parameter)\n\n    @staticmethod\n    def fromSHA1(db, sha1):\n        cursor = db.cursor()\n        cursor.execute(\"SELECT id FROM repositories ORDER BY id ASC\")\n        for (repository_id,) in cursor:\n            repository = Repository.fromId(db, repository_id)\n            if repository.iscommit(sha1): return repository\n        raise GitReferenceError(\n            \"Couldn't find commit %s in any repository.\" % sha1,\n            sha1=sha1)\n\n    @staticmethod\n    def fromPath(db, path, for_modify=False):\n        cursor = db.cursor()\n        cursor.execute(\"SELECT id FROM repositories WHERE path=%s\", (path,))\n        for (repository_id,) in cursor:\n            return Repository.fromId(db, repository_id, for_modify=for_modify)\n        raise NoSuchRepository(path)\n\n    @staticmethod\n    def fromAPI(api_repository):\n        return api_repository._impl.getInternal(api_repository.critic)\n\n    def __terminate(self, db=None):\n        self.stopBatch()\n\n    def __startBatch(self):\n        if self.__batch is None:\n            self.__batch = subprocess.Popen(\n                [configuration.executables.GIT, 'cat-file', '--batch'],\n                stdin=subprocess.PIPE, stdout=subprocess.PIPE,\n                stderr=subprocess.STDOUT, cwd=self.path)\n\n    def __startBatchCheck(self):\n        if self.__batchCheck is None:\n            self.__batchCheck = subprocess.Popen(\n                [configuration.executables.GIT, 'cat-file', '--batch-check'],\n                stdin=subprocess.PIPE, stdout=subprocess.PIPE,\n                stderr=subprocess.STDOUT, cwd=self.path)\n\n    def stopBatch(self):\n        if self.__batch:\n            try: os.kill(self.__batch.pid, 9)\n            except: pass\n            try: self.__batch.wait()\n            except: pass\n            self.__batch = None\n        if self.__batchCheck:\n            try: os.kill(self.__batchCheck.pid, 9)\n            except: pass\n            try: self.__batchCheck.wait()\n            except: pass\n            self.__batchCheck = None\n\n    @staticmethod\n    def forEach(db, fn):\n        for key, repository in db.storage[\"Repository\"].items():\n            if isinstance(key, int):\n                fn(db, repository)\n\n    def getJS(self):\n        return \"var repository = critic.repository = new Repository(%d, %s, %s);\" % (self.id, htmlutils.jsify(self.name), htmlutils.jsify(self.path))\n\n    def getModuleRepository(self, db, commit, path):\n        tree = Tree.fromPath(commit, \"/\")\n        source = self.fetch(tree[\".gitmodules\"].sha1).data\n        lines = iter(source.splitlines())\n\n        for line in lines:\n            if line == ('[submodule \"%s\"]' % path): break\n        else: return None\n\n        for line in lines:\n            line = line.strip()\n\n            if not line or line[0] == \"#\": continue\n            elif line[0] == \"[\": return None\n\n            key, value = map(str.strip, line.split(\"=\"))\n\n            if key == \"url\":\n                path = os.path.abspath(os.path.join(self.path, value))\n\n                cursor = db.cursor()\n                cursor.execute(\"SELECT id FROM repositories WHERE path=%s\", (path,))\n\n                row = cursor.fetchone()\n                if row:\n                    return Repository.fromId(db, row[0])\n                else:\n                    return None\n        else:\n            return None\n\n    def fetch(self, sha1, fetchData=True):\n        if self.__db:\n            cache = self.__db.storage[\"Repository\"]\n            cached_object = cache.get(\"object:\" + sha1)\n            if cached_object:\n                self.__db.recordProfiling(\"fetch: \" + cached_object.type + \" (cached)\", 0)\n                return cached_object\n\n        before = time.time()\n\n        if fetchData:\n            self.__startBatch()\n            stdin, stdout = self.__batch.stdin, self.__batch.stdout\n        else:\n            self.__startBatchCheck()\n            stdin, stdout = self.__batchCheck.stdin, self.__batchCheck.stdout\n\n        try: stdin.write(sha1 + '\\n')\n        except: raise GitError(\"failed when writing to 'git cat-file' stdin: %s\" % stdout.read())\n\n        line = stdout.readline()\n\n        if line == (\"%s missing\\n\" % sha1):\n            raise GitReferenceError(\"%s missing from %s\" % (sha1[:8], self.path), sha1=sha1, repository=self)\n\n        try: sha1, type, size = line.split()\n        except: raise GitError(\"unexpected output from 'git cat-file --batch': %s\" % line)\n\n        size = int(size)\n\n        if fetchData:\n            data = stdout.read(size)\n            stdout.read(1)\n        else:\n            data = None\n\n        git_object = GitObject(sha1, type, size, data)\n\n        after = time.time()\n\n        if not self.__cacheDisabled and (type != \"blob\" or self.__cacheBlobs):\n            cache[\"object:\" + sha1] = git_object\n\n        if self.__db:\n            self.__db.recordProfiling(\"fetch: \" + type, after - before)\n\n        return git_object\n\n    def run(self, command, *arguments, **kwargs):\n        return self.runCustom(self.path, command, *arguments, **kwargs)\n\n    def runCustom(self, cwd, command, *arguments, **kwargs):\n        argv = [configuration.executables.GIT, command]\n        argv.extend(arguments)\n        stdin_data = kwargs.get(\"input\")\n        if stdin_data is None: stdin = None\n        else: stdin = subprocess.PIPE\n        env = {}\n        env.update(os.environ)\n        env.update(configuration.executables.GIT_ENV)\n        env.update(kwargs.get(\"env\", {}))\n        if \"GIT_DIR\" in env: del env[\"GIT_DIR\"]\n        git = subprocess.Popen(argv, stdin=stdin, stdout=subprocess.PIPE,\n                               stderr=subprocess.PIPE, cwd=cwd, env=env)\n        stdout, stderr = git.communicate(stdin_data)\n        if kwargs.get(\"check_errors\", True):\n            if git.returncode == 0:\n                if kwargs.get(\"include_stderr\", False):\n                    return stdout + stderr\n                else:\n                    return stdout\n            else:\n                cmdline = \" \".join(argv)\n                output = stderr.strip()\n                raise GitCommandError(cmdline, output, cwd)\n        else:\n            return git.returncode, stdout, stderr\n\n    def createBranch(self, name, startpoint):\n        argv = [configuration.executables.GIT, 'branch', name, startpoint]\n        git = subprocess.Popen(argv, stdout=subprocess.PIPE,\n                               stderr=subprocess.PIPE, cwd=self.path)\n        stdout, stderr = git.communicate()\n        if git.returncode != 0:\n            cmdline = \" \".join(argv)\n            output = stderr.strip()\n            raise GitCommandError(cmdline, output, self.path)\n\n    def deleteBranch(self, name):\n        argv = [configuration.executables.GIT, 'branch', '-D', name]\n        git = subprocess.Popen(argv, stdout=subprocess.PIPE,\n                               stderr=subprocess.PIPE, cwd=self.path)\n        stdout, stderr = git.communicate()\n        if git.returncode != 0:\n            cmdline = \" \".join(argv)\n            output = stderr.strip()\n            raise GitCommandError(cmdline, output, self.path)\n\n    def mergebase(self, commit_or_commits, db=None):\n        if db and isinstance(commit_or_commits, Commit):\n            cursor = db.cursor()\n            cursor.execute(\"SELECT mergebase FROM mergebases WHERE commit=%s\", (commit_or_commits.getId(db),))\n            try:\n                return cursor.fetchone()[0]\n            except:\n                result = self.mergebase(commit_or_commits)\n                cursor.execute(\"INSERT INTO mergebases (commit, mergebase) VALUES (%s, %s)\", (commit_or_commits.getId(db), result))\n                return result\n\n        try: sha1s = commit_or_commits.parents\n        except: sha1s = map(str, commit_or_commits)\n\n        assert len(sha1s) >= 2\n\n        argv = [configuration.executables.GIT, 'merge-base'] + sha1s\n        git = subprocess.Popen(argv, stdout=subprocess.PIPE,\n                               stderr=subprocess.PIPE, cwd=self.path)\n        stdout, stderr = git.communicate()\n        if git.returncode == 0: return stdout.strip()\n        else:\n            cmdline = \" \".join(argv)\n            output = stderr.strip()\n            raise GitCommandError(cmdline, output, self.path)\n\n    def getCommonAncestor(self, commit_or_commits):\n        try: sha1s = commit_or_commits.parents\n        except: sha1s = list(commit_or_commits)\n\n        assert len(sha1s) >= 2\n\n        mergebases = [self.mergebase([sha1s[0], sha1]) for sha1 in sha1s[1:]]\n\n        if len(mergebases) == 1: return mergebases[0]\n        else: return self.getCommonAncestor(mergebases)\n\n    def revparse(self, name):\n        git = subprocess.Popen(\n            [configuration.executables.GIT, 'rev-parse', '--verify', '--quiet', name],\n            stdout=subprocess.PIPE, stderr=subprocess.PIPE, cwd=self.path)\n        stdout, stderr = git.communicate()\n        if git.returncode == 0: return stdout.strip()\n        else: raise GitReferenceError(\"'git rev-parse' failed: %s\" % stderr.strip(), ref=name, repository=self)\n\n    def revlist(self, included, excluded, *args, **kwargs):\n        args = list(args)\n        args.extend([str(ref) for ref in included])\n        args.extend(['^' + str(ref) for ref in excluded])\n        if \"paths\" in kwargs:\n            args.append(\"--\")\n            args.extend(kwargs[\"paths\"])\n        return self.run('rev-list', *args).splitlines()\n\n    def iscommit(self, name):\n        git = subprocess.Popen(\n            [configuration.executables.GIT, 'cat-file', '-t', name],\n            stdout=subprocess.PIPE, stderr=subprocess.PIPE, cwd=self.path)\n        stdout, stderr = git.communicate()\n        if git.returncode == 0: return stdout.strip() == \"commit\"\n        else: return False\n\n    def createref(self, name, value):\n        assert name.startswith(\"refs/\")\n        self.run(\"update-ref\", name, str(value), \"0\" * 40)\n\n    def updateref(self, name, new_value, old_value=None):\n        assert name.startswith(\"refs/\")\n        args = [\"update-ref\", name, str(new_value)]\n        if old_value is not None:\n            args.append(str(old_value))\n        self.run(*args)\n\n    def deleteref(self, name, value=None):\n        assert name.startswith(\"refs/\")\n        args = [\"update-ref\", \"-d\", name]\n        if value is not None:\n            args.append(str(value))\n        self.run(*args)\n\n    def keepalive(self, commit):\n        sha1 = str(commit)\n        self.run(\"update-ref\", KEEPALIVE_REF_PREFIX + sha1, sha1)\n        return sha1\n\n    def packKeepaliveRefs(self):\n        \"\"\"\n        Pack the repository's keepalive refs into a single chain\n        \"\"\"\n\n        def splitRefs(output):\n            return [(int(timestamp.split()[0]), sha1, timestamp)\n                    for sha1, _, timestamp in\n                    (line.partition(\":\")\n                     for line in\n                     output.splitlines())\n                    # Skip the root commit, which has summary \"Root\".\n                    if len(sha1) == 40]\n\n        loose_keepalive_refs = splitRefs(\n            self.run(\"for-each-ref\",\n                     \"--sort=committerdate\",\n                     \"--format=%(objectname):%(committerdate:raw)\",\n                     KEEPALIVE_REF_PREFIX))\n\n        if not loose_keepalive_refs:\n            # No loose refs => no need to (re)pack.\n            return\n\n        try:\n            old_value = self.revparse(KEEPALIVE_REF_CHAIN)\n        except GitReferenceError:\n            old_value = \"0\" * 40\n            packed_keepalive_refs = []\n        else:\n            packed_keepalive_refs = splitRefs(\n                self.run(\"log\",\n                         \"--first-parent\",\n                         \"--date=raw\",\n                         \"--format=%s:%cd\",\n                         old_value))\n\n        keepalive_refs = sorted(packed_keepalive_refs + loose_keepalive_refs)\n\n        env = getGitEnvironment()\n\n        def withDates(env, timestamp):\n            env[\"GIT_AUTHOR_DATE\"] = timestamp\n            env[\"GIT_COMMITTER_DATE\"] = timestamp\n            return env\n\n        # Note: we don't keep the generated commits alive by updating refs while\n        # doing this.  Since commit-tree itself produces unreferenced objects,\n        # it seems unlikely it will ever run an automatic GC, and if someone\n        # else triggers a GC while we're working, and it prunes our objects,\n        # then we'll fail, which is no big deal (we'd just leave the existing\n        # keepalive refs unmodified.)\n        #\n        # Also note: in most cases, the repacked keepalive chain will end up\n        # reusing the commit objects from the existing keepalive chain, since\n        # all meta-data in the generated commits come from the commits that we\n        # keep alive, and the order stable.\n\n        try:\n            processed = set()\n\n            new_value = self.run(\n                \"commit-tree\", EMPTY_TREE_SHA1, input=\"Root\",\n                env=withDates(env, keepalive_refs[0][2])).strip()\n\n            for _, sha1, timestamp in keepalive_refs:\n                if sha1 in processed:\n                    continue\n                processed.add(sha1)\n\n                new_value = self.run(\n                    \"commit-tree\", EMPTY_TREE_SHA1, \"-p\", new_value, \"-p\", sha1,\n                    input=sha1, env=withDates(env, timestamp)).strip()\n\n            self.updateref(KEEPALIVE_REF_CHAIN, new_value, old_value)\n        except GitCommandError:\n            # No big deal if this fails here; this is just a maintenance\n            # operation.  We'll try again another day.\n            return False\n\n        for _, sha1, _ in loose_keepalive_refs:\n            try:\n                self.deleteref(KEEPALIVE_REF_PREFIX + sha1, sha1)\n            except GitCommandError:\n                # Ignore failures to delete loose keepalive refs.\n                pass\n\n        return True\n\n    @contextlib.contextmanager\n    def temporaryref(self, commit=None):\n        if commit:\n            sha1 = self.revparse(str(commit))\n            name = \"refs/temporary/%s\" % sha1\n            self.createref(name, sha1)\n        else:\n            sha1 = None\n            name = \"refs/temporary/%s-%s\" % (time.strftime(\"%Y%m%d%H%M%S\"),\n                                             base64.b32encode(os.urandom(10)))\n        try:\n            yield name\n        finally:\n            self.deleteref(name, sha1)\n\n    def __copy(self, identifier, flavor):\n        base_args = [\"clone\", \"--quiet\"]\n\n        if flavor == \"relay\":\n            base_args.append(\"--bare\")\n            base_dir = REPOSITORY_RELAYCOPY_DIR\n        else:\n            assert flavor == \"work\"\n            base_dir = REPOSITORY_WORKCOPY_DIR\n\n        class Copy(object):\n            def __init__(self, origin, path, name):\n                self.origin = origin\n                self.path = path\n                self.name = name\n            def run(self, *args, **kwargs):\n                return self.origin.runCustom(\n                    os.path.join(self.path, self.name), *args, **kwargs)\n            def __enter__(self):\n                return self\n            def __exit__(self, *args):\n                shutil.rmtree(self.path)\n                return False\n\n        path = tempfile.mkdtemp(prefix=\"%s_%s_\" % (self.name, identifier),\n                                dir=base_dir)\n        name = os.path.basename(self.path)\n\n        local_args = base_args[:]\n        if not same_filesystem(self.path, path):\n            local_args.append(\"--shared\")\n        local_args.extend([self.path, name])\n\n        fallback_args = base_args[:]\n        fallback_args.extend([\"file://\" + os.path.abspath(self.path), name])\n\n        try:\n            # Try cloning with --local (implied by using a plain path as the\n            # repository URL.)  This may fail due to inaccessible pack-*.keep\n            # files in the repository.\n            self.runCustom(path, *local_args)\n        except GitCommandError:\n            try:\n                # Try cloning without --local (implied by using a file://\n                # repository URL.)  This is slower and uses more disk space, but\n                # is immune to the problems with inaccessible pack-*.keep files.\n                self.runCustom(path, *fallback_args)\n            except GitCommandError:\n                shutil.rmtree(path)\n                raise\n\n        return Copy(self, path, name)\n\n    def relaycopy(self, identifier):\n        return self.__copy(identifier, \"relay\")\n\n    def workcopy(self, identifier):\n        return self.__copy(identifier, \"work\")\n\n    def replaymerge(self, db, user, commit):\n        with self.workcopy(commit.sha1) as workcopy:\n            with self.temporaryref(commit) as ref_name:\n                # Fetch the merge to replay from the main repository into the work copy.\n                workcopy.run('fetch', 'origin', ref_name)\n\n            parent_sha1s = commit.parents\n\n            # Create and check out a branch at first parent.\n            workcopy.run('checkout', '-b', 'replay', parent_sha1s[0])\n\n            # Then perform the merge with the other parents.\n            returncode, stdout, stderr = workcopy.run(\"merge\", *parent_sha1s[1:],\n                env=getGitEnvironment(author=commit.author),\n                check_errors=False)\n\n            # If the merge produced conflicts, just stage and commit them:\n            if returncode != 0:\n                # Reset any submodule gitlinks with conflicts: since we don't\n                # have the submodules checked out, \"git commit --all\" below\n                # may fail to index them.\n                for line in stdout.splitlines():\n                    if line.startswith(\"CONFLICT (submodule):\"):\n                        submodule_path = line.split()[-1]\n                        workcopy.run(\"reset\", \"--\", submodule_path, check_errors=False)\n\n                # Then stage and commit the result, with conflict markers and all.\n                workcopy.run(\"commit\", \"--all\", \"--message=replay of merge that produced %s\" % commit.sha1,\n                             env=getGitEnvironment(author=commit.author))\n\n            sha1 = workcopy.run(\"rev-parse\", \"HEAD\").strip()\n\n            # Then push the commit to the main repository.\n            workcopy.run('push', 'origin', 'HEAD:refs/keepalive/' + sha1)\n\n            commit = Commit.fromSHA1(db, self, sha1)\n\n            # Finally, return the resulting commit.\n            return commit\n\n    def getSignificantBranches(self, db):\n        \"\"\"Return an iterator of \"significant\" branches\n\n           A branch is considered significant if it is referenced by the\n           repository's HEAD (if that's a symbolic ref) or if it is set up to\n           track a remote branch.\"\"\"\n        import dbutils\n        head_branch = self.getHeadBranch(db)\n        if head_branch:\n            yield head_branch\n        cursor = db.cursor()\n        cursor.execute(\n            \"\"\"SELECT local_name\n                 FROM trackedbranches\n                 JOIN branches ON (branches.repository=trackedbranches.repository\n                               AND branches.name=trackedbranches.local_name)\n                WHERE branches.repository=%s\n                  AND branches.type='normal'\n             ORDER BY trackedbranches.id ASC\"\"\",\n                       (self.id,))\n        for (branch_name,) in cursor:\n            if head_branch and head_branch.name == branch_name:\n                continue\n            yield dbutils.Branch.fromName(db, self, branch_name)\n\n    def getDefaultRemote(self, db):\n        cursor = db.cursor()\n        cursor.execute(\"\"\"SELECT remote\n                            FROM trackedbranches\n                           WHERE repository=%s\n                             AND local_name IN ('*', 'master')\n                        ORDER BY local_name\n                           LIMIT 1\"\"\",\n                       (self.id,))\n        row = cursor.fetchone()\n        return row[0] if row else None\n\n    def updateBranchFromRemote(self, db, remote, branch_name):\n        cursor = db.cursor()\n        cursor.execute(\"\"\"SELECT 1\n                            FROM trackedbranches\n                           WHERE repository=%s\n                             AND local_name=%s\n                             AND NOT disabled\"\"\",\n                       (self.id, branch_name))\n        if cursor.fetchone():\n            # Don't update a branch that the branch tracker service owns;\n            # instead just assume it's already up-to-date.\n            return\n\n        if not branch_name.startswith(\"refs/\"):\n            branch_name = \"refs/heads/%s\" % branch_name\n\n        with self.relaycopy(\"updateBranchFromRemote\") as relay:\n            try:\n                relay.run(\"fetch\", remote, branch_name)\n            except GitCommandError as error:\n                if error.output.startswith(\"fatal: Couldn't find remote ref \"):\n                    raise GitReferenceError(\"Couldn't find ref %s in %s.\" % (branch_name, remote),\n                                            ref=branch_name, repository=remote)\n                raise\n\n            relay.run(\"push\", \"-f\", \"origin\", \"FETCH_HEAD:%s\" % branch_name)\n\n    @contextlib.contextmanager\n    def fetchTemporaryFromRemote(self, db, remote, ref):\n        import index\n\n        with self.temporaryref() as temporary_ref:\n            try:\n                self.run(\"fetch\", remote, \"%s:%s\" % (ref, temporary_ref))\n            except GitCommandError as error:\n                if error.output.startswith(\"fatal: Couldn't find remote ref \"):\n                    raise GitReferenceError(\"Couldn't find ref %s in %s.\" % (ref, remote), ref=ref, repository=remote)\n                elif error.output.startswith(\"fatal: Invalid refspec \"):\n                    raise GitReferenceError(\"Invalid ref %r.\" % ref, ref=ref)\n                raise\n\n            sha1 = self.run(\"rev-parse\", \"--verify\", temporary_ref).strip()\n\n            index.processCommits(db, self, sha1)\n\n            yield sha1\n\n    @staticmethod\n    def readObject(repository_path, object_type, object_sha1):\n        argv = [configuration.executables.GIT, 'cat-file', object_type, object_sha1]\n        git = subprocess.Popen(argv, stdout=subprocess.PIPE,\n                               stderr=subprocess.PIPE, cwd=repository_path)\n        stdout, stderr = git.communicate()\n        if git.returncode != 0:\n            raise GitCommandError(\" \".join(argv), stderr.strip(), repository_path)\n        return stdout\n\n    @staticmethod\n    def lsremote(remote, include_heads=False, include_tags=False, pattern=None, regexp=None):\n        if regexp: name_check = lambda item: bool(regexp.match(item[1]))\n        else: name_check = lambda item: True\n\n        argv = [configuration.executables.GIT, 'ls-remote']\n\n        if include_heads: argv.append(\"--heads\")\n        if include_tags: argv.append(\"--tags\")\n\n        argv.append(remote)\n\n        if pattern: argv.append(pattern)\n\n        git = subprocess.Popen(argv, stdout=subprocess.PIPE,\n                               stderr=subprocess.PIPE)\n        stdout, stderr = git.communicate()\n\n        if git.returncode == 0:\n            return filter(name_check, (line.split() for line in stdout.splitlines()))\n        else:\n            cmdline = \" \".join(argv)\n            output = stderr.strip()\n            cwd = os.getcwd()\n            raise GitCommandError(cmdline, output, cwd)\n\n    def findInterestingTag(self, db, sha1):\n        cursor = db.cursor()\n        cursor.execute(\"SELECT name FROM tags WHERE repository=%s AND sha1=%s\",\n                       (self.id, sha1))\n\n        tags = [tag for (tag,) in cursor]\n\n        try:\n            from customization.filtertags import filterTags\n            tags = filterTags(self, tags)\n        except ImportError:\n            pass\n\n        if tags: return tags[0]\n        else: return None\n\n    def getHead(self, db):\n        return Commit.fromSHA1(db, self, self.revparse(\"HEAD\"))\n\n    def getHeadBranch(self, db):\n        \"\"\"Return the branch that HEAD references\n\n           None is returned if HEAD is not a symbolic ref or if it references a\n           ref not under refs/heads/.\"\"\"\n        import dbutils\n        try:\n            ref_name = self.run(\"symbolic-ref\", \"--quiet\", \"HEAD\").strip()\n        except GitCommandError:\n            # HEAD is not a symbolic ref.\n            pass\n        else:\n            if ref_name.startswith(\"refs/heads/\"):\n                branch_name = ref_name[len(\"refs/heads/\"):]\n                return dbutils.Branch.fromName(db, self, branch_name)\n\n    def isEmpty(self):\n        try:\n            self.revparse(\"HEAD\")\n            return False\n        except GitError:\n            return True\n\n    def invokeGitHttpBackend(self, req, user, path):\n        request_environ = req.getEnvironment()\n\n        environ = { \"GIT_HTTP_EXPORT_ALL\": \"true\",\n                    \"REMOTE_ADDR\": request_environ.get(\"REMOTE_ADDR\", \"unknown\"),\n                    \"PATH_TRANSLATED\": os.path.join(self.path, path),\n                    \"REQUEST_METHOD\": req.method,\n                    \"QUERY_STRING\": req.query }\n\n        if \"CONTENT_TYPE\" in request_environ:\n            environ[\"CONTENT_TYPE\"] = request_environ[\"CONTENT_TYPE\"]\n\n        for name, value in req.getEnvironment().items():\n            if name.startswith(\"HTTP_\"):\n                environ[name] = value\n\n        if not user.isAnonymous():\n            environ[\"REMOTE_USER\"] = user.name\n        elif not configuration.base.ALLOW_ANONYMOUS_USER \\\n                or path == \"git-receive-pack\" \\\n                or req.getParameter(\"service\", None) == \"git-receive-pack\":\n            # The git-receive-pack service fails without a user, so request\n            # authorization.\n            raise GitHttpBackendNeedsUser\n\n        git_http_backend = communicate.Communicate(subprocess.Popen(\n            [configuration.executables.GIT, \"http-backend\"],\n            stdin=subprocess.PIPE, stdout=subprocess.PIPE,\n            stderr=subprocess.PIPE, env=environ))\n\n        def produceInput():\n            if req.method not in (\"POST\", \"PUT\"):\n                return None\n            else:\n                data = req.read(65536)\n                if not data:\n                    return None\n                return data\n\n        def handleHeaderLine(line):\n            line = line.strip()\n\n            if not line:\n                req.start()\n                git_http_backend.setCallbacks(stdout=handleOutput)\n                return\n\n            name, _, value = line.partition(\":\")\n            name = name.strip()\n            value = value.strip()\n\n            if name.lower() == \"status\":\n                status_code, _, status_text = value.partition(\" \")\n                req.setStatus(int(status_code), status_text.strip())\n            elif name.lower() == \"content-type\":\n                req.setContentType(value)\n            else:\n                req.addResponseHeader(name, value)\n\n        def handleOutput(data):\n            req.write(data)\n\n        git_http_backend.setInput(produceInput)\n        git_http_backend.setCallbacks(stdout_line=handleHeaderLine)\n\n        try:\n            _, stderr = git_http_backend.run()\n        except communicate.ProcessError as error:\n            raise GitHttpBackendError(error.process.returncode, error.stderr)\n\n    def describe(self, db, sha1):\n        tag = self.findInterestingTag(db, sha1)\n        if tag:\n            return tag\n\n        commit = Commit.fromSHA1(db, self, sha1)\n\n        for branch in self.getSignificantBranches(db):\n            if commit == branch.head_sha1:\n                return \"tip of \" + branch.name\n            elif commit.isAncestorOf(branch.head_sha1):\n                return branch.name\n\n        return None\n\nclass CommitUserTime(object):\n    def __init__(self, name, email, time):\n        self.name = name\n        self.email = email\n        self.time = time\n\n    def __getIds(self, db):\n        cache = db.storage[\"CommitUserTime\"]\n        cache_key = (self.name, self.email)\n\n        if cache_key not in cache:\n            cursor = db.cursor()\n            cursor.execute(\"\"\"SELECT id\n                                FROM gitusers\n                               WHERE fullname=%s\n                                 AND email=%s\"\"\",\n                           (self.name, self.email))\n            row = cursor.fetchone()\n            if not row:\n                cursor.execute(\"\"\"INSERT INTO gitusers (fullname, email)\n                                       VALUES (%s, %s)\n                                    RETURNING id\"\"\",\n                               (self.name, self.email))\n                row = cursor.fetchone()\n            gituser_id, = row\n\n            cursor.execute(\"\"\"SELECT uid\n                                FROM usergitemails\n                               WHERE email=%s\"\"\",\n                           (self.email,))\n            user_ids = frozenset(user_id for user_id, in cursor)\n\n            cache[cache_key] = user_ids, gituser_id\n\n        return cache.get(cache_key)\n\n    def getUserIds(self, db):\n        return self.__getIds(db)[0]\n\n    def getGitUserId(self, db):\n        return self.__getIds(db)[1]\n\n    def getFullname(self, db):\n        user_ids = self.getUserIds(db)\n        if len(user_ids) == 1:\n            import dbutils\n            return dbutils.User.fromId(db, tuple(user_ids)[0]).fullname\n        else:\n            return self.name\n\n    def __str__(self):\n        timestamp = time.strftime(\"%Y-%m-%d %H:%M:%S\", self.time)\n        return \"%s <%s> at %s\" % (self.name, self.email, timestamp)\n\n    @staticmethod\n    def fromValue(value):\n        match = re_author_committer.match(value)\n        return CommitUserTime(textutils.decode(match.group(1)).encode(\"utf-8\"),\n                              textutils.decode(match.group(2)).encode(\"utf-8\"),\n                              time.gmtime(int(match.group(3).split(\" \")[0])))\n\nclass Commit:\n    def __init__(self, repository, id, sha1, parents, author, committer, message, tree):\n        self.repository = repository\n        self.id = id\n        self.sha1 = sha1\n        self.parents = parents\n        self.author = author\n        self.committer = committer\n        self.message = message\n        self.tree = tree\n        self.__treeCache = {}\n\n    def __cache(self, db):\n        cache = db.storage[\"Commit\"]\n        if self.id: cache[self.id] = self\n        cache[self.sha1] = self\n\n    @staticmethod\n    def fromGitObject(db, repository, gitobject, commit_id=None):\n        assert gitobject.type == \"commit\"\n\n        data = gitobject.data\n        parents = []\n\n        while True:\n            line, data = data.split('\\n', 1)\n\n            if not line:\n                break\n\n            key, value = line.split(' ', 1)\n\n            if key == 'tree': tree = value\n            elif key == 'parent': parents.append(value)\n            elif key == 'author': author = CommitUserTime.fromValue(value)\n            elif key == 'committer': committer = CommitUserTime.fromValue(value)\n\n        message = textutils.decode(data).encode(\"utf-8\")\n\n        commit = Commit(repository, commit_id, gitobject.sha1, parents, author,\n                        committer, message, tree)\n        commit.__cache(db)\n        return commit\n\n    @staticmethod\n    def fromSHA1(db, repository, sha1, commit_id=None):\n        return Commit.fromGitObject(db, repository, repository.fetch(sha1), commit_id)\n\n    @staticmethod\n    def fromId(db, repository, commit_id):\n        commit = db.storage[\"Commit\"].get(commit_id)\n        if not commit:\n            cursor = db.cursor()\n            cursor.execute(\"SELECT sha1 FROM commits WHERE id=%s\", (commit_id,))\n            sha1 = cursor.fetchone()[0]\n            commit = Commit.fromSHA1(db, repository, sha1, commit_id)\n        return commit\n\n    @staticmethod\n    def fromAPI(api_commit):\n        return Commit.fromSHA1(api_commit.critic.database,\n                               Repository.fromAPI(api_commit.repository),\n                               api_commit.sha1, api_commit.id)\n\n    def __hash__(self): return hash(self.sha1)\n    def __eq__(self, other): return self.sha1 == str(other)\n    def __ne__(self, other): return self.sha1 != str(other)\n    def __str__(self): return self.sha1\n    def __repr__(self):\n        if self.id is None: return \"Commit(sha1=%r)\" % self.sha1\n        else: return \"Commit(sha1=%r, id=%d)\" % (self.sha1, self.id)\n\n    def summary(self, maxlen=None):\n        summary = self.message.split(\"\\n\", 1)[0].strip()\n        if maxlen and len(summary) > maxlen:\n            summary = summary[:maxlen - 3].strip() + \"...\"\n        return summary\n\n    def niceSummary(self, include_tag=True):\n        try:\n            summary, _, rest = self.message.partition(\"\\n\")\n\n            if summary.startswith(\"fixup! \") or summary.startswith(\"squash! \"):\n                fixup_summary = rest.strip().partition(\"\\n\")[0].strip()\n                if fixup_summary:\n                    what = summary[:summary.index(\"!\")]\n                    if include_tag:\n                        return \"[%s] %s\" % (what, fixup_summary)\n                    else:\n                        return fixup_summary\n\n            return summary\n        except:\n            return self.summary()\n\n    def getId(self, db):\n        if self.id is None:\n            cursor = db.cursor()\n            cursor.execute(\"SELECT id FROM commits WHERE sha1=%s\", (self.sha1,))\n            self.id = cursor.fetchone()[0]\n            self.__cache(db)\n        return self.id\n\n    def findInterestingTag(self, db):\n        return self.repository.findInterestingTag(db, self.sha1)\n\n    def describe(self, db):\n        if db:\n            tag = self.findInterestingTag(db)\n            if tag: return tag\n        return self.sha1[:8]\n\n    def oneline(self, db, decorate=False):\n        line = \"%s %s\" % (self.sha1[:8], self.niceSummary())\n        if decorate:\n            decorations = []\n            if self == self.repository.getHead(db):\n                decorations.append(\"HEAD\")\n            cursor = db.cursor()\n            cursor.execute(\"\"\"SELECT branches.name\n                                FROM branches\n                                JOIN reachable ON (reachable.branch=branches.id)\n                                JOIN commits ON (commits.id=reachable.commit)\n                               WHERE commits.sha1=%s\"\"\",\n                           (self.sha1,))\n            decorations.extend(branch for (branch,) in cursor)\n            if decorations:\n                line += \" (%s)\" % \", \".join(decorations)\n        return line\n\n    def isAncestorOf(self, other):\n        if isinstance(other, Commit):\n            if self.repository != other.repository: return False\n            other_sha1 = other.sha1\n        else:\n            other_sha1 = str(other)\n\n        try:\n            mergebase_sha1 = self.repository.mergebase([self.sha1, other_sha1])\n        except GitCommandError:\n            # Merge-base fails if there is no common ancestor.  And if two\n            # commits have no common ancestor, neither can be an ancestor of the\n            # other, obviously.\n            return False\n        else:\n            return mergebase_sha1 == self.sha1\n\n    def getTree(self, path):\n        path = \"/\" + path.lstrip(\"/\")\n        if path not in self.__treeCache:\n            self.__treeCache[path] = Tree.fromPath(self, path)\n        return self.__treeCache[path]\n\n    def getFileEntry(self, path):\n        tree = self.getTree(os.path.dirname(path))\n        if tree is None:\n            return None\n        return tree.get(os.path.basename(path))\n\n    def getFileSHA1(self, path):\n        entry = self.getFileEntry(path)\n        if entry is None:\n            return None\n        return entry.sha1\n\n    def isDirectory(self, path):\n        return self.getTree(path) is not None\n\nRE_LSTREE_LINE = re.compile(\n    \"(?P<mode>[0-9]{6}) (?P<type>blob|tree|commit) (?P<sha1>[0-9a-f]{40}) +\"\n    \"(?P<size>[0-9]+|-)\\t(?P<quote>[\\\"']?)(?P<name>.*)(?P=quote)$\")\n\nclass Tree:\n    class Entry:\n        class Mode(int):\n            def __new__(cls, value):\n                return super(Tree.Entry.Mode, cls).__new__(cls, int(value, 8))\n\n            def __str__(self):\n                if stat.S_ISDIR(self):\n                    return \"d---------\"\n                elif self == 0160000:\n                    return \"m---------\"\n                else:\n                    if stat.S_ISLNK(self): string = \"l\"\n                    else: string = \"-\"\n\n                    flags = [\"---\", \"--x\", \"-w-\", \"-wx\", \"r--\", \"r-x\", \"rw-\", \"rwx\"]\n                    return string + flags[(self & 0700) >> 6] + flags[(self & 070) >> 3] + flags[self & 07]\n\n        def __init__(self, name, mode, type, sha1, size):\n            if len(name) > 2 and name[0] in ('\"', \"'\") and name[-1] == name[0]:\n                name = diff.parse.demunge(name[1:-1])\n\n            self.name = name\n            self.mode = Tree.Entry.Mode(mode)\n            self.type = type\n            self.sha1 = sha1\n            self.size = size\n\n        def __str__(self):\n            return self.name\n\n        def __repr__(self):\n            return \"[%s %s %s %s%s]\" % (self.mode, self.type, self.name, self.sha1[:8], \" %d\" % self.size if self.size else \"\")\n\n    def __init__(self, entries, commit=None):\n        self.__entries_list = entries\n        self.__entries_dict = dict([(entry.name, entry) for entry in entries])\n\n    def __getitem__(self, item):\n        if type(item) == int:\n            return self.__entries_list[item]\n        else:\n            return self.__entries_dict[str(item)]\n\n    def __len__(self):\n        return len(self.__entries_list)\n    def __iter__(self):\n        return iter(self.__entries_list)\n\n    def keys(self):\n        return self.__entries_dict.keys()\n    def items(self):\n        return self.__entries_dict.items()\n    def values(self):\n        return self.__entries_dict.values()\n    def get(self, key, default=None):\n        return self.__entries_dict.get(key, default)\n\n    @staticmethod\n    def fromPath(commit, path):\n        assert path[0] == \"/\"\n\n        if path == \"/\":\n            what = commit.sha1\n        else:\n            if path[-1] != \"/\": path += \"/\"\n            what = \"%s:%s\" % (commit.sha1, path[1:])\n\n        entries = []\n\n        try:\n            lstree_output = commit.repository.run(\"ls-tree\", \"-l\", what)\n        except GitCommandError as error:\n            if error.output == \"fatal: Not a valid object name %s\" % what:\n                return None\n            raise\n\n        for line in lstree_output.splitlines():\n            match = RE_LSTREE_LINE.match(line)\n\n            assert match, \"Unexpected output from 'git ls-tree': %r\" % line\n\n            name = match.group(\"name\")\n            if match.group(\"quote\"):\n                name = diff.parse.demunge(name)\n\n            if match.group(\"type\") == \"blob\":\n                size = int(match.group(\"size\"))\n            else:\n                size = None\n\n            entries.append(Tree.Entry(name=name,\n                                      mode=match.group(\"mode\"),\n                                      type=match.group(\"type\"),\n                                      sha1=match.group(\"sha1\"),\n                                      size=size))\n\n        return Tree(entries)\n\n    @staticmethod\n    def fromSHA1(repository, sha1):\n        data = repository.fetch(sha1).data\n        entries = []\n\n        while len(data):\n            space = data.index(\" \")\n            null = data.index(\"\\0\", space + 1)\n\n            mode = data[:space]\n            name = data[space + 1:null]\n\n            sha1_binary = data[null + 1:null + 21]\n            sha1 = \"\".join([(\"%02x\" % ord(c)) for c in sha1_binary])\n\n            entry_object = repository.fetch(sha1, fetchData=False)\n\n            entries.append(Tree.Entry(name, mode, entry_object.type, sha1, entry_object.size))\n\n            data = data[null + 21:]\n\n        return Tree(entries)\n\ndef getTaggedCommit(repository, sha1):\n    \"\"\"Returns the SHA-1 of the tagged commit.\n\n       If the supplied SHA-1 sum is a commit object, then it is returned,\n       otherwise it must be a tag object, which is parsed to retrieve the\n       tagged object SHA-1 sum.\"\"\"\n\n    while True:\n        git_object = repository.fetch(sha1)\n\n        if git_object.type == \"commit\":\n            return sha1\n        elif git_object.type != \"tag\":\n            return\n\n        sha1 = git_object.data.split(\"\\n\", 1)[0].split(\" \", 1)[-1]\n\nclass Blame:\n    def __init__(self, from_commit, to_commit):\n        assert from_commit.repository == to_commit.repository\n\n        self.repository = from_commit.repository\n        self.from_commit = from_commit\n        self.to_commit = to_commit\n        self.commits = []\n        self.__commit_ids = {}\n\n    def blame(self, db, path, first_line, last_line):\n        output = self.repository.run(\"blame\",\n                                     \"--porcelain\",\n                                     \"-L\", \"%d,%d\" % (first_line, last_line),\n                                     \"%s..%s\" % (self.from_commit.sha1, self.to_commit.sha1),\n                                     \"--\", path)\n\n        inlines = iter(output.splitlines())\n        lines = []\n\n        try:\n            while True:\n                sha1, original_line, current_line = inlines.next().split(\" \")[:3]\n\n                original_line = int(original_line)\n                current_line = int(current_line)\n\n                author = None\n                author_email = None\n\n                line = inlines.next()\n                while not line.startswith(\"\\t\"):\n                    if line.startswith(\"author \"): author = line[7:]\n                    elif line.startswith(\"author-mail \"): author_email = line[13:-1]\n                    elif line.startswith(\"summary \"): pass\n                    line = inlines.next()\n\n                if sha1 not in self.__commit_ids:\n                    commit = Commit.fromSHA1(db, self.repository, sha1)\n\n                    self.__commit_ids[sha1] = len(self.commits)\n                    self.commits.append({ \"sha1\": sha1,\n                                          \"author_name\": author,\n                                          \"author_email\": author_email,\n                                          \"summary\": commit.niceSummary(),\n                                          \"message\": commit.message,\n                                          \"original\": sha1 == self.from_commit.sha1,\n                                          \"current\": sha1 == self.to_commit.sha1 })\n\n                lines.append({ \"offset\": current_line,\n                               \"commit\": self.__commit_ids[sha1] })\n        except StopIteration:\n            pass\n\n        return lines\n\nclass FetchCommits(threading.Thread):\n    def __init__(self, repository, sha1s):\n        super(FetchCommits, self).__init__()\n\n        self.repository = repository\n        self.sha1s = sha1s\n        self.gitobjects = []\n        self.commits = None\n        self.error = None\n        self.joined = False\n\n        self.start()\n\n    def run(self):\n        try:\n            batch = subprocess.Popen(\n                [configuration.executables.GIT, 'cat-file', '--batch'],\n                stdin=subprocess.PIPE, stdout=subprocess.PIPE,\n                stderr=subprocess.STDOUT, cwd=self.repository.path)\n\n            stdout, stderr = batch.communicate(\"\\n\".join(self.sha1s.keys()) + \"\\n\")\n\n            gitobjects = []\n\n            for sha1, commit_id in self.sha1s.items():\n                line, stdout = stdout.split(\"\\n\", 1)\n\n                try:\n                    object_sha1, object_type, object_size = line.split(\" \")\n                except ValueError:\n                    raise SyntaxError(\"unexpected line: %r\" % line)\n\n                assert object_sha1 == sha1, \"%s != %s\" % (object_sha1, sha1)\n                assert object_type == \"commit\"\n\n                object_size = int(object_size)\n\n                object_data = stdout[:object_size]\n                stdout = stdout[object_size + 1:]\n\n                gitobjects.append((GitObject(object_sha1, object_type, object_size, object_data), commit_id))\n\n            self.gitobjects = gitobjects\n        except Exception:\n            self.error = traceback.format_exc()\n\n    def getCommits(self, db):\n        self.join()\n\n        for gitobject, commit_id in self.gitobjects:\n            Commit.fromGitObject(db, self.repository, gitobject, commit_id)\n"
  },
  {
    "path": "src/gitutils_unittest.py",
    "content": "def keepalives():\n    # Run Repository.packKeepaliveRefs() and make sure it seems to do its job\n    # correctly.  Since it's run as a nightly maintenance task, it would\n    # otherwise not be exercised by testing.\n\n    import api\n    import gitutils\n\n    critic = api.critic.startSession(for_testing=True)\n\n    for repository in api.repository.fetchAll(critic):\n        # Fetch the \"internal\" repository object.  This is a bit ugly, but we\n        # can live with it in a test case.\n        repository = repository._impl.getInternal(critic)\n\n        # Make sure there's at least one loose keepalive ref.\n        repository.keepalive(repository.revparse(\"HEAD\"))\n\n        loose_keepalive_refs_before = repository.run(\n            \"for-each-ref\",\n            \"--format=%(objectname)\",\n            gitutils.KEEPALIVE_REF_PREFIX).splitlines()\n\n        assert len(loose_keepalive_refs_before) > 0\n\n        repository.packKeepaliveRefs()\n\n        loose_keepalive_refs_after = repository.run(\n            \"for-each-ref\",\n            \"--format=%(objectname)\",\n            gitutils.KEEPALIVE_REF_PREFIX).splitlines()\n\n        assert len(loose_keepalive_refs_after) == 0\n\n        chain_before = repository.revparse(gitutils.KEEPALIVE_REF_CHAIN)\n\n        # Check that all previous loose keepalive refs are now ancestors of the\n        # keepalive chain ref (IOW, are being kept alive by it.)\n        for sha1 in loose_keepalive_refs_before:\n            mergebase = repository.mergebase([sha1, chain_before])\n            assert mergebase == sha1\n\n        # Make sure there's a loose keepalive ref again.\n        repository.keepalive(repository.revparse(\"HEAD\"))\n        repository.packKeepaliveRefs()\n\n        chain_after = repository.revparse(gitutils.KEEPALIVE_REF_CHAIN)\n\n        # Make sure the chain didn't change.\n        assert chain_before == chain_after, (\"%s != %s\"\n                                             % (chain_before, chain_after))\n\n    print \"keepalives: ok\"\n"
  },
  {
    "path": "src/hooks/pre-receive",
    "content": "#!/usr/bin/env python\n# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nfrom sys import stdin, stdout, path, exit\nfrom os import getcwd, getuid, environ\nfrom pwd import getpwuid\nfrom socket import socket, AF_UNIX, SOCK_STREAM, SHUT_WR\nfrom subprocess import Popen, PIPE\n\ndef gitconfig(name):\n    process = Popen([\"git\", \"config\", name], stdout=PIPE)\n    stdout, stderr = process.communicate()\n    if process.returncode == 0: return stdout.strip()\n    else: return None\n\nsocket_path = gitconfig(\"critic.socket\")\nrepository_name = gitconfig(\"critic.name\")\n\nif not socket_path or not repository_name:\n    print \"\"\"Repository is not configured properly!  Please add\n\n[critic]\n\\tsocket = <socket path>\n\\tname = <repository name>\n\nto the repository's configuration file.\"\"\"\n    exit(1)\n\nserver_socket = socket(AF_UNIX, SOCK_STREAM)\n\ntry:\n    server_socket.connect(socket_path)\nexcept:\n    print \"Failed to connect to Critic's githook service!\"\n    exit(1)\n\n# Line 1: user name.\ndata = getpwuid(getuid()).pw_name + \"\\n\"\n\n# Line 2: $REMOTE_USER or empty string if undefined.  This will only be used\n#         if the actual user (line 1) is the Critic system user.\ndata += environ.get(\"REMOTE_USER\", \"\") + \"\\n\"\n\n# Line 3: repository name.\ndata += repository_name + \"\\n\"\n\n# Line 4: flags from $CRITIC_FLAGS or empty string if undefined.\ndata += environ.get(\"CRITIC_FLAGS\", \"\") + \"\\n\"\n\n# Line 5-N: input to the git hook.\ndata += stdin.read()\n\ntry:\n    server_socket.sendall(data)\n    server_socket.shutdown(SHUT_WR)\nexcept:\n    print \"Failed to send command to Critic!\"\n    exit(1)\n\ndata = \"\"\n\ntry:\n    while True:\n        received = server_socket.recv(4096)\n        if not received: break\n        data += received\n\n        while \"\\n\" in data and data != \"ok\\n\":\n            line_length = data.index(\"\\n\") + 1\n            line = data[:line_length]\n            data = data[line_length:]\n            stdout.write(line)\n\n        stdout.flush()\n\n    server_socket.close()\nexcept:\n    print \"Failed to read result from Critic!\"\n    exit(1)\n\nif data == \"ok\\n\": exit(0)\nelse: exit(1)\n"
  },
  {
    "path": "src/htmlutils.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport re\nimport time\nimport os\nimport json\nimport urllib\n\nimport textutils\n\nfrom cStringIO import StringIO\n\nfrom linkify import ALL_LINKTYPES, Context\n\nfragments = []\nfor linktype in ALL_LINKTYPES:\n    if linktype.fragment:\n        fragments.append(linktype.fragment)\nre_linkify = re.compile(\"(?:^|\\\\b|(?=\\\\W))(\" + \"|\".join(fragments) + \")([.,:;!?)]*(?:\\\\s|\\\\b|$))\")\n\nre_simple = re.compile(\"^[^ \\t\\r\\n&<>/=`'\\\"]+$\")\nre_nonascii = re.compile(\"[^\\t\\n\\r -\\x7f]\")\nre_control = re.compile(\"[\\x01-\\x1f\\x7f]\")\n\ndef htmlify(text, attributeValue=False, pretty=False):\n    if isinstance(text, unicode): text = re_nonascii.sub(lambda x: \"&#%d;\" % ord(x.group()), text.replace('&', '&amp;').replace('<', '&lt;').replace('>', '&gt;'))\n    else: text = str(text).replace('&', '&amp;').replace('<', '&lt;').replace('>', '&gt;')\n    if attributeValue:\n        if not pretty and re_simple.match(text): return text\n        elif \"'\" in text:\n            if '\"' not in text: text = '\"' + text + '\"'\n            else: text = \"'\" + text.replace(\"'\", '&apos;') + \"'\"\n        else: text = \"'\" + text + \"'\"\n        text = re_control.sub(lambda match: \"&#%d;\" % ord(match.group()), text)\n    return text\n\ndef jsify(what, as_json=False):\n    if what is None: return \"null\"\n    elif isinstance(what, bool): return \"true\" if what else \"false\"\n    elif isinstance(what, int) or isinstance(what, long): return str(what)\n    else:\n        what = textutils.decode(what)\n        result = json.dumps(what)\n        if not as_json:\n            quote = result[0]\n            return result.replace(\"</\", \"<%s+%s/\" % (quote, quote)).replace(\"<!\", \"<%s+%s!\" % (quote, quote))\n        else:\n            return result\n\nre_tag = re.compile(\"<[^>]+>\")\n\ndef tabify(line, tabwidth=8, indenttabsmode=True):\n    index = 0\n    length = len(line)\n    column = 0\n    result = \"\"\n\n    try:\n        leading = True\n        while index < length:\n            tabindex = line.index(\"\\t\", index)\n            nontabbed = line[index:tabindex]\n            nontabbed_length = len(re_tag.sub(\"\", nontabbed))\n            illegal = \"\"\n            if leading:\n                if nontabbed_length != 0:\n                    leading = False\n                elif not indenttabsmode:\n                    illegal = \" ill\"\n            width = tabwidth - (column + nontabbed_length) % tabwidth\n            result += nontabbed + \"<b class='t w%d%s'></b>\" % (width, illegal)\n            index = tabindex + 1\n            column = column + nontabbed_length + width\n    except:\n        result += line[index:]\n\n    return result\n\nBLOCK_ELEMENTS = set([\"html\", \"head\", \"body\", \"section\", \"table\", \"thead\", \"tbody\", \"tfoot\", \"tr\", \"td\", \"th\", \"div\", \"p\", \"ol\", \"li\", \"label\", \"select\", \"option\", \"link\", \"script\"])\nEMPTY_ELEMENTS = set([\"br\", \"hr\", \"input\", \"link\", \"base\", \"col\"])\n\ndef isBlockElement(name):\n    return name in BLOCK_ELEMENTS\n\ndef isEmptyElement(name):\n    return name in EMPTY_ELEMENTS\n\ndef mtime(path):\n    try: return long(os.stat(path).st_mtime)\n    except: raise\n\ndef base36(n):\n    s = \"\"\n    while n:\n        s = \"0123456789abcdefghijklmnopqrstuvwxyz\"[n % 36] + s\n        n = n // 36\n    return s\n\ndef getStaticResourceURI(name):\n    import configuration\n    uri = \"/static-resource/\" + name\n    ts = mtime(os.path.join(configuration.paths.INSTALL_DIR, \"resources\", name))\n    if ts: uri += \"?\" + base36(ts)\n    return uri\n\nclass URL(object):\n    def __init__(self, path, fragment=None, **query):\n        assert path.startswith(\"/\")\n        assert \"?\" not in path\n        assert \"#\" not in path\n        self.value = path\n        if query:\n            self.value += \"?\" + urllib.urlencode(\n                [(name, str(value)) for name, value in query.items()])\n        if fragment:\n            self.value += \"#\" + fragment.lstrip(\"#\")\n    def __str__(self):\n        return self.value\n\nclass MetaInformation(object):\n    def __init__(self):\n        self.__orderIndices = set()\n        self.__externalStylesheetList = []\n        self.__externalStylesheetSet = set()\n        self.__internalStylesheetList = []\n        self.__internalStylesheetSet = set()\n        self.__externalScriptList = []\n        self.__externalScriptSet = set()\n        self.__internalScriptList = []\n        self.__links = {}\n        self.__title = None\n        self.__finished = False\n        self.__request = None\n        self.__base = \"/\"\n\n    def addExternalStylesheet(self, uri, use_static=True, order=0):\n        if use_static:\n            uri = getStaticResourceURI(uri.split(\"/\", 1)[1])\n        if uri not in self.__externalStylesheetSet:\n            self.__orderIndices.add(order)\n            self.__externalStylesheetList.append((order, uri))\n            self.__externalStylesheetSet.add(uri)\n\n    def addInternalStylesheet(self, text, order=0):\n        if text not in self.__internalStylesheetSet:\n            self.__orderIndices.add(order)\n            self.__internalStylesheetList.append((order, text))\n            self.__internalStylesheetSet.add(text)\n\n    def addExternalScript(self, uri, use_static=True, order=0):\n        if use_static:\n            uri = getStaticResourceURI(uri.split(\"/\", 1)[1])\n        if uri not in self.__externalScriptSet:\n            self.__orderIndices.add(order)\n            self.__externalScriptList.append((order, uri))\n            self.__externalScriptSet.add(uri)\n\n    def addInternalScript(self, text, order=0):\n        self.__orderIndices.add(order)\n        self.__internalScriptList.append((order, text))\n\n    def hasTitle(self):\n        return self.__title is not None\n\n    def setTitle(self, title):\n        self.__title = title\n\n    def setLink(self, rel, href):\n        return self.__links.setdefault(rel, href)\n\n    def setBase(self, base):\n        self.__base = base\n\n    def setRequest(self, req):\n        self.__request = req\n\n    def getRequest(self):\n        return self.__request\n\n    def render(self, target):\n        import configuration\n\n        if not self.__finished:\n            if self.__title:\n                target.title().text(self.__title)\n            if self.__base:\n                target.base(href=self.__base)\n            for rel, href in self.__links.items():\n                target.link(rel=rel, href=href)\n\n            for index in sorted(self.__orderIndices):\n                def filtered(items): return [data for order, data in items if order==index]\n\n                for uri in filtered(self.__externalStylesheetList):\n                    target.link(rel=\"stylesheet\", type=\"text/css\", href=uri)\n                for uri in filtered(self.__externalScriptList):\n                    target.script(type=\"text/javascript\", src=uri)\n                for text in filtered(self.__internalStylesheetList):\n                    target.style(type=\"text/css\").text(text.strip(), cdata=True)\n                for text in filtered(self.__internalScriptList):\n                    target.script(type=\"text/javascript\").text(text.strip(), cdata=True)\n\n            if configuration.debug.IS_DEVELOPMENT:\n                favicon = \"/static-resource/favicon-dev.png\"\n            else:\n                favicon = \"/static-resource/favicon.png\"\n\n            target.link(rel=\"icon\", type=\"image/png\", href=favicon)\n\n            self.__finished = True\n\nclass PausedRendering: pass\n\nclass Fragment(object):\n    def __init__(self, is_element=False, req=None):\n        self.__children = []\n        self.__metaInformation = not is_element and MetaInformation() or None\n\n    def appendChild(self, child):\n        self.__children.append(child)\n        return child\n\n    def insertChild(self, child, offset=0):\n        self.__children.insert(offset, child)\n        return child\n\n    def removeChild(self, child):\n        assert child in self.__children\n        self.__children.remove(child)\n\n    def metaInformation(self):\n        return self.__metaInformation\n\n    def __len__(self): return len(self.__children)\n    def __getitem__(self, index): return self.__children[index]\n    def __str__(self): return \"\".join(map(str, self.__children))\n\n    def render(self, output, level=0, indent_before=True, stop=None, pretty=True):\n        for child in self.__children:\n            child.render(output, level, indent_before, stop=stop, pretty=pretty)\n            if pretty: output.write(\"\\n\")\n\n    def deleteChildren(self, count=None):\n        if count is None: self.__children = []\n        else: del self.__children[:count]\n\n    def hasChildren(self):\n        return bool(self.__children)\n\nclass Element(Fragment):\n    def __init__(self, name):\n        super(Element, self).__init__(True)\n        self.__name = name\n        self.__attributes = {}\n        self.__empty = isEmptyElement(name)\n        self.__preformatted = False\n        self.__metaInformation = None\n        self.__rendered = False\n        self.__disabled = False\n\n    def setAttribute(self, name, value):\n        self.__attributes[name] = value\n\n    def addClass(self, *names):\n        classes = set(self.__attributes.get(\"class\").split())\n        classes.update(names)\n        self.setAttribute(\"class\", \" \".join(classes))\n\n    def setPreFormatted(self):\n        self.__preformatted = True\n\n    def setMetaInformation(self, metaInformation):\n        self.__metaInformation = metaInformation\n\n    def appendChild(self, child):\n        assert not self.__empty\n        return Fragment.appendChild(self, child)\n\n    def remove(self):\n        self.__disabled = True\n    def removeIfEmpty(self):\n        self.__disabled = not self.hasChildren()\n\n    def __str__(self):\n        attributes = \"\".join([(\" %s=%s\" % (name, htmlify(value, True))) for name, value in self.__attributes.items()])\n        if isEmptyElement(self.__name):\n            return \"<%s%s>\" % (self.__name, attributes)\n        else:\n            return \"<%s%s>%s</%s>\" % (self.__name, attributes, Fragment.__str__(self), self.__name)\n\n    def render(self, output, level=0, indent_before=True, stop=None, pretty=True):\n        if self.__disabled: return\n\n        if self.__metaInformation: self.__metaInformation.render(Generator(self, None))\n\n        if pretty: indent = \"  \" * level\n        else: indent = \"\"\n\n        if indent_before: startindent = indent\n        else: startindent = \"\"\n\n        for child in self:\n            if isinstance(child, Element) and isBlockElement(child.__name) or (isinstance(child, Text) or isinstance(child, Comment)) and '\\n' in str(child):\n                linebreak = \"\\n\"\n                endindent = indent\n                break\n        else:\n            indent_before = False\n            linebreak = \"\"\n            endindent = \"\"\n\n        if not pretty or self.__preformatted:\n            child_level = 0\n            linebreak = \"\"\n            endindent = \"\"\n        else: child_level = level + 1\n\n        attributes = \"\".join([(\" %s=%s\" % (name, htmlify(value, True, pretty))) for name, value in self.__attributes.items()])\n\n        if self.__empty:\n            if not self.__rendered:\n                output.write(\"%s<%s%s>\" % (startindent, self.__name, attributes))\n            self.__rendered = True\n        else:\n            if not self.__rendered:\n                output.write(\"%s<%s%s>%s\" % (startindent, self.__name, attributes, linebreak))\n            self.__rendered = True\n\n            children_rendered = 0\n            for child in self:\n                if self.__preformatted: child.setPreFormatted()\n                try:\n                    child.render(output, child_level, indent_before, stop, pretty)\n                    output.write(linebreak)\n                    children_rendered += 1\n                except PausedRendering:\n                    self.deleteChildren(children_rendered)\n                    raise\n\n            self.deleteChildren()\n\n            if self == stop: raise PausedRendering\n            else: output.write(\"%s</%s>\" % (endindent, self.__name))\n\n    def empty(self):\n        self.__empty = True\n\nclass Text(object):\n    def __init__(self, value, preformatted=False, cdata=False):\n        if cdata: self.__value = value\n        elif value is None: self.__value = \"&nbsp;\"\n        else: self.__value = htmlify(value)\n        self.__preformatted = preformatted\n\n    def setPreFormatted(self):\n        self.__preformatted = True\n\n    def render(self, output, level=0, indent_before=True, stop=None, pretty=True):\n        if pretty and level and not self.__preformatted and '\\n' in self.__value:\n            indent = \"  \" * level\n            if indent_before: startindent = indent\n            else: startindent = \"\"\n            output.write(startindent + ('\\n' + indent).join([line for line in self.__value.strip().splitlines()]))\n        else:\n            output.write(self.__value)\n\n    def __str__(self):\n        return self.__value\n\nclass Comment(object):\n    def __init__(self, value):\n        self.__value = value.replace(\"--\", \"- -\")\n\n    def setPreFormatted(self):\n        pass\n\n    def render(self, output, level=0, indent_before=True, stop=None, pretty=True):\n        if pretty and level and '\\n' in self.__value:\n            indent = \"  \" * level\n            if indent_before: startindent = indent\n            else: startindent = \"\"\n            output.write(startindent + \"<!-- \" + ('\\n' + indent + \"     \").join(htmlify(self.__value).splitlines()) + \" -->\")\n        else:\n            output.write(\"<!-- \" + self.__value + \" -->\")\n\n    def __str__(self):\n        return self.__value\n\nclass HTML(object):\n    def __init__(self, value):\n        self.__value = value\n\n    def setPreFormatted(self):\n        pass\n\n    def render(self, output, level=0, indent_before=True, stop=None, pretty=True):\n        output.write(self.__value)\n\n    def __str__(self):\n        return self.__value\n\ndef safestr(value):\n    try: return str(value)\n    except: return unicode(value)\n\nclass Generator(object):\n    def __init__(self, target, metaInformation):\n        self.__target = target\n        self.__metaInformation = metaInformation\n\n    def __enter__(self):\n        return self\n    def __exit__(self, *args):\n        return False\n\n    def __eq__(self, other):\n        return other == self.__target\n\n    def __open(self, __name, **attributes):\n        target = self.__target.appendChild(Element(__name))\n        if \"__generator__\" in attributes:\n            del attributes[\"__generator__\"]\n            generator = True\n        else:\n            generator = __name not in EMPTY_ELEMENTS\n        for name, value in attributes.items():\n            if value is not None:\n                target.setAttribute(name.strip(\"_\").replace(\"_\", \"-\"), safestr(value))\n        if not generator: return self\n        else: return Generator(target, self.__metaInformation)\n\n    def __getattr__(self, name):\n        def open(*className, **attributes):\n            assert len(className) == 0 or len(className) == 1\n            if className: return self.__open(name, _class=className[0], **attributes)\n            else: return self.__open(name, **attributes)\n        return open\n\n    def head(self, **attributes):\n        target = self.__target.appendChild(Element(\"head\"))\n        for name, value in attributes.items():\n            if value is not None:\n                target.setAttribute(name.strip(\"_\").replace(\"_\", \"-\"), safestr(value))\n        target.setMetaInformation(self.__metaInformation)\n        return Generator(target, self.__metaInformation)\n\n    def append(self, fragment):\n        if fragment is not None:\n            if isinstance(fragment, Generator): self.__target.appendChild(fragment.__target)\n            else: self.__target.appendChild(fragment)\n\n    def remove(self):\n        self.__target.remove()\n    def removeIfEmpty(self):\n        self.__target.removeIfEmpty()\n\n    def text(self, value=None, preformatted=False, cdata=False, linkify=False, repository=None, escape=False):\n        if linkify:\n            assert not cdata\n\n            if isinstance(linkify, Context):\n                context = linkify\n            else:\n                context = Context(repository=repository)\n\n            for linktype in ALL_LINKTYPES:\n                if linktype.match(value):\n                    url = linktype.linkify(value, context)\n                    if url:\n                        self.a(href=url).text(value, escape=escape)\n                        break\n            else:\n                for word in re_linkify.split(value):\n                    if word:\n                        for linktype in ALL_LINKTYPES:\n                            if linktype.match(word):\n                                url = linktype.linkify(word, context)\n                                if url:\n                                    self.a(href=url).text(word, escape=escape)\n                                    break\n                        else:\n                            self.text(word, preformatted, escape=escape)\n        else:\n            if escape:\n                value = textutils.escape(value)\n            self.__target.appendChild(Text(value, preformatted, cdata))\n        return self\n\n    def comment(self, value):\n        self.__target.appendChild(Comment(safestr(value)))\n        return self\n\n    def commentFirst(self, value):\n        self.__target.insertChild(Comment(safestr(value)), offset=0)\n        return self\n\n    def innerHTML(self, value=\"&nbsp;\"):\n        self.__target.appendChild(HTML(safestr(value)))\n        return self\n\n    def setAttribute(self, name, value):\n        self.__target.setAttribute(name, value)\n        return self\n\n    def addClass(self, *names):\n        self.__target.addClass(*names)\n        return self\n\n    def render(self, output, level=0, stop=None, pretty=True):\n        self.__target.render(output, level, stop=stop, pretty=pretty)\n\n    def empty(self):\n        self.__target.empty()\n        return self\n\n    def preformatted(self):\n        self.__target.setPreFormatted()\n        return self\n\n    def addExternalStylesheet(self, uri, use_static=True, order=0):\n        self.__metaInformation.addExternalStylesheet(uri, use_static, order=order)\n\n    def addInternalStylesheet(self, text, order=0):\n        self.__metaInformation.addInternalStylesheet(text, order=order)\n\n    def addExternalScript(self, uri, use_static=True, order=0):\n        self.__metaInformation.addExternalScript(uri, use_static, order=order)\n\n    def addInternalScript(self, text, here=False, order=0):\n        if here:\n            self.script(type=\"text/javascript\").text(text.strip().replace(\"</\", \"<\\/\"), cdata=True)\n        else:\n            self.__metaInformation.addInternalScript(text, order=order)\n\n    def hasTitle(self):\n        return self.__metaInformation.hasTitle()\n\n    def setTitle(self, title):\n        self.__metaInformation.setTitle(title)\n\n    def setLink(self, rel, href):\n        self.__metaInformation.setLink(rel, href)\n\n    def setBase(self, base):\n        self.__metaInformation.setBase(base)\n\n    def getRequest(self):\n        return self.__metaInformation.getRequest()\n\nclass Document(Generator):\n    def __init__(self, req=None):\n        self.__fragment = Fragment()\n        Generator.__init__(self, self.__fragment, self.__fragment.metaInformation())\n        self.__start = time.time()\n        self.__generation = 0.0\n        self.__rendering = 0.0\n        self.__doctype = True\n\n        if req: self.__fragment.metaInformation().setRequest(req)\n\n    def render(self, plain=False, stop=None, pretty=True):\n        self.__generation += time.time() - self.__start\n\n        output = StringIO()\n        if not plain and self.__doctype:\n            output.write(\"<!DOCTYPE html>\")\n            self.__doctype = False\n\n        before = time.time()\n        try:\n            Generator.render(self, output, stop=stop, pretty=pretty)\n            finished = True\n        except PausedRendering:\n            finished = False\n        after = time.time()\n\n        self.__rendering += after - before\n\n        if not plain and finished:\n            output.write(\"\\n<!-- generation: %.2f ms, rendering: %.2f ms -->\" % (self.__generation * 1000, self.__rendering * 1000))\n\n        self.__start = time.time()\n        return output.getvalue()\n\n    def __str__(self):\n        return self.render()\n\ndef stripStylesheet(text, compact):\n    if compact:\n        text = re.sub(r\"/\\*(?:[^*]|\\*[^/])*\\*/\", \"\", text)\n        text = re.sub(r\"\\s*([,:;{}])\\s*\", lambda m: m.group(1), text)\n        text = re.sub(r\"\\s+\", \" \", text)\n    return text\n\nif __name__ == \"__main__\":\n    generator = Document()\n    row = generator.html().body().table(border=1).tbody().tr()\n    row.td(_class=\"left\").div(id=\"foo\").text(\"Column 1\\nMore text\")\n    row.td(_class=\"right\").text(\"text\").comment(\"comment\")\n    print generator.render()\n"
  },
  {
    "path": "src/htmlutils_unittest.py",
    "content": "def independence():\n    # Simply check that htmlutils can be imported.\n\n    import htmlutils\n\n    print \"independence: ok\"\n"
  },
  {
    "path": "src/index.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\n\nfrom subprocess import Popen as process, PIPE\nfrom re import compile, split\nfrom time import gmtime, strftime\nfrom pwd import getpwuid\nfrom os import getuid\n\nimport gitutils\nfrom log.commitset import CommitSet\n\nimport dbutils\nimport reviewing.utils\nimport reviewing.mail\nimport reviewing.rebase\nimport configuration\nimport log.commitset\nimport textutils\n\nif configuration.extensions.ENABLED:\n    import extensions.role.processcommits\n\ntry:\n    from customization.githook import Reject, update\nexcept ImportError:\n    class Reject(Exception):\n        pass\n    def update(_repository, _ref, _old, _new):\n        pass\n\ndef reflow(message):\n    return textutils.reflow(message, line_length=80 - len(\"remote: \"))\n\ndef timestamp(time):\n    return strftime(\"%Y-%m-%d %H:%M:%S\", time)\n\nclass IndexException(Exception):\n    pass\n\ndef processCommits(db, repository, sha1):\n    sha1 = repository.run(\"rev-parse\", \"--verify\", \"--quiet\", sha1 + \"^{commit}\").strip()\n\n    stack = []\n    edges_values = []\n\n    cursor = db.cursor()\n    cursor.execute(\"SELECT 1 FROM commits LIMIT 1\")\n    emptydb = cursor.fetchone() is None\n\n    cursor.execute(\"\"\"SELECT commits.sha1\n                        FROM commits\n                        JOIN branches ON (branches.head=commits.id)\n                       WHERE branches.repository=%s\n                         AND branches.type='normal'\n                         AND branches.base IS NULL\n                    ORDER BY branches.id ASC\"\"\",\n                   (repository.id,))\n\n    try:\n        base_sha1 = cursor.fetchone()[0]\n        count = int(repository.run(\"rev-list\", \"--count\", \"%s..%s\" % (base_sha1, sha1)).strip())\n    except:\n        count = 0\n\n    if count > configuration.limits.PUSH_COMMIT_LIMIT:\n        raise IndexException(\"\"\"\\\nYou're trying to add %d new commits to this repository.  Are you\nperhaps pushing to the wrong repository?\"\"\" % count)\n\n    commits_values = []\n    commits = set()\n\n    while True:\n        if sha1 not in commits:\n            commit = gitutils.Commit.fromSHA1(db, repository, sha1)\n\n            if commit.author.email: author_id = commit.author.getGitUserId(db)\n            else: author_id = 0\n\n            if commit.committer.email: committer_id = commit.committer.getGitUserId(db)\n            else: committer_id = 0\n\n            if emptydb: row = None\n            else:\n                cursor.execute(\"SELECT id FROM commits WHERE sha1=%s\", (commit.sha1,))\n                row = cursor.fetchone()\n                new_commit = False\n\n            if not row:\n                commits_values.append((commit.sha1, author_id, committer_id, timestamp(commit.author.time), timestamp(commit.committer.time)))\n                new_commit = True\n\n            commits.add(sha1)\n\n            if new_commit:\n                edges_values.extend([(parent_sha1, commit.sha1) for parent_sha1 in set(commit.parents)])\n                stack.extend(set(commit.parents))\n\n        if not stack: break\n\n        sha1 = stack.pop(0)\n\n    cursor.executemany(\"\"\"INSERT INTO commits (sha1, author_gituser, commit_gituser, author_time, commit_time)\n                               VALUES (%s, %s, %s, %s, %s)\"\"\",\n                       commits_values)\n\n    cursor.executemany(\"\"\"INSERT INTO edges (parent, child)\n                               SELECT parents.id, children.id\n                                 FROM commits AS parents,\n                                      commits AS children\n                                WHERE parents.sha1=%s AND children.sha1=%s\"\"\",\n                       edges_values)\n\n    db.commit()\n\ndef createBranches(db, user, repository, branches, flags):\n    if len(branches) > 1:\n        try:\n            from customization.branches import compareBranchNames\n        except ImportError:\n            def compareBranchNames(name1, name2):\n                name1 = split(\"([-_+])\", name1)\n                name2 = split(\"([-_+])\", name2)\n\n                for name1, name2 in map(None, name1, name2):\n                    if name1 is None: return -1\n                    elif name2 is None: return 1\n                    elif name1 != name2:\n                        try: return cmp(int(name1), int(name2))\n                        except: return cmp(name1, name2)\n                else: return 0\n\n        def compareBranches(branch1, branch2):\n            name1, head1 = branch1\n            name2, head2 = branch2\n\n            # Same name ought not occur twice, but just to be on the safe side.\n            if name1 == name2: return 0\n\n            # Special case for master.  Mostly redundant, because it's quite\n            # unlikely that master would be created along with other branches.\n            elif name1 == \"master\": return -1\n            elif name2 == \"master\": return 1\n\n            # Try a natural ordering based on the relationships of the head\n            # commits of the two branches, unless the heads are the same:\n            if head1 != head2:\n                base = repository.mergebase([head1, head2])\n\n                # If either head is an ancestor of the other head, merge-base\n                # between them will be the ancestor head, and in that case,\n                # process that branch first.  Otherwise, then that would be\n                # guaranteed to show up as empty, and that's probably not the\n                # intention.\n                if base == head1: return -1\n                elif base == head2: return 1\n\n            # Two non-taskbranch branches that seem \"unrelated\".  Process them\n            # ordered by name, mostly so that this comparison function is well-\n            # behaved.\n            return compareBranchNames(name1, name2)\n\n        branches.sort(cmp=compareBranches)\n\n    multiple = len(branches) > 1\n\n    for name, head in branches:\n        createBranch(db, user, repository, name, head, multiple, flags)\n\ndef createBranch(db, user, repository, name, head, multiple, flags):\n    try:\n        update(repository.path, \"refs/heads/\" + name, None, head)\n    except Reject as rejected:\n        raise IndexException(str(rejected))\n    except Exception:\n        pass\n\n    cursor = db.cursor()\n\n    # Check if a branch with this name already \"exists\".\n    branch = dbutils.Branch.fromName(db, repository, name, load_review=True)\n    if branch is not None:\n        if branch.archived:\n            # This is a (review) branch that has been archived.  It's expected\n            # that Git thinks the user is creating a new branch.\n            message = \"\"\"\\\nThis repository already contains a branch named '%s', but it has been archived,\nmeaning it has been hidden from view to reduce the number of visible refs in\nthis repository.\"\"\" % name\n\n            if branch.review:\n                message += \"\"\"\n\nTo continue working on this branch, you need to first reopen the review that is\nassociated with the branch.  You can do this from the review's front-page:\n\n%s\"\"\" % branch.review.getURL(db, user, indent=2)\n\n            raise IndexException(reflow(message))\n        else:\n            # This is a branch that's not supposed to have been archived,\n            # meaning it appears to have just gone missing from the repository.\n            # Handle this the same way we handle updates where Git's idea of the\n            # branches current value doesn't match what we think it should be.\n            #\n            # We can trigger that handling by calling updateBranch() with any\n            # \"wrong\" old value.\n            updateBranch(db, user, repository, name, \"0\" * 40, head, multiple, flags)\n            return\n\n    def commit_id(sha1):\n        cursor.execute(\"SELECT id FROM commits WHERE sha1=%s\", [sha1])\n        return cursor.fetchone()[0]\n\n    components = name.split(\"/\")\n    for index in range(1, len(components)):\n        try: repository.revparse(\"refs/heads/%s\" % \"/\".join(components[:index]))\n        except: continue\n\n        message = (\"Cannot create branch with name '%s' since there is already a branch named '%s' in the repository.\" %\n                   (name, \"/\".join(components[:index])))\n        raise IndexException(reflow(message))\n\n    if name.startswith(\"r/\"):\n        try:\n            review_id = int(name[2:])\n\n            cursor.execute(\"SELECT branches.name FROM reviews JOIN branches ON (branches.id=reviews.branch) WHERE reviews.id=%s\", (review_id,))\n            row = cursor.fetchone()\n\n            message = \"Refusing to create review named as a number.\"\n\n            if row:\n                message += \"\\nDid you mean to push to the branch '%s', perhaps?\" % row[0]\n\n            raise IndexException(message)\n        except ValueError:\n            pass\n\n        if user.isSystem():\n            raise IndexException(\"Refusing to create review this way.\")\n        elif user.getPreference(db, \"review.createViaPush\"):\n            the_commit = gitutils.Commit.fromSHA1(db, repository, head, commit_id(head))\n            all_commits = [the_commit]\n\n            review = reviewing.utils.createReview(\n                db, user, repository, all_commits, name,\n                the_commit.niceSummary(include_tag=False), None, via_push=True)\n\n            print \"Submitted review:\"\n            print review.getURL(db, user, indent=2)\n\n            if review.reviewers:\n                print \"  Reviewers:\"\n                for reviewer in review.reviewers:\n                    print \"    %s <%s>\" % (reviewer.fullname, reviewer.email)\n\n            if review.watchers:\n                print \"  Watchers:\"\n                for watcher in review.watchers:\n                    print \"    %s <%s>\" % (watcher.fullname, watcher.email)\n\n            if configuration.extensions.ENABLED:\n                if extensions.role.processcommits.execute(db, user, review, all_commits, None, the_commit, sys.stdout):\n                    print\n\n            print \"Thank you!\"\n            return True\n        else:\n            raise IndexException(\"Refusing to create review; user preference 'review.createViaPush' is not enabled.\")\n\n    sha1 = head\n    base = None\n    tail = None\n\n    cursor.execute(\"\"\"SELECT 1\n                        FROM reachable\n                        JOIN branches ON (branches.id=reachable.branch)\n                        JOIN repositories ON (repositories.id=branches.repository)\n                       WHERE repositories.id=%s\n                       LIMIT 1\"\"\",\n                   (repository.id,))\n\n    if cursor.fetchone():\n        def reachable(sha1):\n            cursor.execute(\"\"\"SELECT branches.id\n                                FROM branches\n                                JOIN reachable ON (reachable.branch=branches.id)\n                                JOIN commits ON (commits.id=reachable.commit)\n                               WHERE branches.repository=%s\n                                 AND branches.type='normal'\n                                 AND commits.sha1=%s\n                            ORDER BY reachable.branch ASC\n                               LIMIT 1\"\"\",\n                           (repository.id, sha1))\n            return cursor.fetchone()\n    else:\n        def reachable(sha1):\n            return None\n\n    commit_map = {}\n    commit_list = []\n\n    row = reachable(sha1)\n    if row:\n        # Head of branch is reachable from an existing branch.  Could be because\n        # this branch is actually empty (just created with no \"own\" commits) or\n        # it could have been merged into some other already existing branch.  We\n        # can't tell, so we just record it as empty.\n\n        base = row[0]\n        tail = sha1\n    else:\n        stack = []\n\n        while True:\n            if sha1 not in commit_map:\n                commit = gitutils.Commit.fromSHA1(db, repository, sha1)\n                commit_map[sha1] = commit\n                commit_list.append(commit)\n\n                for sha1 in commit.parents:\n                    if sha1 not in commit_map:\n                        row = reachable(sha1)\n                        if not row:\n                            stack.append(sha1)\n                        elif base is None:\n                            base = row[0]\n                            tail = sha1\n\n                            base_chain = [base]\n\n                            while True:\n                                cursor.execute(\"SELECT base FROM branches WHERE id=%s\", (base_chain[-1],))\n                                next = cursor.fetchone()[0]\n                                if next is None: break\n                                else: base_chain.append(next)\n\n                            def reachable(sha1):\n                                cursor.execute(\"\"\"SELECT 1\n                                                    FROM reachable\n                                                    JOIN commits ON (commits.id=reachable.commit)\n                                                   WHERE reachable.branch=ANY (%s)\n                                                     AND commits.sha1=%s\"\"\",\n                                               (base_chain, sha1))\n                                return cursor.fetchone()\n\n            if stack: sha1 = stack.pop(0)\n            else: break\n\n    if not base:\n        cursor.execute(\"INSERT INTO branches (repository, name, head) VALUES (%s, %s, %s) RETURNING id\", (repository.id, name, commit_id(head)))\n        branch_id = cursor.fetchone()[0]\n    else:\n        cursor.execute(\"INSERT INTO branches (repository, name, head, base, tail) VALUES (%s, %s, %s, %s, %s) RETURNING id\", (repository.id, name, commit_id(head), base, commit_id(tail)))\n        branch_id = cursor.fetchone()[0]\n\n        # Suppress the \"user friendly\" feedback if the push is performed by the\n        # Critic system user, since there wouldn't be a human being reading it.\n        if not user.isSystem():\n            cursor.execute(\"SELECT name FROM branches WHERE id=%s\", [base])\n\n            print \"Added branch based on %s containing %d commit%s:\" % (cursor.fetchone()[0], len(commit_list), \"s\" if len(commit_list) > 1 else \"\")\n            for url_prefix in user.getCriticURLs(db):\n                print \"  %s/log?repository=%d&branch=%s\" % (url_prefix, repository.id, name)\n            if len(commit_list) > 1:\n                print \"To create a review of all %d commits:\" % len(commit_list)\n            else:\n                print \"To create a review of the commit:\"\n            for url_prefix in user.getCriticURLs(db):\n                print \"  %s/createreview?repository=%d&branch=%s\" % (url_prefix, repository.id, name)\n\n    reachable_values = [(branch_id, commit.sha1) for commit in commit_list]\n    cursor.executemany(\"INSERT INTO reachable (branch, commit) SELECT %s, id FROM commits WHERE sha1=%s\", reachable_values)\n\ndef updateBranch(db, user, repository, name, old, new, multiple, flags):\n    try:\n        update(repository.path, \"refs/heads/\" + name, old, new)\n    except Reject as rejected:\n        raise IndexException(str(rejected))\n    except Exception:\n        pass\n\n    try:\n        branch = dbutils.Branch.fromName(db, repository, name, for_update=dbutils.NOWAIT)\n    except dbutils.FailedToLock:\n        raise IndexException(reflow(\n                \"The branch '%s' is currently locked since it is being updated \"\n                \"by another push.  Please fetch and try again.\" % name))\n    else:\n        if not branch:\n            # FIXME: We should handle this better.  Maybe just redirect to\n            # createBranch()?\n            raise IndexException(\"The branch '%s' is not in the database!\" % name)\n        base_branch_id = branch.base.id if branch.base else None\n\n    if branch.head_sha1 != old:\n        if new == branch.head_sha1:\n            # This is what we think the ref ought to be already.  Do nothing,\n            # and let the repository \"catch up.\"\n            return\n        else:\n            data = { \"name\": name,\n                     \"old\": old[:8],\n                     \"new\": new[:8],\n                     \"current\": branch.head_sha1[:8] }\n\n            message = \"\"\"CONFUSED!  Git thinks %(name)s points to %(old)s, but Critic thinks it points to %(current)s.  Rejecting push since it would only makes matters worse.  To resolve this problem, use\n\n  git push -f critic %(current)s:%(name)s\n\nto resynchronize the Git repository with Critic's database.  Note that 'critic' above must be replaced by the actual name of your Critic remote, if not 'critic'.\"\"\" % data\n\n            raise IndexException(textutils.reflow(message, line_length=80 - len(\"remote: \")))\n\n    cursor = db.cursor()\n    cursor.execute(\"\"\"SELECT id, remote, remote_name, forced, updating\n                        FROM trackedbranches\n                       WHERE repository=%s\n                         AND local_name=%s\n                         AND NOT disabled\"\"\",\n                   (repository.id, name))\n    row = cursor.fetchone()\n\n    if row:\n        trackedbranch_id, remote, remote_name, forced, updating = row\n        tracked_branch = \"%s in %s\" % (remote_name, remote)\n\n        assert not forced or not name.startswith(\"r/\")\n\n        if not user.isSystem() \\\n                or flags.get(\"trackedbranch_id\") != str(trackedbranch_id):\n            raise IndexException(\"\"\"\\\nThe branch '%s' is set up to track '%s' in\n  %s\nPlease don't push it manually to this repository.\"\"\" % (name, remote_name, remote))\n\n        assert updating\n\n        if not name.startswith(\"r/\"):\n            conflicting = repository.revlist([branch.head_sha1], [new])\n            added = repository.revlist([new], [branch.head_sha1])\n\n            if conflicting:\n                if forced:\n                    if branch.base is None:\n                        cursor.executemany(\"\"\"DELETE FROM reachable\n                                                    WHERE branch=%s\n                                                      AND commit IN (SELECT id\n                                                                       FROM commits\n                                                                      WHERE sha1=%s)\"\"\",\n                                           [(branch.id, sha1) for sha1 in conflicting])\n                    else:\n                        print \"Non-fast-forward update detected; deleting and recreating branch.\"\n\n                        deleteBranch(db, user, repository, branch.name, old)\n                        createBranches(db, user, repository, [(branch.name, new)], flags)\n\n                        return\n                else:\n                    raise IndexException(\"\"\"\\\nRejecting non-fast-forward update of branch.  To perform the update, you\ncan delete the branch using\n  git push critic :%s\nfirst, and then repeat this push.\"\"\" % name)\n\n            cursor.executemany(\"\"\"INSERT INTO reachable (branch, commit)\n                                       SELECT %s, commits.id\n                                         FROM commits\n                                        WHERE sha1=%s\"\"\",\n                               [(branch.id, sha1) for sha1 in added])\n\n            new_head = gitutils.Commit.fromSHA1(db, repository, new)\n\n            cursor.execute(\"UPDATE branches SET head=%s WHERE id=%s\",\n                           (new_head.getId(db), branch.id))\n\n            output = []\n\n            if conflicting:\n                output.append(\"Pruned %d conflicting commits.\" % len(conflicting))\n            if added:\n                output.append(\"Added %d new commits.\" % len(added))\n\n            if output:\n                print \"\\n\".join(output)\n\n            return\n    else:\n        tracked_branch = False\n\n    cursor.execute(\"SELECT id FROM reviews WHERE branch=%s\", (branch.id,))\n    row = cursor.fetchone()\n\n    is_review = bool(row)\n\n    if is_review:\n        if multiple:\n            raise IndexException(\"\"\"\\\nRefusing to update review in push of multiple refs.  Please push one\nreview branch at a time.\"\"\")\n\n        review_id = row[0]\n\n        cursor.execute(\"\"\"SELECT id, old_head, old_upstream, new_upstream, uid, branch\n                            FROM reviewrebases\n                           WHERE review=%s AND new_head IS NULL\"\"\",\n                       (review_id,))\n        row = cursor.fetchone()\n\n        if row:\n            if tracked_branch:\n                raise IndexException(\"Refusing to perform a review rebase via an automatic update.\")\n\n            rebase_id, old_head_id, old_upstream_id, new_upstream_id, rebaser_id, onto_branch = row\n\n            review = dbutils.Review.fromId(db, review_id)\n            rebaser = dbutils.User.fromId(db, rebaser_id)\n\n            if rebaser.id != user.id:\n                if user.isSystem():\n                    user = rebaser\n                else:\n                    raise IndexException(\"\"\"\\\nThis review is currently being rebased by\n  %s <%s>\nand can't be otherwise updated right now.\"\"\" % (rebaser.fullname, rebaser.email))\n\n            old_head = gitutils.Commit.fromId(db, repository, old_head_id)\n            old_commitset = log.commitset.CommitSet(review.branch.getCommits(db))\n\n            if old_head.sha1 != old:\n                raise IndexException(\"\"\"\\\nUnexpected error.  The branch appears to have been updated since your\nrebase was prepared.  You need to cancel the rebase via the review\nfront-page and then try again, and/or report a bug about this error.\"\"\")\n\n            if old_upstream_id is not None:\n                new_head = gitutils.Commit.fromSHA1(db, repository, new)\n\n                old_upstream = gitutils.Commit.fromId(db, repository, old_upstream_id)\n\n                if new_upstream_id is not None:\n                    new_upstream = gitutils.Commit.fromId(db, repository, new_upstream_id)\n                else:\n                    if len(new_head.parents) != 1:\n                        raise IndexException(\"Invalid rebase: New head can't be a merge commit.\")\n\n                    new_upstream = gitutils.Commit.fromSHA1(db, repository, new_head.parents[0])\n\n                    if new_upstream in old_commitset.getTails():\n                        old_upstream = new_upstream = None\n            else:\n                old_upstream = None\n\n            if old_upstream:\n                unrelated_move = False\n\n                if not new_upstream.isAncestorOf(new):\n                    raise IndexException(\"\"\"\\\nInvalid rebase: The new upstream commit you specified when the rebase\nwas prepared is not an ancestor of the commit now pushed.  You may want\nto cancel the rebase via the review front-page, and prepare another one\nspecifying the correct new upstream commit; or rebase the branch onto\nthe new upstream specified and then push that instead.\"\"\")\n\n                if not old_upstream.isAncestorOf(new_upstream):\n                    unrelated_move = True\n\n                equivalent_merge = replayed_rebase = None\n\n                if unrelated_move:\n                    replayed_rebase = reviewing.rebase.replayRebase(\n                        db, review, user, old_head, old_upstream, new_head,\n                        new_upstream, onto_branch)\n                else:\n                    equivalent_merge = reviewing.rebase.createEquivalentMergeCommit(\n                        db, review, user, old_head, old_upstream, new_head,\n                        new_upstream, onto_branch)\n\n                new_sha1s = repository.revlist([new_head.sha1], [new_upstream.sha1], '--topo-order')\n                rebased_commits = [gitutils.Commit.fromSHA1(db, repository, sha1) for sha1 in new_sha1s]\n                reachable_values = [(review.branch.id, sha1) for sha1 in new_sha1s]\n\n                pending_mails = []\n\n                recipients = review.getRecipients(db)\n                for to_user in recipients:\n                    pending_mails.extend(reviewing.mail.sendReviewRebased(\n                            db, user, to_user, recipients, review,\n                            new_upstream, rebased_commits, onto_branch))\n\n                print \"Rebase performed.\"\n\n                review.setPerformedRebase(old_head, new_head, old_upstream, new_upstream, user,\n                                          equivalent_merge, replayed_rebase)\n\n                if unrelated_move:\n                    reviewing.utils.addCommitsToReview(\n                        db, user, review, [replayed_rebase],\n                        pending_mails=pending_mails,\n                        silent_if_empty=set([replayed_rebase]),\n                        replayed_rebases={ replayed_rebase: new_head })\n\n                    repository.keepalive(old_head)\n                    repository.keepalive(replayed_rebase)\n\n                    cursor.execute(\"\"\"UPDATE reviewrebases\n                                         SET replayed_rebase=%s\n                                       WHERE id=%s\"\"\",\n                                   (replayed_rebase.getId(db), rebase_id))\n                else:\n                    reviewing.utils.addCommitsToReview(\n                        db, user, review, [equivalent_merge], pending_mails=pending_mails,\n                        silent_if_empty=set([equivalent_merge]), full_merges=set([equivalent_merge]))\n\n                    repository.keepalive(equivalent_merge)\n\n                    cursor.execute(\"\"\"UPDATE reviewrebases\n                                         SET equivalent_merge=%s\n                                       WHERE id=%s\"\"\",\n                                   (equivalent_merge.getId(db), rebase_id))\n\n                cursor.execute(\"\"\"UPDATE reviewrebases\n                                     SET new_head=%s,\n                                         new_upstream=%s\n                                   WHERE id=%s\"\"\",\n                               (new_head.getId(db), new_upstream.getId(db), rebase_id))\n\n                cursor.execute(\"\"\"INSERT INTO previousreachable (rebase, commit)\n                                       SELECT %s, commit\n                                         FROM reachable\n                                        WHERE branch=%s\"\"\",\n                               (rebase_id, review.branch.id))\n                cursor.execute(\"DELETE FROM reachable WHERE branch=%s\",\n                               (review.branch.id,))\n                cursor.executemany(\"\"\"INSERT INTO reachable (branch, commit)\n                                           SELECT %s, commits.id\n                                             FROM commits\n                                            WHERE commits.sha1=%s\"\"\",\n                                   reachable_values)\n                cursor.execute(\"UPDATE branches SET head=%s WHERE id=%s\",\n                               (new_head.getId(db), review.branch.id))\n            else:\n                old_commitset = log.commitset.CommitSet(review.branch.getCommits(db))\n                new_sha1s = repository.revlist([new], old_commitset.getTails(), '--topo-order')\n\n                if old_head.sha1 in new_sha1s:\n                    raise IndexException(\"\"\"\\\nInvalid history rewrite: Old head of the branch reachable from the\npushed ref; no history rewrite performed.  (Cancel the rebase via\nthe review front-page if you've changed your mind.)\"\"\")\n\n                for new_sha1 in new_sha1s:\n                    new_head = gitutils.Commit.fromSHA1(db, repository, new_sha1)\n                    if new_head.tree == old_head.tree: break\n                else:\n                    raise IndexException(\"\"\"\\\nInvalid history rewrite: The rebase introduced unexpected code changes.\nUse git diff between the review branch in Critic's repository and\nthe rebased local branch to see what those changes are.\"\"\")\n\n                rebased_commits = [gitutils.Commit.fromSHA1(db, repository, sha1) for sha1 in repository.revlist([new_head], old_commitset.getTails(), '--topo-order')]\n                new_commits = [gitutils.Commit.fromSHA1(db, repository, sha1) for sha1 in repository.revlist([new], [new_head], '--topo-order')]\n                reachable_values = [(review.branch.id, sha1) for sha1 in new_sha1s]\n\n                pending_mails = []\n\n                recipients = review.getRecipients(db)\n                for to_user in recipients:\n                    pending_mails.extend(reviewing.mail.sendReviewRebased(db, user, to_user, recipients, review, None, rebased_commits))\n\n                print \"History rewrite performed.\"\n\n                if new_commits:\n                    reviewing.utils.addCommitsToReview(db, user, review, new_commits, pending_mails=pending_mails)\n                else:\n                    reviewing.mail.sendPendingMails(pending_mails)\n\n                cursor.execute(\"\"\"UPDATE reviewrebases\n                                     SET new_head=%s\n                                   WHERE id=%s\"\"\",\n                               (new_head.getId(db), rebase_id))\n\n                cursor.execute(\"\"\"INSERT INTO previousreachable (rebase, commit)\n                                       SELECT %s, commit\n                                         FROM reachable\n                                        WHERE branch=%s\"\"\",\n                               (rebase_id, review.branch.id))\n                cursor.execute(\"DELETE FROM reachable WHERE branch=%s\",\n                               (review.branch.id,))\n                cursor.executemany(\"\"\"INSERT INTO reachable (branch, commit)\n                                           SELECT %s, commits.id\n                                             FROM commits\n                                            WHERE commits.sha1=%s\"\"\",\n                                   reachable_values)\n                cursor.execute(\"UPDATE branches SET head=%s WHERE id=%s\",\n                               (gitutils.Commit.fromSHA1(db, repository, new).getId(db),\n                                review.branch.id))\n\n                repository.keepalive(old)\n\n            review.incrementSerial(db)\n\n            return True\n        elif old != repository.mergebase([old, new]):\n            raise IndexException(\"Rejecting non-fast-forward update of review branch.\")\n    elif old != repository.mergebase([old, new]):\n        raise IndexException(\"\"\"\\\nRejecting non-fast-forward update of branch.  To perform the update, you\ncan delete the branch using\n  git push critic :%s\nfirst, and then repeat this push.\"\"\" % name)\n\n    cursor.execute(\"SELECT id FROM branches WHERE repository=%s AND base IS NULL ORDER BY id ASC LIMIT 1\", (repository.id,))\n    root_branch_id = cursor.fetchone()[0]\n\n    def isreachable(sha1):\n        if is_review and sha1 == branch.tail_sha1:\n            return True\n        if base_branch_id:\n            cursor.execute(\"\"\"SELECT 1\n                                FROM commits\n                                JOIN reachable ON (reachable.commit=commits.id)\n                               WHERE commits.sha1=%s\n                                 AND reachable.branch IN (%s, %s, %s)\"\"\",\n                           (sha1, branch.id, base_branch_id, root_branch_id))\n        else:\n            cursor.execute(\"\"\"SELECT 1\n                                FROM commits\n                                JOIN reachable ON (reachable.commit=commits.id)\n                               WHERE commits.sha1=%s\n                                 AND reachable.branch IN (%s, %s)\"\"\",\n                           (sha1, branch.id, root_branch_id))\n        return cursor.fetchone() is not None\n\n    stack = [new]\n    commits = set()\n    commit_list = []\n    processed = set()\n\n    while stack:\n        sha1 = stack.pop()\n\n        if sha1 not in commits and not isreachable(sha1):\n            commits.add(sha1)\n            commit_list.append(sha1)\n\n            stack.extend([parent_sha1 for parent_sha1 in gitutils.Commit.fromSHA1(db, repository, sha1).parents if parent_sha1 not in processed])\n\n        processed.add(sha1)\n\n    branch = dbutils.Branch.fromName(db, repository, name)\n    review = dbutils.Review.fromBranch(db, branch)\n\n    if review:\n        if review.state != \"open\":\n            raise IndexException(\"\"\"\\\nThe review is closed and can't be extended.  You need to reopen it at\n%s\nbefore you can add commits to it.\"\"\" % review.getURL(db, user, 2))\n\n        all_commits = [gitutils.Commit.fromSHA1(db, repository, sha1) for sha1 in reversed(commit_list)]\n\n        tails = CommitSet(all_commits).getTails()\n\n        if old not in tails:\n            raise IndexException(\"\"\"\\\nPush rejected; would break the review.\n\nIt looks like some of the pushed commits are reachable from the\nrepository's main branch, and thus consequently the commits currently\nincluded in the review are too.\n\nPerhaps you should request a new review of the follow-up commits?\"\"\")\n\n        reviewing.utils.addCommitsToReview(db, user, review, all_commits, commitset=commits, tracked_branch=tracked_branch)\n\n    reachable_values = [(branch.id, sha1) for sha1 in reversed(commit_list) if sha1 in commits]\n\n    cursor.executemany(\"INSERT INTO reachable (branch, commit) SELECT %s, commits.id FROM commits WHERE commits.sha1=%s\", reachable_values)\n    cursor.execute(\"UPDATE branches SET head=%s WHERE id=%s\", (gitutils.Commit.fromSHA1(db, repository, new).getId(db), branch.id))\n\n    db.commit()\n\n    if configuration.extensions.ENABLED and review:\n        extensions.role.processcommits.execute(db, user, review, all_commits,\n                                               gitutils.Commit.fromSHA1(db, repository, old),\n                                               gitutils.Commit.fromSHA1(db, repository, new),\n                                               sys.stdout)\n\ndef deleteBranch(db, user, repository, name, old):\n    try:\n        update(repository.path, \"refs/heads/\" + name, old, None)\n    except Reject as rejected:\n        raise IndexException(str(rejected))\n    except Exception:\n        pass\n\n    branch = dbutils.Branch.fromName(db, repository, name)\n\n    if branch:\n        review = dbutils.Review.fromBranch(db, branch)\n\n        if review:\n            raise IndexException(\"This is Critic refusing to delete a branch that belongs to a review.\")\n\n        cursor = db.cursor()\n        cursor.execute(\"SELECT COUNT(*) FROM reachable WHERE branch=%s\", (branch.id,))\n\n        ncommits = cursor.fetchone()[0]\n\n        if branch.base:\n            cursor.execute(\"UPDATE branches SET base=%s WHERE base=%s\", (branch.base.id, branch.id))\n\n        cursor.execute(\"DELETE FROM branches WHERE id=%s\", (branch.id,))\n\n        # Suppress the \"user friendly\" feedback if the push is performed by the\n        # Critic system user, since there wouldn't be a human being reading it.\n        if not user.isSystem():\n            print \"Deleted branch containing %d commit%s.\" % (ncommits, \"s\" if ncommits > 1 else \"\")\n\ndef createTag(db, user, repository, name, sha1):\n    sha1 = gitutils.getTaggedCommit(repository, sha1)\n\n    if sha1:\n        cursor = db.cursor()\n        cursor.execute(\"INSERT INTO tags (name, repository, sha1) VALUES (%s, %s, %s)\",\n                       (name, repository.id, sha1))\n\ndef updateTag(db, user, repository, name, old_sha1, new_sha1):\n    sha1 = gitutils.getTaggedCommit(repository, new_sha1)\n    cursor = db.cursor()\n\n    if sha1:\n        cursor.execute(\"UPDATE tags SET sha1=%s WHERE name=%s AND repository=%s\",\n                       (sha1, name, repository.id))\n    else:\n        cursor.execute(\"DELETE FROM tags WHERE name=%s AND repository=%s\",\n                       (name, repository.id))\n\ndef deleteTag(db, user, repository, name):\n    cursor = db.cursor()\n    cursor.execute(\"DELETE FROM tags WHERE name=%s AND repository=%s\",\n                   (name, repository.id))\n"
  },
  {
    "path": "src/inpututils.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2013 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport os\n\n# Try to import the readline module to augment raw_input(), used below, which\n# automatically uses readline for line editing if it has been loaded.  We don't\n# really care if it fails; that just means raw_input() is a bit dumber.\ntry: import readline\nexcept: pass\n\n__doc__ = \"Helper functions for prompting for and reading input.\"\n\ndef apply_check(check, input):\n    result = check(input)\n    if result is None:\n        return True\n    elif result is True:\n        print \"Invalid input.\"\n        print\n    else:\n        print \"Invalid input: %s.\" % result\n        print\n    return False\n\n\ndef yes_or_no(prompt, default=None):\n    prompt = \"%s [%s/%s] \" % (prompt, \"Y\" if default is True else \"y\", \"N\" if default is False else \"n\")\n\n    while True:\n        try: input = raw_input(prompt)\n        except KeyboardInterrupt:\n            print\n            raise\n\n        if input.lower() in (\"y\", \"yes\"):\n            return True\n        elif input.lower() in (\"n\", \"no\"):\n            return False\n        elif input or default is None:\n            print \"Please answer 'y'/'yes' or 'n'/'no'.\"\n            print\n        else:\n            return default\n\ndef string(prompt, default=None, check=None):\n    prompt = \"%s%s \" % (prompt, (\" [%s]\" % default) if default is not None else \"\")\n\n    while True:\n        try: input = raw_input(prompt)\n        except KeyboardInterrupt:\n            print\n            raise\n\n        if default and not input:\n            input = default\n\n        if check:\n            if apply_check(check, input):\n                return input\n        elif not input:\n            print \"Invalid input: empty.\"\n        else:\n            return input\n\ndef password(prompt, default=None, twice=True):\n    import termios\n\n    prompt = \"%s%s \" % (prompt, \" [****]\" if default is not None else \"\")\n\n    def internal(prompt):\n        if os.isatty(sys.stdin.fileno()):\n            old = termios.tcgetattr(sys.stdin)\n            new = old[:]\n            new[3] = new[3] & ~termios.ECHO\n            try:\n                termios.tcsetattr(sys.stdin, termios.TCSADRAIN, new)\n                try: password = raw_input(prompt)\n                except KeyboardInterrupt:\n                    print\n                    raise\n            finally:\n                termios.tcsetattr(sys.stdin, termios.TCSADRAIN, old)\n        else:\n            password = sys.stdin.readline().rstrip(\"\\n\")\n        print\n        if default and not password: return default\n        else: return password\n\n    while True:\n        password = internal(prompt)\n\n        if twice:\n            andagain = internal(\"And again: \")\n\n            if password == andagain:\n                return password\n            else:\n                print\n                print \"Passwords differ.  Please try again.\"\n                print\n        else:\n            return password\n"
  },
  {
    "path": "src/jsonapi/__init__.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport contextlib\nimport itertools\nimport re\n\nimport api\nimport auth\nimport request\nimport textutils\n\nclass Error(Exception):\n    pass\n\nclass PathError(Error):\n    \"\"\"Raised for valid paths that don't match a resource\n\n       Results in a 404 \"Not Found\" response.\n\n       Note: A \"valid\" path is one that could have returned a resource, had the\n             system's dynamic state (database + repositories) been different.\"\"\"\n\n    http_status = 404\n    title = \"No such resource\"\n\nclass UsageError(Error):\n    \"\"\"Raised for invalid paths and/or query parameters\n\n       Results in a 400 \"Bad Request\" response.\n\n       Note: An \"invalid\" path is one that could never (in this version of\n             Critic) return any other response, regardless of the system's\n             dynamic state (database + repositories.)\"\"\"\n\n    http_status = 400\n    title = \"Invalid API request\"\n\nclass InputError(Error):\n    http_status = 400\n    title = \"Invalid API input\"\n\nclass PermissionDenied(Error):\n    http_status = 403\n    title = \"Permission denied\"\n\nclass ResultDelayed(Error):\n    http_status = 202\n    title = \"Resource temporarily unavailable\"\n\nclass InternalRedirect(Exception):\n    def __init__(self, resource_path, subresource_path=None,\n                 value=None, values=None):\n        self.resource_path = resource_path\n        self.subresource_path = subresource_path or []\n        self.value = value\n        self.values = values\n\nclass ResourceSkipped(Exception):\n    \"\"\"Raised by a resource class's json() to skip the resource\n\n       The message should explain why it was skipped, which may be\n       sent to the client in a \"404 Not Found\" response.\"\"\"\n    pass\n\nSPECIAL_QUERY_PARAMETERS = frozenset([\"fields\", \"include\", \"debug\"])\n\ndef _process_fields(value):\n    fields = set()\n    for field in value.split(\",\"):\n        fields.add(field)\n        # For fields on the form 'a.b.c', add the prefixes 'a' and 'a.b' as\n        # well, so that one can request inclusion of fields in sub-objects\n        # without having to explicitly request the sub-object be included.\n        while True:\n            field, _, _ = field.rpartition(\".\")\n            if not field:\n                break\n            fields.add(field)\n    return fields\n\nclass Parameters(object):\n    def __init__(self, critic, req):\n        self.critic = critic\n        self.req = req\n        self.debug = req.getParameter(\n            \"debug\", set(), filter=lambda value: set(value.split(\",\")))\n        self.fields = req.getParameter(\n            \"fields\", set(), filter=_process_fields)\n        self.fields_per_type = {}\n        self.__query_parameters = {\n            name: value\n            for name, value in req.getParameters().items()\n            if name not in SPECIAL_QUERY_PARAMETERS }\n        self.__resource_name = None\n        self.range_accessed = False\n        self.context = {}\n        self.subresource_path = []\n        self.output_format = self.__query_parameters.get(\n            \"output_format\", \"default\")\n\n    def __prepareType(self, resource_type):\n        if resource_type not in self.fields_per_type:\n            self.fields_per_type[resource_type] = self.req.getParameter(\n                \"fields[%s]\" % resource_type, self.fields,\n                filter=_process_fields)\n        return self.fields_per_type[resource_type]\n\n    def hasField(self, resource_type, key):\n        fields = self.__prepareType(resource_type)\n        return not fields or key in fields\n\n    def filtered(self, resource_type, resource_json):\n        fields = self.__prepareType(resource_type)\n        if fields:\n            def filter_json(prefix, key, json):\n                if isinstance(json, dict):\n                    if key:\n                        prefix += key + \".\"\n                    return { key: filter_json(prefix, key, value)\n                             for key, value in json.items()\n                             if prefix + key in fields }\n                else:\n                    return json\n            return filter_json(\"\", \"\", resource_json)\n        return resource_json\n\n    @contextlib.contextmanager\n    def forResource(self, resource):\n        assert self.__resource_name is None\n        self.__resource_name = resource.name\n        yield\n        self.__resource_name = None\n\n    def getQueryParameter(self, name, converter=None, exceptions=()):\n        if self.__resource_name:\n            value = self.__query_parameters.get(\n                \"%s[%s]\" % (name, self.__resource_name))\n        else:\n            value = None\n        if value is None:\n            value = self.__query_parameters.get(name)\n        if value is not None and converter:\n            try:\n                value = converter(value)\n            except exceptions:\n                raise UsageError(\"Invalid %s parameter: %r\" % (name, value))\n        return value\n\n    def getRange(self):\n        self.range_accessed = True\n        offset = self.getQueryParameter(\n            \"offset\", converter=int, exceptions=ValueError)\n        if offset is not None:\n            if offset < 0:\n                raise UsageError(\"Invalid offset parameter: %r\" % offset)\n        count = self.getQueryParameter(\n            \"count\", converter=int, exceptions=ValueError)\n        if count is not None:\n            if count < 1:\n                raise UsageError(\"Invalid count parameter: %r\" % count)\n        if offset and count:\n            return offset, offset + count\n        return offset, count\n\n    def setContext(self, key, value):\n        if key in self.context:\n            existing = self.context[key]\n            if existing is None or existing != value:\n                self.context[key] = None\n        else:\n            self.context[key] = value\n\nclass Linked(object):\n    def __init__(self, req=None):\n        if req is not None:\n            include = req.getParameter(\n                \"include\", [], filter=lambda value: value.split(\",\"))\n            self.linked_per_type = { resource_type: set()\n                                     for resource_type in include }\n\n    def __getitem__(self, resource_type):\n        return self.linked_per_type[resource_type]\n    def __setitem__(self, resource_type, value):\n        self.linked_per_type[resource_type] = value\n\n    def isEmpty(self):\n        return not any(self.linked_per_type.values())\n\n    def add(self, resource_path, *values):\n        resource_class = lookup(resource_path)\n        assert all(isinstance(value, resource_class.value_class)\n                   for value in values)\n        linked = self.linked_per_type.get(resource_class.name)\n        if linked is not None:\n            linked.update(values)\n        return resource_class\n\n    def filter_referenced(self, json):\n        if isinstance(json, dict):\n            return {\n                key: self.filter_referenced(value)\n                for key, value in json.items()\n            }\n        elif isinstance(json, list):\n            return [self.filter_referenced(value) for value in json]\n        elif type(json) in VALUE_CLASSES:\n            resource_path = VALUE_CLASSES[type(json)]\n            resource_class = self.add(resource_path, json)\n            return resource_class.resource_id(json)\n        else:\n            return json\n\n    def copy(self):\n        linked = Linked()\n        linked.linked_per_type = {\n            resource_type: set(linked_objects)\n            for resource_type, linked_objects in self.linked_per_type.items()\n        }\n        return linked\n\nHANDLERS = {}\nVALUE_CLASSES = {}\n\ndef registerHandler(path, resource_class):\n    HANDLERS[path] = resource_class\n    if not path.startswith(\"...\"):\n        if isinstance(resource_class.value_class, tuple):\n            for value_class in resource_class.value_class:\n                VALUE_CLASSES[value_class] = path\n        else:\n            VALUE_CLASSES[resource_class.value_class] = path\n\ndef PrimaryResource(resource_class):\n    assert hasattr(resource_class, \"name\")\n    assert hasattr(resource_class, \"value_class\")\n    for name in (\"single\", \"multiple\", \"create\", \"update\", \"delete\"):\n        if not hasattr(resource_class, name):\n            setattr(resource_class, name, None)\n    for name in (\"exceptions\", \"objects\", \"lists\", \"maps\"):\n        if not hasattr(resource_class, name):\n            setattr(resource_class, name, ())\n    for name in (\"anonymous_create\", \"anonymous_update\", \"anonymous_delete\"):\n        if not hasattr(resource_class, name):\n            setattr(resource_class, name, False)\n    if not hasattr(resource_class, \"resource_id\"):\n        resource_class.resource_id = staticmethod(lambda value: value.id)\n    contexts = getattr(resource_class, \"contexts\", (None,))\n    if None in contexts:\n        registerHandler(\"v1/\" + resource_class.name, resource_class)\n    for context in filter(None, contexts):\n        registerHandler(\".../%s/%s\" % (context, resource_class.name),\n                        resource_class)\n    return resource_class\n\ndef lookup(resource_path):\n    if not isinstance(resource_path, list):\n        resource_path = resource_path.split(\"/\")\n    for offset in range(len(resource_path) - 1):\n        if offset:\n            resource_id = \"/\".join([\"...\"] + resource_path[offset:])\n        else:\n            resource_id = \"/\".join(resource_path)\n        try:\n            return HANDLERS[resource_id]\n        except KeyError:\n            continue\n    else:\n        raise PathError(\"Invalid resource: %r\" % \"/\".join(resource_path))\n\ndef find(resource_name):\n    suffix = \"/\" + resource_name\n    return (resource_class\n            for resource_id, resource_class in HANDLERS.items()\n            if resource_id.endswith(suffix))\n\ndef id_or_name(argument):\n    try:\n        return int(argument), None\n    except ValueError:\n        return None, argument\n\ndef numeric_id(argument):\n    try:\n        value = int(argument)\n        if value < 1:\n            raise ValueError\n        return value\n    except ValueError:\n        raise UsageError(\"Invalid numeric id: %r\" % argument)\n\ndef deduce(resource_path, parameters):\n    resource_class = lookup(resource_path)\n    try:\n        return resource_class.deduce(parameters)\n    except resource_class.exceptions as error:\n        raise PathError(\"Resource not found: %s\" % error.message)\n\ndef from_parameter(resource_path, parameter_name, parameters):\n    resource_class = lookup(resource_path)\n    parameter_value = parameters.getQueryParameter(parameter_name)\n    if parameter_value is None:\n        return None\n    try:\n        return resource_class.fromParameter(parameter_value, parameters)\n    except resource_class.exceptions as error:\n        raise PathError(\"Invalid parameter: %s=%s: %s\"\n                        % (parameter_name, parameter_value, error.message))\n\ndef sorted_by_id(items):\n    return sorted(items, key=lambda item: item.id)\n\nimport check\n\nfrom check import convert, ensure\n\nimport v1\nimport documentation\n\ndef getAPIVersion(req):\n    path = req.path.split(\"/\")\n\n    assert len(path) >= 1 and path[0] == \"api\"\n\n    if len(path) < 2:\n        return None\n\n    api_version = path[1]\n\n    if api_version != \"v1\":\n        raise PathError(\"Unsupported API version: %r\" % api_version)\n\n    return api_version\n\ndef finishGET(critic, req, parameters, resource_class, value, values):\n    assert (value is None) != (values is None)\n\n    api_version = getAPIVersion(req)\n\n    try:\n        if values is not None:\n            values_json = []\n\n            for value in values:\n                try:\n                    values_json.append(resource_class.json(value, parameters))\n                except ResourceSkipped:\n                    pass\n\n            resource_json = {\n                resource_class.name: values_json\n            }\n        else:\n            try:\n                resource_json = resource_class.json(value, parameters)\n            except ResourceSkipped as error:\n                raise PathError(\"Resource not found: %s\" % error.message)\n            if parameters.output_format == \"static\":\n                resource_json = { resource_class.name: [resource_json] }\n    except resource_class.exceptions as error:\n        raise PathError(\"Resource not found: %s\" % error.message)\n    except IndexError:\n        raise PathError(\"List index out of range\")\n\n    if req.method != \"DELETE\" and parameters.subresource_path:\n        subresource_json = resource_json\n        for component in parameters.subresource_path:\n            subresource_json = subresource_json[component]\n        resource_json = {\n            \"/\".join(parameters.subresource_path): subresource_json\n        }\n\n    linked = Linked(req)\n\n    resource_json = linked.filter_referenced(resource_json)\n\n    if linked.linked_per_type:\n        all_linked = linked.copy()\n\n        linked_json = resource_json[\"linked\"] = {\n            resource_type: []\n            for resource_type in linked.linked_per_type\n        }\n\n        while not linked.isEmpty():\n            additional_linked = Linked(req)\n\n            for resource_type, linked_values in linked.linked_per_type.items():\n                resource_class = lookup([api_version, resource_type])\n\n                for linked_value in linked_values:\n                    try:\n                        linked_value_json = resource_class.json(linked_value,\n                                                                parameters)\n                    except ResourceSkipped:\n                        continue\n                    linked_json[resource_type].append(\n                        additional_linked.filter_referenced(linked_value_json))\n\n            for resource_type in linked.linked_per_type.keys():\n                additional_linked[resource_type] -= all_linked[resource_type]\n                all_linked[resource_type] |= linked[resource_type]\n\n            linked = additional_linked\n\n            for linked_items in linked_json.values():\n                if linked_items and \"id\" in linked_items[0]:\n                    linked_items.sort(key=lambda item: item[\"id\"])\n\n    if critic.database.profiling and \"dbqueries\" in parameters.debug:\n        import profiling\n        # Sort items by accumulated time.\n        items = sorted(critic.database.profiling.items(),\n                       key=lambda item: item[1][1],\n                       reverse=True)\n        resource_json.setdefault(\"debug\", {})[\"dbqueries\"] = {\n            \"formatted\": profiling.formatDBProfiling(critic.database),\n            \"items\": [\n                {\n                    \"query\": re.sub(r\"\\s+\", \" \", query),\n                    \"count\": count,\n                    \"accumulated\": {\n                        \"time\": accumulated_ms,\n                        \"rows\": accumulated_rows\n                    },\n                    \"maximum\": {\n                        \"time\": maximum_ms,\n                        \"rows\": maximum_rows\n                    }\n                }\n                for query, (count,\n                            accumulated_ms, maximum_ms,\n                            accumulated_rows, maximum_rows) in items\n            ]\n        }\n\n    return resource_json\n\ndef requireSignIn(critic):\n    if critic.actual_user is None:\n        raise UsageError(\"Sign-in required\")\n\ndef finishPOST(critic, req, parameters, resource_class, value, values, data):\n    if not resource_class.anonymous_create:\n        requireSignIn(critic)\n\n    if (value or values) and not parameters.subresource_path:\n        raise UsageError(\"Invalid POST request\")\n\n    if not resource_class.create:\n        raise UsageError(\"Resource class does not support creating: \"\n                         % resource_class.name)\n\n    while True:\n        try:\n            value, values = resource_class.create(\n                parameters, value, values, data)\n        except resource_class.exceptions as error:\n            raise UsageError(error.message)\n        except InternalRedirect as redirect:\n            resource_class = lookup(redirect.resource_path)\n            parameters.subresource_path = redirect.subresource_path\n            value = redirect.value\n            values = redirect.values\n        else:\n            break\n\n    return finishGET(critic, req, parameters, resource_class, value, values)\n\ndef finishPUT(critic, req, parameters, resource_class, value, values, data):\n    if not resource_class.anonymous_update:\n        requireSignIn(critic)\n\n    if not resource_class.update:\n        raise UsageError(\"Resource class does not support updating: \"\n                         % resource_class.name)\n\n    if value or values:\n        try:\n            resource_class.update(parameters, value, values, data)\n        except resource_class.exceptions as error:\n            raise UsageError(error.message)\n\n    return finishGET(critic, req, parameters, resource_class, value, values)\n\ndef finishDELETE(critic, req, parameters, resource_class, value, values):\n    if not resource_class.anonymous_delete:\n        requireSignIn(critic)\n\n    if not resource_class.delete:\n        raise UsageError(\"Resource class does not support deleting: \"\n                         % resource_class.name)\n\n    if parameters.output_format == \"static\":\n        if value is not None:\n            resource_ids = [value.id]\n        else:\n            resource_ids = [resource.id for resource in values]\n\n    try:\n        return_value = resource_class.delete(parameters, value, values)\n    except resource_class.exceptions as error:\n        raise UsageError(error.message)\n\n    if return_value is None:\n        if parameters.output_format == \"static\":\n            return {\n                \"deleted\": {\n                    resource_class.name: resource_ids\n                },\n            }\n\n        raise request.NoContent()\n\n    value, values = return_value\n\n    return finishGET(critic, req, parameters, resource_class, value, values)\n\ndef handleRequestInternal(critic, req):\n    api_version = getAPIVersion(req)\n\n    if not api_version:\n        if req.method == \"GET\":\n            documentation.describeRoot()\n        else:\n            raise UsageError(\"Invalid %s request\" % req.method)\n\n    prefix = [api_version]\n    parameters = Parameters(critic, req)\n\n    path = req.path.rstrip(\"/\").split(\"/\")[2:]\n\n    if not path:\n        if req.method == \"GET\":\n            describe_parameter = parameters.getQueryParameter(\"describe\")\n            if describe_parameter:\n                v1.documentation.describeResource(describe_parameter)\n            v1.documentation.describeVersion()\n        else:\n            raise UsageError(\"Invalid %s request\" % req.method)\n\n    if req.method in (\"POST\", \"PUT\"):\n        try:\n            data = textutils.json_decode(req.read())\n        except ValueError:\n            raise UsageError(\"Invalid %s request body\" % req.method)\n\n    context = None\n    resource_class = None\n\n    while True:\n        next_component = path.pop(0)\n\n        if resource_class and (next_component in resource_class.objects or\n                               next_component in resource_class.lists or\n                               next_component in resource_class.maps):\n            subresource_id = []\n            subresource_path = []\n\n            while True:\n                subresource_id.append(next_component)\n                subresource_path.append(next_component)\n\n                if \"/\".join(subresource_id) in resource_class.objects:\n                    pass\n                elif \"/\".join(subresource_id) in resource_class.lists:\n                    if path:\n                        try:\n                            subresource_path.append(int(path[0]))\n                        except ValueError:\n                            raise UsageError(\n                                \"Item identifier must be an integer: %r\"\n                                % path[0])\n                        else:\n                            del path[0]\n                elif \"/\".join(subresource_id) in resource_class.maps:\n                    if path:\n                        subresource_path.append(path[0])\n                else:\n                    raise PathError(\"Invalid resource: %r / %r\"\n                                    % (\"/\".join(resource_path),\n                                       \"/\".join(subresource_id)))\n\n                if not path:\n                    break\n\n                next_component = path.pop(0)\n\n            parameters.subresource_path = subresource_path\n            break\n\n        resource_path = prefix + [next_component]\n        resource_class = lookup(resource_path)\n\n        prefix.append(resource_class.name)\n\n        value = None\n        values = None\n\n        resource_id = \"/\".join(resource_path)\n\n        try:\n            if path and resource_class.single:\n                arguments = filter(None, path.pop(0).split(\",\"))\n                if len(arguments) == 0 or (len(arguments) > 1 and path):\n                    raise UsageError(\"Invalid resource path: %s\" % req.path)\n                if len(arguments) == 1:\n                    with parameters.forResource(resource_class):\n                        value = resource_class.single(parameters, arguments[0])\n                    assert isinstance(value, resource_class.value_class)\n                    if not path:\n                        break\n                else:\n                    with parameters.forResource(resource_class):\n                        values = [resource_class.single(parameters, argument)\n                                  for argument in arguments]\n                    assert all(isinstance(value, resource_class.value_class)\n                               for value in values)\n                    break\n            elif not path:\n                if req.method == \"POST\":\n                    break\n                if not resource_class.multiple:\n                    raise UsageError(\"Resource requires an argument: %s\"\n                                     % resource_id)\n                with parameters.forResource(resource_class):\n                    values = resource_class.multiple(parameters)\n                if isinstance(values, resource_class.value_class):\n                    value, values = values, None\n                elif not parameters.range_accessed:\n                    begin, end = parameters.getRange()\n                    values = itertools.islice(values, begin, end)\n                break\n        except resource_class.exceptions as error:\n            raise PathError(\"Resource not found: %s\" % error.message)\n\n    if values and not isinstance(values, list):\n        values = list(values)\n\n    if req.method == \"GET\":\n        return finishGET(critic, req, parameters, resource_class, value, values)\n    elif req.method == \"POST\":\n        return finishPOST(\n            critic, req, parameters, resource_class, value, values, data)\n    elif req.method == \"PUT\":\n        return finishPUT(\n            critic, req, parameters, resource_class, value, values, data)\n    elif req.method == \"DELETE\":\n        return finishDELETE(\n            critic, req, parameters, resource_class, value, values)\n\ndef handleRequest(critic, req):\n    try:\n        return handleRequestInternal(critic, req)\n    except (api.PermissionDenied, auth.AccessDenied) as error:\n        raise PermissionDenied(error.message)\n    except api.ResultDelayedError:\n        raise ResultDelayed(\"Please try again later\")\n"
  },
  {
    "path": "src/jsonapi/check.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2015 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport contextlib\nimport re\n\nimport api\nimport jsonapi\n\ndef ishashable(value):\n    try:\n        hash(value)\n    except TypeError:\n        return False\n    else:\n        return True\n\nclass TypeCheckerContext(object):\n    def __init__(self, parameters):\n        self.critic = parameters.critic\n        self.__repository = parameters.context.get(\"repositories\")\n        self.__review = parameters.context.get(\"reviews\")\n        self.__path = [\"data\"]\n\n    @contextlib.contextmanager\n    def push(self, element):\n        if isinstance(element, int):\n            self.__path.append(\"[%d]\" % element)\n        else:\n            self.__path.append(\".\" + str(element))\n        yield\n        self.__path.pop()\n\n    def __str__(self):\n        return \"\".join(self.__path)\n\n    @property\n    def review(self):\n        return self.__review\n\n    @review.setter\n    def review(self, review):\n        assert self.__review is None or self.__review == review\n        self.__review = review\n        if review is not None:\n            self.repository = review.repository\n\n    @property\n    def repository(self):\n        return self.__repository\n\n    @repository.setter\n    def repository(self, repository):\n        assert self.__repository is None or self.__repository == repository\n        self.__repository = repository\n\nclass TypeChecker(object):\n    convert_exception = ()\n\n    def check_compatibility(self, context, value):\n        if hasattr(self, \"required_isinstance\"):\n            return isinstance(value, self.required_isinstance)\n        return True\n\n    def __call__(self, context, value):\n        if not self.check_compatibility(context, value):\n            raise jsonapi.InputError(\"%s: expected %s\" % (context, self))\n        message = self.check(context, value)\n        if message is not None:\n            raise jsonapi.InputError(\"%s: %s\" % (context, message))\n        if hasattr(self, \"convert\"):\n            try:\n                value = self.convert(context, value)\n            except self.convert_exception as error:\n                raise jsonapi.InputError(\"%s: %s\" % (context, error.message))\n        if hasattr(self, \"process\"):\n            self.process(context, value)\n        return value\n\n    def __str__(self):\n        return self.expected_type\n\n    def check(self, context, value):\n        pass\n\n    @staticmethod\n    def make(value):\n        if ishashable(value) and value in CHECKER_MAP:\n            value = CHECKER_MAP[value]\n        if isinstance(value, TypeChecker):\n            return value\n        if isinstance(value, type) and issubclass(value, TypeChecker):\n            return value()\n        if isinstance(value, list):\n            assert(len(value) == 1)\n            return ListChecker(value[0])\n        if isinstance(value, (set, frozenset, tuple)):\n            if all(isinstance(item, str) for item in value):\n                return EnumerationChecker(*value)\n            return VariantChecker(value)\n        if isinstance(value, dict):\n            return ObjectChecker(value)\n        raise Exception(\"invalid checked type: %r\" % value)\n\nclass ListChecker(TypeChecker):\n    required_isinstance = list\n\n    def __init__(self, checker):\n        self.checker = TypeChecker.make(checker)\n        self.expected_type = \"list of %s\" % self.checker.expected_type\n\n    def convert(self, context, value):\n        result = []\n        for index, element in enumerate(value):\n            with context.push(index):\n                result.append(self.checker(context, element))\n        return result\n\nclass VariantChecker(TypeChecker):\n    def __init__(self, checkers=None):\n        if checkers is None:\n            checkers = self.types\n        self.checkers = map(TypeChecker.make, checkers)\n        self.matched = None\n        self.expected_type = \"%s or %s\" % (\", \".join(map(str, self.checkers[:-1])),\n                                           self.checkers[-1])\n\n    def check_compatibility(self, context, value):\n        for checker in self.checkers:\n            if checker.check_compatibility(context, value):\n                self.matched = checker\n                self.convert_exception = checker.convert_exception\n                return True\n        return False\n\n    def convert(self, context, value):\n        try:\n            return self.matched(context, value)\n        finally:\n            self.matched = None\n\nclass ObjectChecker(TypeChecker):\n    required_isinstance = dict\n    expected_type = \"object\"\n\n    def __init__(self, attributes):\n        self.attributes = {}\n        self.prioritized = set()\n        for attribute_name, attribute_type in attributes.items():\n            required = False\n            default = False\n            if attribute_name.endswith(\"=null\"):\n                default = True\n                attribute_name = attribute_name[:-5]\n            elif attribute_name.endswith(\"?\"):\n                attribute_name = attribute_name[:-1]\n            else:\n                required = True\n            if attribute_name.endswith(\"!\"):\n                attribute_name = attribute_name[:-1]\n                self.prioritized.add(attribute_name)\n            self.attributes[attribute_name] = (required, default,\n                                               TypeChecker.make(attribute_type))\n\n    def convert(self, context, value):\n        result = {}\n        def convert_attributes(attributes):\n            for attribute_name, attribute_value in attributes:\n                with context.push(attribute_name):\n                    if attribute_name not in self.attributes:\n                        raise jsonapi.InputError(\n                            \"%s: unexpected attribute\" % context)\n                    result[attribute_name] = self.attributes[attribute_name][2](\n                        context, attribute_value)\n        convert_attributes((attribute_name, attribute_value)\n                           for attribute_name, attribute_value in value.items()\n                           if attribute_name in self.prioritized)\n        convert_attributes((attribute_name, attribute_value)\n                           for attribute_name, attribute_value in value.items()\n                           if attribute_name not in self.prioritized)\n        for attribute_name, (required, default, _) in self.attributes.items():\n            if attribute_name not in result:\n                if required:\n                    with context.push(attribute_name):\n                        raise jsonapi.InputError(\"%s: missing attribute\"\n                                                 % context)\n                elif default:\n                    result[attribute_name] = None\n        return result\n\nclass IntegerChecker(TypeChecker):\n    required_isinstance = int\n    expected_type = \"integer\"\n\nclass RestrictedInteger(IntegerChecker):\n    def __init__(self, minvalue=None, maxvalue=None):\n        self.minvalue = minvalue\n        self.maxvalue = maxvalue\n\n    def check(self, context, value):\n        if self.minvalue is not None and value < self.minvalue:\n            return \"must be at least %d\" % self.minvalue\n        if self.maxvalue is not None and value > self.maxvalue:\n            return \"can be at most %d\" % self.maxvalue\n        return super(RestrictedInteger, self).check(context, value)\n\nclass NonNegativeInteger(RestrictedInteger):\n    def __init__(self):\n        super(NonNegativeInteger, self).__init__(minvalue=0)\n\nclass PositiveInteger(RestrictedInteger):\n    def __init__(self):\n        super(PositiveInteger, self).__init__(minvalue=1)\n\nclass StringChecker(TypeChecker):\n    required_isinstance = basestring\n    expected_type = \"string\"\n\nclass RestrictedString(StringChecker):\n    def __init__(self, minlength=None, maxlength=None, regexp=None):\n        self.minlength = minlength\n        self.maxlength = maxlength\n        self.regexp = re.compile(regexp) if regexp else None\n\n    def check(self, context, value):\n        if self.minlength is not None and len(value) < self.minlength:\n            return \"must be at least %d characters long\" % self.minlength\n        if self.maxlength is not None and len(value) < self.maxlength:\n            return \"can be at most %d characters long\" % self.maxlength\n        if self.regexp is not None and not self.regexp.match(value):\n            return \"must match '%s'\" % self.regexp.pattern\n        return super(RestrictedString, self).check(context, value)\n\nclass RegularExpression(StringChecker):\n    def check(self, context, value):\n        try:\n            re.compile(value)\n        except re.error:\n            return \"must be a valid Python regular expression\"\n        return super(RegularExpression, self).check(context, value)\n\nclass EnumerationChecker(StringChecker):\n    def __init__(self, *values):\n        self.values = frozenset(values)\n\n    def check(self, context, value):\n        if value not in self.values:\n            values = sorted(self.values)\n            return (\"must be one of %s and %s\"\n                    % (\", \".join(values[:-1]), values[-1]))\n        return super(EnumerationChecker, self).check(context, value)\n\nclass BooleanChecker(TypeChecker):\n    required_isinstance = bool\n    expected_type = \"boolean\"\n\nclass UserId(PositiveInteger):\n    convert_exception = api.user.InvalidUserId\n    def convert(self, context, value):\n        return api.user.fetch(context.critic, user_id=value)\n\nclass UserName(StringChecker):\n    convert_exception = api.user.InvalidUserName\n    def convert(self, context, value):\n        return api.user.fetch(context.critic, name=value)\n\nclass User(VariantChecker):\n    types = (UserId, UserName)\n\nclass RepositoryId(PositiveInteger):\n    convert_exception = api.repository.InvalidRepositoryId\n    def convert(self, context, value):\n        return api.repository.fetch(context.critic, repository_id=value)\n\nclass RepositoryName(StringChecker):\n    convert_exception = api.repository.InvalidRepositoryName\n    def convert(self, context, value):\n        return api.repository.fetch(context.critic, name=value)\n\nclass Repository(VariantChecker):\n    types = (RepositoryId, RepositoryName)\n    def process(self, context, repository):\n        context.repository = repository\n\n    class Required(TypeChecker):\n        def check(self, context, value):\n            if context.repository is None:\n                return \"no repository set in context\"\n            return super(Repository.Required, self).check(context, value)\n\nclass Review(PositiveInteger):\n    convert_exception = api.review.InvalidReviewId\n    def convert(self, context, value):\n        return api.review.fetch(context.critic, review_id=value)\n    def process(self, context, review):\n        context.review = review\n\nclass Comment(PositiveInteger):\n    convert_exception = api.comment.InvalidCommentId\n    def convert(self, context, value):\n        return api.comment.fetch(context.critic, comment_id=value)\n    def process(self, context, comment):\n        context.review = comment.review\n\nclass Reply(PositiveInteger):\n    convert_exception = api.reply.InvalidReplyId\n    def convert(self, context, value):\n        return api.reply.fetch(context.critic, reply_id=value)\n\nclass CommitId(PositiveInteger):\n    convert_exception = api.commit.InvalidCommitId\n    def convert(self, context, value):\n        return api.commit.fetch(context.repository, commit_id=value)\n\nclass CommitReference(StringChecker):\n    convert_exception = api.repository.InvalidRef\n    def convert(self, context, value):\n        return api.commit.fetch(context.repository, ref=value)\n\nclass Commit(VariantChecker, Repository.Required):\n    types = (CommitId, CommitReference)\n\nclass FileId(PositiveInteger):\n    convert_exception = api.file.InvalidFileId\n    def convert(self, context, value):\n        return api.file.fetch(context.critic, file_id=value)\n\nclass FilePath(StringChecker):\n    convert_exception = api.file.InvalidPath\n    def convert(self, context, value):\n        return api.file.fetch(context.critic, path=value)\n\nclass File(VariantChecker):\n    types = (FileId, FilePath)\n\nclass Changeset(PositiveInteger, Repository.Required):\n    convert_exception = api.changeset.InvalidChangesetId\n    def convert(self, context, value):\n        assert context.repository\n        return api.changeset.fetch(\n            context.critic, context.repository, changeset_id=value)\n\nclass ExtensionId(PositiveInteger):\n    convert_exception = api.extension.InvalidExtensionId\n    def convert(self, context, value):\n        return api.extension.fetch(context.critic, extension_id=value)\n\nclass ExtensionKey(StringChecker):\n    convert_exception = api.extension.InvalidExtensionKey\n    def convert(self, context, value):\n        return api.extension.fetch(context.critic, key=value)\n\nclass Extension(VariantChecker):\n    types = (ExtensionId, ExtensionKey)\n\nclass AccessControlProfile(PositiveInteger):\n    convert_exception = api.accesscontrolprofile.InvalidAccessControlProfileId\n    def convert(self, context, value):\n        return api.accesscontrolprofile.fetch(context.critic, profile_id=value)\n\nCHECKER_MAP = { int: IntegerChecker(),\n                str: StringChecker(),\n                bool: BooleanChecker(),\n                api.user.User: User,\n                api.repository.Repository: Repository,\n                api.review.Review: Review,\n                api.comment.Comment: Comment,\n                api.reply.Reply: Reply,\n                api.commit.Commit: Commit,\n                api.file.File: File,\n                api.changeset.Changeset: Changeset,\n                api.extension.Extension: Extension,\n                api.accesscontrolprofile.AccessControlProfile:\n                    AccessControlProfile }\n\ndef convert(parameters, checker, value):\n    context = TypeCheckerContext(parameters)\n    return TypeChecker.make(checker)(context, value)\n\ndef ensure(data, path, ensured_value):\n    if isinstance(path, (tuple, list)):\n        for key in path[:-1]:\n            data = data[key]\n        key = path[-1]\n    else:\n        key = path\n\n    if key not in data:\n        data[key] = ensured_value\n    elif data[key] != ensured_value:\n        path_string = \"data\"\n        for key in path:\n            if isinstance(key, str):\n                path_string += \".\" + key\n            else:\n                path_string += \"[%d]\" % key\n        raise jsonapi.InputError(\"%s: must be %r or omitted\"\n                                 % (path_string, ensured_value))\n"
  },
  {
    "path": "src/jsonapi/documentation.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport page.utils\n\ndef describeRoot():\n    # Since there is only one supported API version; simply redirect\n    # to its documentation.\n    raise page.utils.MovedTemporarily(\"/api/v1\")\n"
  },
  {
    "path": "src/jsonapi/v1/README.txt",
    "content": "API design\n==========\n\nPrimary vs secondary resources\n------------------------------\n\nAccessed resources (objects) are classified as either \"primary\" or \"secondary\".\n\nPrimary resources are those that are directly addressable via some path, such as\nusers (/api/v1/users/1) or reviews (/api/v1/reviews/1), but also a users\nindividual registered email addresses (/api/v1/users/1/emails/1).  A primary\nresource has a resource type that is simply the path component, i.e. \"users\",\n\"reviews\" or \"emails\" for the resources mentioned in this paragraph.\n\nA secondary resource is not directly addressable via a path and is only ever\nreturned as part of a primary resource.\n\nA primary resource referenced by another primary resource is always included as\nan id reference only, never expanded.  Such a referenced resource may, if\nrequested, still be included in the response, but then as a separate linked\nresource.\n\nExample:\n\n  Users are primary resources, and are referenced by various fields in the a\n  review resource, but only by id:\n\n    /api/v1/reviews/1 => {\n\n      \"id\": 1,\n      \"owners: [2, 3],\n      \"reviewers\": [4, 5, 6],\n      ...\n\n    }\n\n  The name+email+timestamp objects that represent the author and committer\n  metadata in commits, OTOH, are secondary resources, and thus directly included\n  in the commit resource that reference them:\n\n    /api/v1/repository/1/commits/1 => {\n\n      \"id\": 1,\n      \"author\": { \"name\": \"...\",\n                  \"email\": \"...\",\n                  \"timestamp\": ... },\n      \"committer\": { \"name\": \"...\",\n                     \"email\": \"...\",\n                     \"timestamp\": ... },\n\n    }\n\nThis rule exists mostly to define a consistent answer to the question: \"Should\nresource X be included in resource Y?\"\n\nReferenced primary resource are not directly included because A) they are\ngenerally convenient enough to access on their own, given an id, and B) can\nalways be included in the response anyway, as a linked resource.\n\nLinked resources\n----------------\n\nTypically, any primary resource whose id is included as part of another primary\nresource is recorded as a linked resource.  This includes the case of the other\nprimary resource also being \"just\" a linked resource, as long as that other\nprimary resource was actually going to be included in the final response.\n\nLinked resources are included in the final response if requested via the\n'include' query parameter.  Its value is a comma-separated list of resource\ntypes.\n\nExample:\n\n  /api/v1/reviews/1?include=users => {\n\n    \"id\": 1,\n    \"owners\": [2, 3],\n    \"reviewers\": [3, 4],\n    ...\n\n    \"linked\": { \"users\": [{ \"id\": 2,\n                            \"name\": \"...\",\n                            ... },\n                          { \"id\": 3,\n                            \"name\": \"...\",\n                            ... },\n                          { \"id\": 4,\n                            \"name\": \"...\",\n                            ... }] },\n\n  }\n\nSome references to primary resources do not cause the referenced resource to be\nrecorded as a linked resource.  One notable such exception is the references to\na commit's parent commits, since the recursive processing of linked resources\ncould easily cause a repository's entire history to be included.\n\nExample:\n\n  /api/v1/repositories/1/commits/1?include=commits => {\n\n    \"id\": 1,\n    \"parents\": [2, 3],\n    ...\n\n    \"linked\": {}\n\n  }\n\nNote that the exception here is the commit resource's 'parents' field, not\ncommits in general.  Other resources that include lists of commit reference do\nrecord them as linked resources.\n\nExample:\n\n  /api/v1/reviews/1?include=commits => {\n\n    \"id\": 1,\n    \"commits\": [1, 2, 3],\n    ...\n\n    \"linked\": { \"commits\": [{ \"id\": 1,\n                              \"parents\": [2],\n                              ... },\n                            { \"id\": 2,\n                              \"parents\": [3],\n                              ... },\n                            { \"id\": 3,\n                              \"parents\": [4],\n                              ... }] }\n  }\n\n  Note: In this scenario, commits 1-3 are included since they are directly\n        referenced from the review resource, but commit 4, which is referenced\n        from commit 3's 'parents' field, is not included.\n\nCollections\n-----------\n\nPrimary resources can also typically be accessed as collections, normally via a\npath that doesn't include the final component that identifies the specific\nresource.\n\nA collection of primary resources is returned as an object with a single key,\nthe resource type, mapped to an array of resources.\n\nExample:\n\n  /api/v1/users => {\n\n    \"users\": [{ \"id\": 1,\n                \"name\": \"...\",\n                ... },\n              { \"id\": 2,\n                \"name\": \"...\",\n                ... },\n              ...]\n\n  }\n\nThe top-level { resource_type: collection } structure is there to make it\npossible to also include linked resources as part of the top-level structure.\nThis would not be possible if the top-level structure was an array, for\ninstance.\n\n\nImplementation\n==============\n\nA primary resource is implemented by decorating a class with the decorator\n|jsonapi.PrimaryResource|, as such:\n\n  class User(object):\n    \"\"\"Internal representation\"\"\"\n    def __init__(self, name):\n      self.name = name\n\n  @jsonapi.PrimaryResource\n  class Users(object):\n    name = \"apples\"\n    value_class = User\n\n    @staticmethod\n    def json(value, parameters, linked):\n      return { \"name\": value.name }\n\n    @staticmethod\n    def single(parameters, argument):\n      return User(argument)\n\n    @staticmethod\n    def multiple(parameters):\n      return [User(\"alice\"), User(\"bob\")]\n\nA resource class is never instantiated; it is only expected to have class\nattributes and static (or class) methods.  Two attributes are required: |name|\nand |value_class|.\n\nIn addition, these attributes are used if present: |contexts| and |exceptions|.\n\nname\n----\nThe resource name (typically plural) as it appears in the path.  This defines\nthe paths that this resource class handles.\n\nIf the |name| attribute is \"users\", the resource class handles the path\n/api/v1/users/, unless the |contexts| attribute is present and overrides this\n(see below).\n\nvalue_class\n-----------\nThe internal type of the values being \"wrapped\".\n\ncontexts\n--------\nThe optional |contexts| attribute should be a tuple containing strings or the\nspecial value None.  If it contains None, the resource can appear without\ncontext, meaning at the beginning of a path.  If it contains strings, those\nstrings should match the name of other primary resources, and the meaning is\nthat this resource can occur following that other resource on a path.\n\nexceptions\n----------\nThe optional |exceptions| attribute should be a tuple containing exception\ntypes that the resource class's methods can raise and have converted into\nPathError exceptions.\n\njson()\n------\nThe json() method is called to convert an instance of the resource class's\ninternal value class to a simple data structure (typically a dictionary) that\ncan then be converted into a JSON string.  It must be implemented.\n\nThe |value| parameter is the value being converted.  It is guaranteed to be an\ninstance of the resource class's internal value class.\n\nThe |parameters| parameter gives access to query string parameters, and to\ncontext objects introduced by earlier path segments (all but the last).\n\nThe |linked| parameter holds an object that can be used to register other\nprimary resources referenced by this resource.\n\nsingle()\n--------\nThe single() method is called when processing a path\n\n  .../<resource name>/<argument>\n\nwhere <resource name> is the resource class's |name| attribute and <argument> is\nthe |argument| parameter to the method.  If the single() method is not\nimplemented, this type of path is invalid.\n\nThe |critic| parameter is an api.critic.Critic instance.\n\nThe |argument| parameter is the next path component, as described above.\n\nThe |parameters| parameter is the same as to json().\n\nThe return value must be an instance of the resource class's internal value\nclass.\n\nmultiple()\n----------\nThe multiple() method is called when processing a path\n\n  .../<resource name>\n\nand would normally return \"all resources of this type.\"  It can also filter its\nreturn value using query parameters.  If the multiple() method is not\nimplemented, this type of path is invalid.\n\nThe |critic| and |parameters| parameters are the same as to single().\n\nThe return value must be an iterable of the resource class's internal value\nclass, or an instance of it.  The return value can be an iterator or generator.\n"
  },
  {
    "path": "src/jsonapi/v1/__init__.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport configuration\nimport datetime\n\nEPOCH = datetime.datetime.utcfromtimestamp(0)\n\ndef timestamp(timestamp):\n    if timestamp is None:\n        return None\n    return (timestamp - EPOCH).total_seconds()\n\nimport users\nimport sessions\nimport repositories\nimport commits\nimport branches\nimport reviews\nimport reviewsummaries\nimport rebases\nimport changesets\nimport filechanges\nimport files\nimport comments\nimport replies\nimport batches\nimport reviewablefilechanges\nimport filediffs\nimport filecontents\n\nif configuration.auth.ENABLE_ACCESS_TOKENS:\n    import accesstokens\n    import accesscontrolprofiles\n    import labeledaccesscontrolprofiles\n\nif configuration.extensions.ENABLED:\n    import extensions\n\nimport documentation\n"
  },
  {
    "path": "src/jsonapi/v1/accesscontrolprofiles.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport itertools\n\nimport api\nimport jsonapi\n\nRULE = api.accesscontrolprofile.AccessControlProfile.RULE_VALUES\nCATEGORIES = frozenset([\"http\", \"repositories\", \"extensions\"])\nREQUEST_METHOD = api.accesscontrolprofile \\\n        .AccessControlProfile.HTTPException.REQUEST_METHODS\nREPOSITORY_ACCESS_TYPE = api.accesscontrolprofile \\\n        .AccessControlProfile.RepositoryException.ACCESS_TYPES\nEXTENSION_ACCESS_TYPE = api.accesscontrolprofile \\\n        .AccessControlProfile.ExtensionException.ACCESS_TYPES\n\nHTTP_EXCEPTION = {\n    \"request_method=null\": REQUEST_METHOD,\n    \"path_pattern=null\": jsonapi.check.RegularExpression\n}\nREPOSITORIES_EXCEPTION = {\n    \"access_type=null\": REPOSITORY_ACCESS_TYPE,\n    \"repository=null\": api.repository.Repository\n}\nEXTENSION_EXCEPTION = {\n    \"access_type=null\": EXTENSION_ACCESS_TYPE,\n    \"extension=null\": api.extension.Extension\n}\n\nPROFILE = {\n    \"http?\": {\n        \"rule\": RULE,\n        \"exceptions?\": [HTTP_EXCEPTION]\n    },\n    \"repositories?\": {\n        \"rule\": RULE,\n        \"exceptions?\": [REPOSITORIES_EXCEPTION]\n    },\n    \"extensions?\": {\n        \"rule\": RULE,\n        \"exceptions?\": [EXTENSION_EXCEPTION]\n    }\n}\n\nPROFILE_WITH_TITLE = PROFILE.copy()\nPROFILE_WITH_TITLE[\"title?\"] = str\n\ndef updateProfile(profile_modifier, converted):\n    def updateExceptions(exceptions_modifier, exceptions):\n        exceptions_modifier.deleteAll()\n        for exception in exceptions:\n            exceptions_modifier.add(**exception)\n\n    def updateCategory(category):\n        if category not in converted:\n            return\n        if \"rule\" in converted[category]:\n            profile_modifier.setRule(category, converted[category][\"rule\"])\n        if \"exceptions\" in converted[category]:\n            updateExceptions(profile_modifier.modifyExceptions(category),\n                             converted[category][\"exceptions\"])\n\n    if \"title\" in converted:\n        profile_modifier.setTitle(converted[\"title\"])\n\n    updateCategory(\"http\")\n    updateCategory(\"repositories\")\n    updateCategory(\"extensions\")\n\n@jsonapi.PrimaryResource\nclass AccessControlProfiles(object):\n    \"\"\"The access control profiles of this system.\"\"\"\n\n    name = \"accesscontrolprofiles\"\n    value_class = api.accesscontrolprofile.AccessControlProfile\n    exceptions = (api.accesscontrolprofile.AccessControlProfileError,)\n    objects = (\"http\",\n               \"repositories\",\n               \"extensions\")\n    lists = (\"http/exceptions\",\n             \"repositories/exceptions\",\n             \"extensions/exceptions\")\n\n    @staticmethod\n    def json(value, parameters, linked):\n        \"\"\"AccessControlProfile {\n             \"id\": integer,\n             \"title\": string or null,\n             \"http\": {\n               \"rule\": \"allow\" or \"deny\",\n               \"exceptions: [{\n                 \"id\": integer,\n                 \"request_method\": string or null,\n                 \"path_pattern\": string or null\n               }]\n             },\n             \"repositories\": {\n               \"rule\": \"allow\" or \"deny\",\n               \"exceptions: [{\n                 \"id\": integer,\n                 \"access_type\": \"read\" or \"modify\",\n                 \"repository\": integer\n               }]\n             },\n             \"extensions\": {\n               \"rule\": \"allow\" or \"deny\",\n               \"exceptions: [{\n                 \"id\": integer,\n                 \"access_type\": \"install\" or \"execute\",\n                 \"extension\": string,\n               }]\n             }\n           }\"\"\"\n\n        # Make sure that only administrator users can access profiles that are\n        # not connected to access tokens, and that only administrator users and\n        # the user that owns the access token can access other profiles.\n        if not value.access_token \\\n                or value.access_token.access_type != \"user\" \\\n                or parameters.critic.actual_user != value.access_token.user:\n            api.PermissionDenied.raiseUnlessAdministrator(parameters.critic)\n\n        for exception in value.repositories.exceptions:\n            if exception.repository:\n                linked.add(\"v1/repositories\", exception.repository)\n        for exception in value.extensions.exceptions:\n            if exception.extension:\n                linked.add(\"v1/extensions\", exception.extension)\n\n        return parameters.filtered(\"accesscontrolprofiles\", {\n            \"id\": value.id,\n            \"title\": value.title,\n            \"http\": {\n                \"rule\": value.http.rule,\n                \"exceptions\": [{\n                    \"id\": exception.id,\n                    \"request_method\": exception.request_method,\n                    \"path_pattern\": exception.path_pattern\n                } for exception in value.http.exceptions]\n            },\n            \"repositories\": {\n                \"rule\": value.repositories.rule,\n                \"exceptions\": [{\n                    \"id\": exception.id,\n                    \"access_type\": exception.access_type,\n                    \"repository\": (exception.repository.id\n                                   if exception.repository else None)\n                } for exception in value.repositories.exceptions]\n            },\n            \"extensions\": {\n                \"rule\": value.extensions.rule,\n                \"exceptions\": [{\n                    \"id\": exception.id,\n                    \"access_type\": exception.access_type,\n                    \"extension\": (exception.extension.id\n                                  if exception.extension else None)\n                } for exception in value.extensions.exceptions]\n            },\n        })\n\n    @staticmethod\n    def single(parameters, argument):\n        \"\"\"Retrieve one (or more) access control profiles.\n\n           PROFILE_ID : integer\n\n           Retrieve an access control profile identified by the profile's unique\n           numeric id.\"\"\"\n\n        return api.accesscontrolprofile.fetch(\n            parameters.critic, profile_id=jsonapi.numeric_id(argument))\n\n    @staticmethod\n    def multiple(parameters):\n        \"\"\"Retrieve all primary access control profiles in the system.\n\n           title : TITLE : string\n\n           Retrieve only access control profiles with a matching title.\"\"\"\n        title_parameter = parameters.getQueryParameter(\"title\")\n        return api.accesscontrolprofile.fetchAll(\n            parameters.critic, title=title_parameter)\n\n    @staticmethod\n    def deduce(parameters):\n        profile = parameters.context.get(\"accesscontrolprofiles\")\n        profile_parameter = parameters.getQueryParameter(\"profile\")\n        if profile_parameter is not None:\n            if profile is not None:\n                raise jsonapi.UsageError(\n                    \"Redundant query parameter: profile=%s\" % profile_parameter)\n            profile_id = jsonapi.numeric_id(profile_parameter)\n            profile = api.accesscontrolprofile.fetch(\n                parameters.critic, profile_id=profile_id)\n        return profile\n\n    @staticmethod\n    def create(parameters, value, values, data):\n        critic = parameters.critic\n        user = parameters.context.get(\"users\", critic.actual_user)\n        profiles = [value] if value else values\n        path = parameters.subresource_path\n\n        if 0 < len(path) < 2:\n            raise jsonapi.UsageError(\"Invalid POST request\")\n\n        if len(path) == 2 \\\n                and path[0] in CATEGORIES \\\n                and path[1] == \"exceptions\":\n            # Create an rule exception.\n\n            if path[0] == \"http\":\n                exception_type = HTTP_EXCEPTION\n            elif path[0] == \"repositories\":\n                exception_type = REPOSITORIES_EXCEPTION\n            else:\n                exception_type = EXTENSION_EXCEPTION\n\n            converted = jsonapi.convert(\n                parameters,\n                exception_type,\n                data)\n\n            with api.transaction.Transaction(critic) as transaction:\n                for profile in profiles:\n                    modifier = transaction.modifyAccessControlProfile(profile) \\\n                        .modifyExceptions(path[0]) \\\n                        .add(**converted)\n\n            return value, values\n\n        # Create an access control profile.\n        assert not profiles\n\n        converted = jsonapi.convert(parameters, PROFILE_WITH_TITLE, data)\n\n        result = []\n\n        def collectAccessControlProfile(profile):\n            assert isinstance(\n                profile, api.accesscontrolprofile.AccessControlProfile)\n            result.append(profile)\n\n        with api.transaction.Transaction(critic) as transaction:\n            modifier = transaction.createAccessControlProfile(\n                callback=collectAccessControlProfile)\n            updateProfile(modifier, converted)\n\n        assert len(result) == 1\n        return result[0], None\n\n    @staticmethod\n    def update(parameters, value, values, data):\n        critic = parameters.critic\n\n        if value:\n            profiles = [value]\n        else:\n            profiles = values\n\n        path = parameters.subresource_path\n\n        if len(path) == 1 and path[0] in CATEGORIES:\n            if path[0] == \"http\":\n                exception_type = HTTP_EXCEPTION\n            elif path[0] == \"repositories\":\n                exception_type = REPOSITORIES_EXCEPTION\n            else:\n                exception_type = EXTENSION_EXCEPTION\n\n            converted = jsonapi.convert(\n                parameters,\n                {\n                    \"rule?\": RULE,\n                    \"exceptions?\": [exception_type]\n                },\n                data)\n\n            with api.transaction.Transaction(critic) as transaction:\n                for profile in profiles:\n                    modifier = transaction.modifyAccessControlProfile(profile)\n                    updateProfile(modifier, { path[0]: converted })\n\n            return\n\n        converted = jsonapi.convert(parameters, PROFILE_WITH_TITLE, data)\n\n        with api.transaction.Transaction(critic) as transaction:\n            for profile in profiles:\n                modifier = transaction.modifyAccessControlProfile(profile)\n                updateProfile(modifier, converted)\n\n    @staticmethod\n    def delete(parameters, value, values):\n        critic = parameters.critic\n        path = parameters.subresource_path\n\n        if value:\n            profiles = [value]\n        else:\n            profiles = values\n\n        if len(path) in (2, 3) \\\n                and path[0] in CATEGORIES \\\n                and path[1] == \"exceptions\":\n            exception_id = path[2] if len(path) == 3 else None\n\n            with api.transaction.Transaction(critic) as transaction:\n                for profile in profiles:\n                    modifier = transaction.modifyAccessControlProfile(profile) \\\n                        .modifyProfile() \\\n                        .modifyExceptions(path[0])\n\n                    if exception_id is None:\n                        modifier.deleteAll()\n                    else:\n                        modifier.delete(exception_id)\n\n            return value, values\n\n        if path:\n            raise jsonapi.UsageError(\"Invalid DELETE request\")\n\n        with api.transaction.Transaction(critic) as transaction:\n            for profile in profiles:\n                transaction.modifyAccessControlProfile(profile) \\\n                    .delete()\n"
  },
  {
    "path": "src/jsonapi/v1/accesstokens.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\nimport jsonapi\n\nACCESS_TYPE = frozenset([\"user\", \"anonymous\", \"system\"])\n\nfrom accesscontrolprofiles import (RULE, CATEGORIES, HTTP_EXCEPTION,\n                                   REPOSITORIES_EXCEPTION, EXTENSION_EXCEPTION,\n                                   PROFILE, updateProfile)\n\ndef modifyAccessToken(transaction, access_token):\n    if access_token.user:\n        return transaction \\\n            .modifyUser(access_token.user) \\\n            .modifyAccessToken(access_token)\n    return transaction \\\n        .modifyAccessToken(access_token)\n\n@jsonapi.PrimaryResource\nclass AccessTokens(object):\n    \"\"\"Access tokens.\"\"\"\n\n    name = \"accesstokens\"\n    contexts = (None, \"users\")\n    value_class = api.accesstoken.AccessToken\n    exceptions = (api.accesstoken.AccessTokenError,)\n    objects = (\"profile\",\n               \"profile/http\",\n               \"profile/repositories\",\n               \"profile/extensions\")\n    lists = (\"profile/http/exceptions\",\n             \"profile/repositories/exceptions\",\n             \"profile/extensions/exceptions\")\n\n    @staticmethod\n    def json(value, parameters):\n        \"\"\"AccessToken {\n             \"id\": integer,\n             \"access_type\": \"user\", \"anonymous\" or \"system\",\n             \"user\": integer or null,\n             \"part1\": string,\n             \"part2\": string,\n             \"title\": string or null,\n             \"profile\": null or AccessControlProfile\n           }\n\n           AccessControlProfile {\n             \"http\": {\n               \"rule\": \"allow\" or \"deny\",\n               \"exceptions: [{\n                 \"id\": integer,\n                 \"request_method\": string or null,\n                 \"path_pattern\": string or null\n               }]\n             },\n             \"repositories\": {\n               \"rule\": \"allow\" or \"deny\",\n               \"exceptions: [{\n                 \"id\": integer,\n                 \"access_type\": \"read\" or \"modify\",\n                 \"repository\": integer\n               }]\n             },\n             \"extensions\": {\n               \"rule\": \"allow\" or \"deny\",\n               \"exceptions: [{\n                 \"id\": integer,\n                 \"access_type\": \"install\" or \"execute\",\n                 \"extension\": string,\n               }]\n             }\n           }\"\"\"\n\n        # Make sure that only administrator users can access other user's access\n        # tokens or access tokens that do not belong to any user.\n        if value.access_type != \"user\" \\\n                or parameters.critic.actual_user != value.user:\n            api.PermissionDenied.raiseUnlessAdministrator(parameters.critic)\n\n        data = { \"id\": value.id,\n                 \"access_type\": value.access_type,\n                 \"user\": value.user,\n                 \"part1\": value.part1,\n                 \"part2\": value.part2,\n                 \"title\": value.title }\n\n        if value.profile:\n            data[\"profile\"] = {\n                \"http\": {\n                    \"rule\": value.profile.http.rule,\n                    \"exceptions\": [{\n                        \"id\": exception.id,\n                        \"request_method\": exception.request_method,\n                        \"path_pattern\": exception.path_pattern\n                    } for exception in value.profile.http.exceptions]\n                },\n                \"repositories\": {\n                    \"rule\": value.profile.repositories.rule,\n                    \"exceptions\": [{\n                        \"id\": exception.id,\n                        \"access_type\": exception.access_type,\n                        \"repository\": exception.repository\n                    } for exception in value.profile.repositories.exceptions]\n                },\n                \"extensions\": {\n                    \"rule\": value.profile.extensions.rule,\n                    \"exceptions\": [{\n                        \"id\": exception.id,\n                        \"access_type\": exception.access_type,\n                        \"extension\": exception.extension\n                    } for exception in value.profile.extensions.exceptions]\n                },\n            }\n        else:\n            data[\"profile\"] = None\n\n        return parameters.filtered(\"accesstokens\", data)\n\n    @staticmethod\n    def single(parameters, argument):\n        \"\"\"Retrieve one (or more) access tokens.\n\n           TOKEN_ID : integer\n\n           Retrieve an access token identified by its unique numeric id.\"\"\"\n\n        value = api.accesstoken.fetch(parameters.critic,\n                                      jsonapi.numeric_id(argument))\n\n        if \"users\" in parameters.context:\n            if value.user != parameters.context[\"users\"]:\n                raise InvalidAccessTokenId(jsonapi.numeric_id(argument))\n\n        return value\n\n    @staticmethod\n    def multiple(parameters):\n        \"\"\"All access tokens.\"\"\"\n\n        user = jsonapi.deduce(\"v1/users\", parameters)\n\n        # Only administrators are allowed to access all access tokens in the\n        # system.\n        if user is None:\n            api.PermissionDenied.raiseUnlessAdministrator(parameters.critic)\n\n        return api.accesstoken.fetchAll(parameters.critic, user=user)\n\n    @staticmethod\n    def create(parameters, value, values, data):\n        critic = parameters.critic\n        user = parameters.context.get(\"users\", critic.actual_user)\n        access_tokens = [value] if value else values\n        path = parameters.subresource_path\n\n        if 0 < len(path) < 3:\n            raise jsonapi.UsageError(\"Invalid POST request\")\n\n        if len(path) == 3 \\\n                and path[0] == \"profile\" \\\n                and path[1] in CATEGORIES \\\n                and path[2] == \"exceptions\":\n            # Create an rule exception.\n\n            if path[1] == \"http\":\n                exception_type = HTTP_EXCEPTION\n            elif path[1] == \"repositories\":\n                exception_type = REPOSITORIES_EXCEPTION\n            else:\n                exception_type = EXTENSION_EXCEPTION\n\n            converted = jsonapi.convert(\n                parameters,\n                exception_type,\n                data)\n\n            with api.transaction.Transaction(critic) as transaction:\n                for access_token in access_tokens:\n                    modifier = modifyAccessToken(transaction, access_token) \\\n                        .modifyProfile() \\\n                        .modifyExceptions(path[1]) \\\n                        .add(**converted)\n\n            return value, values\n\n        # Create an access token.\n        assert not access_tokens\n\n        converted = jsonapi.convert(\n            parameters,\n            { \"access_type?\": ACCESS_TYPE,\n              \"title?\": str,\n              \"profile?\": PROFILE },\n            data)\n\n        result = []\n\n        def collectAccessToken(token):\n            assert isinstance(token, api.accesstoken.AccessToken)\n            result.append(token)\n\n        with api.transaction.Transaction(critic) as transaction:\n            modifier = transaction \\\n                .modifyUser(user)\n\n            access_type = converted.get(\"access_type\", \"user\")\n\n            access_token = modifier.createAccessToken(\n                access_type=access_type,\n                title=converted.get(\"title\"),\n                callback=collectAccessToken)\n\n            if \"profile\" in converted:\n                modifier = transaction\n                if access_type == \"user\":\n                    modifier = modifier.modifyUser(user)\n                modifier = modifier \\\n                     .modifyAccessToken(access_token) \\\n                     .modifyProfile()\n                updateProfile(modifier, converted[\"profile\"])\n\n        assert len(result) == 1\n        return result[0], None\n\n    @staticmethod\n    def update(parameters, value, values, data):\n        critic = parameters.critic\n\n        if value:\n            access_tokens = [value]\n        else:\n            access_tokens = values\n\n        path = parameters.subresource_path\n\n        if path == [\"profile\"]:\n            converted = jsonapi.convert(parameters, PROFILE, data)\n\n            with api.transaction.Transaction(critic) as transaction:\n                for access_token in access_tokens:\n                    modifier = modifyAccessToken(transaction, access_token) \\\n                        .modifyProfile()\n\n                    updateProfile(modifier, converted)\n\n            return\n\n        if len(path) == 2 \\\n                and path[0] == \"profile\" \\\n                and path[1] in CATEGORIES:\n            if path[1] == \"http\":\n                exception_type = HTTP_EXCEPTION\n            elif path[1] == \"repositories\":\n                exception_type = REPOSITORIES_EXCEPTION\n            else:\n                exception_type = EXTENSION_EXCEPTION\n\n            converted = jsonapi.convert(\n                parameters,\n                {\n                    \"rule?\": RULE,\n                    \"exceptions?\": [exception_type]\n                },\n                data)\n\n            with api.transaction.Transaction(critic) as transaction:\n                for access_token in access_tokens:\n                    modifier = modifyAccessToken(transaction, access_token) \\\n                        .modifyProfile()\n\n                    updateProfile(modifier, { path[1]: converted })\n\n            return\n\n        converted = jsonapi.check.convert(\n            parameters,\n            {\n                \"title?\": str,\n                \"profile?\": PROFILE,\n            },\n            data)\n\n        with api.transaction.Transaction(critic) as transaction:\n            for access_token in access_tokens:\n                modifier = modifyAccessToken(transaction, access_token)\n\n                if \"title\" in converted:\n                    modifier.setTitle(converted[\"title\"])\n\n                if \"profile\" in converted:\n                    updateProfile(modifier.modifyProfile(),\n                                  converted[\"profile\"])\n\n    @staticmethod\n    def delete(parameters, value, values):\n        critic = parameters.critic\n        path = parameters.subresource_path\n\n        if value:\n            access_tokens = [value]\n        else:\n            access_tokens = values\n\n        if 3 <= len(path) <= 4 \\\n                and path[0] == \"profile\" \\\n                and path[1] in CATEGORIES \\\n                and path[2] == \"exceptions\":\n            exception_id = None\n            if len(path) == 4:\n                exception_id = path[3]\n\n            with api.transaction.Transaction(critic) as transaction:\n                for access_token in access_tokens:\n                    modifier = modifyAccessToken(transaction, access_token) \\\n                        .modifyProfile() \\\n                        .modifyExceptions(path[1])\n\n                    if exception_id is None:\n                        modifier.deleteAll()\n                    else:\n                        modifier.delete(exception_id)\n\n            return value, values\n\n        if path:\n            raise jsonapi.UsageError(\"Invalid DELETE request\")\n\n        with api.transaction.Transaction(critic) as transaction:\n            for access_token in access_tokens:\n                modifyAccessToken(transaction, access_token) \\\n                    .delete()\n"
  },
  {
    "path": "src/jsonapi/v1/batches.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2017 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\nimport jsonapi\n\n@jsonapi.PrimaryResource\nclass Batches(object):\n    \"\"\"Batches of changes in reviews.\"\"\"\n\n    name = \"batches\"\n    contexts = (None, \"reviews\")\n    value_class = api.batch.Batch\n    exceptions = api.batch.BatchError\n\n    @staticmethod\n    def json(value, parameters):\n        \"\"\"{\n             \"id\": integer or null,\n             \"is_empty\": boolean,\n             \"review\": integer,\n             \"author\": integer,\n             \"comment\": integer or null,\n             \"timestamp\": float or null,\n             \"created_comments\": integer[],\n             \"written_replies\": integer[],\n             \"resolved_issues\": integer[],\n             \"reopened_issues\": integer[],\n             \"morphed_comments\": MorphedComment[],\n             \"reviewed_changes\": integer[],\n             \"unreviewed_changes\": integer[],\n           }\n\n           MorphedComment {\n             \"comment\": integer,\n             \"new_type\": \"issue\" or \"note\",\n           }\"\"\"\n\n        morphed_comments = sorted([\n            { \"comment\": comment, \"new_type\": new_type }\n            for comment, new_type in value.morphed_comments.items()\n        ], key=lambda morphed_comment: morphed_comment[\"comment\"].id)\n\n        timestamp = jsonapi.v1.timestamp(value.timestamp)\n        return parameters.filtered(\n            \"batches\", { \"id\": value.id,\n                         \"is_empty\": value.is_empty,\n                         \"review\": value.review,\n                         \"author\": value.author,\n                         \"timestamp\": timestamp,\n                         \"comment\": value.comment,\n                         \"created_comments\": jsonapi.sorted_by_id(\n                             value.created_comments),\n                         \"written_replies\": jsonapi.sorted_by_id(\n                             value.written_replies),\n                         \"resolved_issues\": jsonapi.sorted_by_id(\n                             value.resolved_issues),\n                         \"reopened_issues\": jsonapi.sorted_by_id(\n                             value.reopened_issues),\n                         \"morphed_comments\": morphed_comments,\n                         \"reviewed_changes\": jsonapi.sorted_by_id(\n                             value.reviewed_file_changes),\n                         \"unreviewed_changes\": jsonapi.sorted_by_id(\n                             value.unreviewed_file_changes) })\n\n    @staticmethod\n    def single(parameters, argument):\n        \"\"\"Retrieve one (or more) batches in reviews.\n\n           BATCH_ID : integer\n\n           Retrieve a batch identified by its unique numeric id.\"\"\"\n\n        batch = api.batch.fetch(\n            parameters.critic, batch_id=jsonapi.numeric_id(argument))\n\n        review = jsonapi.deduce(\"v1/reviews\", parameters)\n        if review and review != batch.review:\n            raise jsonapi.PathError(\n                \"Batch does not belong to specified review\")\n\n        return Batches.setAsContext(parameters, batch)\n\n    @staticmethod\n    def multiple(parameters):\n        \"\"\"Retrieve all batches in the system (or review.)\n\n           review : REVIEW_ID : integer\n\n           Retrieve only batches in the specified review.  Can only be used if a\n           review is not specified in the resource path.\n\n           author : AUTHOR : integer or string\n\n           Retrieve only batches authored by the specified user, identified by\n           the user's unique numeric id or user name.\n\n           unpublished : UNPUBLISHED : 'yes'\n\n           Retrieve a single batch representing the current user's unpublished\n           changes to a review. Must be combined with `review` and cannot be\n           combined with `author`.\"\"\"\n\n        critic = parameters.critic\n\n        review = jsonapi.deduce(\"v1/reviews\", parameters)\n        author = jsonapi.from_parameter(\"v1/users\", \"author\", parameters)\n\n        unpublished_parameter = parameters.getQueryParameter(\"unpublished\")\n        if unpublished_parameter is not None:\n            if unpublished_parameter == \"yes\":\n                if author is not None:\n                    raise jsonapi.UsageError(\n                        \"Parameters 'author' and 'unpublished' cannot be \"\n                        \"combined\")\n                return api.batch.fetchUnpublished(critic, review)\n            else:\n                raise jsonapi.UsageError(\n                    \"Invalid 'unpublished' parameter: %r (must be 'yes')\"\n                    % unpublished_parameter)\n\n        return api.batch.fetchAll(critic, review=review, author=author)\n\n    @staticmethod\n    def create(parameters, value, values, data):\n        critic = parameters.critic\n        user = parameters.context.get(\"users\", critic.actual_user)\n\n        if value or values:\n            raise jsonapi.UsageError(\"Invalid POST request\")\n\n        converted = jsonapi.convert(\n            parameters,\n            {\n                \"review?\": api.review.Review,\n                \"comment?\": str,\n            },\n            data)\n\n        review = jsonapi.deduce(\"v1/reviews\", parameters)\n\n        if not review:\n            if \"review\" not in converted:\n                raise jsonapi.UsageError(\"No review specified\")\n            review = converted[\"review\"]\n        elif \"review\" in converted and review != converted[\"review\"]:\n            raise jsonapi.UsageError(\"Conflicting reviews specified\")\n\n        if \"comment\" in converted:\n            comment_text = converted[\"comment\"].strip()\n            if not comment_text:\n                raise jsonapi.UsageError(\"Empty comment specified\")\n        else:\n            comment_text = None\n\n        result = []\n\n        def collectBatch(batch):\n            assert isinstance(batch, api.batch.Batch)\n            result.append(batch)\n\n        with api.transaction.Transaction(critic) as transaction:\n            modifier = transaction.modifyReview(review)\n\n            if comment_text:\n                note = modifier.createComment(comment_type=\"note\",\n                                              author=critic.actual_user,\n                                              text=comment_text)\n            else:\n                note = None\n\n            modifier.submitChanges(note, callback=collectBatch)\n\n        assert len(result) == 1\n        return result[0], None\n\n    @staticmethod\n    def deduce(parameters):\n        batch = parameters.context.get(\"batches\")\n        batch_parameter = parameters.getQueryParameter(\"batch\")\n        if batch_parameter is not None:\n            if batch is not None:\n                raise jsonapi.UsageError(\n                    \"Redundant query parameter: batch=%s\" % batch_parameter)\n            batch = api.batch.fetch(\n                parameters.critic, jsonapi.numeric_id(batch_parameter))\n        return batch\n\n    @staticmethod\n    def setAsContext(parameters, batch):\n        parameters.setContext(Batches.name, batch)\n        return batch\n"
  },
  {
    "path": "src/jsonapi/v1/branches.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\nimport jsonapi\n\n@jsonapi.PrimaryResource\nclass Branches(object):\n    \"\"\"Branches in the Git repositories.\"\"\"\n\n    name = \"branches\"\n    contexts = (None, \"repositories\")\n    value_class = api.branch.Branch\n    exceptions = (api.branch.BranchError, api.repository.RepositoryError)\n\n    @staticmethod\n    def json(value, parameters):\n        \"\"\"Branch {\n             \"id\": integer, // the branch's id\n             \"name\": string, // the branch's name\n             \"repository\": integer, // the branch's repository's id\n             \"head\": integer, // the branch's head commit's id\n           }\"\"\"\n\n        return parameters.filtered(\n            \"branches\", { \"id\": value.id,\n                          \"name\": value.name,\n                          \"repository\": value.repository,\n                          \"head\": value.head })\n\n    @staticmethod\n    def single(parameters, argument):\n        \"\"\"Retrieve one (or more) branches in the Git repositories.\n\n           BRANCH_ID : integer\n\n           Retrieve a branch identified by its unique numeric id.\"\"\"\n\n        return Branches.setAsContext(parameters, api.branch.fetch(\n            parameters.critic, branch_id=jsonapi.numeric_id(argument)))\n\n    @staticmethod\n    def multiple(parameters):\n        \"\"\"Retrieve all branches in the Git repositories.\n\n           repository : REPOSITORY : -\n\n           Include only branches in one repository, identified by the\n           repository's unique numeric id or short-name.\n\n           name : NAME : string\n\n           Retrieve only the branch with the specified name.  The name should\n           <em>not</em> include the \"refs/heads/\" prefix.  When this parameter\n           is specified a repository must be specified as well, either in the\n           resource path or using the <code>repository</code> parameter.\"\"\"\n\n        repository = jsonapi.deduce(\"v1/repositories\", parameters)\n        name_parameter = parameters.getQueryParameter(\"name\")\n        if name_parameter:\n            if repository is None:\n                raise jsonapi.UsageError(\n                    \"Named branch access must have repository specified.\")\n            return api.branch.fetch(\n                parameters.critic, repository=repository, name=name_parameter)\n        return api.branch.fetchAll(parameters.critic, repository=repository)\n\n    @staticmethod\n    def setAsContext(parameters, branch):\n        parameters.setContext(Branches.name, branch)\n        return branch\n\nimport commits\n\n@jsonapi.PrimaryResource\nclass BranchCommits(object):\n    \"\"\"Commits associated with a branch.\n\n       This is typically not all commits reachable from a branch.  When a branch\n       is first pushed to a repository, all commits reachable only from the\n       branch are associated with it.  After that, as the branch is updated, all\n       new commits are also associated with the branch.\"\"\"\n\n    name = \"commits\"\n    contexts = (\"branches\", \"reviews\")\n    value_class = api.commit.Commit\n    exceptions = (api.commit.CommitError,)\n\n    json = staticmethod(commits.Commits.json)\n\n    @staticmethod\n    def multiple(parameters):\n        \"\"\"Retrieve all commits associated with the branch.\n\n           sort : SORT_KEY : -\n\n           Sort the commits in <code>topological</code> or <code>date</code>\n           order.  In either case, a child commit is always sorted before all of\n           its parent commits, but whenever more than one commit could be\n           emitted without violating this rule, topological order prefers the\n           first parent and its ancestors, while date order prefers the commit\n           with the most recent commit date.  Topological order is the\n           default.\"\"\"\n\n        branch = parameters.context[\"branches\"]\n        sort_parameter = parameters.getQueryParameter(\"sort\")\n        if sort_parameter is None or sort_parameter == \"topological\":\n            return branch.commits.topo_ordered\n        elif sort_parameter != \"date\":\n            raise jsonapi.UsageError(\"Invalid commits sort parameter: %r\"\n                                     % sort_parameter)\n        return branch.commits.date_ordered\n"
  },
  {
    "path": "src/jsonapi/v1/changesets.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2017 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport re\n\nimport api\nimport jsonapi\n\n\n@jsonapi.PrimaryResource\nclass Changesets(object):\n    \"\"\"Changesets in the git repositories\"\"\"\n\n    name = \"changesets\"\n    contexts = (None, \"repositories\", \"reviews\")\n    value_class = api.changeset.Changeset\n    exceptions = (api.changeset.ChangesetError,)\n\n    @staticmethod\n    def json(value, parameters):\n        \"\"\"Changeset {\n             \"id\": integer, // the changeset's id\n             \"type\": string, // the changeset type (direct, custom, merge, conflict)\n             \"from_commit\": integer, // commit id for changesets from_commit\n             \"to_commit\": integer, // commit id for changesets to_commit\n             \"files\": integer[], // a list of all files changed in this changeset\n             \"review_state\": ReviewState or null,\n           }\n\n           ReviewState {\n             \"review\": integer,\n             \"comments\": integer[],\n           }\"\"\"\n\n        def review_state(review):\n            if not review:\n                return None\n\n            comments = api.comment.fetchAll(\n                parameters.critic, review=review, changeset=value)\n\n            try:\n                reviewablefilechanges = api.reviewablefilechange.fetchAll(\n                    parameters.critic, review=review, changeset=value)\n            except api.reviewablefilechange.InvalidChangeset:\n                reviewablefilechanges = None\n\n            return {\n                \"review\": review,\n                \"commments\": comments,\n                \"reviewablefilechanges\": reviewablefilechanges,\n            }\n\n        review = jsonapi.deduce(\"v1/reviews\", parameters)\n\n        contributing_commits = value.contributing_commits\n        if contributing_commits:\n            contributing_commits = list(contributing_commits.topo_ordered)\n\n        return parameters.filtered(\n            \"changesets\", {\n                \"id\": value.id,\n                \"type\": value.type,\n                \"from_commit\": value.from_commit,\n                \"to_commit\": value.to_commit,\n                \"files\": value.files,\n                \"contributing_commits\": contributing_commits,\n                \"review_state\": review_state(review)\n            })\n\n    @staticmethod\n    def single(parameters, argument):\n        \"\"\"Retrieve one (or more) changesets.\n\n           CHANGESET_ID : integer\n\n           Retrieve a changeset identified by its unique numeric id.\n\n           repository : REPOSITORY : -\n\n           Specify repository to access, identified by its unique numeric id or\n           short-name.  Required unless a repository is specified in the\n           resource path.\"\"\"\n\n        repository = jsonapi.deduce(\"v1/repositories\", parameters)\n        if repository is None:\n            raise jsonapi.UsageError(\n                \"repository needs to be specified, ex. &repository=<id>\")\n        return Changesets.setAsContext(parameters,\n                                       api.changeset.fetch(\n                                           parameters.critic,\n                                           repository,\n                                           jsonapi.numeric_id(argument)))\n\n    @staticmethod\n    def multiple(parameters):\n        \"\"\"Retrieve (and create if it doesn't exist) a changeset identified by\n           a single commit (changeset type: direct) or any two commits in the\n           same repository (changeset type: custom).\n\n           from : COMMIT_SHA1 : string\n\n           Retrieve a changeset with a commit (identified by its SHA-1 sum) as\n           its from_commit. The SHA-1 sum can be abbreviated, but must be at\n           least 4 characters long, and must be unambiguous in the repository.\n           Must be used together with parameter 'to'.\n\n           to : COMMIT_SHA1 : string\n\n           Retrieve a changeset with a commit (identified by its SHA-1 sum) as\n           its to_commit. The SHA-1 sum can be abbreviated, but must be at least\n           4 characters long, and must be unambiguous in the repository. Must be\n           used together with parameter 'from'.\n\n           commit : COMMIT_SHA1 : string\n\n           Retrieve a changeset with a commit (identified by its SHA-1 sum) as\n           its to_commit, and the commit's parent as its from_commit. The SHA-1\n           sum can be abbreviated, but must be at least 4 characters long, and\n           must be unambiguous in the repository. Cannot be combined with 'from'\n           or 'to'. Currently does not support merge commits.\n\n           repository : REPOSITORY : -\n\n           Specify repository to access, identified by its unique numeric id or\n           short-name.  Required unless a repository is specified in the\n           resource path.\n\n           review : REVIEW_ID : -\n\n           Specify a review to calculate an \"automatic\" changeset for.\n\n           automatic : MODE : string\n\n           Calculate the changeset commit range automatically based on a review\n           and a mode, which must be \"everything\", \"reviewable\" (changes\n           assigned to current user), \"relevant\" (changes assigned to or files\n           watched by current user) or \"pending\" (unreviewed changes assigned to\n           current user.)\n\n           A review must be specified in this case, and none of the 'from', 'to'\n           or 'commit' parameters can be used.\"\"\"\n\n        repository = jsonapi.deduce(\"v1/repositories\", parameters)\n        if repository is None:\n            raise jsonapi.UsageError(\n                \"repository needs to be specified, ex. &repository=<id>\")\n\n        def get_commit(name):\n            return jsonapi.from_parameter(\"v1/commits\", name, parameters)\n\n        from_commit = get_commit(\"from\")\n        to_commit = get_commit(\"to\")\n        single_commit = get_commit(\"commit\")\n\n        review = jsonapi.deduce(\"v1/reviews\", parameters)\n        automatic = parameters.getQueryParameter(\"automatic\")\n\n        if automatic is not None:\n            if automatic not in api.changeset.Changeset.AUTOMATIC_MODES:\n                raise jsonapi.UsageError(\"Invalid automatic mode: %r (must be \"\n                                         \"one of 'everything', 'reviewable', \"\n                                         \"'relevant' or 'pending'.\"\n                                         % automatic)\n            if review is None:\n                raise jsonapi.UsageError(\"A review must be specified when \"\n                                         \"an automatic mode is used\")\n            if from_commit or to_commit or single_commit:\n                raise jsonapi.UsageError(\"Explicit commit range cannot be \"\n                                         \"specified when an automatic mode is \"\n                                         \"used\")\n        else:\n            if not (from_commit or to_commit or single_commit):\n                raise jsonapi.UsageError(\n                    \"Missing required parameters from and to, or commit\")\n\n            if (from_commit is None) != (to_commit is None):\n                raise jsonapi.UsageError(\"Missing required parameters from and \"\n                                         \"to, only one supplied\")\n\n            if from_commit == to_commit and from_commit is not None:\n                raise jsonapi.UsageError(\"from and to can't be the same commit\")\n\n        return Changesets.setAsContext(\n            parameters, api.changeset.fetch(\n                parameters.critic,\n                repository,\n                from_commit=from_commit,\n                to_commit=to_commit,\n                single_commit=single_commit,\n                review=review,\n                automatic=automatic))\n\n    @staticmethod\n    def deduce(parameters):\n        repository = jsonapi.deduce(\"v1/repositories\", parameters)\n        changeset = parameters.context.get(Changesets.name)\n        changeset_parameter = parameters.getQueryParameter(\"changeset\")\n        if changeset_parameter is not None:\n            if changeset is not None:\n                raise jsonapi.UsageError(\n                    \"Redundant query parameter: changeset=%s\"\n                    % changeset_parameter)\n            if repository is None:\n                raise jsonapi.UsageError(\n                    \"repository needs to be specified, ex. &repository=<id>\")\n            changeset_id = jsonapi.numeric_id(changeset_parameter)\n            changeset = api.changeset.fetch(\n                parameters.critic, repository, changeset_id=changeset_id)\n        return changeset\n\n\n    @staticmethod\n    def setAsContext(parameters, changeset):\n        parameters.setContext(Changesets.name, changeset)\n        return changeset\n"
  },
  {
    "path": "src/jsonapi/v1/comments.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2017 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\nimport jsonapi\n\n@jsonapi.PrimaryResource\nclass Comments(object):\n    \"\"\"Issues and notes in reviews.\"\"\"\n\n    name = \"comments\"\n    contexts = (None, \"reviews\")\n    value_class = (api.comment.Comment, api.comment.Issue, api.comment.Note)\n    exceptions = (api.comment.CommentError, api.reply.ReplyError)\n\n    @staticmethod\n    def json(value, parameters):\n        \"\"\"{\n             \"id\": integer,\n             \"type\": \"issue\" or \"note\",\n             \"is_draft\": boolean,\n             \"state\": \"open\", \"addressed\" or \"resolved\" (null for notes),\n             \"review\": integer,\n             \"author\": integer,\n             \"location\": Location or null,\n             \"resolved_by\": integer, // user that resolved the issue\n             \"addressed_by\": integer, // commit that addressed the issue\n             \"timestamp\": float,\n             \"text\": string,\n             \"replies\": integer[],\n             \"draft_changes\": DraftChanges or null,\n           }\n\n           Location {\n             \"type\": \"commit-message\" or \"file-version\",\n             \"first_line\": integer, // first commented line (one-based, inclusive)\n             \"last_line\": integer, // last commented line (one-based, inclusive)\n           }\n\n           CommitMessageLocation : Location {\n             \"commit\": integer // commented commit\n           }\n\n           FileVersionLocation : Location {\n             \"file\": integer, // commented file\n             \"changeset\": integer or null, // commented changeset\n             \"commit\": integer, // commented commit\n           }\n\n           DraftChanges {\n             \"author\": integer, // author of these draft changes\n             \"is_draft\": boolean, // true if comment itself is unpublished\n             \"reply\": integer or null, // unpublished reply\n             \"new_type\": \"issue\" or \"note\", // unpublished comment type change\n             \"new_state\": \"open\", \"addressed\" or \"resolved\" (null for notes)\n             \"new_location\": FileVersionLocation or null,\n           }\"\"\"\n\n        if isinstance(value, api.comment.Issue):\n            state = value.state\n            resolved_by = value.resolved_by\n            addressed_by = value.addressed_by\n        else:\n            state = None\n            resolved_by = None\n            addressed_by = None\n\n        def location_json(location):\n            if not location:\n                return None\n\n            if location.type == \"file-version\":\n                changeset = jsonapi.deduce(\"v1/changesets\", parameters)\n\n                if not changeset:\n                    # FileVersionLocation.translateTo() only allows one, so let\n                    # a deduced changeset win over a deduced commit.\n                    commit = jsonapi.deduce(\"v1/commits\", parameters)\n                else:\n                    commit = None\n\n                if changeset or commit:\n                    location = location.translateTo(changeset=changeset,\n                                                    commit=commit)\n\n                    if not location:\n                        raise jsonapi.ResourceSkipped(\n                            \"Comment not present in changeset/commit\")\n\n            result = {\n                \"type\": location.type,\n                \"first_line\": location.first_line,\n                \"last_line\": location.last_line\n            }\n\n            if location.type == \"commit-message\":\n                result.update({\n                    \"commit\": location.commit\n                })\n            else:\n                result.update({\n                    \"file\": location.file,\n                    \"changeset\": location.changeset,\n                    \"side\": location.side,\n                    \"commit\": location.commit,\n                    \"is_translated\": location.is_translated\n                })\n\n            return result\n\n        draft_changes = value.draft_changes\n        if draft_changes:\n            draft_changes_json = {\n                \"author\": draft_changes.author,\n                \"is_draft\": draft_changes.is_draft,\n                \"reply\": draft_changes.reply,\n                \"new_type\": draft_changes.new_type,\n                \"new_state\": None,\n                \"new_location\": None,\n            }\n\n            if isinstance(draft_changes, api.comment.Issue.DraftChanges):\n                draft_changes_json.update({\n                    \"new_state\": draft_changes.new_state,\n                    \"new_location\": location_json(draft_changes.new_location),\n                })\n        else:\n            draft_changes_json = None\n\n        timestamp = jsonapi.v1.timestamp(value.timestamp)\n        return parameters.filtered(\n            \"comments\", {\n                \"id\": value.id,\n                \"type\": value.type,\n                \"is_draft\": value.is_draft,\n                \"state\": state,\n                \"review\": value.review,\n                \"author\": value.author,\n                \"location\": location_json(value.location),\n                \"resolved_by\": resolved_by,\n                \"addressed_by\": addressed_by,\n                \"timestamp\": timestamp,\n                \"text\": value.text,\n                \"replies\": value.replies,\n                \"draft_changes\": draft_changes_json,\n            })\n\n    @staticmethod\n    def single(parameters, argument):\n        \"\"\"Retrieve one (or more) comments in reviews.\n\n           COMMENT_ID : integer\n\n           Retrieve a comment identified by its unique numeric id.\"\"\"\n\n        comment = api.comment.fetch(\n            parameters.critic, comment_id=jsonapi.numeric_id(argument))\n\n        review = jsonapi.deduce(\"v1/reviews\", parameters)\n        if review and review != comment.review:\n            raise jsonapi.PathError(\n                \"Comment does not belong to specified review\")\n\n        return Comments.setAsContext(parameters, comment)\n\n    @staticmethod\n    def multiple(parameters):\n        \"\"\"Retrieve all comments in the system (or review.)\n\n           with_reply : REPLY_ID : integer\n\n           Retrieve only the comment to which the specified reply is a reply.\n           This is equivalent to accessing /api/v1/comments/COMMENT_ID with that\n           comment's numeric id.  When used, any other parameters are ignored.\n\n           review : REVIEW_ID : integer\n\n           Retrieve only comments in the specified review.  Can only be used if\n           a review is not specified in the resource path.\n\n           author : AUTHOR : integer or string\n\n           Retrieve only comments authored by the specified user, identified by\n           the user's unique numeric id or user name.\n\n           comment_type : TYPE : -\n\n           Retrieve only comments of the specified type.  Valid values are:\n           <code>issue</code> and <code>note</code>.\n\n           state : STATE : -\n\n           Retrieve only issues in the specified state.  Valid values are:\n           <code>open</code>, <code>addressed</code> and <code>resolved</code>.\n\n           location_type : LOCATION : -\n\n           Retrieve only comments in the specified type of location.  Valid\n           values are: <code>general</code>, <code>commit-message</code> and\n           <code>file-version</code>.\n\n           changeset : CHANGESET_ID : integer\n\n           Retrieve only comments visible in the specified changeset. Can not be\n           combined with the <code>commit</code> parameter.\n\n           commit : COMMIT : integer or string\n\n           Retrieve only comments visible in the specified commit, either in its\n           commit message or in the commit's version of a file. Combine with the\n           <code>location_type</code> parameter to select only one of those\n           possibilities. Can not be combined with the <code>changeset</code>\n           parameter.\"\"\"\n\n        critic = parameters.critic\n\n        reply = jsonapi.from_parameter(\"v1/replies\", \"with_reply\", parameters)\n        if reply:\n            return reply.comment\n\n        review = jsonapi.deduce(\"v1/reviews\", parameters)\n        author = jsonapi.from_parameter(\"v1/users\", \"author\", parameters)\n\n        comment_type_parameter = parameters.getQueryParameter(\"comment_type\")\n        if comment_type_parameter:\n            if comment_type_parameter not in api.comment.Comment.TYPE_VALUES:\n                raise jsonapi.UsageError(\"Invalid comment-type parameter: %r\"\n                                         % comment_type_parameter)\n            comment_type = comment_type_parameter\n        else:\n            comment_type = None\n\n        state_parameter = parameters.getQueryParameter(\"state\")\n        if state_parameter:\n            if state_parameter not in api.comment.Issue.STATE_VALUES:\n                raise jsonapi.UsageError(\n                    \"Invalid state parameter: %r\" % state_parameter)\n            state = state_parameter\n        else:\n            state = None\n\n        location_type_parameter = parameters.getQueryParameter(\"location_type\")\n        if location_type_parameter:\n            if location_type_parameter not in api.comment.Location.TYPE_VALUES:\n                raise jsonapi.UsageError(\"Invalid location-type parameter: %r\"\n                                         % location_type_parameter)\n            location_type = location_type_parameter\n        else:\n            location_type = None\n\n        changeset = jsonapi.deduce(\"v1/changesets\", parameters)\n        commit = jsonapi.deduce(\"v1/commits\", parameters)\n\n        if changeset and commit:\n            raise jsonapi.UsageError(\n                \"Incompatible parameters: changeset and commit\")\n\n        return api.comment.fetchAll(critic, review=review, author=author,\n                                    comment_type=comment_type, state=state,\n                                    location_type=location_type,\n                                    changeset=changeset, commit=commit)\n\n    @staticmethod\n    def create(parameters, value, values, data):\n        critic = parameters.critic\n        user = parameters.context.get(\"users\", critic.actual_user)\n        path = parameters.subresource_path\n\n        if value and path == [\"replies\"]:\n            assert isinstance(value, api.comment.Comment)\n            Comments.setAsContext(parameters, value)\n            raise jsonapi.InternalRedirect(\"v1/replies\")\n\n        if value or values or path:\n            raise jsonapi.UsageError(\"Invalid POST request\")\n\n        converted = jsonapi.convert(\n            parameters,\n            {\n                \"type\": api.comment.Comment.TYPE_VALUES,\n                \"review!?\": api.review.Review,\n                \"author?\": api.user.User,\n                \"location?\": {\n                    # Note: \"general\" not included here; |location| should be\n                    #       omitted instead.\n                    \"type\": frozenset([\"commit-message\", \"file-version\"]),\n                    \"first_line\": int,\n                    \"last_line\": int,\n                    \"commit?\": api.commit.Commit,\n                    \"file?\": api.file.File,\n                    \"changeset?\": api.changeset.Changeset,\n                    \"side?\": frozenset([\"old\", \"new\"]),\n                },\n                \"text\": str\n            },\n            data)\n\n        review = jsonapi.deduce(\"v1/reviews\", parameters)\n\n        if not review:\n            if \"review\" not in converted:\n                raise jsonapi.UsageError(\"No review specified\")\n            review = converted[\"review\"]\n        elif \"review\" in converted and review != converted[\"review\"]:\n            raise jsonapi.UsageError(\"Conflicting reviews specified\")\n\n        if \"author\" in converted:\n            author = converted[\"author\"]\n        else:\n            author = critic.actual_user\n\n        if converted[\"type\"] == \"issue\":\n            expected_class = api.comment.Issue\n        else:\n            expected_class = api.comment.Note\n\n        converted_location = converted.get(\"location\")\n        if converted_location:\n            location_type = converted_location.pop(\"type\")\n            if location_type == \"commit-message\":\n                required_fields = set((\"first_line\", \"last_line\", \"commit\"))\n                optional_fields = set()\n            else:\n                required_fields = set((\"first_line\", \"last_line\", \"file\"))\n                optional_fields = set((\"commit\", \"changeset\", \"side\"))\n            accepted_fields = required_fields | optional_fields\n\n            for required_field in required_fields:\n                if required_field not in converted_location:\n                    raise jsonapi.InputError(\n                        \"data.location.%s: missing attribute\" % required_field)\n            for actual_field in converted_location.keys():\n                if actual_field not in accepted_fields:\n                    raise jsonapi.InputError(\n                        \"data.location.%s: unexpected attribute\" % actual_field)\n\n            if location_type == \"commit-message\":\n                max_line = len(\n                    converted_location[\"commit\"].message.splitlines())\n            else:\n                if \"commit\" in converted_location:\n                    if \"changeset\" in converted_location:\n                        raise jsonapi.InputError(\n                            \"data.location: only one of commit and changeset \"\n                            \"can be specified\")\n                    changeset = None\n                    side = None\n                    commit = converted_location[\"commit\"]\n                elif \"changeset\" not in converted_location:\n                    raise jsonapi.InputError(\n                        \"data.location: one of commit and changeset must be \"\n                        \"specified\")\n                elif \"side\" not in converted_location:\n                    raise jsonapi.InputError(\n                        \"data.location.side: missing attribute (required when \"\n                        \"changeset is specified)\")\n                else:\n                    changeset = converted_location[\"changeset\"]\n                    side = converted_location[\"side\"]\n                    commit = None\n\n            first_line = converted_location[\"first_line\"]\n            last_line = converted_location[\"last_line\"]\n\n            if location_type == \"commit-message\":\n                location = api.comment.CommitMessageLocation.make(\n                    critic, first_line, last_line, converted_location[\"commit\"])\n            else:\n                location = api.comment.FileVersionLocation.make(\n                    critic, first_line, last_line, converted_location[\"file\"],\n                    changeset, side, commit)\n        else:\n            location = None\n\n        result = []\n\n        def collectComment(comment):\n            assert isinstance(comment, expected_class), repr(comment)\n            result.append(comment)\n\n        with api.transaction.Transaction(critic) as transaction:\n            transaction \\\n                .modifyReview(review) \\\n                .createComment(\n                    comment_type=converted[\"type\"],\n                    author=author,\n                    text=converted[\"text\"],\n                    location=location,\n                    callback=collectComment)\n\n        assert len(result) == 1, repr(result)\n        return result[0], None\n\n    @staticmethod\n    def update(parameters, value, values, data):\n        critic = parameters.critic\n        path = parameters.subresource_path\n\n        if value:\n            comments = [value]\n        else:\n            comments = values\n\n        if path:\n            raise jsonapi.UsageError(\"Invalid PUT request\")\n\n        converted = jsonapi.convert(\n            parameters,\n            {\n                \"text?\": str,\n                \"draft_changes?\": {\n                    \"new_state?\": frozenset([\"open\", \"resolved\"]),\n                },\n            },\n            data)\n\n        with api.transaction.Transaction(critic) as transaction:\n            for comment in comments:\n                modifier = transaction \\\n                    .modifyReview(comment.review) \\\n                    .modifyComment(comment)\n\n                if \"text\" in converted:\n                    modifier.setText(converted[\"text\"])\n\n                draft_changes = converted.get(\"draft_changes\")\n                if draft_changes:\n                    if draft_changes.get(\"new_state\") == \"resolved\":\n                        modifier.resolveIssue()\n                    elif draft_changes.get(\"new_state\") == \"open\":\n                        modifier.reopenIssue()\n\n        return value, values\n\n    @staticmethod\n    def delete(parameters, value, values):\n        critic = parameters.critic\n        path = parameters.subresource_path\n\n        if value:\n            comments = [value]\n        else:\n            comments = values\n\n        if path:\n            raise jsonapi.UsageError(\"Invalid DELETE request\")\n\n        with api.transaction.Transaction(critic) as transaction:\n            for comment in comments:\n                transaction \\\n                    .modifyReview(comment.review) \\\n                    .modifyComment(comment) \\\n                    .delete()\n\n    @staticmethod\n    def deduce(parameters):\n        comment = parameters.context.get(\"comments\")\n        comment_parameter = parameters.getQueryParameter(\"comment\")\n        if comment_parameter is not None:\n            if comment is not None:\n                raise jsonapi.UsageError(\n                    \"Redundant query parameter: comment=%s\" % comment_parameter)\n            comment = api.comment.fetch(\n                parameters.critic,\n                comment_id=jsonapi.numeric_id(comment_parameter))\n        return comment\n\n    @staticmethod\n    def setAsContext(parameters, comment):\n        parameters.setContext(Comments.name, comment)\n        return comment\n"
  },
  {
    "path": "src/jsonapi/v1/commits.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport datetime\nimport re\n\nimport api\nimport jsonapi\n\n@jsonapi.PrimaryResource\nclass Commits(object):\n    \"\"\"Commits in the Git repositories.\"\"\"\n\n    name = \"commits\"\n    contexts = (None, \"repositories\", \"changesets\")\n    value_class = api.commit.Commit\n    exceptions = (api.commit.CommitError, api.repository.InvalidRef)\n\n    @staticmethod\n    def json(value, parameters):\n        \"\"\"Commit {\n             \"id\": integer, // the commit's id\n             \"sha1\": string, // the commit's SHA-1 sum\n             \"summary\": string, // (processed) commit summary\n             \"message\": string, // full / raw commit message\n             \"parents\": [integer], // list of commit ids\n             \"author\": {\n               \"name\": string, // author (full)name\n               \"email\": string, // author email\n               \"timestamp\": float, // seconds since epoch\n             },\n             \"committer\": {\n               \"name\": string, // committer (full)name\n               \"email\": string, // committer email\n               \"timestamp\": float, // seconds since epoch\n             },\n           }\"\"\"\n\n        parents_ids = [parent.id for parent in value.parents]\n\n        # Important:\n        #\n        # We're returning parents as integers instead of as api.commit.Commit\n        # objects here, to disable expansion of them as linked objects.  Not\n        # doing this would typically lead to recursively dumping all commits in\n        # a repository a lot of the time, which wouldn't generally be useful.\n        #\n        # Limited sets of commits are returned as api.commit.Commit objects from\n        # other resources, like reviews, which does enable expansion of them as\n        # linked objects, just not recursively.\n\n        def userAndTimestamp(user_and_timestamp):\n            timestamp = jsonapi.v1.timestamp(user_and_timestamp.timestamp)\n            return { \"name\": user_and_timestamp.name,\n                     \"email\": user_and_timestamp.email,\n                     \"timestamp\": timestamp }\n\n        return parameters.filtered(\n            \"branches\", { \"id\": value.id,\n                          \"sha1\": value.sha1,\n                          \"summary\": value.summary,\n                          \"message\": value.message,\n                          \"parents\": parents_ids,\n                          \"author\": userAndTimestamp(value.author),\n                          \"committer\": userAndTimestamp(value.committer) })\n\n    @staticmethod\n    def single(parameters, argument):\n        \"\"\"Retrieve one (or more) commits from a Git repository.\n\n           COMMIT_ID : integer\n\n           Retrieve a commit identified by its unique numeric id.\n\n           repository : REPOSITORY : -\n\n           Specify repository to access, identified by its unique numeric id or\n           short-name.  Required unless a repository is specified in the\n           resource path.\"\"\"\n\n        repository = jsonapi.deduce(\"v1/repositories\", parameters)\n        if repository is None:\n            raise jsonapi.UsageError(\n                \"Commit reference must have repository specified.\")\n        return Commits.setAsContext(parameters, api.commit.fetch(\n            repository, commit_id=jsonapi.numeric_id(argument)))\n\n    @staticmethod\n    def multiple(parameters):\n        \"\"\"Retrieve a single commit identified by its SHA-1 sum.\n\n           sha1 : COMMIT_SHA1 : string\n\n           Retrieve a commit identified by its SHA-1 sum.  The SHA-1 sum can be\n           abbreviated, but must be at least 4 characters long, and must be\n           unambigious in the repository.\n\n           repository : REPOSITORY : -\n\n           Specify repository to access, identified by its unique numeric id or\n           short-name.  Required unless a repository is specified in the\n           resource path.\"\"\"\n\n        sha1_parameter = parameters.getQueryParameter(\"sha1\")\n        if sha1_parameter is None:\n            raise jsonapi.UsageError(\"Missing required SHA-1 parameter.\")\n        if not re.match(\"[0-9A-Fa-f]{4,40}$\", sha1_parameter):\n            raise jsonapi.UsageError(\n                \"Invalid SHA-1 parameter: %r\" % sha1_parameter)\n        repository = jsonapi.deduce(\"v1/repositories\", parameters)\n        if repository is None:\n            raise jsonapi.UsageError(\n                \"Commit reference must have repository specified.\")\n        return api.commit.fetch(repository, sha1=sha1_parameter)\n\n    @staticmethod\n    def deduce(parameters):\n        commit = parameters.context.get(Commits.name)\n        commit_parameter = parameters.getQueryParameter(\"commit\")\n        if commit_parameter is not None:\n            if commit is not None:\n                raise jsonapi.UsageError(\n                    \"Redundant query parameter: commit=%s\"\n                    % commit_parameter)\n            commit = Commits.fromParameter(commit_parameter, parameters)\n        return commit\n\n    @staticmethod\n    def fromParameter(value, parameters):\n        repository = jsonapi.deduce(\"v1/repositories\", parameters)\n        commit_id, ref = jsonapi.id_or_name(value)\n        return api.commit.fetch(repository, commit_id, ref=ref)\n\n    @staticmethod\n    def setAsContext(parameters, commit):\n        parameters.setContext(Commits.name, commit)\n        return commit\n"
  },
  {
    "path": "src/jsonapi/v1/documentation.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport re\nimport itertools\n\nimport jsonapi\nimport page.utils\n\ndef splitAndDeindentDocstring(item, level, default=None):\n    if item.__doc__ is None:\n        if default:\n            return [default]\n        return\n    def inner():\n        lines = item.__doc__.splitlines()\n        yield lines[0]\n        indentation_level = level * 4 + 3\n        expected_indentation = \" \" * indentation_level\n        for line in lines[1:]:\n            assert not line or line.startswith(expected_indentation)\n            yield line[indentation_level:]\n    return list(inner())\n\ndef extractResourceSummary(resource_class):\n    lines = list(splitAndDeindentDocstring(\n        resource_class, level=1, default=\"Undocumented\"))\n    try:\n        return \" \".join(lines[:lines.index(\"\")])\n    except ValueError:\n        return \" \".join(lines)\n\ndef popParagraph(lines, include_empty=False):\n    try:\n        index = lines.index(\"\")\n    except ValueError:\n        index = len(lines)\n    paragraph = lines[:index]\n    if len(paragraph) == 1 and \" : \" in paragraph[0]:\n        return None\n    if include_empty:\n        paragraph.append(\"\")\n    del lines[:index + 1]\n    return paragraph\n\ndef copyParagraph(destination, source, as_definition=False, include_empty=None):\n    if include_empty is None:\n        include_empty = not as_definition\n    paragraph = popParagraph(source, include_empty=False)\n    if paragraph is None:\n        paragraph = [\"Undocumented!\"]\n    if as_definition:\n        paragraph = ([\"= \" + paragraph[0]] +\n                     [\"  \" + line for line in paragraph[1:]])\n    if include_empty:\n        paragraph.append(\"\")\n    destination.extend(paragraph)\n    return True\n\ndef describeVersion():\n    lines = [\"Critic JSON API: Version 1\",\n             \"==========================\",\n             \"\",\n             \"Path\",\n             \"----\",\n             \"<code>/api/v1</code>\",\n             \"\",\n             \"Top-level resources\",\n             \"-------------------\"]\n\n    supported_resources = []\n\n    for path, resource_class in jsonapi.HANDLERS.items():\n        if path.startswith(\"v1/\"):\n            supported_resources.append(\n                (path[3:], extractResourceSummary(resource_class)))\n\n    supported_resources.sort()\n\n    for path, summary in supported_resources:\n        lines.extend([\"? <code>/api/v1/[%s][%s]</code>\" % (path, path),\n                      \"= \" + summary])\n\n    lines.append(\"\")\n\n    for path, _ in supported_resources:\n        lines.append(\"[%s]: /api/v1?describe=%s\" % (path, path))\n\n    raise page.utils.DisplayFormattedText(lines)\n\ndef listAlternativePaths(resource_class):\n    for context in getattr(resource_class, \"contexts\", (None,)):\n        if context is None:\n            yield resource_class.name\n            continue\n        for context_class in jsonapi.find(context):\n            for context_path in listAlternativePaths(context_class):\n                yield \"%s/%s\" % (context_path, resource_class.name)\n\ndef describeResource(resource_path):\n    resource_class = jsonapi.lookup(\"v1/\" + resource_path)\n\n    # foo/bar/fie -> foo/A/bar/B/fie\n    path_with_arguments = resource_path.split(\"/\")\n    placeholders = (\"A\", \"B\", \"C\", \"D\", \"E\", \"F\")\n    for index in reversed(range(1, len(path_with_arguments))):\n        path_with_arguments.insert(\n            index, \"<i>%s</i>\" % placeholders[index - 1])\n    path_with_arguments = \"/\".join(path_with_arguments)\n\n    lines = [resource_class.name.capitalize(),\n             \"=\" * len(resource_class.name),\n             \"\"]\n\n    lines.extend([\"Description\",\n                  \"-----------\",\n                  \"\"] +\n                 splitAndDeindentDocstring(resource_class, level=1) +\n                 [\"\"])\n\n    def search_and_filter_parameters(description, path=path_with_arguments):\n        nparameters = 0\n        while description:\n            try:\n                key, name, expected_type = description[0].split(\" : \")\n            except ValueError:\n                break\n            else:\n                del description[0]\n                if not nparameters:\n                    lines.extend([\"Search/filter parameters\",\n                                  \"------------------------\"])\n                nparameters += 1\n                if expected_type == \"-\":\n                    expected_type = \"\"\n                else:\n                    expected_type = \" (%s)\" % expected_type\n                lines.append(\"? <code>api/v1/%s?<b>%s=%s</b></code>%s\"\n                             % (path, key, name, expected_type))\n                assert not description.pop(0)\n                copyParagraph(lines, description, as_definition=True)\n        if nparameters > 1:\n            lines.extend(\n                [\"\",\n                 \"Note: Unless noted otherwise, search/filter parameters\",\n                 \"      can be combined.\"])\n        if nparameters > 0:\n            lines.append(\"\")\n            return True\n        return False\n\n    if resource_class.single:\n        lines.extend([\"Single-resource access\",\n                      \"----------------------\",\n                      \"\"])\n\n        description = splitAndDeindentDocstring(resource_class.single, 2)\n\n        if description is None:\n            lines.append(\"Undocumented!\")\n        else:\n            lines.append(\"? <code>api/v1/%s/ARGUMENT[,ARGUMENT,...]</code>\"\n                         % path_with_arguments)\n            copyParagraph(\n                lines, description, as_definition=True, include_empty=True)\n\n            first_variant = True\n            while description:\n                try:\n                    name, expected_type = description[0].split(\" : \")\n                except ValueError:\n                    break\n                else:\n                    del description[0]\n                    if first_variant:\n                        lines.extend([\"Resource argument\",\n                                      \"-----------------\"])\n                        first_variant = False\n                    lines.append(\"? <code>api/v1/%s/<b>%s</b></code> (%s)\"\n                                 % (path_with_arguments, name, expected_type))\n                    assert not description.pop(0)\n                    copyParagraph(lines, description, as_definition=True)\n\n            lines.append(\"\")\n\n            search_and_filter_parameters(\n                description, path=\"%s/ARGUMENT\" % path_with_arguments)\n\n            lines.extend(description)\n\n        lines.append(\"\")\n\n    if resource_class.multiple:\n        lines.extend([\"Multiple-resource access\",\n                      \"------------------------\",\n                      \"\"])\n\n        description = splitAndDeindentDocstring(resource_class.multiple, 2)\n\n        if description is None:\n            lines.extend([\"Undocumented!\",\n                          \"\"])\n        else:\n            lines.append(\"? <code>api/v1/%s</code>\" % path_with_arguments)\n            copyParagraph(lines, description, as_definition=True)\n            lines.append(\"\")\n\n            while description and not search_and_filter_parameters(description):\n                copyParagraph(lines, description)\n\n    lines.extend([\"Resource structure\",\n                  \"------------------\",\n                  \"\"])\n\n    structure_lines = splitAndDeindentDocstring(resource_class.json, level=2)\n    if structure_lines:\n        def massage_line(line):\n            if line.strip().startswith(\"// \"):\n                return \"  // <b>\" + line.strip()[3:] + \"</b>\"\n            line, _, description = line.partition(\" // \")\n            if description:\n                line += \" <i style='float: right'>%s</i>\" % description\n            return line\n\n        lines.extend(\"|| \" + massage_line(line) for line in structure_lines)\n        lines.append(\"\")\n    else:\n        lines.extend([\"Undocumented!\",\n                      \"\"])\n\n    alternative_paths = sorted(\n        set(listAlternativePaths(resource_class)) - set([resource_path]))\n\n    if alternative_paths:\n        lines.extend([\"Alternative paths\",\n                      \"-----------------\",\n                      \"\",\n                      \"This class of resources is also accessible as:\",\n                      \"\"])\n\n        for alternative_path in alternative_paths:\n            lines.extend([\"* <code>api/v1/[%s][%s]</code>\"\n                          % (alternative_path, alternative_path),\n                          \"\"])\n\n        for alternative_path in alternative_paths:\n            lines.append(\"[%s]: /api/v1?describe=%s\"\n                         % (alternative_path, alternative_path))\n        lines.append(\"\")\n\n    subresources = []\n    prefix = \".../\" + resource_class.name + \"/\"\n\n    for path, subresource_class in jsonapi.HANDLERS.items():\n        if path.startswith(prefix):\n            subresources.append(\n                (path[len(prefix):], extractResourceSummary(subresource_class)))\n\n    if subresources:\n        lines.extend([\"Sub-resources\",\n                      \"-------------\",\n                      \"\"])\n\n        for path, summary in sorted(subresources):\n            lines.extend([\"? <code>/api/v1/%s/[%s][%s]</code>\"\n                          % (path_with_arguments, path, path),\n                          \"= \" + summary])\n\n        lines.append(\"\")\n\n        for path, _ in subresources:\n            lines.append(\"[%s]: /api/v1?describe=%s/%s\"\n                         % (path, resource_path, path))\n\n    raise page.utils.DisplayFormattedText(lines)\n"
  },
  {
    "path": "src/jsonapi/v1/extensions.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\nimport jsonapi\n\n@jsonapi.PrimaryResource\nclass Extensions(object):\n    \"\"\"Extensions.\"\"\"\n\n    name = \"extensions\"\n    contexts = (None, \"users\")\n    value_class = api.extension.Extension\n    exceptions = (api.extension.ExtensionError,)\n\n    @staticmethod\n    def json(value, parameters):\n        \"\"\"Extension {\n             \"id\": integer,\n             \"name\": string,\n             \"key\": string,\n             \"publisher\": integer or null,\n           }\"\"\"\n\n        data = { \"id\": value.id,\n                 \"name\": value.name,\n                 \"key\": value.key,\n                 \"publisher\": value.publisher }\n\n        return parameters.filtered(\"extensions\", data)\n\n    @staticmethod\n    def single(parameters, argument):\n        \"\"\"Retrieve one (or more) extensions by id.\n\n           EXTENSION_ID : integer\n\n           Retrieve an extension identified by its unique numeric id.\"\"\"\n\n        value = api.extension.fetch(parameters.critic,\n                                    jsonapi.numeric_id(argument))\n\n        if \"users\" in parameters.context:\n            if value.publisher != parameters.context[\"users\"]:\n                raise InvalidExtensionId(jsonapi.numeric_id(argument))\n\n        return value\n\n    @staticmethod\n    def multiple(parameters):\n        \"\"\"Retrieve a single extension by key or all extensions.\n\n           key : KEY : string\n\n           Retrieve only the extension with the given key.  This is equivalent\n           to accessing /api/v1/extensions/EXTENSION_ID with that extension's\n           numeric id.  When used, other parameters are ignored.\n\n           installed_by : INSTALLED_BY : integer or string\n\n           Retrieve only extensions installed by the specified user.  The user\n           can be identified by numeric id or username.\"\"\"\n\n        key_parameter = parameters.getQueryParameter(\"key\")\n        if key_parameter:\n            return api.extension.fetch(parameters.critic, key=key_parameter)\n\n        installed_by = jsonapi.from_parameter(\n            \"v1/users\", \"installed_by\", parameters)\n\n        return api.extension.fetchAll(\n            parameters.critic,\n            publisher=jsonapi.deduce(\"v1/users\", parameters),\n            installed_by=installed_by)\n"
  },
  {
    "path": "src/jsonapi/v1/filechanges.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2017 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport re\n\nimport api\nimport jsonapi\n\n@jsonapi.PrimaryResource\nclass FileChanges(object):\n    \"\"\"File changes for a changeset\"\"\"\n\n    name = \"filechanges\"\n    contexts = (None, \"repositories\", \"changesets\")\n    value_class = api.filechange.FileChange\n    exceptions = (api.filechange.FileChangeError,)\n\n    @staticmethod\n    def json(value, parameters):\n        \"\"\"{\n             \"id\": integer, // the file's id\n             \"path\": string, // the file's path\n             \"changeset\": integer, // the changeset's id\n             \"old_sha1\": string, // the sha1 identifying the file's old blob\n             \"old_mode\": string, // the old file permissions\n             \"new_sha1\": string, // the sha1 identifying the file's new blob\n             \"new_mode\": string, // the new file permissions\n           }\"\"\"\n\n        return parameters.filtered(\n            \"filechanges\", {\n                \"file\": value.file,\n                \"changeset\": value.changeset,\n                \"old_sha1\": value.old_sha1,\n                \"old_mode\": value.old_mode,\n                \"new_sha1\": value.new_sha1,\n                \"new_mode\": value.new_mode\n            })\n\n    @staticmethod\n    def single(parameters, argument):\n        \"\"\"Retrieve one (or more) filechanges (changed files).\n\n           FILE_ID : integer\n\n           Retrieve the changes to a file identified by its unique numeric id.\n\n           changeset : CHANGESET : -\n\n           Retrieve the changes from a changeset identified by its unique numeric id.\n\n           reposititory : REPOSITORY : -\n\n           The repository in which the files exist.\"\"\"\n\n        changeset = jsonapi.deduce(\"v1/changesets\", parameters)\n        file = api.file.fetch(parameters.critic, jsonapi.numeric_id(argument))\n\n        return FileChanges.setAsContext(\n            parameters, api.filechange.fetch(\n                parameters.critic, changeset, file))\n\n    @staticmethod\n    def multiple(parameters):\n        \"\"\"Retrieve all filechanges (changed files) from a changeset.\n\n           changeset : CHANGESET : -\n\n           Retrieve the changed from a changeset indentified by its unique numeric id.\n\n           reposititory : REPOSITORY : -\n\n           The repository in which the files exist.\"\"\"\n\n        changeset = jsonapi.deduce(\"v1/changesets\", parameters)\n        return api.filechange.fetchAll(parameters.critic, changeset)\n\n    @staticmethod\n    def deduce(parameters):\n        changeset = jsonapi.deduce(\"v1/changesets\", parameters)\n        if changeset is None:\n            raise jsonapi.UsageError(\n                \"changeset needs to be specified, ex. &changeset=<id>\")\n        filechange = parameters.context.get(FileChanges.name)\n        filechange_parameter = parameters.getQueryParameter(\"filechange\")\n        if filechange_parameter is not None:\n            if filechange is not None:\n                raise jsonapi.UsageError(\n                    \"Redundant query parameter: filechange=%s\"\n                    % filechange_parameter)\n            filechange_id = jsonapi.numeric_id(filechange_parameter)\n            filechange = api.filechange.fetch(\n                parameters.critic, changeset, filechange_id)\n        return filechange\n\n    @staticmethod\n    def setAsContext(parameters, filechange):\n        parameters.setContext(FileChanges.name, filechange)\n        return filechange\n\n    @staticmethod\n    def resource_id(value):\n        return value.file.id\n"
  },
  {
    "path": "src/jsonapi/v1/filecontents.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2017 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\nimport jsonapi\n\n@jsonapi.PrimaryResource\nclass Filecontents(object):\n    \"\"\"Context lines for a file in a commit\"\"\"\n\n    name = \"filecontents\"\n    contexts = (None, \"repositories\")\n    value_class = api.filecontent.Filecontent\n    exceptions = (api.filecontent.FilecontentError,)\n\n    @staticmethod\n    def json(value, parameters):\n        \"\"\"TODO: add documentation\"\"\"\n\n        def part_as_dict(part):\n            dict_part = {\n                \"type\": part.type,\n                \"content\": part.content\n                }\n            return dict_part\n\n        def line_as_dict(line):\n            return {\n                \"parts\": [part_as_dict(part) for part in line.parts],\n                \"offset\": line.offset\n            }\n\n        first = parameters.getQueryParameter(\"first\", int, ValueError)\n        last = parameters.getQueryParameter(\"last\", int, ValueError)\n\n        dict_lines = [line_as_dict(line) for line in value.getLines(first, last)]\n\n        return parameters.filtered(\n            \"filecontents\", {\"lines\": dict_lines})\n\n    @staticmethod\n    def multiple(parameters):\n        \"\"\"TODO: add documentation\"\"\"\n\n        commit = jsonapi.deduce(\"v1/commits\", parameters)\n        if commit is None:\n            raise jsonapi.UsageError(\n                \"commit must be specified, ex. &commit=<sha1>\")\n\n        file_obj = jsonapi.deduce(\"v1/files\", parameters)\n\n        blob_sha1 = commit.getFileInformation(file_obj).sha1\n\n        return api.filecontent.fetch(\n            parameters.critic, commit.repository, blob_sha1, file_obj)\n"
  },
  {
    "path": "src/jsonapi/v1/filediffs.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2017 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\nimport jsonapi\n\n@jsonapi.PrimaryResource\nclass Filediffs(object):\n    \"\"\"Source code for a filechange\"\"\"\n\n    name = \"filediffs\"\n    contexts = (None, \"changesets\")\n    value_class = api.filediff.Filediff\n    exceptions = (api.filediff.FilediffError, api.filechange.FileChangeError)\n\n    @staticmethod\n    def json(value, parameters):\n        \"\"\"TODO: add documentation\"\"\"\n\n        def part_as_dict(part):\n            if not part.type and not part.state:\n                return part.content\n            dict_part = {\n                \"content\": part.content\n            }\n            if part.type:\n                dict_part[\"type\"] = part.type\n            if part.state:\n                dict_part[\"state\"] = part.state\n            return dict_part\n\n        def line_as_dict(line):\n            dict_line = {\n                \"type\": line.type_string,\n                \"old_offset\": line.old_offset,\n                \"new_offset\": line.new_offset,\n            }\n            dict_line[\"content\"] = [part_as_dict(part) for\n                                    part in line.content]\n            return dict_line\n\n        def chunk_as_dict(chunk):\n            return {\n                \"content\": [line_as_dict(line) for line in chunk.lines],\n                \"old_offset\": chunk.old_offset,\n                \"old_count\": chunk.old_count,\n                \"new_offset\": chunk.new_offset,\n                \"new_count\": chunk.new_count\n            }\n\n        context_lines = parameters.getQueryParameter(\n            \"context_lines\", int, ValueError)\n        if context_lines is not None:\n            if context_lines < 0:\n                raise jsonapi.UsageError(\n                    \"Negative number of context lines not supported\")\n        else:\n            # TODO: load this from the user's config (or make it mandatory and\n            # let the client handle config loading).\n            context_lines = 3\n\n        comment = jsonapi.deduce(\"v1/comments\", parameters)\n        if comment is not None:\n            comments = [comment]\n            ignore_chunks = True\n        else:\n            review = jsonapi.deduce(\"v1/reviews\", parameters)\n            if review is not None:\n                comments = api.comment.fetchAll(\n                    parameters.critic, review=review,\n                    changeset=value.filechange.changeset)\n            else:\n                comments = None\n            ignore_chunks = False\n\n        macro_chunks = value.getMacroChunks(\n            context_lines, comments, ignore_chunks)\n\n        dict_chunks = [chunk_as_dict(chunk) for chunk in macro_chunks]\n        return parameters.filtered(\n            \"filediffs\", {\n                \"file\": value.filechange,\n                \"changeset\": value.filechange.changeset,\n                \"macro_chunks\": dict_chunks,\n                \"old_count\": value.old_count,\n                \"new_count\": value.new_count\n            })\n\n    @staticmethod\n    def single(parameters, argument):\n        \"\"\"TODO: add documentation\"\"\"\n\n        changeset = jsonapi.deduce(\"v1/changesets\", parameters)\n        if changeset is None:\n            raise jsonapi.UsageError(\n                \"changeset needs to be specified, ex. &changeset=<id>\")\n\n        repository = jsonapi.deduce(\"v1/repositories\", parameters)\n        if repository is None:\n            raise jsonapi.UsageError(\n                \"repository needs to be specified, \"\n                \"ex. &repository=<id or name>\")\n\n        file = api.file.fetch(parameters.critic, jsonapi.numeric_id(argument))\n        filechange = api.filechange.fetch(parameters.critic, changeset, file)\n\n        return api.filediff.fetch(parameters.critic, filechange)\n\n    @staticmethod\n    def multiple(parameters):\n        \"\"\"TODO: add documentation\"\"\"\n\n        changeset = jsonapi.deduce(\"v1/changesets\", parameters)\n        if changeset is None:\n            raise jsonapi.UsageError(\n                \"changeset needs to be specified, ex. &changeset=<id>\")\n\n        repository = jsonapi.deduce(\"v1/repositories\", parameters)\n        if repository is None:\n            raise jsonapi.UsageError(\n                \"repository needs to be specified, \"\n                \"ex. &repository=<id or name>\")\n\n        return api.filediff.fetchAll(parameters.critic, changeset)\n\n    @staticmethod\n    def resource_id(value):\n        return value.filechange.file.id\n"
  },
  {
    "path": "src/jsonapi/v1/files.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2017 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\nimport jsonapi\n\n@jsonapi.PrimaryResource\nclass Files(object):\n    \"\"\"Files (path <=> id mappings) in the system.\"\"\"\n\n    name = \"files\"\n    value_class = api.file.File\n    exceptions = api.file.FileError\n\n    @staticmethod\n    def json(value, parameters):\n        \"\"\"{\n             \"id\": integer,\n             \"path\": string\n           }\"\"\"\n\n        return parameters.filtered(\n            \"files\", { \"id\": value.id,\n                       \"path\": value.path })\n\n    @staticmethod\n    def single(parameters, argument):\n        \"\"\"Retrieve one (or more) files.\n\n           FILE_ID : integer\n\n           Retrieve a file identified by its unique numeric id.\"\"\"\n\n        return api.file.fetch(\n            parameters.critic, file_id=jsonapi.numeric_id(argument))\n\n    @staticmethod\n    def multiple(parameters):\n        \"\"\"Retrieve a file by its path.\n\n           path : PATH : string\n\n           Retrieve the file with the specified path. (Required)\"\"\"\n\n        path = parameters.getQueryParameter(\"path\")\n        if path is None:\n            raise UsageError(\"No path parameter specified\")\n\n        return api.file.fetch(parameters.critic, path=path)\n\n    @staticmethod\n    def fromParameter(value, parameters):\n        file_id, path = jsonapi.id_or_name(value)\n        return api.file.fetch(parameters.critic, file_id, path=path)\n\n    @staticmethod\n    def deduce(parameters):\n        file_obj = parameters.context.get((Files.name))\n        file_parameter = parameters.getQueryParameter(\"file\")\n        if file_parameter is not None:\n            if file_obj is not None:\n                raise jsonapi.UsageError(\n                    \"Redundant query parameter: file=%s\"\n                    % file_parameter)\n            file_id = Files.fromParameter(file_parameter, parameters)\n            file_obj = api.file.fetch(parameters.critic, file_id)\n        return file_obj\n\n    @staticmethod\n    def setAsContext(parameters, file_obj):\n        parameters.setContext(Files.name, file_obj)\n        return file_obj\n"
  },
  {
    "path": "src/jsonapi/v1/labeledaccesscontrolprofiles.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport itertools\n\nimport api\nimport jsonapi\n\napi_module = api.labeledaccesscontrolprofile\n\n@jsonapi.PrimaryResource\nclass LabeledAccessControlProfiles(object):\n    \"\"\"The labeled access control profile selectorss of this system.\"\"\"\n\n    name = \"labeledaccesscontrolprofiles\"\n    contexts = (None, \"accesscontrolprofiles\")\n    value_class = api_module.LabeledAccessControlProfile\n    exceptions = api_module.LabeledAccessControlProfileError\n\n    @staticmethod\n    def json(value, parameters, linked):\n        \"\"\"LabeledAccessControlProfile {\n             \"labels\": [string],\n             \"profile\": integer\n           }\"\"\"\n\n        # Make sure that only administrator users can access.\n        api.PermissionDenied.raiseUnlessAdministrator(parameters.critic)\n\n        return parameters.filtered(\"labeledaccesscontrolprofiles\", {\n            \"labels\": value.labels,\n            \"profile\": value.profile.id,\n        })\n\n    @staticmethod\n    def single(parameters, argument):\n        \"\"\"Retrieve one (or more) access control profiles.\n\n           LABELS : string\n\n           Retrieve an access control profile identified by the profile\n           selectors's set of labels. Separate labels with pipe ('|')\n           characters.\"\"\"\n\n        return api.accesscontrolprofile.fetch(\n            parameters.critic, labels=argument.split(\"|\"))\n\n    @staticmethod\n    def multiple(parameters):\n        \"\"\"Retrieve all labeled access control profile selectors in the system.\n\n           profile : PROFILE_ID : integer\n\n           Include only selectors selecting the given profile, identified by its\n           unique numeric id.\"\"\"\n        profile = jsonapi.deduce(\"v1/accesscontrolprofiles\", parameters)\n        return api.labeledaccesscontrolprofile.fetchAll(\n            parameters.critic, profile=profile)\n\n    @staticmethod\n    def create(parameters, value, values, data):\n        critic = parameters.critic\n        user = parameters.context.get(\"users\", critic.actual_user)\n\n        if parameters.subresource_path:\n            raise jsonapi.UsageError(\"Invalid POST request\")\n\n        # Create a labeled access control profile selector.\n        assert not (value or values)\n\n        converted = jsonapi.convert(parameters, {\n            \"labels\": [str],\n            \"profile\": api.accesscontrolprofile.AccessControlProfile\n        }, data)\n\n        result = []\n\n        def collectLabeledAccessControlProfile(labeled_profile):\n            assert isinstance(\n                labeled_profile,\n                api.labeledaccesscontrolprofile.LabeledAccessControlProfile)\n            result.append(labeled_profile)\n\n        with api.transaction.Transaction(critic) as transaction:\n            transaction.createLabeledAccessControlProfile(\n                converted[\"labels\"], converted[\"profile\"],\n                callback=collectLabeledAccessControlProfile)\n\n        assert len(result) == 1\n        return result[0], None\n"
  },
  {
    "path": "src/jsonapi/v1/rebases.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2015 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\nimport jsonapi\n\n@jsonapi.PrimaryResource\nclass Rebases(object):\n    \"\"\"The review rebases in this system.\"\"\"\n\n    name = \"rebases\"\n    contexts = (None, \"reviews\")\n    value_class = (api.log.rebase.MoveRebase, api.log.rebase.HistoryRewrite)\n    exceptions = api.log.rebase.RebaseError\n\n    @staticmethod\n    def json(value, parameters):\n        \"\"\"{\n             \"id\": integer,\n             \"review\": integer,\n             \"creator\": integer,\n             \"type\": \"history-rewrite\" or \"move\"\n             \"old_head\": integer,\n             \"new_head\": integer,\n             // if |type| is \"move\":\n             \"old_upstream\": integer,\n             \"new_upstream\": integer,\n             \"equivalent_merge\": integer or null,\n             \"replayed_rebase\": integer or null,\n           }\"\"\"\n\n        old_head = value.old_head\n        new_head = value.new_head\n\n        data = { \"id\": value.id,\n                 \"review\": value.review,\n                 \"creator\": value.creator,\n                 \"old_head\": old_head,\n                 \"new_head\": new_head }\n\n        if isinstance(value, api.log.rebase.HistoryRewrite):\n            data.update({ \"type\": \"history-rewrite\" })\n        else:\n            data.update({ \"type\": \"move\",\n                          \"old_upstream\": value.old_upstream,\n                          \"new_upstream\": value.new_upstream,\n                          \"equivalent_merge\": value.equivalent_merge,\n                          \"replayed_rebase\": value.replayed_rebase })\n\n        return parameters.filtered(\"rebases\", data)\n\n    @staticmethod\n    def single(parameters, argument):\n        \"\"\"Retrieve one (or more) rebases in this system.\n\n           REBASE_ID : integer\n\n           Retrieve a rebase identified by its unique numeric id.\"\"\"\n\n        return Rebases.setAsContext(parameters, api.log.rebase.fetch(\n            parameters.critic, rebase_id=jsonapi.numeric_id(argument)))\n\n    @staticmethod\n    def multiple(parameters):\n        \"\"\"Retrieve all rebases in this system.\n\n           review : REVIEW_ID : -\n\n           Include only rebases of one review, identified by the review's unique\n           numeric id.\"\"\"\n\n        review = jsonapi.deduce(\"v1/reviews\", parameters)\n        return api.log.rebase.fetchAll(parameters.critic, review=review)\n\n    @staticmethod\n    def create(parameters, value, values, data):\n        critic = parameters.critic\n        user = critic.actual_user\n\n        converted = jsonapi.convert(\n            parameters,\n            {\n                \"new_upstream?\": str,\n                \"history_rewrite?\": bool\n            },\n            data)\n\n        new_upstream = converted.get(\"new_upstream\")\n        history_rewrite = converted.get(\"history_rewrite\")\n\n        if (new_upstream is None) == (history_rewrite is None):\n            raise jsonapi.UsageError(\n                \"Exactly one of the arguments new_upstream and history_rewrite \"\n                \"must be specified.\")\n\n        if history_rewrite == False:\n            raise jsonapi.UsageError(\n                \"history_rewrite must be true, or omitted.\")\n\n        review = jsonapi.deduce(\"v1/reviews\", parameters)\n\n        if review is None:\n            raise jsonapi.UsageError(\n                \"review must be specified when preparing a rebase\")\n\n        if history_rewrite is not None:\n            expected_type = api.log.rebase.HistoryRewrite\n        else:\n            expected_type = api.log.rebase.MoveRebase\n\n        result = []\n\n        def collectRebase(rebase):\n            assert isinstance(rebase, expected_type), repr(rebase)\n            result.append(rebase)\n\n        with api.transaction.Transaction(critic) as transaction:\n            transaction \\\n                .modifyReview(review) \\\n                .prepareRebase(\n                    user, new_upstream, history_rewrite,\n                    callback=collectRebase)\n\n        assert len(result) == 1, repr(result)\n        return result[0], None\n\n    @staticmethod\n    def delete(parameters, value, values):\n        critic = parameters.critic\n\n        if value is None:\n            raise jsonapi.UsageError(\n                \"Only one rebase can currently be deleted per request\")\n        rebase = value\n\n        with api.transaction.Transaction(critic) as transaction:\n            transaction \\\n                .modifyReview(rebase.review) \\\n                .cancelRebase(rebase)\n\n    @staticmethod\n    def setAsContext(parameters, rebase):\n        parameters.setContext(Rebases.name, rebase)\n        # Also set the rebase's review (and repository and branch) as context.\n        jsonapi.v1.reviews.Reviews.setAsContext(parameters, rebase.review)\n        return rebase\n"
  },
  {
    "path": "src/jsonapi/v1/replies.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2017 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\nimport jsonapi\n\n@jsonapi.PrimaryResource\nclass Replies(object):\n    \"\"\"Replies to comments in reviews.\"\"\"\n\n    name = \"replies\"\n    contexts = (None, \"comments\")\n    value_class = api.reply.Reply\n    exceptions = (api.comment.CommentError, api.reply.ReplyError)\n\n    @staticmethod\n    def json(value, parameters):\n        \"\"\"{\n             \"id\": integer,\n             \"is_draft\": boolean,\n             \"author\": integer,\n             \"timestamp\": float,\n             \"text\": string,\n           }\"\"\"\n\n        timestamp = jsonapi.v1.timestamp(value.timestamp)\n        return parameters.filtered(\n            \"replies\", { \"id\": value.id,\n                         \"is_draft\": value.is_draft,\n                         \"author\": value.author,\n                         \"timestamp\": timestamp,\n                         \"text\": value.text })\n\n    @staticmethod\n    def single(parameters, argument):\n        \"\"\"Retrieve one (or more) replies to comments.\n\n           REPLY_ID : integer\n\n           Retrieve a reply identified by its unique numeric id.\"\"\"\n\n        reply = api.reply.fetch(\n            parameters.critic, reply_id=jsonapi.numeric_id(argument))\n\n        comment = jsonapi.deduce(\"v1/comments\", parameters)\n        if comment and comment != reply.comment:\n            raise jsonapi.PathError(\n                \"Reply does not belong to specified comment\")\n\n        return reply\n\n    @staticmethod\n    def multiple(parameters):\n        \"\"\"Retrieve replies to a comment.\n\n           comment : COMMENT_ID : integer\n\n           Retrieve all replies to the specified comment.\"\"\"\n\n        comment = jsonapi.deduce(\"v1/comments\", parameters)\n\n        if not comment:\n            raise jsonapi.UsageError(\"A comment must be identified.\")\n\n        return comment.replies\n\n    @staticmethod\n    def create(parameters, value, values, data):\n        critic = parameters.critic\n        user = parameters.context.get(\"users\", critic.actual_user)\n\n        if value or values:\n            raise jsonapi.UsageError(\"Invalid POST request\")\n\n        converted = jsonapi.convert(\n            parameters,\n            {\n                \"comment?\": api.comment.Comment,\n                \"author?\": api.user.User,\n                \"text\": str\n            },\n            data)\n\n        comment = jsonapi.deduce(\"v1/comments\", parameters)\n\n        if not comment:\n            if \"comment\" not in converted:\n                raise jsonapi.UsageError(\"No comment specified\")\n            comment = converted[\"comment\"]\n        elif \"comment\" in converted and comment != converted[\"comment\"]:\n            raise jsonapi.UsageError(\"Conflicting comments specified\")\n\n        if \"author\" in converted:\n            author = converted[\"author\"]\n        else:\n            author = critic.actual_user\n\n        if not converted[\"text\"].strip():\n            raise jsonapi.UsageError(\"Empty reply\")\n\n        result = []\n\n        def collectReply(reply):\n            assert isinstance(reply, api.reply.Reply)\n            result.append(reply)\n\n        with api.transaction.Transaction(critic) as transaction:\n            transaction \\\n                .modifyReview(comment.review) \\\n                .modifyComment(comment) \\\n                .addReply(\n                    author=author,\n                    text=converted[\"text\"],\n                    callback=collectReply)\n\n        assert len(result) == 1\n        return result[0], None\n\n    @staticmethod\n    def update(parameters, value, values, data):\n        critic = parameters.critic\n        path = parameters.subresource_path\n\n        if value:\n            replies = [value]\n        else:\n            replies = values\n\n        if path:\n            raise jsonapi.UsageError(\"Invalid PUT request\")\n\n        converted = jsonapi.convert(\n            parameters,\n            {\n                \"text\": str\n            },\n            data)\n\n        with api.transaction.Transaction(critic) as transaction:\n            for reply in replies:\n                transaction \\\n                    .modifyReview(reply.comment.review) \\\n                    .modifyComment(reply.comment) \\\n                    .modifyReply(reply) \\\n                    .setText(converted[\"text\"])\n\n        return value, values\n\n    @staticmethod\n    def delete(parameters, value, values):\n        critic = parameters.critic\n        path = parameters.subresource_path\n\n        if value:\n            replies = [value]\n        else:\n            replies = values\n\n        if path:\n            raise jsonapi.UsageError(\"Invalid DELETE request\")\n\n        with api.transaction.Transaction(critic) as transaction:\n            for reply in replies:\n                transaction \\\n                    .modifyReview(reply.comment.review) \\\n                    .modifyComment(reply.comment) \\\n                    .modifyReply(reply) \\\n                    .delete()\n\n    @staticmethod\n    def fromParameter(value, parameters):\n        return api.reply.fetch(parameters.critic, jsonapi.numeric_id(value))\n"
  },
  {
    "path": "src/jsonapi/v1/repositories.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\nimport jsonapi\n\ndef from_argument(parameters, argument):\n    repository_id, name = jsonapi.id_or_name(argument)\n    return api.repository.fetch(\n        parameters.critic, repository_id=repository_id, name=name)\n\n@jsonapi.PrimaryResource\nclass Repositories(object):\n    \"\"\"The Git repositories on this system.\"\"\"\n\n    name = \"repositories\"\n    value_class = api.repository.Repository\n    exceptions = (api.repository.RepositoryError,)\n\n    @staticmethod\n    def json(value, parameters):\n        \"\"\"Repository {\n             \"id\": integer, // the repository's id\n             \"name\": string, // the repository's (unique) short name\n             \"path\": string, // absolute file-system path\n             \"relative_path\": string, // relative file-system path\n             \"url\": string, // the repository's URL\n           }\"\"\"\n\n        return parameters.filtered(\n            \"repositories\", { \"id\": value.id,\n                              \"name\": value.name,\n                              \"path\": value.path,\n                              \"relative_path\": value.relative_path,\n                              \"url\": value.url })\n\n    @staticmethod\n    def single(parameters, argument):\n        \"\"\"Retrieve one (or more) repositories on this system.\n\n           REPOSITORY_ID : integer\n\n           Retrieve a repository identified by its unique numeric id.\"\"\"\n\n        return Repositories.setAsContext(parameters, api.repository.fetch(\n            parameters.critic, repository_id=jsonapi.numeric_id(argument)))\n\n    @staticmethod\n    def multiple(parameters):\n        \"\"\"Retrieve a single named repository or all repositories on this\n           system.\n\n           name : SHORT_NAME : string\n\n           Retrieve a repository identified by its unique short-name.  This is\n           equivalent to accessing /api/v1/repositories/REPOSITORY_ID with that\n           repository's numeric id.  When used, any other parameters are\n           ignored.\n\n           filter : highlighted : -\n\n           If specified, retrieve only \"highlighted\" repositories.  These are\n           repositories that are deemed of particular interest for the signed-in\n           user.  (If no user is signed in, no repositories are highlighted.)\"\"\"\n\n        name_parameter = parameters.getQueryParameter(\"name\")\n        if name_parameter:\n            return api.repository.fetch(parameters.critic, name=name_parameter)\n        filter_parameter = parameters.getQueryParameter(\"filter\")\n        if filter_parameter is not None:\n            if filter_parameter == \"highlighted\":\n                repositories = api.repository.fetchHighlighted(\n                    parameters.critic)\n            else:\n                raise jsonapi.UsageError(\n                    \"Invalid repository filter parameter: %r\"\n                    % filter_parameter)\n        else:\n            repositories = api.repository.fetchAll(parameters.critic)\n        return repositories\n\n    @staticmethod\n    def deduce(parameters):\n        repository = parameters.context.get(\"repositories\")\n        repository_parameter = parameters.getQueryParameter(\"repository\")\n        if repository_parameter is not None:\n            if repository is not None:\n                raise jsonapi.UsageError(\n                    \"Redundant query parameter: repository=%s\"\n                    % repository_parameter)\n            repository = from_argument(parameters, repository_parameter)\n        if repository is not None:\n            return repository\n\n        review = jsonapi.deduce(\"v1/reviews\", parameters)\n        if review is not None:\n            return review.repository\n\n    @staticmethod\n    def fromParameter(value, parameters):\n        repository_id, name = jsonapi.id_or_name(value)\n        return api.repository.fetch(parameters.critic, repository_id, name=name)\n\n    @staticmethod\n    def setAsContext(parameters, repository):\n        parameters.setContext(Repositories.name, repository)\n        return repository\n"
  },
  {
    "path": "src/jsonapi/v1/reviewablefilechanges.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2017 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\nimport jsonapi\n\n@jsonapi.PrimaryResource\nclass ReviewableFileChanges(object):\n    \"\"\"Reviewable file changes\"\"\"\n\n    name = \"reviewablefilechanges\"\n    contexts = (None, \"reviews\", \"changesets\")\n    value_class = api.reviewablefilechange.ReviewableFileChange\n    exceptions = api.reviewablefilechange.ReviewableFileChangeError\n\n    @staticmethod\n    def json(value, parameters):\n        \"\"\"{\n             \"id\": integer, // the object's unique id\n             \"review\": integer,\n             \"changeset\": integer,\n             \"file\": integer,\n             \"deleted_lines\": integer,\n             \"inserted_lines\": integer,\n             \"is_reviewed\": boolean,\n             \"reviewed_by\": integer,\n             \"assigned_reviewers\": integer[],\n             \"draft_changes\": DraftChanges or null,\n           }\n\n           DraftChanges {\n             \"author\": integer, // author of draft changes\n             \"new_is_reviewed\": boolean,\n             \"new_reviewed_by\": integer,\n           }\"\"\"\n\n        draft_changes = value.draft_changes\n        if draft_changes:\n            draft_changes_json = {\n                \"author\": draft_changes.author,\n                \"new_is_reviewed\": draft_changes.new_is_reviewed,\n                \"new_reviewed_by\": draft_changes.new_reviewed_by,\n            }\n        else:\n            draft_changes_json = None\n\n        return parameters.filtered(\n            \"reviewablefilechanges\", {\n                \"id\": value.id,\n                \"review\": value.review,\n                \"changeset\": value.changeset,\n                \"file\": value.file,\n                \"deleted_lines\": value.deleted_lines,\n                \"inserted_lines\": value.inserted_lines,\n                \"is_reviewed\": value.is_reviewed,\n                \"reviewed_by\": value.reviewed_by,\n                \"assigned_reviewers\": jsonapi.sorted_by_id(\n                    value.assigned_reviewers),\n                \"draft_changes\": draft_changes_json,\n            })\n\n    @staticmethod\n    def single(parameters, argument):\n        \"\"\"Retrieve one (or more) reviewable file change.\n\n           FILECHANGE_ID : integer\n\n           Retrieve the reviewable changes to a file in a commit identified by\n           its unique numeric id.\"\"\"\n\n        return api.reviewablefilechange.fetch(\n            parameters.critic, jsonapi.numeric_id(argument))\n\n    @staticmethod\n    def multiple(parameters):\n        \"\"\"Retrieve all reviewable file changes in a review.\n\n           review : REVIEW_ID : -\n\n           Retrieve the reviewable changes in the specified review.\n\n           changeset : CHANGESET_ID : -\n\n           Retrieve the reviewable changes in the specified changeset.\n\n           file : FILE : -\n\n           Retrieve the reviewable changes in the specified file only.\n\n           assignee : USER : -\n\n           Retrieve reviewable changes assigned to the specified user only.\n\n           state : STATE : \"pending\" or \"reviewed\"\n\n           Retrieve reviewable changes in the specified state only.\"\"\"\n\n        review = jsonapi.deduce(\"v1/reviews\", parameters)\n        changeset = jsonapi.deduce(\"v1/changesets\", parameters)\n\n        if not review:\n            raise jsonapi.UsageError(\"Missing required parameter: review\")\n\n        file = jsonapi.from_parameter(\"v1/files\", \"file\", parameters)\n        assignee = jsonapi.from_parameter(\"v1/users\", \"assignee\", parameters)\n        state_parameter = parameters.getQueryParameter(\"state\")\n\n        if state_parameter is None:\n            is_reviewed = None\n        else:\n            if state_parameter not in (\"pending\", \"reviewed\"):\n                raise jsonapi.UsageError(\n                    \"Invalid parameter value: state=%r \"\n                    \"(value values are 'pending' and 'reviewed')\"\n                    % state_parameter)\n            is_reviewed = state_parameter == \"reviewed\"\n\n        return api.reviewablefilechange.fetchAll(\n            parameters.critic, review, changeset, file, assignee, is_reviewed)\n\n    @staticmethod\n    def update(parameters, value, values, data):\n        critic = parameters.critic\n        filechanges = [value] if value else values\n\n        reviews = set(filechange.review for filechange in filechanges)\n        if len(reviews) > 1:\n            raise jsonapi.UsageError(\"Multiple reviews updated\")\n        review = reviews.pop()\n\n        converted = jsonapi.convert(\n            parameters,\n            {\n                \"draft_changes\": {\n                    \"new_is_reviewed\": bool,\n                },\n            },\n            data)\n\n        is_reviewed = converted[\"draft_changes\"][\"new_is_reviewed\"]\n\n        with api.transaction.Transaction(critic) as transaction:\n            modifier = transaction \\\n                .modifyReview(review)\n\n            if is_reviewed:\n                for filechange in filechanges:\n                    modifier.markChangeAsReviewed(filechange)\n            else:\n                for filechange in filechanges:\n                    modifier.markChangeAsPending(filechange)\n"
  },
  {
    "path": "src/jsonapi/v1/reviews.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\nimport jsonapi\n\n@jsonapi.PrimaryResource\nclass Reviews(object):\n    \"\"\"The reviews in this system.\"\"\"\n\n    name = \"reviews\"\n    value_class = api.review.Review\n    exceptions = (api.review.InvalidReviewId, api.repository.RepositoryError)\n    lists = (\"issues\", \"notes\")\n\n    @staticmethod\n    def json(value, parameters):\n        \"\"\"Review {\n             \"id\": integer,\n             \"state\": string,\n             \"summary\": string,\n             \"description\": string or null,\n             \"repository\": integer,\n             \"branch\": integer,\n             \"owners\": integer[],\n             \"active_reviewers\": integer[],\n             \"assigned_reviewers\": integer[],\n             \"watchers\": integer[],\n             \"partitions\": Partition[],\n             \"issues\": integer[],\n             \"notes\": integer[],\n             \"pending_rebase\": integer or null,\n             \"progress\": float,\n             \"progress_per_commit\": CommitChangeCount[],\n           }\n\n           Partition {\n             \"commits\": integer[],\n             \"rebase\": integer or null,\n           }\n\n           CommitChangeCount {\n             \"commit_id\": integer,\n             \"total_changes\": integer,\n             \"reviewed_changes\": integer,\n           }\"\"\"\n\n        def change_counts_as_dict(change_counts):\n            return [{\n                \"commit_id\": change_count.commit_id,\n                \"total_changes\": change_count.total_changes,\n                \"reviewed_changes\": change_count.reviewed_changes,\n                } for change_count in change_counts]\n\n        partitions = []\n\n        def add_partition(partition):\n            if partition.following:\n                partition_rebase = partition.following.rebase\n            else:\n                partition_rebase = None\n\n            partitions.append({ \"commits\": list(partition.commits.topo_ordered),\n                                \"rebase\": partition_rebase })\n\n            if partition.following:\n                add_partition(partition.following.partition)\n\n        add_partition(value.first_partition)\n\n        return parameters.filtered(\n            \"reviews\", { \"id\": value.id,\n                         \"state\": value.state,\n                         \"summary\": value.summary,\n                         \"description\": value.description,\n                         \"repository\": value.repository,\n                         \"branch\": value.branch,\n                         \"owners\": jsonapi.sorted_by_id(value.owners),\n                         \"active_reviewers\": jsonapi.sorted_by_id(\n                             value.active_reviewers),\n                         \"assigned_reviewers\": jsonapi.sorted_by_id(\n                             value.assigned_reviewers),\n                         \"watchers\": jsonapi.sorted_by_id(value.watchers),\n                         \"partitions\": partitions,\n                         \"issues\": jsonapi.sorted_by_id(value.issues),\n                         \"notes\": jsonapi.sorted_by_id(value.notes),\n                         \"pending_rebase\": value.pending_rebase,\n                         \"progress\": value.total_progress,\n                         \"progress_per_commit\":\n                             change_counts_as_dict(value.progress_per_commit)})\n\n    @staticmethod\n    def single(parameters, argument):\n        \"\"\"Retrieve one (or more) reviews in this system.\n\n           REVIEW_ID : integer\n\n           Retrieve a review identified by its unique numeric id.\"\"\"\n\n        return Reviews.setAsContext(parameters, api.review.fetch(\n            parameters.critic, review_id=jsonapi.numeric_id(argument)))\n\n    @staticmethod\n    def multiple(parameters):\n        \"\"\"Retrieve all reviews in this system.\n\n           repository : REPOSITORY : -\n\n           Include only reviews in one repository, identified by the\n           repository's unique numeric id or short-name.\n\n           state : STATE[,STATE,...] : -\n\n           Include only reviews in the specified state.  Valid values are:\n           <code>open</code>, <code>closed</code>, <code>dropped</code>.\"\"\"\n\n        repository = jsonapi.deduce(\"v1/repositories\", parameters)\n        state_parameter = parameters.getQueryParameter(\"state\")\n        if state_parameter:\n            state = set(state_parameter.split(\",\"))\n            invalid = state - api.review.Review.STATE_VALUES\n            if invalid:\n                raise jsonapi.UsageError(\n                    \"Invalid review state values: %s\"\n                    % \", \".join(map(repr, sorted(invalid))))\n        else:\n            state = None\n        return api.review.fetchAll(\n            parameters.critic, repository=repository, state=state)\n\n    @staticmethod\n    def create(parameters, value, values, data):\n        critic = parameters.critic\n        path = parameters.subresource_path\n        review = value\n\n        if review:\n            if path == [\"issues\"] or path == [\"notes\"]:\n                Reviews.setAsContext(parameters, review)\n                if path == [\"issues\"]:\n                    comment_type = \"issue\"\n                else:\n                    comment_type = \"note\"\n                jsonapi.ensure(data, \"type\", comment_type)\n                raise jsonapi.InternalRedirect(\"v1/comments\")\n\n        raise jsonapi.UsageError(\"Review creation not yet supported\")\n\n    @staticmethod\n    def deduce(parameters):\n        review = parameters.context.get(\"reviews\")\n        review_parameter = parameters.getQueryParameter(\"review\")\n        if review_parameter is not None:\n            if review is not None:\n                raise jsonapi.UsageError(\n                    \"Redundant query parameter: review=%s\" % review_parameter)\n            review = api.review.fetch(\n                parameters.critic,\n                review_id=jsonapi.numeric_id(review_parameter))\n        return review\n\n    @staticmethod\n    def setAsContext(parameters, review):\n        parameters.setContext(Reviews.name, review)\n        # Also set the review's repository and branch as context.\n        jsonapi.v1.repositories.Repositories.setAsContext(\n            parameters, review.repository)\n        jsonapi.v1.branches.Branches.setAsContext(parameters, review.branch)\n        return review\n"
  },
  {
    "path": "src/jsonapi/v1/reviewsummaries.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2017 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\nimport jsonapi\n\n@jsonapi.PrimaryResource\nclass ReviewSummaries(object):\n    \"\"\"Review summaries\"\"\"\n\n    name = \"reviewsummaries\"\n    value_class = api.reviewsummary.ReviewSummaryContainer\n    exceptions = (api.reviewsummary.ReviewSummaryError,)\n\n    @staticmethod\n    def json(value, parameters):\n        \"\"\"ReviewSummaries {\n             \"reviews\": ReviewSummary[],\n             \"more\": bool // true if there are more reviews than the ones retrieved\n           }\n\n           ReviewSummary {\n             \"review\": integer,\n             \"summary\": string, // the review's summary (text)\n             \"latest_change\": integer, // the timestamp of the latest commit or comment\n             \"progress\": float, // reviewing progress as a number between 0 and 1\n             \"issues\": integer, // the number of open issues in the review\n           }\"\"\"\n\n        def review_summary_as_dict(review_summary):\n            return { \"review\": review_summary.review,\n                     \"summary\": review_summary.review.summary,\n                     \"latest_change\": review_summary.latest_change,\n                     \"progress\": review_summary.review.total_progress,\n                     \"issues\": len(review_summary.review.open_issues)}\n\n        return parameters.filtered(\n            \"review summaries\", { \"reviews\": [review_summary_as_dict(review_summary) for\n                                              review_summary in value.reviews],\n                                  \"more\": value.more})\n\n    @staticmethod\n    def multiple(parameters):\n        \"\"\"Retrieve review summaries.\"\"\"\n\n        countParameter = parameters.getQueryParameter(\"count\")\n        offsetParameter = parameters.getQueryParameter(\"offset\")\n        count = int(countParameter) if countParameter is not None else None\n        offset = int(offsetParameter) if offsetParameter is not None else None\n        if count < 1 and count is not None:\n            jsonapi.InputError(\"count parameter must be bigger than 0\")\n        if offset < 0 and offset is not None:\n            jsonapi.InputError(\"offset can't be less than 0\")\n\n        search_type = parameters.getQueryParameter(\"type\")\n        user = parameters.critic.actual_user\n\n        if user is None and search_type != \"all\":\n            raise jsonapi.PermissionDenied(\n                \"You do not have the rights to access this resource\")\n\n        if search_type not in api.reviewsummary.ReviewSummary.TYPE_VALUES:\n            raise jsonapi.UsageError(\n                \"Review summary type parameter must be specified and set to \"\n                \"one of: \" + \\\n                \", \".join(api.reviewsummary.ReviewSummary.TYPE_VALUES))\n        return api.reviewsummary.fetchMany(parameters.critic, search_type, user, count, offset)\n"
  },
  {
    "path": "src/jsonapi/v1/sessions.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2016 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport api\nimport jsonapi\n\nimport auth\n\nclass Session(object):\n    def __init__(self, user, session_type):\n        self.user = user\n        self.session_type = session_type\n\nclass SessionError(jsonapi.Error):\n    http_status = 403\n    title = \"Session error\"\n\n@jsonapi.PrimaryResource\nclass Sessions(object):\n    \"\"\"The session of the accessing client.\"\"\"\n\n    name = \"sessions\"\n    value_class = Session\n\n    anonymous_create = True\n\n    @staticmethod\n    def json(value, parameters):\n        \"\"\"Session {\n             \"user\": integer, // the signed in user's id, or null\n             \"type\": \"normal\" or \"accesstoken\", or null,\n             \"fields\": [\n                 {\n                     \"identifier\": string, // unique field identifier\n                     \"label\": string,      // UI label\n                     \"hidden\": boolean,    // true for passwords\n                     \"description\": string or null\n                 },\n                 ...\n             ]\n           }\"\"\"\n\n        fields = []\n\n        for db_field in auth.DATABASE.getFields():\n            hidden, identifier, label = db_field[:3]\n            if len(db_field) == 4:\n                description = db_field[3]\n            else:\n                description = None\n            fields.append({\n                \"identifier\": identifier,\n                \"label\": label,\n                \"hidden\": hidden,\n                \"description\": description\n            })\n\n        return parameters.filtered(\n            \"sessions\", { \"user\": value.user,\n                          \"type\": value.session_type,\n                          \"fields\": fields })\n\n    @staticmethod\n    def single(parameters, argument):\n        \"\"\"Retrieve the current session.\n\n           CURRENT : \"current\"\n\n           Retrieve the current session.\"\"\"\n\n        if argument != \"current\":\n            raise jsonapi.UsageError('Resource argument must be \"current\"')\n\n        user = parameters.critic.actual_user\n        if parameters.critic.access_token:\n            session_type = \"accesstoken\"\n        elif user:\n            session_type = \"normal\"\n        else:\n            session_type = None\n        return Session(user, session_type)\n\n    @staticmethod\n    def create(parameters, value, values, data):\n        fields = auth.DATABASE.getFields()\n\n        converted = jsonapi.convert(parameters, {\n            fieldname: str\n            for hidden, fieldname, label in fields\n        }, data)\n\n        critic = parameters.critic\n\n        try:\n            auth.DATABASE.authenticate(critic.database, converted)\n        except auth.AuthenticationFailed as error:\n            raise SessionError(error.message)\n        except auth.WrongPassword:\n            raise SessionError(\"Wrong password\")\n\n        auth.createSessionId(\n            critic.database, parameters.req, critic.database.user)\n\n        return Session(critic.actual_user, \"normal\"), None\n\n    @staticmethod\n    def delete(parameters, value, values):\n        critic = parameters.critic\n\n        auth.deleteSessionId(\n            critic.database, parameters.req, critic.database.user)\n"
  },
  {
    "path": "src/jsonapi/v1/users.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport itertools\n\nimport api\nimport jsonapi\n\n@jsonapi.PrimaryResource\nclass Users(object):\n    \"\"\"The users of this system.\"\"\"\n\n    name = \"users\"\n    value_class = api.user.User\n    exceptions = (api.user.UserError,)\n\n    @staticmethod\n    def json(value, parameters):\n        \"\"\"User {\n             \"id\": integer, // the user's id\n             \"name\": string, // the user's unique user name\n             \"fullname\": string, // the user's full name\n             \"status\": string, // the user's status: \"current\", \"absent\" or \"retired\"\n             \"email\": string, // the user's primary email address\n           }\"\"\"\n\n        return parameters.filtered(\n            \"users\", { \"id\": value.id,\n                       \"name\": value.name,\n                       \"fullname\": value.fullname,\n                       \"status\": value.status,\n                       \"email\": value.email })\n\n    @staticmethod\n    def single(parameters, argument):\n        \"\"\"Retrieve one (or more) users of this system.\n\n           USER_ID : integer or \"me\"\n\n           Retrieve a user identified by the user's unique numeric id, or the\n           identifier \"me\" to retrieve the current user.\"\"\"\n\n        if argument == \"me\":\n            user = parameters.critic.actual_user\n            if user is None:\n                raise api.user.UserError(\"'users/me' (not signed in)\")\n        else:\n            user = api.user.fetch(parameters.critic,\n                                  user_id=jsonapi.numeric_id(argument))\n        return Users.setAsContext(parameters, user)\n\n    @staticmethod\n    def multiple(parameters):\n        \"\"\"Retrieve a single named user or all users of this system.\n\n           name : NAME : string\n\n           Retrieve only the user with the given name.  This is equivalent to\n           accessing /api/v1/users/USER_ID with that user's numeric id.  When\n           used, any other parameters are ignored.\n\n           status : USER_STATUS[,USER_STATUS,...] : string\n\n           Include only users whose status is one of the specified.  Valid\n           values are: <code>current</code>, <code>absent</code>,\n           <code>retired</code>.\n\n           sort : SORT_KEY : string\n\n           Sort the returned users by the specified key.  Valid values are:\n           <code>id</code>, <code>name</code>, <code>fullname</code>,\n           <code>email</code>.\"\"\"\n\n        name_parameter = parameters.getQueryParameter(\"name\")\n        if name_parameter:\n            return api.user.fetch(parameters.critic, name=name_parameter)\n        status_parameter = parameters.getQueryParameter(\"status\")\n        if status_parameter:\n            status = set(status_parameter.split(\",\"))\n            invalid = status - api.user.User.STATUS_VALUES\n            if invalid:\n                raise jsonapi.UsageError(\n                    \"Invalid user status values: %s\"\n                    % \", \".join(map(repr, sorted(invalid))))\n        else:\n            status = None\n        sort_parameter = parameters.getQueryParameter(\"sort\")\n        if sort_parameter:\n            if sort_parameter not in (\"id\", \"name\", \"fullname\", \"email\"):\n                raise jsonapi.UsageError(\"Invalid user sort parameter: %r\"\n                                         % sort_parameter)\n            sort_key = lambda user: getattr(user, sort_parameter)\n        else:\n            sort_key = lambda user: user.id\n        return sorted(api.user.fetchAll(parameters.critic, status=status),\n                      key=sort_key)\n\n    @staticmethod\n    def update(parameters, value, values, data):\n        if values and len(values) != 1:\n            raise UsageError(\"Updating multiple users not supported\")\n\n        critic = parameters.critic\n\n        if values:\n            value = values[0]\n\n        converted = jsonapi.convert(parameters, { \"fullname?\": str }, data)\n\n        with api.transaction.Transaction(critic) as transaction:\n            if \"fullname\" in converted:\n                new_fullname = converted[\"fullname\"].strip()\n                if not new_fullname:\n                    raise jsonapi.InputError(\"Empty new fullname\")\n                transaction.modifyUser(value).setFullname(new_fullname)\n\n    @staticmethod\n    def deduce(parameters):\n        user = parameters.context.get(\"users\")\n        user_parameter = parameters.getQueryParameter(\"user\")\n        if user_parameter is not None:\n            if user is not None:\n                raise jsonapi.UsageError(\n                    \"Redundant query parameter: user=%s\"\n                    % user_parameter)\n            user = Users.fromParameter(user_parameter, parameters)\n        return user\n\n    @staticmethod\n    def fromParameter(value, parameters):\n        user_id, name = jsonapi.id_or_name(value)\n        return api.user.fetch(parameters.critic, user_id=user_id, name=name)\n\n    @staticmethod\n    def setAsContext(parameters, user):\n        parameters.setContext(Users.name, user)\n        return user\n\n@jsonapi.PrimaryResource\nclass Emails(object):\n    \"\"\"A user's primary email addresses.\n\n       A \"primary\" email address is one that Critic would send emails to.  A\n       user can have multiple primary email addresses registered, but at most\n       one of them can be selected.  Emails are only sent to a selected primary\n       email address.\n\n       A user also has a set of \"Git\" email addresses.  Those are only compared\n       against Git commit meta-data, and are never used when sending emails.\"\"\"\n\n    name = \"emails\"\n    contexts = (\"users\",)\n    value_class = api.user.User.PrimaryEmail\n\n    @staticmethod\n    def json(value, parameters):\n        \"\"\"Email {\n             \"address\": string, // the email address\n             \"selected\": string, // true if address is selected\n             \"verified\": string, // true if address is verified\n           }\"\"\"\n\n        return parameters.filtered(\n            \"emails\", { \"address\": value.address,\n                        \"selected\": value.selected,\n                        \"verified\": value.verified })\n\n    @staticmethod\n    def single(parameters, argument):\n        \"\"\"A primary email address by index.\n\n           INDEX : integer\n\n           Retrieve a primary email address identified by its index.\"\"\"\n\n        emails = list(parameters.context[\"users\"].primary_emails)\n\n        try:\n            return emails[jsonapi.numeric_id(argument) - 1]\n        except IndexError:\n            raise jsonapi.PathError(\"List index out of range\")\n\n    @staticmethod\n    def multiple(parameters):\n        \"\"\"All primary email addresses.\"\"\"\n\n        return parameters.context[\"users\"].primary_emails\n\n@jsonapi.PrimaryResource\nclass Filters(object):\n    \"\"\"A user's repository filters.\"\"\"\n\n    name = \"filters\"\n    contexts = (\"users\",)\n    value_class = api.filters.RepositoryFilter\n    exceptions = (api.repository.RepositoryError, KeyError)\n\n    lists = frozenset((\"delegates\",))\n\n    @staticmethod\n    def json(value, parameters):\n        \"\"\"Filter {\n             \"id\": integer, // the filter's id\n             \"type\": string, // \"reviewer\", \"watcher\" or \"ignored\"\n             \"path\": string, // the filtered path\n             \"repository\": integer, // the filter's repository's id\n             \"delegates\": integer[], // list of user ids\n           }\"\"\"\n\n        return parameters.filtered(\"filters\", {\n            \"id\": value.id,\n            \"type\": value.type,\n            \"path\": value.path,\n            \"repository\": value.repository,\n            \"delegates\": jsonapi.sorted_by_id(value.delegates)\n        })\n\n    @staticmethod\n    def single(parameters, argument):\n        \"\"\"Retrieve one (or more) of a user's repository filters.\n\n           FILTER_ID : integer\n\n           Retrieve a filter identified by the filters's unique numeric id.\"\"\"\n\n        user = parameters.context[\"users\"]\n        filter_id = jsonapi.numeric_id(argument)\n\n        for repository_filters in user.repository_filters.values():\n            for repository_filter in repository_filters:\n                if repository_filter.id == filter_id:\n                    return repository_filter\n\n        raise KeyError(\"invalid filter id: %d\" % filter_id)\n\n    @staticmethod\n    def multiple(parameters):\n        \"\"\"All repository filters.\n\n           repository : REPOSITORY : -\n\n           Include only filters for the specified repository, identified by its\n           unique numeric id or short-name.\"\"\"\n\n        user = parameters.context[\"users\"]\n        repository = jsonapi.deduce(\"v1/repositories\", parameters)\n        if repository:\n            repository_filters = user.repository_filters.get(\n                repository, [])\n        else:\n            repository_filters = itertools.chain(\n                *user.repository_filters.values())\n        return jsonapi.sorted_by_id(repository_filters)\n\n    @staticmethod\n    def create(parameters, value, values, data):\n        import reviewing.filters\n\n        class FilterPath(jsonapi.check.StringChecker):\n            def check(self, context, value):\n                path = reviewing.filters.sanitizePath(value)\n                try:\n                    reviewing.filters.validatePattern(path)\n                except reviewing.filters.PatternError as error:\n                    return error.message\n\n            def convert(self, context, value):\n                return reviewing.filters.sanitizePath(value)\n\n        critic = parameters.critic\n        subject = parameters.context[\"users\"]\n\n        if parameters.subresource_path:\n            if value:\n                repository_filters = [value]\n            else:\n                repository_filters = values\n\n            assert parameters.subresource_path[0] == \"delegates\"\n            assert len(parameters.subresource_path) == 1\n\n            converted = jsonapi.convert(parameters, api.user.User, data)\n\n            with api.transaction.Transaction(critic) as transaction:\n                for repository_filter in repository_filters:\n                    delegates = set(repository_filter.delegates)\n\n                    if converted not in delegates:\n                        delegates.add(converted)\n                        transaction \\\n                            .modifyUser(subject) \\\n                            .modifyFilter(repository_filter) \\\n                            .setDelegates(delegates)\n\n            return value, values\n\n        converted = jsonapi.convert(\n            parameters,\n            { \"type\": set((\"reviewer\", \"watcher\", \"ignore\")),\n              \"path\": FilterPath,\n              \"repository\": api.repository.Repository,\n              \"delegates?\": [api.user.User] },\n            data)\n\n        result = []\n\n        def collectFilter(repository_filter):\n            assert isinstance(repository_filter, api.filters.RepositoryFilter)\n            result.append(repository_filter)\n\n        with api.transaction.Transaction(critic, result) as transaction:\n            transaction \\\n                .modifyUser(subject) \\\n                .createFilter(\n                    filter_type=converted[\"type\"],\n                    repository=converted[\"repository\"],\n                    path=converted[\"path\"],\n                    delegates=converted.get(\"delegates\", []),\n                    callback=collectFilter)\n\n        assert len(result) == 1\n        return result[0], None\n\n    @staticmethod\n    def update(parameters, value, values, data):\n        critic = parameters.critic\n\n        if parameters.subresource_path:\n            assert parameters.subresource_path[0] == \"delegates\"\n\n            if len(parameters.subresource_path) == 2:\n                raise jsonapi.UsageError(\"can't update specific delegate\")\n\n            delegates = jsonapi.convert(\n                parameters, [api.user.User], data)\n        else:\n            converted = jsonapi.convert(\n                parameters,\n                { \"delegates?\": [api.user.User] },\n                data)\n\n            delegates = converted.get(\"delegates\")\n\n        if value:\n            repository_filters = [value]\n        else:\n            repository_filters = values\n\n        with api.transaction.Transaction(critic) as transaction:\n            for repository_filter in repository_filters:\n                if delegates is not None:\n                    transaction \\\n                        .modifyUser(repository_filter.subject) \\\n                        .modifyFilter(repository_filter) \\\n                        .setDelegates(delegates)\n\n        return value, values\n\n    @staticmethod\n    def delete(parameters, value, values):\n        critic = parameters.critic\n\n        if parameters.subresource_path:\n            assert value and not values\n            assert parameters.subresource_path[0] == \"delegates\"\n\n            repository_filter = value\n            delegates = jsonapi.sorted_by_id(repository_filter.delegates)\n\n            if len(parameters.subresource_path) == 1:\n                # Delete all delegates.\n                delegates = []\n            else:\n                del delegates[parameters.subresource_path[1] - 1]\n\n            with api.transaction.Transaction(critic) as transaction:\n                transaction \\\n                    .modifyUser(repository_filter.subject) \\\n                    .modifyFilter(repository_filter) \\\n                    .setDelegates(delegates)\n\n            # Remove the last component from the sub-resource path, since we've\n            # just deleted the specified sub-resource(s).\n            del parameters.subresource_path[-1]\n\n            return value, values\n\n        if value:\n            repository_filters = [value]\n        else:\n            repository_filters = values\n\n        with api.transaction.Transaction(critic) as transaction:\n            for repository_filter in repository_filters:\n                transaction \\\n                    .modifyUser(repository_filter.subject) \\\n                    .modifyFilter(repository_filter) \\\n                    .delete()\n"
  },
  {
    "path": "src/library/js/v8/critic-batch.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2013 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\n\"use strict\";\n\nvar batch_internals = {};\nvar batch_id_counter = 0;\n\nfunction CriticBatch(data)\n{\n  var self = this;\n  var review_id, review;\n  var user_id, user;\n  var chain_id, chain;\n  var issues;\n  var notes;\n  var replies;\n\n  var internal_id = batch_id_counter++;\n  var internals = batch_internals[internal_id] = {};\n\n  Object.defineProperty(this, \"__id__\", { value: internal_id });\n\n  function getReview()\n  {\n    if (!review)\n      review = new CriticReview(review_id);\n    return review;\n  }\n\n  function getUser()\n  {\n    if (!user)\n      user = new CriticUser(user_id);\n    return user;\n  }\n\n  function getCommentChain()\n  {\n    if (chain === void 0)\n    {\n      if (chain_id === null)\n        chain = null;\n      else\n        chain = new CriticCommentChain(chain_id, { batch: self, review: review });\n    }\n    return chain;\n  }\n\n  function getIssues()\n  {\n    if (!issues)\n    {\n      issues = [];\n\n      var result = db.execute((\"SELECT id, review, batch, uid, time, type, state, origin, file, \" +\n                                      \"first_commit, last_commit, closed_by, addressed_by \" +\n                                 \"FROM commentchains \" +\n                                \"WHERE batch=%d \" +\n                                  \"AND type='issue' \" +\n                                \"ORDER BY time ASC\"), self.id);\n\n      for (var index = 0; index < result.length; ++index)\n        issues.push(new CriticCommentChain(result[index], { batch: self }));\n\n      Object.freeze(issues);\n    }\n\n    return issues;\n  }\n\n  function getNotes()\n  {\n    if (!notes)\n    {\n      notes = [];\n\n      var result = db.execute((\"SELECT id, review, batch, uid, time, type, state, origin, file, \" +\n                                      \"first_commit, last_commit, closed_by, addressed_by \" +\n                                 \"FROM commentchains \" +\n                                \"WHERE batch=%d \" +\n                                  \"AND type='note' \" +\n                                \"ORDER BY time ASC\"), self.id);\n\n      for (var index = 0; index < result.length; ++index)\n        notes.push(new CriticCommentChain(result[index], { batch: self }));\n\n      Object.freeze(notes);\n    }\n\n    return notes;\n  }\n\n  function getReplies()\n  {\n    if (!replies)\n    {\n      replies = [];\n\n      var result = db.execute((\"SELECT id, chain, uid, time, state, comment \" +\n                                 \"FROM comments \" +\n                                \"WHERE batch=%d\"), self.id);\n\n      for (var index = 0; index < result.length; ++index)\n      {\n        var row = result[index];\n        replies.push(new CriticComment(row.chain, self.id, row.id, row.uid, row.time, row.state, row.comment, { batch: self }));\n      }\n\n      Object.freeze(replies);\n    }\n\n    return replies;\n  }\n\n  if (typeof data.id == \"number\")\n  {\n    this.id = data.id;\n\n    var result = db.execute(\"SELECT review, uid, comment, time FROM batches WHERE id=%d\", this.id)[0];\n\n    if (!result)\n      throw CriticError(format(\"%d: invalid batch ID\", this.id));\n\n    this.time = result.time;\n\n    review_id = result.review;\n    review = data.review || null;\n    user_id = result.uid;\n    user = data.user || null;\n    chain_id = result.comment;\n    chain = data.comment || null;\n\n    Object.defineProperties(this, { review: { get: getReview, enumerable: true },\n                                    user: { get: getUser, enumerable: true },\n                                    commentChain: { get: getCommentChain, enumerable: true },\n                                    issues: { get: getIssues, enumerable: true },\n                                    notes: { get: getNotes, enumerable: true },\n                                    replies: { get: getReplies, enumerable: true }});\n  }\n  else if (data.internals === batch_internals)\n  {\n    this.id = null;\n    this.review = data.review;\n    this.user = data.user;\n\n    internals.filter_user_ids = {};\n    internals.filter_operations = [];\n    internals.comment_operations = [];\n    internals.assignments = { fileCount: 0 };\n    internals.added_filters = [];\n    internals.removed_filters = [];\n\n    internals.replied_to_chains = {};\n    internals.modified_issues = {};\n\n    internals.review_created = data.review_created;\n  }\n  else\n    throw CriticError(\"invalid argument\");\n\n  Object.freeze(this);\n}\n\nfunction commitFromFileVersion(file_version)\n{\n  if (file_version instanceof CriticChangesetFileVersion)\n    if (file_version == file_version.file.oldVersion)\n      return file_version.file.changeset.parent;\n    else\n      return file_version.file.changeset.child;\n  else\n    return file_version.commit;\n}\n\nfunction createCommentChain(text, data, type)\n{\n  data = data || {};\n  text = text && String(text);\n\n  if (!(this instanceof CriticBatch))\n    throw CriticError(\"invalid this object; expected batch object\");\n  if (!text)\n    throw CriticError(\"invalid use: can't add empty comment\");\n\n  var users = {};\n\n  function addUser(user)\n  {\n    users[user.id] = user;\n  }\n\n  function insertUsers(chain_id, comment_id, mark_comment_as_read)\n  {\n    for (var user_id in users)\n    {\n      db.execute(\"INSERT INTO commentchainusers (chain, uid) VALUES (%d, %d)\", chain_id, user_id);\n      if (!mark_comment_as_read)\n        db.execute(\"INSERT INTO commentstoread (uid, comment) VALUES (%d, %d)\", user_id, comment_id);\n    }\n  }\n\n  addUser(this.user);\n\n  this.review.owners.forEach(addUser);\n\n  var operations = batch_internals[this.__id__].comment_operations;\n\n  if (data.fileVersion)\n  {\n    if (!(data.fileVersion instanceof CriticFileVersion))\n      throw CriticError(\"data.fileVersion: invalid argument; expected file version object\");\n    if (typeof data.lineIndex != \"number\")\n      throw CriticError(\"data.lineIndex: invalid argument; expected number\");\n    if (typeof data.lineCount != \"number\")\n      throw CriticError(\"data.lineCount: invalid argument; expected number\");\n\n    var file_version = data.fileVersion;\n\n    var cli_command = {\n      name: \"propagate-comment\",\n      data: {\n        review_id: this.review.id,\n        commit_id: commitFromFileVersion(file_version).id,\n        file_id: file_version.file.id,\n        first_line: data.lineIndex + 1,\n        last_line: data.lineIndex + data.lineCount\n      }\n    };\n\n    var cli_result = JSON.parse(executeCLI([cli_command])[0]);\n\n    if (typeof cli_result == \"string\")\n      throw CriticError(format(\"comment propagation failed: %s\", cli_result));\n\n    var state, addressed_by;\n\n    if (cli_result.status == \"clean\")\n    {\n      state = \"open\";\n      addressed_by = null;\n    }\n    else if (data.allowInitiallyAddressed)\n    {\n      state = 'addressed';\n      addressed_by = cli_result.addressed_by;\n    }\n    else\n    {\n      var addressed_by_commit = this.review.repository.getCommit(cli_result.addressed_by);\n      throw CriticError(format(\"cannot raise issue; lines are modified by a later commit: %s\", addressed_by_commit.sha1));\n    }\n\n    var lines = cli_result.lines;\n\n    var origin, parent, child;\n\n    if (file_version instanceof CriticChangesetFileVersion)\n    {\n      origin = file_version == file_version.file.oldVersion ? \"old\" : \"new\";\n      parent = file_version.file.changeset.parent;\n      child = file_version.file.changeset.child;\n    }\n    else\n    {\n      origin = \"new\";\n      parent = child = file_version.commit;\n    }\n\n    operations.push(function (batch_id, data)\n      {\n        var chain_id = db.execute(\"INSERT INTO commentchains (review, batch, uid, type, state, origin, file, first_commit, last_commit) VALUES (%d, %d, %d, %s, %s, %s, %d, %d, %d) RETURNING id\",\n                                  this.review.id, batch_id, this.user.id, type, state, origin, file_version.id, parent.id, child.id)[0].id;\n\n        if (addressed_by)\n          db.execute(\"UPDATE commentchains SET addressed_by=%d WHERE id=%d\", addressed_by, chain_id);\n\n        var comment_id = db.execute(\"INSERT INTO comments (chain, batch, uid, state, comment) VALUES (%d, %d, %d, 'current', %s) RETURNING id\",\n                                    chain_id, batch_id, this.user.id, text)[0].id;\n\n        db.execute(\"UPDATE commentchains SET first_comment=%d WHERE id=%d\", comment_id, chain_id);\n\n        for (var index = 0; index < lines.length; ++index)\n        {\n          var line = lines[index];\n\n          db.execute(\"INSERT INTO commentchainlines (chain, uid, state, sha1, first_line, last_line) VALUES (%d, %d, 'current', %s, %d, %d)\",\n                     chain_id, this.user.id, line[0], line[1], line[2]);\n        }\n\n        insertUsers(chain_id, comment_id, data.silent);\n      });\n  }\n  else if (data.commit)\n  {\n    if (typeof data.lineIndex != \"number\")\n      throw CriticError(\"data.lineIndex: invalid argument; expected number\");\n    if (typeof data.lineCount != \"number\")\n      throw CriticError(\"data.lineCount: invalid argument; expected number\");\n\n    var lines = data.commit.message.trim().split(\"\\n\");\n\n    if (data.lineIndex >= lines.length || data.lineIndex + data.lineCount > lines.length || data.lineCount == 0)\n      throw CriticError(\"data.lineIndex/data.lineCount: out of range or invalid\");\n\n    operations.push(function (batch_id, data)\n      {\n        var chain_id = db.execute(\"INSERT INTO commentchains (review, batch, uid, type, state, first_commit, last_commit) VALUES (%d, %d, %d, %s, 'open', %d, %d) RETURNING id\",\n                                  this.review.id, batch_id, this.user.id, type, data.commit.id, data.commit.id)[0].id;\n\n        var comment_id = db.execute(\"INSERT INTO comments (chain, batch, uid, state, comment) VALUES (%d, %d, %d, 'current', %s) RETURNING id\",\n                                    chain_id, batch_id, this.user.id, text)[0].id;\n\n        db.execute(\"UPDATE commentchains SET first_comment=%d WHERE id=%d\", comment_id, chain_id);\n\n        db.execute(\"INSERT INTO commentchainlines (chain, uid, state, commit, sha1, first_line, last_line) VALUES (%d, %d, 'current', %d, %s, %d, %d)\",\n                   chain_id, this.user.id, data.commit.id, data.commit.sha1, data.lineIndex, data.lineIndex + data.lineCount - 1);\n\n        insertUsers(chain_id, comment_id, data.silent);\n      });\n  }\n  else\n    operations.push(function (batch_id, data)\n      {\n        var chain_id = db.execute(\"INSERT INTO commentchains (review, batch, uid, type, state) VALUES (%d, %d, %d, %s, 'open') RETURNING id\",\n                                  this.review.id, batch_id, this.user.id, type)[0].id;\n\n        var comment_id = db.execute(\"INSERT INTO comments (chain, batch, uid, state, comment) VALUES (%d, %d, %d, 'current', %s) RETURNING id\",\n                                    chain_id, batch_id, this.user.id, text)[0].id;\n\n        db.execute(\"UPDATE commentchains SET first_comment=%d WHERE id=%d\", comment_id, chain_id);\n\n        insertUsers(chain_id, comment_id, data.silent);\n      });\n}\n\nCriticBatch.prototype.raiseIssue = function (text, data) { createCommentChain.call(this, text, data, \"issue\"); };\nCriticBatch.prototype.writeNote = function (text, data) { createCommentChain.call(this, text, data, \"note\"); };\n\nCriticBatch.prototype.addReply = function (chain, text)\n  {\n    var internals = batch_internals[this.__id__];\n    var operations = internals.comment_operations;\n\n    text = text && String(text);\n\n    if (!(chain instanceof CriticCommentChain))\n      throw CriticError(\"invalid chain argument; expected CommentChain object\");\n    if (chain.review.id != this.review.id)\n      throw CriticError(\"invalid use; chain belongs to a different review\");\n\n    if (!text)\n      throw CriticError(\"invalid use: can't add empty comment\");\n\n    if (chain.id in internals.replied_to_chains)\n      throw CriticError(\"can't add two replies to a comment chain in one batch\");\n    internals.replied_to_chains[chain.id] = true;\n\n    operations.push(function (batch_id, data)\n      {\n        var comment_id = db.execute(\"INSERT INTO comments (chain, batch, uid, state, comment) VALUES (%d, %d, %d, 'current', %s) RETURNING id\",\n                                    chain.id, batch_id, this.user.id, text)[0].id;\n\n        if (!data.silent)\n          for (var user_id in chain.users)\n            if (user_id != this.user.id)\n              db.execute(\"INSERT INTO commentstoread (uid, comment) VALUES (%d, %d)\", user_id, comment_id);\n\n        if (!db.execute(\"SELECT 1 FROM commentchainusers WHERE chain=%d AND uid=%d\", chain.id, this.user.id)[0])\n          db.execute(\"INSERT INTO commentchainusers (chain, uid) VALUES (%d, %d)\", chain.id, this.user.id);\n      });\n  };\n\nCriticBatch.prototype.resolveIssue = function (chain)\n  {\n    var internals = batch_internals[this.__id__];\n    var operations = internals.comment_operations;\n\n    if (!(chain instanceof CriticCommentChain))\n      throw CriticError(\"invalid chain argument; expected CommentChain object\");\n    if (chain.review.id != this.review.id)\n      throw CriticError(\"invalid use; chain belongs to a different review\");\n\n    if (chain.state != CriticCommentChain.STATE_OPEN)\n      throw CriticError(\"can't resolve issue; already addressed or resolved\");\n\n    if (chain.id in internals.modified_issues)\n      throw CriticError(\"can't modify the state of an issue more than once in a single batch\");\n    internals.modified_issues[chain.id] = true;\n\n    operations.push(function (batch_id)\n      {\n        db.execute(\"UPDATE commentchains SET state='closed', closed_by=%d WHERE id=%d\", this.user.id, chain.id);\n        db.execute(\"INSERT INTO commentchainchanges (batch, uid, chain, state, from_state, to_state) VALUES (%d, %d, %d, 'performed', %s, %s)\",\n                   batch_id, this.user.id, chain.id, 'open', 'closed');\n\n        if (!db.execute(\"SELECT 1 FROM commentchainusers WHERE chain=%d AND uid=%d\", chain.id, this.user.id)[0])\n          db.execute(\"INSERT INTO commentchainusers (chain, uid) VALUES (%d, %d)\", chain.id, this.user.id);\n      });\n  };\n\nCriticBatch.prototype.reopenIssue = function (chain, data)\n  {\n    var internals = batch_internals[this.__id__];\n    var operations = internals.comment_operations;\n\n    if (!(chain instanceof CriticCommentChain))\n      throw CriticError(\"invalid chain argument; expected CommentChain object\");\n    if (chain.review.id != this.review.id)\n      throw CriticError(\"invalid use; chain belongs to a different review\");\n    if (chain.type != CriticCommentChain.TYPE_ISSUE)\n      throw CriticError(\"invalid use: cannot reopen notes\");\n\n    if (chain.id in internals.modified_issues)\n      throw CriticError(\"can't modify the state of an issue more than once in a single batch\");\n    internals.modified_issues[chain.id] = true;\n\n    var current_state;\n    switch (chain.state)\n    {\n    case CriticCommentChain.STATE_ADDRESSED: current_state = \"addressed\"; break;\n    case CriticCommentChain.STATE_RESOLVED: current_state = \"closed\"; break;\n    default: writeln(chain.state); throw CriticError(\"can't ropen issue; not addressed or resolved\");\n    }\n\n    var lines;\n    if (chain.file)\n    {\n      if (chain.state == CriticCommentChain.STATE_ADDRESSED || data && data.fileVersion)\n      {\n        if (!data || !data.fileVersion || !(data.fileVersion instanceof CriticFileVersion))\n          throw CriticError(\"data.fileVersion: invalid argument; expected file version object\");\n        if (typeof data.lineIndex != \"number\")\n          throw CriticError(\"data.lineIndex: invalid argument; expected number\");\n        if (typeof data.lineCount != \"number\")\n          throw CriticError(\"data.lineCount: invalid argument; expected number\");\n\n        var file_version = data.fileVersion;\n\n        var cli_command = {\n          name: \"propagate-comment\",\n          data: {\n            review_id: this.review.id,\n            chain_id: chain.id,\n            commit_id: commitFromFileVersion(file_version).id,\n            file_id: file_version.file.id,\n            first_line: data.lineIndex + 1,\n            last_line: data.lineIndex + data.lineCount\n          }\n        };\n\n        var cli_result = JSON.parse(executeCLI([cli_command])[0]);\n\n        if (typeof cli_result == \"string\")\n          throw CriticError(format(\"comment propagation failed: %s\", cli_result));\n\n        if (cli_result.status == \"modified\")\n        {\n          var addressed_by_commit = this.review.repository.getCommit(cli_result.addressed_by);\n          throw CriticError(format(\"cannot reopen issue; lines are modified by a later commit: %s\", addressed_by_commit.sha1));\n        }\n\n        lines = cli_result.lines;\n      }\n    }\n    else\n      lines = null;\n\n    operations.push(function (batch_id)\n      {\n        db.execute(\"UPDATE commentchains SET state='open', closed_by=null, addressed_by=null WHERE id=%d\", chain.id);\n        db.execute(\"INSERT INTO commentchainchanges (batch, uid, chain, state, from_state, to_state) VALUES (%d, %d, %d, 'performed', %s, %s)\",\n                   batch_id, this.user.id, chain.id, current_state, 'open');\n\n        if (lines)\n          for (var index = 0; index < lines.length; ++index)\n          {\n            var line = lines[index];\n\n            db.execute(\"INSERT INTO commentchainlines (chain, uid, state, sha1, first_line, last_line) VALUES (%d, %d, 'current', %s, %d, %d)\",\n                       chain.id, this.user.id, line[0], line[1], line[2]);\n          }\n      });\n  };\n\nCriticBatch.prototype.markIssueAddressedBy = function (chain, commit)\n  {\n    var internals = batch_internals[this.__id__];\n    var operations = internals.comment_operations;\n\n    if (!(chain instanceof CriticCommentChain))\n      throw CriticError(\"invalid chain argument; expected CommentChain object\");\n    if (chain.review.id != this.review.id)\n      throw CriticError(\"invalid use; chain belongs to a different review\");\n\n    if (!(commit instanceof CriticCommit))\n      throw CriticError(\"invalid commit argument; expected Commit object\");\n    if (!(commit.sha1 in this.review.commits))\n      throw CriticError(\"invalid use; issues can only be addressed by commits in the review\");\n\n    if (chain.state != CriticCommentChain.STATE_OPEN)\n      throw CriticError(\"can't address issue; already addressed or resolved\");\n\n    if (chain.id in internals.modified_issues)\n      throw CriticError(\"can't modify the state of an issue more than once in a single batch\");\n    internals.modified_issues[chain.id] = true;\n\n    operations.push(function (batch_id)\n      {\n        db.execute(\"UPDATE commentchains SET state='addressed', closed_by=%d, addressed_by=%d WHERE id=%d\", this.user.id, commit.id, chain.id);\n        db.execute(\"INSERT INTO commentchainchanges (batch, uid, chain, state, from_state, to_state) VALUES (%d, %d, %d, 'performed', %s, %s)\",\n                   batch_id, this.user.id, chain.id, 'open', 'addressed');\n\n        if (!db.execute(\"SELECT 1 FROM commentchainusers WHERE chain=%d AND uid=%d\", chain.id, this.user.id)[0])\n          db.execute(\"INSERT INTO commentchainusers (chain, uid) VALUES (%d, %d)\", chain.id, this.user.id);\n      });\n  };\n\n/*\nfunction changeReviewFileStatus(what, new_state)\n{\n  if (what instanceof CriticChangeset)\n  {\n    if (what.review != this.review)\n      throw CriticError(\"invalid changeset; must associated with the batch's review\");\n\n    result = db.execute(\"SELECT id, file, deleted, inserted FROM reviewfiles WHERE review=%d AND changeset=%d AND state!=%s\", this.review.id, what.id, new_state);\n  }\n  else if (what instanceof CriticChangesetFile)\n  {\n    if (what.changeset.review != this.review)\n      throw CriticError(\"invalid file; must be part of a changeset associated with the batch's review\");\n\n    result = db.execute(\"SELECT id, file, deleted, inserted FROM reviewfiles WHERE review=%d AND changeset=%d AND file=%d AND state!=%s\", this.review.id, what.changeset.id, what.id, new_state);\n  }\n  else\n  {\n    result = [];\n\n    for (var index = 0; index < what.length; ++index)\n    {\n      var rows = db.execute(\"SELECT reviewfiles.id, file, deleted, inserted FROM reviewfiles JOIN changesets ON (changeset=changesets.id) WHERE review=%d AND child=%d AND state!=%s\", this.review.id, what[index].id, new_state);\n      for (var row_index = 0; row_index < rows.length; ++row_index)\n        result.push(rows[row_index]);\n    }\n  }\n}\n*/\n\nfunction changeAssignments(user, what, assigned)\n{\n  var result;\n\n  if (what instanceof CriticChangeset)\n  {\n    if (what.review != this.review)\n      throw CriticError(\"invalid changeset; must associated with the batch's review\");\n\n    result = db.execute(\"SELECT id, file, deleted, inserted FROM reviewfiles WHERE review=%d AND changeset=%d\", this.review.id, what.id);\n  }\n  else if (what instanceof CriticChangesetFile)\n  {\n    if (what.changeset.review != this.review)\n      throw CriticError(\"invalid file; must be part of a changeset associated with the batch's review\");\n\n    result = db.execute(\"SELECT id, file, deleted, inserted FROM reviewfiles WHERE review=%d AND changeset=%d AND file=%d\", this.review.id, what.changeset.id, what.id);\n  }\n  else\n  {\n    result = [];\n\n    for (var index = 0; index < what.length; ++index)\n    {\n      var rows = db.execute(\"SELECT reviewfiles.id, file, deleted, inserted FROM reviewfiles JOIN changesets ON (changeset=changesets.id) WHERE review=%d AND child=%d\", this.review.id, what[index].id);\n      for (var row_index = 0; row_index < rows.length; ++row_index)\n        result.push(rows[row_index]);\n    }\n  }\n\n  var assignments = batch_internals[this.__id__].assignments;\n  var files = assignments[user.id];\n\n  if (!files)\n  {\n    files = assignments[user.id] = {};\n    Object.defineProperty(files, \"fileCount\", { value: 0, writable: true });\n  }\n\n  for (var index = 0; index < result.length; ++index)\n  {\n    var row = result[index];\n    var file = files[row.file];\n\n    if (!file)\n      file = files[row.file] =\n        {\n          assignedFiles: {}, assignedDeleteCount: 0, assignedInsertCount: 0,\n          unassignedFiles: {}, unassignedDeleteCount: 0, unassignedInsertCount: 0\n        };\n\n    if (assigned)\n    {\n      if (row.id in file.unassignedFiles)\n      {\n        delete file.unassignedFiles[row.id];\n        file.unassignedDeleteCount -= row.deleted;\n        file.unassignedInsertCount -= row.inserted;\n      }\n      else\n      {\n        ++assignments.fileCount;\n        ++files.fileCount;\n      }\n\n      file.assignedFiles[row.id] = true;\n      file.assignedDeleteCount += row.deleted;\n      file.assignedInsertCount += row.inserted;\n    }\n    else\n    {\n      if (row.id in file.assignedFiles)\n      {\n        delete file.assignedFiles[row.id];\n        file.assignedDeleteCount -= row.deleted;\n        file.assignedInsertCount -= row.inserted;\n      }\n      else\n      {\n        ++assignments.fileCount;\n        ++files.fileCount;\n      }\n\n      file.unassignedFiles[row.id] = true;\n      file.unassignedDeleteCount += row.deleted;\n      file.unassignedInsertCount += row.inserted;\n    }\n  }\n}\n\nCriticBatch.prototype.assignChanges = function (user, what) { changeAssignments.call(this, user, what, true); };\nCriticBatch.prototype.unassignChanges = function (user, what) { changeAssignments.call(this, user, what, false); };\n\nCriticBatch.prototype.addReviewFilter = function (user, type, path)\n  {\n    if (!(this instanceof CriticBatch))\n      throw CriticError(\"invalid this object; expected batch object\");\n\n    type = String(type);\n    path = String(path);\n\n    if (!(user instanceof CriticUser))\n      throw CriticError(\"invalid user argument; expected User object\");\n    if (type != \"reviewer\" && type != \"watcher\" && type != \"ignored\")\n      throw CriticError(\"invalid type argument; expected 'reviewer', 'watcher' or 'ignored'\");\n    if (/[^\\/]\\*\\*|\\*\\*[^\\/]/.test(path))\n      throw CriticError(\"invalid wildcards in path argument\");\n\n    var internals = batch_internals[this.__id__];\n    var user_ids = internals.filter_user_ids;\n    var operations = internals.filter_operations;\n    var added_filters = internals.added_filters;\n    var removed_filters = internals.removed_filters;\n\n    for (var index = 0; index < removed_filters.length; ++index)\n    {\n      var removed_filter = removed_filters[index];\n      if (removed_filter.uid == user.id && removed_filter.path == path)\n        throw CriticError(\"can't add filter; identical or conflicting filter removed in this batch\");\n    }\n\n    var result = db.execute(\"SELECT 1 FROM reviewfilters WHERE review=%d AND uid=%d AND path=%s\", this.review.id, user.id, path);\n\n    if (result.length != 0)\n      throw CriticError(\"can't add filter; identical or conflicting filter already exists\");\n\n    for (var index = 0; index < added_filters.length; ++index)\n    {\n      var added_filter = added_filters[index];\n      if (added_filter.uid == user.id && added_filter.path == path)\n        throw CriticError(\"can't add filter; identical or conflicting filter added in this batch\");\n    }\n\n    user_ids[user.id] = user;\n    added_filters.push({ uid: user.id, path: path, type: type, delegate: null });\n\n    operations.push(function (transaction_id)\n      {\n        db.execute(\"INSERT INTO reviewfilters (review, uid, path, type, creator) VALUES (%d, %d, %s, %s, %d)\",\n                   this.review.id, user.id, path, type, this.user.id);\n\n        db.execute(\"INSERT INTO reviewfilterchanges (transaction, uid, path, type, created) VALUES (%d, %d, %s, %s, true)\",\n                   transaction_id, user.id, path, type);\n      });\n  };\n\nCriticBatch.prototype.removeReviewFilter = function (user, type, path)\n  {\n    if (!(this instanceof CriticBatch))\n      throw CriticError(\"invalid this object; expected batch object\");\n\n    type = String(type);\n    path = String(path);\n\n    if (!(user instanceof CriticUser))\n      throw CriticError(\"invalid user argument; expected User object\");\n    if (type != \"reviewer\" && type != \"watcher\" && type != \"ignored\")\n      throw CriticError(\"invalid type argument; expected 'reviewer', 'watcher' or 'ignored'\");\n\n    var internals = batch_internals[this.__id__];\n    var user_ids = internals.filter_user_ids;\n    var operations = internals.filter_operations;\n    var added_filters = internals.added_filters;\n    var removed_filters = internals.removed_filters;\n\n    for (var index = 0; index < added_filters.length; ++index)\n    {\n      var added_filter = added_filters[index];\n      if (added_filter.uid == user.id && added_filter.path == path)\n        throw CriticError(\"can't remove filter; identical or conflicting filter added in this batch\");\n    }\n\n    var result = db.execute(\"SELECT 1 FROM reviewfilters WHERE review=%d AND uid=%d AND path=%s AND type=%s\", this.review.id, user.id, path, type)[0];\n\n    if (!result)\n      throw CriticError(\"can't remove filter; no such filter exists\");\n\n    for (var index = 0; index < removed_filters.length; ++index)\n    {\n      var removed_filter = removed_filters[index];\n      if (removed_filter.uid == user.id && removed_filter.path == path && removed_filter.type == type)\n        /* Already being removed; ignore this call. */\n        return;\n    }\n\n    user_ids[user.id] = user;\n    removed_filters.push({ uid: user.id, path: path, type: type, delegate: null });\n\n    operations.push(\n      function (transaction_id)\n      {\n        db.execute(\"DELETE FROM reviewfilters WHERE review=%d AND uid=%d AND path=%s\", this.review.id, user.id, path);\n\n        db.execute(\"INSERT INTO reviewfilterchanges (transaction, uid, type, created) VALUES (%d, %d, %s, %s, false)\",\n                   transaction_id, user.id, path, type);\n      });\n  };\n\nfunction getReviewMessageId(review, to_user)\n{\n  var result = db.execute(\"SELECT messageid, hostname FROM reviewmessageids WHERE review=%d AND uid=%d\", review.id, to_user.id)[0];\n\n  if (result)\n    return format(\"<%s@%s>\", result.messageid, result.hostname.trim());\n  else\n    return null;\n}\n\nfunction getCommentMessageId(comment, to_user)\n{\n  var result = db.execute(\"SELECT messageid, hostname FROM commentmessageids WHERE comment=%d AND uid=%d\", comment.id, to_user.id)[0];\n\n  if (result)\n    return format(\"<%s@%s>\", result.messageid, result.hostname.trim());\n  else\n    return null;\n}\n\nCriticBatch.prototype.finish = function (data)\n  {\n    if (!(this instanceof CriticBatch))\n      throw CriticError(\"invalid this object; expected batch object\");\n\n    /* We commit at the end of the function.  To ensure we don't commit anything\n       we didn't mean to, roll back the current transaction first.  (Normally,\n       there wouldn't be anything in the current transaction to commit, but\n       better safe than sorry. */\n    db.rollback();\n\n    try\n    {\n      data = data || {};\n\n      var text = data.text && String(data.text);\n      var internals = batch_internals[this.__id__];\n      var filter_operations = internals.filter_operations;\n      var filter_user_ids = internals.filter_user_ids;\n      var added_filters = internals.added_filters;\n      var removed_filters = internals.added_filters;\n      var comment_operations = internals.comment_operations;\n      var assignments = internals.assignments;\n      var review_id = this.review.id;\n      var batch_id = null;\n      var transaction_id = null;\n      var progress_before = this.review.progress;\n\n      if (comment_operations.length)\n        batch_id = db.execute(\"INSERT INTO batches (review, uid) VALUES (%d, %d) RETURNING id\", this.review.id, this.user.id)[0].id;\n      if (filter_operations.length || assignments.fileCount)\n        transaction_id = db.execute(\"INSERT INTO reviewassignmentstransactions (review, assigner) VALUES (%d, %d) RETURNING ID\", this.review.id, this.user.id)[0].id;\n\n      if (filter_operations.length)\n      {\n        var filters_before = new CriticFilters({ review: this.review });\n        var filters_after = new CriticFilters({ review: this.review,\n                                                added_review_filters: added_filters,\n                                                removed_review_filters: removed_filters });\n\n        for (var index = 0; index < filter_operations.length; ++index)\n        {\n          var followup = filter_operations[index].call(this, transaction_id);\n          if (followup)\n            filter_operations.push(followup);\n        }\n\n        var commits = this.review.commits;\n        var changesets = [];\n\n        for (var cindex = 0; cindex < commits.length; ++cindex)\n        {\n          var changeset = this.review.getChangeset(commits[cindex]);\n\n          if (changeset instanceof CriticMergeChangeset)\n            for (var index = 0; index < changeset.changesets.length; ++index)\n              changesets.push(changeset.changesets[index]);\n          else\n            changesets.push(changeset);\n        }\n\n        for (var user_id in filter_user_ids)\n        {\n          var user = filter_user_ids[user_id];\n\n          for (var csindex = 0; csindex < changesets.length; ++csindex)\n          {\n            var changeset = changesets[csindex];\n            var files = changeset.files;\n\n            for (var findex = 0; findex < files.length; ++findex)\n            {\n              var file = files[findex];\n              var reviewer_before = filters_before.isReviewer(user.id, file.id);\n              var reviewer_after = filters_after.isReviewer(user.id, file.id);\n\n              if (!reviewer_before && reviewer_after)\n              {\n                if (!user.isAuthor(changeset.child))\n                  this.assignChanges(user, file);\n              }\n              else if (reviewer_before && !reviewer_after)\n                this.unassignChanges(user, file);\n            }\n          }\n        }\n      }\n\n      if (assignments.fileCount)\n        for (var user_id in assignments)\n        {\n          var files = assignments[user_id], values;\n\n          for (var file_id in files)\n          {\n            var file = files[file_id];\n            var assignedFiles = file.assignedFiles;\n            var unassignedFiles = file.unassignedFiles;\n\n            values = Object.keys(assignedFiles);\n            if (values.length)\n            {\n              var result = db.execute(format(\"SELECT id, deleted, inserted FROM reviewfiles JOIN reviewuserfiles ON (reviewuserfiles.file=reviewfiles.id) WHERE review=%d AND uid=%d\", review_id, user_id));\n              for (var index = 0; index < result.length; ++index)\n              {\n                var row = result[index];\n                if (assignedFiles[row.id])\n                {\n                  delete assignedFiles[row.id];\n                  file.assignedDeleteCount -= row.deleted;\n                  file.assignedInsertCount -= row.inserted;\n                }\n              }\n            }\n\n            values = Object.keys(assignedFiles).map(function (file_id) { return format(\"(%d,%d)\", file_id, user_id); });\n            if (values.length)\n            {\n              if (db.execute(\"SELECT 1 FROM reviewusers WHERE review=%d AND uid=%d\", review_id, user_id).length == 0)\n                db.execute(\"INSERT INTO reviewusers (review, uid) VALUES (%d, %d)\", review_id, user_id);\n\n              db.execute(\"INSERT INTO reviewuserfiles (file, uid) VALUES \" + values);\n              db.execute(\"INSERT INTO reviewassignmentchanges (transaction, file, uid, assigned) VALUES \" +\n                         Object.keys(assignedFiles).map(function (file_id) { return format(\"(%d,%d,%d,true)\", transaction_id, file_id, user_id); }));\n            }\n            file.assignedFileCount = values.length;\n\n            values = Object.keys(unassignedFiles);\n            if (values.length)\n            {\n              var result = db.execute(format(\"SELECT COUNT(*) AS count, SUM(deleted) AS deleted, SUM(inserted) AS inserted FROM reviewfiles JOIN reviewuserfiles ON (reviewuserfiles.file=reviewfiles.id) WHERE review=%%d AND uid=%%d AND id IN (%s)\", values), review_id, user_id)[0];\n\n              db.execute(format(\"DELETE FROM reviewuserfiles WHERE file IN (%s) AND uid=%d\", values, user_id));\n              db.execute(\"INSERT INTO reviewassignmentchanges (transaction, file, uid, assigned) VALUES \" +\n                         Object.keys(unassignedFiles).map(function (file_id) { return format(\"(%d,%d,%d,false)\", transaction_id, file_id, user_id); }));\n\n              file.unassignedFileCount = result.count;\n              file.unassignedDeleteCount = result.deleteCount;\n              file.unassignedInsertCount = result.insertCount;\n            }\n          }\n        }\n\n      if (comment_operations.length)\n        for (var index = 0; index < comment_operations.length; ++index)\n          comment_operations[index].call(this, batch_id, data);\n\n      db.execute(\"UPDATE reviews SET serial=serial+1 WHERE id=%d\", this.review.id);\n\n      var updated_review = new CriticReview(this.review.id);\n      var progress_after = updated_review.progress;\n\n      db.commit();\n\n      if (!data.silent)\n      {\n        var commands = [];\n\n        if (batch_id !== null)\n        {\n          commands.push({\n            name: \"generate-mails-for-batch\",\n            data: {\n              batch_id: batch_id,\n              was_accepted: progress_before.accepted,\n              is_accepted: progress_after.accepted\n            }\n          });\n        }\n\n        if (transaction_id !== null)\n        {\n          commands.push({\n            name: \"generate-mails-for-assignments-transaction\",\n            data: {\n              transaction_id: transaction_id\n            }\n          });\n        }\n\n        var lines = executeCLI(commands);\n\n        if (lines)\n        {\n          try {\n            for (var index = 0; index < lines.length; ++index)\n              JSON.parse(lines[index]).forEach(sendMail);\n          } catch (error) {\n            throw new Error(format(\"%r\", lines));\n          }\n\n          var pid = parseInt(IO.File.read(maildelivery_pid_path).decode().trim());\n\n          OS.Process.kill(pid, 1);\n        }\n      }\n    }\n    finally\n    {\n      /* If anything fails we don't want to have done anything at all to the\n         database, so roll the transaction back.  If we did finish, we just\n         committed the transaction, in which case we aren't in a transaction\n         right no, and the rollback() call is a no-op. */\n      db.rollback();\n    }\n  };\n"
  },
  {
    "path": "src/library/js/v8/critic-branch.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2013 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\n\"use strict\";\n\nfunction CriticBranch(options)\n{\n  var repository, branch_id, name, result;\n\n  if (\"id\" in options)\n  {\n    branch_id = options.id;\n\n    result = db.execute(\"SELECT repository, name, head, base FROM branches WHERE id=%d\", branch_id)[0];\n\n    if (!result)\n      throw CriticError(format(\"%d: invalid branch ID\", branch_id));\n\n    repository = options.repository || new CriticRepository(result.repository);\n    name = result.name;\n  }\n  else if (\"repository\" in options && \"name\" in options)\n  {\n    repository = options.repository;\n    name = options.name;\n\n    result = db.execute(\"SELECT id, head, base FROM branches WHERE repository=%d AND name=%s\", repository.id, name)[0];\n\n    if (!result)\n      throw CriticError(format(\"%s: no such branch\", name));\n\n    branch_id = result.id;\n  }\n  else\n    throw CriticError(\"invalid argument; must specify either 'id' or 'repository'+'name'\");\n\n  var self = this;\n  var head_id = result.head, head = options.head || null;\n  var base_id = result.base, base = options.base || null;\n  var review = options.review || void 0;\n  var commits;\n\n  result = null;\n\n  function getHead()\n  {\n    if (!head)\n      head = self.repository.getCommit(head_id);\n\n    return head;\n  }\n\n  function getBase()\n  {\n    if (!base)\n      if (!base_id)\n        return null;\n      else\n        base = new CriticBranch({ id: base_id, repository: self.repository });\n\n    return base;\n  }\n\n  function getReview()\n  {\n    if (review === void 0)\n    {\n      var result = db.execute(\"SELECT id FROM reviews WHERE branch=%d\", self.id)[0];\n\n      if (result)\n        review = new CriticReview(result.id);\n      else\n        review = null;\n    }\n\n    return review;\n  }\n\n  function getCommits()\n  {\n    if (!commits)\n    {\n      var result = db.execute(\"SELECT commit FROM reachable WHERE branch=%d LIMIT %d\", branch_id, configuration.maxCommits + 1);\n\n      if (result.length > configuration.maxCommits)\n        throw CriticError(format(\"implementation limit; branch contains more than %d commits\", configuration.maxCommits));\n\n      var all_commits = [];\n\n      for (var index = 0; index < result.length; ++index)\n        all_commits.push(repository.getCommit(result[index].commit));\n\n      commits = new CriticCommitSet(all_commits);\n    }\n\n    return commits;\n  }\n\n  this.repository = repository;\n  this.id = branch_id;\n  this.name = name;\n\n  Object.defineProperties(this, { head: { get: getHead, enumerable: true },\n                                  base: { get: getBase, enumerable: true },\n                                  review: { get: getReview, enumerable: true },\n                                  commits: { get: getCommits, enumerable: true }});\n  Object.freeze(this);\n}\n\nObject.defineProperties(CriticBranch.prototype, {\n\n  getWorkCopy: { writable: true, value: function ()\n    {\n      return new CriticRepositoryWorkCopy(this.repository, this.name);\n    } },\n\n  getCheckBranch: { writable: true, value: function (upstream)\n    {\n      return new CriticCheckBranch(this, upstream);\n    } }\n\n});\n\nfunction CriticCheckBranch(branch, upstream)\n{\n  this.branch = branch;\n  this.upstream = upstream;\n}\n\nObject.defineProperties(CriticCheckBranch.prototype, {\n\n  addNote: {\n    writable: true,\n    value: function (commit, data)\n    {\n      var sha1, review_id, note, user;\n\n      if (commit instanceof CriticCommit)\n        sha1 = commit.sha1;\n      else\n        throw CriticError(\"invalid commit argument; expected critic.Commit object\");\n\n      if (data.review)\n      {\n        if (data.review instanceof CriticReview)\n          review_id = data.review.id;\n        else\n          throw CriticError(\"invalid data.review argument; expected critic.Review object\");\n      }\n      else\n        review_id = null;\n\n      if (data.note)\n        note = String(note);\n      else\n        note = null;\n\n      if (data.user)\n        user = data.user;\n      else\n        user = global.user;\n\n      db.execute(\"INSERT INTO checkbranchnotes (repository, branch, upstream, sha1, uid, review, text) VALUES (%d, %s, %s, %s, %d, %d, %s)\",\n                 this.branch.repository.id, this.branch.name, this.upstream, sha1, user.id, review_id, note);\n    }\n  }\n\n});\n"
  },
  {
    "path": "src/library/js/v8/critic-changeset.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2013 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\n\"use strict\";\n\nvar CriticChangesetLineConstants = { TYPE_CONTEXT:    0,\n                                     TYPE_WHITESPACE: 1,\n                                     TYPE_REPLACED:   2,\n                                     TYPE_MODIFIED:   3,\n                                     TYPE_DELETED:    4,\n                                     TYPE_INSERTED:   5,\n\n                                     OPERATION_REPLACE: 0,\n                                     OPERATION_DELETE:  1,\n                                     OPERATION_INSERT:  2 };\n\nfunction CriticChangesetLine(type, old_index, old_lines, new_index, new_lines, operations)\n{\n  this.type = type;\n  this.oldIndex = old_index;\n  this.oldText = old_lines && old_lines[old_index];\n  this.newIndex = new_index;\n  this.newText = new_lines && new_lines[new_index];\n\n  operations = operations || null;\n\n  var self = this;\n\n  function getOperations()\n  {\n    if (typeof operations == \"string\")\n    {\n      operations = operations.split(\",\");\n\n      for (var index = 0; index < operations.length; ++index)\n      {\n        var operation = operations[index], match;\n\n        if (operation == \"ws\")\n          continue;\n\n        if (match = /^r(\\d+)-(\\d+)=(\\d+)-(\\d+)$/.exec(operation))\n          operation = [CriticChangesetLineConstants.OPERATION_REPLACE, parseInt(match[1]), parseInt(match[2]), parseInt(match[3]), parseInt(match[4])];\n        else if (match = /^d(\\d+)-(\\d+)$/.exec(operation))\n          operation = [CriticChangesetLineConstants.OPERATION_DELETE, parseInt(match[1]), parseInt(match[2])];\n        else if (match = /^i(\\d+)-(\\d+)$/.exec(operation))\n          operation = [CriticChangesetLineConstants.OPERATION_INSERT, parseInt(match[1]), parseInt(match[2])];\n\n        Object.freeze(operation);\n\n        operations[index] = operation;\n      }\n\n      Object.freeze(operations);\n    }\n\n    return operations;\n  }\n\n  Object.defineProperty(this, \"operations\", { get: getOperations, enumerable: true });\n  Object.freeze(this);\n}\n\nfunction CriticChangesetChunk(changeset, file, delete_offset, delete_count, insert_offset, insert_count, analysis, whitespace)\n{\n  this.changeset = changeset;\n  this.file = file;\n  this.deleteOffset = delete_offset;\n  this.deleteCount = delete_count;\n  this.insertOffset = insert_offset;\n  this.insertCount = insert_count;\n\n  var self = this;\n  var lines;\n\n  function getLines()\n  {\n    function fillLines(old_stop, new_stop)\n    {\n      while (old_offset < old_stop && new_offset < new_stop)\n        lines.push(new CriticChangesetLine(CriticChangesetLineConstants.TYPE_REPLACED, self.deleteOffset + old_offset++, old_lines, self.insertOffset + new_offset++, new_lines));\n      while (old_offset < old_stop)\n        lines.push(new CriticChangesetLine(CriticChangesetLineConstants.TYPE_DELETED, self.deleteOffset + old_offset++, old_lines, self.insertOffset + new_offset, null));\n      while (new_offset < new_stop)\n        lines.push(new CriticChangesetLine(CriticChangesetLineConstants.TYPE_INSERTED, self.deleteOffset + old_offset, null, self.insertOffset + new_offset++, new_lines));\n    }\n\n    if (!lines)\n    {\n      var old_lines = file.oldVersion.lines;\n      var new_lines = file.newVersion.lines;\n      var old_offset = 0, new_offset = 0;\n\n      lines = [];\n\n      if (analysis)\n      {\n        var mappings = analysis.split(\";\");\n\n        for (var mapping_index = 0; mapping_index < mappings.length; ++mapping_index)\n        {\n          var match = /^(\\d+)=(\\d+)(?::(.*))?$/.exec(mappings[mapping_index]);\n          var old_mapped_offset = parseInt(match[1]);\n          var new_mapped_offset = parseInt(match[2]);\n          var operations = match[3];\n\n          fillLines(old_mapped_offset, new_mapped_offset);\n\n          lines.push(new CriticChangesetLine(CriticChangesetLineConstants.TYPE_MODIFIED, self.deleteOffset + old_offset++, old_lines, self.insertOffset + new_offset++, new_lines, operations));\n        }\n      }\n\n      fillLines(self.deleteCount, self.insertCount);\n    }\n\n    return lines;\n  }\n\n  Object.defineProperty(this, \"lines\", { get: getLines, enumerable: true });\n  Object.freeze(this);\n}\n\nCriticChangesetChunk.prototype.toString = function ()\n  {\n    var result = format(\"@@ -%d,%d +%d,%d @@\\n\", this.deleteOffset + 1, this.deleteCount, this.insertOffset + 1, this.insertCount);\n\n    for (var index = 0; index < this.lines.length; ++index)\n    {\n      var line = this.lines[index];\n      switch (line.type)\n      {\n      case CriticChangesetLineConstants.TYPE_REPLACED:\n      case CriticChangesetLineConstants.TYPE_MODIFIED:\n      case CriticChangesetLineConstants.TYPE_DELETED:\n        result += format(\"-%s%s\\n\", line.oldText, line.operations ? \" \" + JSON.stringify(line.operations) : \"\");\n      }\n    }\n\n    for (var index = 0; index < this.lines.length; ++index)\n    {\n      var line = this.lines[index];\n      switch (line.type)\n      {\n      case CriticChangesetLineConstants.TYPE_REPLACED:\n      case CriticChangesetLineConstants.TYPE_MODIFIED:\n      case CriticChangesetLineConstants.TYPE_INSERTED:\n        result += format(\"+%s%s\\n\", line.newText, line.operations ? \" \" + JSON.stringify(line.operations) : \"\");\n      }\n    }\n\n    return result;\n  };\n\nfunction CriticChangesetFile(changeset, file_id, old_sha1, new_sha1, old_mode, new_mode)\n{\n  CriticFile.call(this, { id: file_id });\n\n  this.changeset = changeset;\n\n  if (old_sha1 != '0000000000000000000000000000000000000000')\n    this.oldVersion = new CriticChangesetFileVersion(changeset, this, old_mode, null, old_sha1);\n  else\n    this.oldVersion = null;\n  if (new_sha1 != '0000000000000000000000000000000000000000')\n    this.newVersion = new CriticChangesetFileVersion(changeset, this, new_mode, null, new_sha1);\n  else\n    this.newVersion = null;\n\n  var self = this;\n  var chunks, deleteCount, insertCount;\n  var reviewers;\n\n  function getChunks()\n  {\n    if (chunks === void 0)\n    {\n      if (self.oldVersion === null || self.newVersion === null)\n        chunks = null;\n      else\n      {\n        chunks = [];\n\n        var result = db.execute(\"SELECT deleteOffset, deleteCount, insertOffset, insertCount, analysis, whitespace FROM chunks WHERE changeset=%d AND file=%d ORDER BY deleteOffset ASC\", self.changeset.id, self.id);\n\n        for (var index = 0; index < result.length; ++index)\n        {\n          var row = result[index];\n          chunks.push(new CriticChangesetChunk(self.changeset, self, row.deleteOffset - 1, row.deleteCount, row.insertOffset - 1, row.insertCount, row.analysis, !!row.whitespace));\n        }\n      }\n    }\n\n    return chunks;\n  }\n\n  function fetchCounts()\n  {\n    var result = db.execute(\"SELECT SUM(deletecount) AS deleteCount, SUM(insertcount) AS insertCount FROM chunks WHERE changeset=%d AND file=%d\",\n                            self.changeset.id, self.id)[0];\n\n    deleteCount = result.deleteCount;\n    insertCount = result.insertCount;\n  }\n\n  function getDeleteCount()\n  {\n    if (deleteCount === void 0)\n      fetchCounts();\n\n    return deleteCount;\n  }\n\n  function getInsertCount()\n  {\n    if (insertCount === void 0)\n      fetchCounts();\n\n    return insertCount;\n  }\n\n  function getReviewers()\n  {\n    if (!reviewers)\n      if (!self.changeset.review)\n        return null;\n      else\n      {\n        reviewers = {};\n\n        Object.defineProperties(reviewers, { pending: { value: {} },\n                                             reviewed: { value: {} }});\n\n        var result = db.execute(\"SELECT assignee, state, SUM(deleted) AS deleted, SUM(inserted) AS inserted\" +\n                                 \" FROM fullreviewuserfiles\" +\n                                \" WHERE review=%d AND changeset=%d AND file=%d AND (state='pending' OR reviewer=assignee)\" +\n                             \" GROUP BY assignee, state\",\n                                self.changeset.review.id, self.changeset.id, self.id);\n\n        for (var index = 0; index < result.length; ++index)\n        {\n          var row = result[index];\n          var user_id = row.assignee;\n\n          if (!(user_id in reviewers))\n            reviewers[user_id] = new CriticUser(user_id);\n\n          var counts = Object.freeze(Object.create(null, { deleteCount: { value: row.deleted, enumerable: true },\n                                                           insertCount: { value: row.inserted, enumerable: true }}));\n\n          if (row.state == \"pending\")\n            reviewers.pending[user_id] = counts;\n          else\n            reviewers.reviewed[user_id] = counts;\n        }\n\n        Object.freeze(reviewers.pending);\n        Object.freeze(reviewers.reviewed);\n        Object.freeze(reviewers);\n      }\n\n    return reviewers;\n  }\n\n  var isReviewed, reviewedBy;\n\n  if (!changeset.review)\n    isReviewed = reviewedBy = null;\n\n  function fetchReviewStatus()\n  {\n    var result = db.execute(\"SELECT state, reviewer FROM reviewfiles WHERE review=%d AND changeset=%d AND file=%d\",\n                            self.changeset.review.id, self.changeset.id, self.id)[0];\n\n    isReviewed = result.state == \"reviewed\";\n    reviewedBy = isReviewed ? new CriticUser({ id: result.reviewer }) : null;\n  }\n\n  function getIsReviewed()\n  {\n    if (isReviewed === void 0)\n      fetchReviewStatus();\n\n    return isReviewed;\n  }\n\n  function getReviewedBy()\n  {\n    if (reviewedBy === void 0)\n      fetchReviewStatus();\n\n    return reviewedBy;\n  }\n\n  Object.defineProperties(this, { chunks: { get: getChunks, enumerable: true },\n                                  deleteCount: { get: getDeleteCount, enumerable: true },\n                                  insertCount: { get: getInsertCount, enumerable: true },\n                                  reviewers: { get: getReviewers, enumerable: true },\n                                  isReviewed: { get: getIsReviewed, enumerable: true },\n                                  reviewedBy: { get: getReviewedBy, enumerable: true }});\n  Object.freeze(this);\n}\n\nCriticChangesetFile.prototype = Object.create(CriticFile.prototype);\n\nfunction CriticChangesetFileVersion(changeset, file, mode, size, sha1)\n{\n  CriticFileVersion.call(this, changeset.repository, file.path, mode, size, sha1, { review: changeset.review });\n\n  this.changeset = changeset;\n  this.file = file;\n\n  Object.freeze(this);\n}\n\nCriticChangesetFileVersion.prototype = Object.create(CriticFileVersion.prototype);\n\nCriticChangesetFile.prototype.toString = function () { return format(\"CriticChangesetFile(path=%s)\", this.path); };\n\nfunction CriticChangeset(repository, data)\n{\n  var changeset_id, parent, child;\n\n  if (typeof data.id == \"number\")\n  {\n    changeset_id = data.id;\n\n    var result = db.execute(\"SELECT parent, child FROM changesets WHERE id=%d\", changeset_id)[0];\n\n    if (!result)\n      throw CriticError(format(\"%d: invalid changeset ID\", changeset_id));\n\n    parent = data.parent || repository.getCommit(result.parent);\n    child = data.child || repository.getCommit(result.child);\n\n    result = null;\n  }\n  else if (data.parent && data.child)\n  {\n    parent = data.parent;\n    child = data.child;\n\n    var result = db.execute(\"SELECT id FROM changesets WHERE parent=%d AND child=%d AND type IN ('direct', 'custom')\", parent.id, child.id)[0];\n\n    if (!result)\n      throw CriticError(format(\"%s..%s: changeset not cached\", parent.sha1, child.sha1));\n\n    changeset_id = result.id;\n\n    result = null;\n  }\n  else\n    throw CriticError(\"invalid use: either changeset ID or parent/child commits must be provided\");\n\n  this.repository = repository;\n  this.review = data.review || null;\n  this.id = changeset_id;\n  this.parent = parent;\n  this.child = child;\n  this.commits = data.commits;\n\n  var self = this;\n  var files = null, filtered_files = data.files || false;\n  var reviewers = null;\n  var actuals = null;\n\n  function getFiles()\n  {\n    if (!files)\n    {\n      files = [];\n\n      var file_filter;\n      if (filtered_files)\n        file_filter = format(\" AND file IN (%s)\", filtered_files.map(parseInt).join(\", \"));\n      else\n        file_filter = \"\";\n\n      var result = db.execute(\"SELECT file, old_sha1, new_sha1, old_mode, new_mode FROM fileversions WHERE changeset=%d\" + file_filter, self.id);\n\n      for (var index = 0; index < result.length; ++index)\n      {\n        var row = result[index];\n        var file = new CriticChangesetFile(self, row.file, row.old_sha1, row.new_sha1, row.old_mode, row.new_mode);\n\n        files.push(file);\n\n        Object.defineProperty(files, file.path, { value: file });\n      }\n\n      Object.freeze(files);\n    }\n\n    return files;\n  }\n\n  function getReviewers()\n  {\n    if (!reviewers)\n      if (!self.review)\n        return null;\n      else\n      {\n        reviewers = {};\n\n        Object.defineProperties(reviewers, { pending: { value: {} },\n                                             reviewed: { value: {} }});\n\n        var result = db.execute(\"SELECT assignee, state, SUM(deleted) AS deleted, SUM(inserted) AS inserted\" +\n                                 \" FROM fullreviewuserfiles\" +\n                                \" WHERE review=%d AND changeset=%d AND (state='pending' OR reviewer=assignee)\" +\n                             \" GROUP BY assignee, state\", self.review.id, self.id);\n\n        for (var index = 0; index < result.length; ++index)\n        {\n          var row = result[index];\n          var user_id = row.assignee;\n\n          if (!(user_id in reviewers))\n            reviewers[user_id] = new CriticUser(user_id);\n\n          var counts = Object.freeze(Object.create(null, { deleteCount: { value: row.deleted, enumerable: true },\n                                                           insertCount: { value: row.inserted, enumerable: true }}));\n\n          if (row.state == \"pending\")\n            reviewers.pending[user_id] = counts;\n          else\n            reviewers.reviewed[user_id] = counts;\n        }\n\n        Object.freeze(reviewers.pending);\n        Object.freeze(reviewers.reviewed);\n        Object.freeze(reviewers);\n      }\n\n    return reviewers;\n  }\n\n  function getActuals()\n  {\n    if (actuals === null)\n      if (!self.review)\n        return null;\n      else\n      {\n        actuals = [];\n\n        if (db.execute(\"SELECT 1 FROM reviewchangesets WHERE review=%d AND changeset=%d\", self.review.id, self.id)[0])\n          actuals.push(self);\n        else\n        {\n          var commits = self.review.commits.restrict([self.child], [self.parent]);\n\n          for (var index = 0; index < commits.length; ++index)\n          {\n            if (commits[index].parents.length > 1)\n            {\n              var merge = self.review.repository.getMergeChangeset(commits[index], { review: self.review });\n              merge.changesets.forEach(function (changeset) { actuals.push(changeset); });\n            }\n            else\n              actuals.push(self.review.repository.getChangeset({ commit: commits[index], review: self.review }));\n          }\n        }\n\n        Object.freeze(actuals);\n      }\n\n    return actuals;\n  }\n\n  Object.defineProperties(this, { files: { get: getFiles, enumerable: true },\n                                  reviewers: { get: getReviewers, enumerable: true },\n                                  actuals: { get: getActuals, enumerable: true }});\n\n  Object.freeze(this);\n}\n\nfunction CriticMergeChangeset(changesets)\n{\n  this.repository = changesets[0].repository;\n  this.review = changesets[0].review;\n  this.commit = changesets[0].child;\n  this.changesets = [];\n\n  for (var index = 0; index < changesets.length; ++index)\n  {\n    this.changesets.push(changesets[index]);\n    this.changesets[changesets[index].parent.sha1] = changesets[index];\n  }\n\n  Object.freeze(this.changesets);\n  Object.freeze(this);\n}\n"
  },
  {
    "path": "src/library/js/v8/critic-cli.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2014 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\n\"use strict\";\n\nfunction executeCLI(commands)\n{\n  var argv = [python_executable, \"-m\", \"cli\"], stdin = \"\";\n\n  if (typeof user_id === \"number\")\n    argv.push(\"-u\", user_id.toString());\n\n  authentication_labels.forEach(\n    function (authentication_label)\n    {\n      argv.push(\"-l\", authentication_label);\n    });\n\n  commands.forEach(\n    function (command)\n    {\n      argv.push(command.name);\n      stdin += format(\"%r\\n\", command.data);\n    });\n\n  var process = new OS.Process(python_executable,\n                               { argv: argv,\n                                 environ: { PYTHONPATH: python_path }});\n\n  return process.call(stdin).trim().split(\"\\n\");\n}\n"
  },
  {
    "path": "src/library/js/v8/critic-comment.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2013 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\n\"use strict\";\n\nfunction CriticComment(chain_id, batch_id, comment_id, user_id, time, state, text, data)\n{\n  this.id = comment_id;\n  this.user = new CriticUser(user_id);\n  this.time = time;\n  this.text = text;\n\n  var self = this;\n  var chain = data.chain || null;\n  var batch = data.batch || null;\n\n  function getChain()\n  {\n    if (!chain)\n      chain = new CriticCommentChain(chain_id, { comments: { comment_id: self }});\n    return chain;\n  }\n\n  function getBatch()\n  {\n    if (!batch)\n      batch = new CriticBatch(batch_id);\n    return batch;\n  }\n\n  Object.defineProperties(this, { chain: { get: getChain, enumerable: true },\n                                  batch: { get: getBatch, enumerable: true }});\n  Object.freeze(this);\n}\n\nfunction CriticCommentChain(result_or_chain_id, data)\n{\n  var chain_id, result;\n\n  if (typeof result_or_chain_id == \"number\")\n  {\n    chain_id = result_or_chain_id;\n    result = db.execute(\"SELECT id, review, batch, uid, time, type, state, origin, file, first_commit, last_commit, closed_by, addressed_by FROM commentchains WHERE id=%d\", chain_id)[0];\n  }\n  else\n  {\n    result = result_or_chain_id;\n    chain_id = result.id;\n  }\n\n  if (!result)\n    throw CriticError(format(\"%d: invalid comment chain ID\", chain_id));\n\n  data = data || {};\n\n  this.id = chain_id;\n  this.type = result.type == \"issue\" ? CriticCommentChain.TYPE_ISSUE : CriticCommentChain.TYPE_NOTE;\n\n  switch (result.state)\n  {\n  case \"open\":\n    this.state = CriticCommentChain.STATE_OPEN;\n    break;\n\n  case \"closed\":\n    this.state = CriticCommentChain.STATE_RESOLVED;\n    break;\n\n  case \"addressed\":\n    this.state = CriticCommentChain.STATE_ADDRESSED;\n    break;\n  }\n\n  var self = this;\n  var review_id = result.review, review = data.review || null;\n  var batch_id = result.batch, batch = data.batch || null;\n  var user_id = result.uid, user = data.user || null;\n  var users = null;\n  var closed_by_id = result.closed_by, closed_by = null;\n  var file_id, file;\n  var commit_id, first_commit_id, last_commit_id, addressed_by_id;\n  var changeset, addressed_by, commit, lines, context;\n  var comments;\n\n  function getReview()\n  {\n    if (!review)\n      review = new CriticReview(review_id);\n    return review;\n  }\n\n  function getBatch()\n  {\n    if (!batch)\n      batch = new CriticBatch(batch_id);\n    return batch;\n  }\n\n  function getUser()\n  {\n    if (!user)\n      user = new CriticUser(user_id);\n    return user;\n  }\n\n  function getUsers()\n  {\n    if (!users)\n    {\n      users = {};\n\n      var result = db.execute(\"SELECT uid FROM commentchainusers WHERE chain=%d\", self.id);\n\n      for (var index = 0; index < result.length; ++index)\n        users[result[index].uid] = new CriticUser(result[index].uid);\n\n      Object.freeze(users);\n    }\n\n    return users;\n  }\n\n  function getClosedBy()\n  {\n    if (!closed_by)\n      if (!closed_by_id)\n        return null;\n      else\n        closed_by = new CriticUser(closed_by_id);\n\n    return closed_by;\n  }\n\n  function getChangeset()\n  {\n    if (changeset === void 0)\n      if (first_commit_id === last_commit_id)\n        changeset = null;\n      else\n      {\n        var review = getReview();\n        var first_commit = review.repository.getCommit(first_commit_id);\n        var last_commit = review.repository.getCommit(last_commit_id);\n\n        changeset = new CriticChangeset(review.repository, { parent: first_commit, child: last_commit, files: [file_id] });\n      }\n\n    return changeset;\n  }\n\n  function getAddressedBy()\n  {\n    if (!addressed_by)\n      if (!addressed_by_id)\n        return null;\n      else\n      {\n        var review = getReview();\n        addressed_by = review.repository.getCommit(addressed_by_id);\n      }\n\n    return addressed_by;\n  }\n\n  function getCommit()\n  {\n    if (!commit)\n    {\n      var review = getReview();\n      commit = review.repository.getCommit(commit_id);\n    }\n\n    return commit;\n  }\n\n  function getFile()\n  {\n    if (!file)\n      if (self.changeset)\n        file = self.changeset.files[0];\n      else\n        file = review.repository.getCommit(first_commit_id).getFile(file_id);\n\n    return file;\n  }\n\n  function getLines()\n  {\n    if (!lines)\n    {\n      lines = {};\n\n      var result = db.execute(\"SELECT sha1, first_line, last_line FROM commentchainlines WHERE chain=%d AND state='current'\", chain_id);\n\n      for (var index = 0; index < result.length; ++index)\n      {\n        var row = result[index];\n        lines[row.sha1] = Object.freeze({ firstLine: row.first_line - 1, lastLine: row.last_line - 1 });\n      }\n\n      Object.freeze(lines);\n    }\n\n    return lines;\n  }\n\n  function getContext(minimized)\n  {\n    if (context === void 0)\n    {\n      var version;\n\n      if (self.file instanceof CriticFileVersion)\n        version = self.file;\n      else if (self.origin == \"old\")\n        version = self.file.oldVersion;\n      else\n        version = self.file.newVersion;\n\n      var position = self.lines[version.sha1];\n      var result = db.execute(\"SELECT context FROM codecontexts WHERE sha1=%s AND first_line<=%d AND last_line>=%d ORDER BY first_line DESC LIMIT 1\", version.sha1, position.firstLine + 1, position.lastLine + 1)[0];\n\n      if (result)\n        context = result.context;\n      else\n        context = null;\n    }\n\n    if (minimized && context)\n      return context.replace(/\\(.*(?:\\)|...$)/, \"(...)\");\n    else\n      return context;\n  }\n\n  function getComments()\n  {\n    if (!comments)\n    {\n      comments = [];\n\n      var result = db.execute(\"SELECT id, batch, uid, time, state, comment FROM comments WHERE chain=%d AND state='current' ORDER BY time ASC\", chain_id);\n\n      for (var index = 0; index < result.length; ++index)\n      {\n        var row = result[index];\n        comments.push(data.comments && data.comments[row.id] || new CriticComment(chain_id, row.batch, row.id, row.uid, row.time, row.state, row.comment, { chain: self }));\n      }\n\n      Object.freeze(comments);\n    }\n\n    return comments;\n  }\n\n  if (result.file)\n  {\n    this.origin = result.origin;\n\n    file_id = result.file;\n    file = data.file || null;\n\n    first_commit_id = result.first_commit;\n    last_commit_id = result.last_commit;\n    addressed_by_id = result.addressed_by;\n\n    Object.defineProperties(this, { changeset: { get: getChangeset, enumerable: true },\n                                    addressedBy: { get: getAddressedBy, enumerable: true },\n                                    file: { get: getFile, enumerable: true },\n                                    lines: { get: getLines, enumerable: true },\n                                    context: { get: getContext, enumerable: true },\n                                    minimizedContext: { get: function () { return getContext(true); }, enumerable: true }});\n  }\n  else if (result.first_commit && result.last_commit)\n  {\n    commit_id = result.first_commit;\n\n    var result2 = db.execute(\"SELECT first_line, last_line FROM commentchainlines WHERE chain=%d\", chain_id)[0];\n\n    this.firstLine = result2.first_line;\n    this.lastLine = result2.last_line;\n\n    result2 = null;\n\n    Object.defineProperty(this, \"commit\", { get: getCommit, enumerable: true });\n  }\n\n  result = null;\n\n  Object.defineProperties(this, { review: { get: getReview, enumerable: true },\n                                  batch: { get: getBatch, enumerable: true },\n                                  user: { get: getUser, enumerable: true },\n                                  users: { get: getUsers, enumerable: true },\n                                  closedBy: { get: getClosedBy, enumerable: true },\n                                  comments: { get: getComments, enumerable: true }});\n  Object.freeze(this);\n}\n\nObject.defineProperties(CriticCommentChain, { TYPE_ISSUE:      { value: 0 },\n                                              TYPE_NOTE:       { value: 1 },\n                                              STATE_OPEN:      { value: 0 },\n                                              STATE_RESOLVED:  { value: 1 },\n                                              STATE_ADDRESSED: { value: 2 }});\n\nCriticCommentChain.find = function (data)\n  {\n    var commit;\n    if (typeof data.commit == \"object\" && data.commit instanceof CriticCommit)\n      commit = data.commit;\n    else\n    {\n      var repository = data.repository;\n      if (!repository || !(repository instanceof CriticRepository))\n        throw CriticError(\"invalid argument; data.commit must be a Commit object, or data.repository must be a Repository object\");\n      commit = repository.getCommit(data.commit);\n    }\n\n    var review = null;\n    if (\"review\" in data)\n    {\n      if (typeof data.review == \"object\" && data.review instanceof CriticReview)\n        review = data.review;\n      else\n        review = new CriticReview(data.review);\n    }\n\n    var result = [];\n\n    if (\"file\" in data)\n    {\n      var file;\n      if (typeof data.file == \"object\" && data.file instanceof CriticFile)\n        file = data.file;\n      else\n        file = CriticFile.find(data.file);\n\n      var fileversion = commit.getFile(file.path);\n      var args = [\"SELECT DISTINCT commentchains.review, commentchainlines.chain, commentchainlines.first_line, commentchainlines.last_line \" +\n                  \"  FROM commentchains \" +\n                  \"  JOIN commentchainlines ON (commentchainlines.chain=commentchains.id) \" +\n                  \" WHERE commentchainlines.sha1=%s \" +\n                  \"   AND commentchains.state IN ('open', 'closed', 'addressed')\",\n                  fileversion.sha1];\n\n      if (review !== null)\n      {\n        args[0] += \" AND commentchains.review=%d\";\n        args.push(review.id);\n      }\n\n      result = db.execute.apply(db, args);\n    }\n    else if (review)\n    {\n      var preliminary = db.execute(\"SELECT DISTINCT commentchains.id, commentchains.file, commentchainlines.sha1, \" +\n                                   \"                commentchainlines.first_line, commentchainlines.last_line \" +\n                                   \"  FROM commentchains \" +\n                                   \"  JOIN commentchainlines ON (commentchainlines.chain=commentchains.id) \" +\n                                   \" WHERE commentchains.review=%d \" +\n                                   \"   AND commentchains.state IN ('open', 'closed', 'addressed')\",\n                                   review.id);\n\n      var files_in_commit = {};\n\n      preliminary.apply(\n        function (chain_id, file_id, sha1, first_line, last_line)\n        {\n          if (!(file_id in files_in_commit))\n            try\n            {\n              files_in_commit[file_id] = commit.getFile(CriticFile.find(file_id).path).sha1;\n            }\n            catch (error)\n            {\n              files_in_commit[file_id] = \"\";\n            }\n\n          if (sha1 == files_in_commit[file_id])\n            result.push({ review: review.id,\n                          chain: chain_id,\n                          first_line: first_line,\n                          last_line: last_line });\n        });\n    }\n\n    var chains = [];\n    var reviews = {};\n\n    for (var index = 0; index < result.length; ++index)\n    {\n      var review_id = result[index].review;\n      var review = reviews[review_id] || (reviews[review_id] = new CriticReview(review_id));\n\n      chains.push({ chain: new CriticCommentChain(result[index].chain, { review: review }),\n                    lineIndex: result[index].first_line - 1,\n                    lineCount: result[index].last_line - result[index].first_line + 1 });\n    }\n\n    return chains;\n  };\n\nCriticCommentChain.prototype.getComment = function (id)\n  {\n    id = ~~id;\n\n    var result = db.execute(\"SELECT batch, uid, time, state, comment FROM comments WHERE chain=%d AND id=%d AND state='current'\", this.id, id)[0];\n    return new CriticComment(this.id, result.batch, id, result.uid, result.time, result.state, result.comment, { chain: this });\n  };\n"
  },
  {
    "path": "src/library/js/v8/critic-commitset.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2013 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\n\"use strict\";\n\nfunction CriticCommitSet(all)\n{\n  Object.defineProperties(this, { parents: { value: {} },\n                                  children: { value: {} },\n                                  heads: { value: [] },\n                                  tails: { value: [] }});\n\n  this.repository = null;\n\n  for (var index = 0; index < all.length; ++index)\n  {\n    var commit = all[index];\n    if (!this.repository)\n      this.repository = commit.repository;\n    else if (this.repository.id != commit.repository.id)\n      throw CriticError(format(\"invalid use: commits from multiple repositories in source array ('%s' and '%s')\", this.repository.name, commit.repository.name));\n    if (commit.sha1 in this)\n      throw CriticError(format(\"invalid use: commit %s occurs multiple times in source array\", commit.sha1));\n    Object.defineProperty(this, commit.sha1, { value: commit });\n    this.parents[commit.sha1] = [];\n    this.children[commit.sha1] = [];\n  }\n\n  for (var index = 0; index < all.length; ++index)\n  {\n    var commit = all[index];\n    var count = 0;\n\n    for (var pindex = 0; pindex < commit.parents.length; ++pindex)\n    {\n      var parent = commit.parents[pindex];\n      if (parent.sha1 in this)\n      {\n        ++count;\n        this.parents[commit.sha1].push(parent);\n      }\n      else if (!(parent.sha1 in this.children))\n        this.children[parent.sha1] = [];\n\n      this.children[parent.sha1].push(commit);\n    }\n  }\n\n  for (var index = 0; index < all.length; ++index)\n  {\n    var commit = all[index];\n\n    if (this.children[commit.sha1].length == 0)\n    {\n      this.heads.push(commit);\n      Object.defineProperty(this.heads, commit.sha1, { value: commit });\n    }\n    if (this.parents[commit.sha1].length < commit.parents.length)\n    {\n      for (var index1 = 0; index1 < commit.parents.length; ++index1)\n      {\n        var parent = commit.parents[index1];\n        if (!(parent.sha1 in this) && !(parent.sha1 in this.tails))\n        {\n          this.tails.push(parent);\n          Object.defineProperty(this.tails, parent.sha1, { value: parent });\n        }\n      }\n    }\n\n    Object.freeze(this.parents[commit.sha1]);\n    Object.freeze(this.children[commit.sha1]);\n  }\n\n  var added = {};\n\n  for (var index = 0; index < this.heads.length; ++index)\n  {\n    var stack = [this.heads[index]];\n    var stack_offset = 0;\n\n    while (stack_offset < stack.length)\n    {\n      var commit = stack[stack_offset++];\n\n      if (commit.sha1 in added)\n        continue;\n\n      do\n      {\n        var parents = this.parents[commit.sha1];\n\n        this.push(commit);\n\n        added[commit.sha1] = true;\n        commit = null;\n\n        for (var pindex = 0; pindex < parents.length; ++pindex)\n        {\n          var parent = parents[pindex];\n\n          if (parent.sha1 in added)\n            continue;\n          else if (commit === null)\n            commit = parent;\n          else\n            stack.push(parent);\n        }\n      }\n      while (commit);\n    }\n  }\n\n  var self = this;\n  var upstreams = null;\n\n  function getUpstreams()\n  {\n    if (!upstreams)\n    {\n      upstreams = self.tails.slice();\n\n      if (upstreams.length > 1)\n      {\n        for (var index1 = 0; index1 < upstreams.length; ++index1)\n          for (var index2 = 0; index2 < self.heads.length; ++index2)\n            if (self.heads[index2].isAncestorOf(upstreams[index1]))\n            {\n              upstreams[index1] = null;\n              break;\n            }\n\n        for (var index1 = 0; index1 < upstreams.length; ++index1)\n          if (upstreams[index1])\n            for (var index2 = 0; index2 < upstreams.length; ++index2)\n              if (index1 != index2 && upstreams[index2])\n                if (upstreams[index1].isAncestorOf(upstreams[index2]))\n                {\n                  upstreams[index1] = null;\n                  break;\n                }\n\n        upstreams = upstreams.filter(function (commit) { return commit !== null; });\n      }\n\n      Object.freeze(upstreams);\n    }\n\n    return upstreams;\n  }\n\n  Object.defineProperty(this, \"upstreams\", { get: getUpstreams });\n\n  Object.freeze(this.parents);\n  Object.freeze(this.children);\n  Object.freeze(this.heads);\n  Object.freeze(this.tails);\n  Object.freeze(this);\n}\n\nvar properties =\n  {\n    restrict: {\n      value: function (heads, tails)\n        {\n          var reachable = [];\n          var self = this;\n\n          var exclude = {};\n          if (tails)\n            for (var index = 0; index < tails.length; ++index)\n            {\n              var tail = tails[index];\n\n              if (!(tail.sha1 in this) && !(tail.sha1 in this.tails))\n                throw CriticError(\"CommitSet.restrict: invalid tail commits; not member or tail of commit-set\");\n\n              exclude[tail.sha1] = true;\n            }\n\n          function add(commit)\n          {\n            if (!(commit.sha1 in reachable) && !(commit.sha1 in exclude))\n            {\n              reachable.push(commit);\n              reachable[commit.sha1] = commit;\n\n              var parents = self.parents[commit.sha1];\n              for (var index = 0; index < parents.length; ++index)\n                add(parents[index]);\n            }\n          }\n\n          for (var index = 0; index < heads.length; ++index)\n            if (heads[index].sha1 in this)\n              add(heads[index]);\n\n          return new CriticCommitSet(reachable);\n        },\n      writable: true,\n      configurable: true },\n\n    without: {\n      value: function (commits)\n        {\n          var remaining = [];\n\n          for (var index = 0; index < this.length; ++index)\n            if (!(this[index].sha1 in commits))\n              remaining.push(this[index]);\n\n          return new CriticCommitSet(remaining);\n        },\n      writable: true,\n      configurable: true },\n\n    getChangeset: {\n      value: function (data)\n        {\n          if (this.length == 0)\n            return null;\n\n          if (this.heads.length != 1)\n            throw CriticError(format(\"commit-set has multiple heads: %d\", this.heads.length));\n          if (this.upstreams.length != 1)\n            throw CriticError(\"commit-set has multiple upstreams\");\n\n          data = data || {};\n\n          data.parent = this.upstreams[0];\n          data.child = this.heads[0];\n          data.commits = this;\n\n          return this.repository.getChangeset(data);\n        },\n      writable: true,\n      configurable: true }\n  };\n\nCriticCommitSet.prototype = Object.create(Array.prototype, properties);\n"
  },
  {
    "path": "src/library/js/v8/critic-dashboard.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2013 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\n\"use strict\";\n\nfunction CriticDashboard(user)\n{\n  if (!(user instanceof CriticUser))\n    throw CriticError(\"invalid user argument; expected User object\");\n\n  this.user = user;\n\n  var self = this;\n\n  var owned_finished = null;\n  var owned_accepted = null;\n  var owned_pending = null;\n  var owned_dropped = null;\n\n  var active = null;\n  var inactive = null;\n\n  function createReviews(result, filter)\n  {\n    var reviews = [];\n\n    for (var index = 0; index < result.length; ++index)\n    {\n      var review = new CriticReview(result[index].id);\n      if (!filter || filter(review))\n        reviews.push(review);\n    }\n\n    return Object.freeze(reviews);\n  }\n\n  function getOwnedFinished()\n  {\n    if (!owned_finished)\n      owned_finished = createReviews(db.execute(\"SELECT id FROM reviews JOIN reviewusers ON (review=id) WHERE uid=%d AND owner AND state='closed' ORDER BY id ASC\", self.user.id));\n    return owned_finished;\n  }\n\n  function getOwnedAccepted()\n  {\n    if (!owned_accepted)\n    {\n      var owned_open = createReviews(db.execute(\"SELECT id FROM reviews JOIN reviewusers ON (review=id) WHERE uid=%d AND owner AND state='open' ORDER BY id ASC\", self.user.id));\n\n      owned_accepted = [];\n      owned_pending = [];\n\n      for (var index = 0; index < owned_open.length; ++index)\n      {\n        var review = owned_open[index];\n        if (review.progress.accepted)\n          owned_accepted.push(review);\n        else\n          owned_pending.push(review);\n      }\n\n      Object.freeze(owned_accepted);\n      Object.freeze(owned_pending);\n    }\n\n    return owned_accepted;\n  }\n\n  function getOwnedPending()\n  {\n    if (!owned_pending)\n      /* Populates owned_pending as a side-effect. */\n      getOwnedAccepted();\n    return owned_pending;\n  }\n\n  function getOwnedDropped()\n  {\n    if (!owned_dropped)\n      owned_dropped = createReviews(db.execute(\"SELECT id FROM reviews JOIN reviewusers ON (review=id) WHERE uid=%d AND owner AND state='dropped' ORDER BY id ASC\", self.user.id));\n    return owned_dropped;\n  }\n\n  function getActive()\n  {\n    if (!active)\n    {\n      active = [];\n\n      Object.defineProperties(active, { hasPendingChanges: { value: Object.create(null) },\n                                        hasUnreadComments: { value: Object.create(null) },\n                                        unsharedPendingChanges: { value: Object.create(null) },\n                                        sharedPendingChanges: { value: Object.create(null) },\n                                        unreadComments: { value: Object.create(null) },\n                                        isReviewer: { value: Object.create(null) },\n                                        isWatcher: { value: Object.create(null) }});\n\n      var assignments = db.execute(\"SELECT DISTINCT reviews.id AS id, fullreviewuserfiles.state AS state FROM reviews JOIN fullreviewuserfiles ON (review=id) WHERE assignee=%d AND reviews.state='open'\", self.user.id);\n      var is_reviewer = {};\n\n      for (var index = 0; index < assignments.length; ++index)\n      {\n        var row = assignments[index];\n        var review_id = row.id;\n        if (row.state == \"pending\")\n        {\n          var review = new CriticReview(review_id);\n          active.push(review);\n          active.hasPendingChanges[review_id] = review;\n          active.isReviewer[review_id] = review;\n        }\n        is_reviewer[review_id] = true;\n      }\n\n      var before = Date.now();\n\n      for (var review_id in active.hasPendingChanges)\n      {\n        var pending = db.execute(\"SELECT SUM(reviewfiles.deleted) AS deleted, \" +\n                                 \"       SUM(reviewfiles.inserted) AS inserted, \" +\n                                 \"       EXTRACT('epoch' FROM (NOW() - MIN(reviewuserfiles.time))) AS seconds, \" +\n                                 \"       reviewfilesharing.reviewers<2 AS unshared \" +\n                                 \"FROM reviews \" +\n                                 \"JOIN reviewfiles ON (reviewfiles.review=reviews.id) \" +\n                                 \"JOIN reviewuserfiles ON (reviewuserfiles.file=reviewfiles.id \" +\n                                 \"                     AND reviewuserfiles.uid=%d) \" +\n                                 \"JOIN reviewfilesharing ON (reviewfilesharing.review=reviews.id \" +\n                                 \"                       AND reviewfilesharing.file=reviewfiles.id) \" +\n                                 \"WHERE reviews.id=%d \" +\n                                 \"  AND reviewfiles.state='pending' \" +\n                                 \"GROUP BY reviewfilesharing.reviewers<2\",\n                                 self.user.id, review_id);\n\n        for (var index = 0; index < pending.length; ++index)\n        {\n          var row = pending[index];\n          if (row.unshared)\n            active.unsharedPendingChanges[review_id] = Object.freeze({ deleted: row.deleted, inserted: row.inserted, seconds: row.seconds });\n          else\n            active.sharedPendingChanges[review_id] = Object.freeze({ deleted: row.deleted, inserted: row.inserted, seconds: row.seconds });\n        }\n      }\n\n      var after = Date.now();\n\n      active.qt = after - before;\n\n      var with_unread = db.execute(\"SELECT reviews.id AS id, COUNT(comments.id) AS count FROM reviews JOIN commentchains ON (commentchains.review=reviews.id) JOIN comments ON (comments.chain=commentchains.id) JOIN commentstoread ON (commentstoread.comment=comments.id) WHERE commentstoread.uid=%d AND reviews.state='open' GROUP BY reviews.id\", self.user.id);\n\n      for (var index = 0; index < with_unread.length; ++index)\n      {\n        var review_id = with_unread[index].id, review = active.hasPendingChanges[review_id];\n        if (!review)\n          active.push(review = new CriticReview(review_id));\n        active.hasUnreadComments[review_id] = review;\n        active.unreadComments[review_id] = with_unread[index].count;\n        if (is_reviewer[review_id])\n          active.isReviewer[review_id] = review;\n        else\n          active.isWatcher[review_id] = review;\n      }\n\n      active.sort(function (a, b) { switch (true) { case a.id < b.id: return -1; case a.id > b.id: return 1; default: return 0; } });\n\n      Object.freeze(active.hasPendingChanges);\n      Object.freeze(active.hasUnreadComments);\n      Object.freeze(active.unsharedPendingChanges);\n      Object.freeze(active.sharedPendingChanges);\n      Object.freeze(active.unreadComments);\n      Object.freeze(active.isReviewer);\n      Object.freeze(active.isWatcher);\n      Object.freeze(active);\n    }\n\n    return active;\n  }\n\n  function getInactive()\n  {\n    if (!inactive)\n    {\n      inactive = [];\n\n      Object.defineProperties(inactive, { isReviewer: { value: {} },\n                                          isWatcher: { value: {} }});\n\n      var is_reviewer = db.execute(\"SELECT DISTINCT reviews.id AS id, fullreviewuserfiles.state AS state FROM reviews JOIN fullreviewuserfiles ON (review=id) WHERE assignee=%d AND reviews.state='open'\", self.user.id);\n      var include = {}, exclude = {};\n\n      for (var index = 0; index < is_reviewer.length; ++index)\n      {\n        var review_id = is_reviewer[index].id;\n        if (is_reviewer[index].state == 'pending')\n          exclude[review_id] = true;\n        else\n          include[review_id] = true;\n      }\n\n      for (var review_id in include)\n        if (!exclude[review_id])\n        {\n          var review = new CriticReview(~~review_id);\n          inactive.push(review);\n          inactive.isReviewer[review_id] = review;\n        }\n\n      var is_watcher = db.execute(\"SELECT id FROM reviews JOIN reviewusers ON (review=id) WHERE uid=%d AND state='open'\", self.user.id);\n\n      for (var index = 0; index < is_watcher.length; ++index)\n      {\n        var review_id = is_watcher[index].id;\n        if (!include[review_id] && !exclude[review_id])\n        {\n          var review = new CriticReview(review_id);\n          inactive.push(review);\n          inactive.isWatcher[review_id] = review;\n        }\n      }\n\n      inactive.sort(function (a, b) { switch (true) { case a.id < b.id: return -1; case a.id > b.id: return 1; default: return 0; } });\n\n      Object.freeze(inactive.isReviewer);\n      Object.freeze(inactive.isWatcher);\n      Object.freeze(inactive);\n    }\n\n    return inactive;\n  }\n\n  this.owned = Object.create(null, { finished: { get: getOwnedFinished, enumerable: true },\n                                     accepted: { get: getOwnedAccepted, enumerable: true },\n                                     pending: { get: getOwnedPending, enumerable: true },\n                                     dropped: { get: getOwnedDropped, enumerable: true }});\n\n  Object.defineProperties(this, { active: { get: getActive, enumerable: true },\n                                  inactive: { get: getInactive, enumerable: true }});\n\n  Object.freeze(this.owned);\n  Object.freeze(this);\n}\n"
  },
  {
    "path": "src/library/js/v8/critic-file.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2013 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\n\"use strict\";\n\n/* Constructor for internal use.  External code (and in practice most internal\n   code as well) uses CriticFile.find() to create or find previously created\n   CriticFile objects.  That function is also exposed as the File \"constructor\"\n   externally, to emulate the real constructors old behavior. */\nfunction CriticFile(data)\n{\n  var result;\n\n  if (\"id\" in data)\n  {\n    result = db.execute(\"SELECT id, path FROM files WHERE id=%d\", data.id)[0];\n    if (!result)\n      throw CriticError(format(\"%s: invalid file ID\", data.id));\n  }\n  else if (\"path\" in data)\n  {\n    result = db.execute(\"SELECT id, path FROM files WHERE MD5(path)=MD5(%s)\", data.path)[0];\n    if (!result)\n      throw CriticError(format(\"%s: no such file\", data.path));\n  }\n  else\n    throw CriticError(\"invalid use; expected data.id or data.path\");\n\n  this.id = result.id;\n  this.path = result.path;\n\n  /* CriticFile has sub-classes whose constructors will first call this function\n     and then extend the object further.  If that's the case, we don't freeze\n     the object (expecting the sub-class constructor to do so instead.) */\n  var is_subclass = Object.getPrototypeOf(this) !== CriticFile.prototype;\n\n  if (!is_subclass)\n    Object.freeze(this);\n}\n\nObject.defineProperties(\n  CriticFile.prototype,\n  {\n    toString: { value: function () { return this.path; }, writable: true, configurable: true },\n    valueOf: { value: function () { return this.id; }, writable: true, configurable: true }\n  }\n);\n\nvar file_cache_by_id = {};\nvar file_cache_by_path = {};\n\nCriticFile.find = function (path_or_id)\n  {\n    var cached;\n\n    if (typeof path_or_id == \"string\")\n    {\n      /* Normalize forward slashes in the path so that the cache lookup always\n         uses the same format. */\n      var path = path_or_id.replace(/^\\/+|\\/+(?=\\/)|\\/+$/g, \"\");\n\n      if (cached = file_cache_by_path[path])\n        return cached;\n      else\n        return file_cache_by_path[path] = new CriticFile({ path: path });\n    }\n    else\n    {\n      var id = ~~path_or_id;\n\n      if (cached = file_cache_by_id[id])\n        return cached;\n      else\n        return file_cache_by_path[id] = new CriticFile({ id: id });\n    }\n  };\n\n/* Externally, CriticFile.find() doubles as the File constructor, so set its\n   prototype property to make the instanceof operator work as expected. */\nCriticFile.find.prototype = CriticFile.prototype;\n/* It also needs to reference itself... */\nCriticFile.find.find = CriticFile.find;\n\nfunction CriticFileVersion(repository, file, mode, size, sha1, data)\n{\n  CriticFile.call(this, { path: file });\n\n  this.repository = repository;\n  this.mode = mode;\n  this.size = size;\n  this.sha1 = sha1;\n\n  var self = this;\n  var review = null;\n  var bytes, lines;\n  var commentChains = null;\n\n  if (data)\n    if (data.review)\n      review = data.review;\n\n  function isBinary(data)\n  {\n    /* Check if the file appears to be binary.\n\n       Using information from .gitattributes would be nice, but git doesn't seem\n       to have a command that directly exposes information from it, and parsing\n       it here is quite sub-optimal.  And if the file really is binary then this\n       function will find that out too typically, so using .gitattributes is\n       mostly and optimization anyway.\n\n       This code is essentially a copy of git's heuristics for determining if a\n       file is binary, in convert.c::gather_stats() and convert.c::is_binary(). */\n\n    var printable = 0, nonprintable = 0;\n\n    for (var index = 0; index < data.length; ++index)\n    {\n      var byte = data[index];\n\n      if (byte < 32)\n      {\n        switch (byte)\n        {\n        case 0:\n          return true;\n\n        case 8:\n        case 9:\n        case 10:\n        case 12:\n        case 13:\n        case 27:\n          ++printable;\n          break;\n\n        default:\n          ++nonprintable;\n        }\n      }\n      else if (byte == 127)\n        ++nonprintable;\n      else\n        ++printable;\n    }\n\n    return (printable >> 7) < nonprintable;\n  }\n\n  function getBytes()\n  {\n    if (bytes === void 0)\n      bytes = self.repository.fetch(self.sha1).data;\n    return bytes;\n  }\n\n  function getLines()\n  {\n    if (lines === void 0)\n    {\n      /* Re-use 'bytes' if set, but if not don't set 'bytes' since that keeps\n         the array (which might be huge) alive longer than what is probably\n         necessary. */\n      var data = bytes || self.repository.fetch(self.sha1).data;\n\n      if (isBinary(data))\n        lines = null;\n      else\n      {\n        var source = data.decode();\n\n        lines = source.split(/\\r\\n|\\n/g);\n\n        /* If the file ends in a line-break (as it typically should) the last\n           element in the array will be empty.  We don't want to keep that\n           empty element; it will only make it seem like there's an empty line\n           at the end of the file. */\n        if (lines.length && !lines[lines.length - 1])\n          lines.pop();\n\n        Object.freeze(lines);\n      }\n    }\n\n    return lines;\n  }\n\n  function getCommentChains()\n  {\n    if (!commentChains)\n    {\n      commentChains = [];\n\n      var result = db.execute(\"SELECT DISTINCT id, first_line, last_line FROM commentchains JOIN commentchainlines ON (chain=id) WHERE commentchains.state!='draft' AND commentchainlines.state!='draft' AND review=%d AND sha1=%s ORDER BY first_line ASC, last_line ASC\", review.id, self.sha1);\n\n      for (var index = 0; index < result.length; ++index)\n        commentChains.push(new CriticCommentChain(result[index].id, { review: review }));\n\n      Object.freeze(commentChains);\n    }\n\n    return commentChains;\n  }\n\n  Object.defineProperties(this, { lines: { get: getLines, enumerable: true },\n                                  bytes: { get: getBytes, enumerable: true }});\n\n  if (review)\n    Object.defineProperty(this, \"commentChains\", { get: getCommentChains, enumerable: true });\n  else\n    this.commentChains = null;\n\n  var is_subclass = Object.getPrototypeOf(this) !== CriticFileVersion.prototype;\n\n  if (!is_subclass)\n    Object.freeze(this);\n}\n\nCriticFileVersion.prototype = Object.create(CriticFile.prototype);\n"
  },
  {
    "path": "src/library/js/v8/critic-filters.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2013 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\n\"use strict\";\n\nfunction CriticFilters(data)\n{\n  var cli_input = {};\n\n  if (data.review)\n  {\n    cli_input.review_id = data.review.id;\n    if (data.added_review_filters)\n    {\n      cli_input.added_review_filters = [];\n      for (var index = 0; index < data.added_review_filters.length; ++index)\n      {\n        var filter = data.added_review_filters[index];\n        cli_input.added_review_filters.push([filter.uid, filter.path, filter.type, filter.delegate]);\n      }\n    }\n    if (data.removed_review_filters)\n    {\n      cli_input.removed_review_filters = [];\n      for (var index = 0; index < data.removed_review_filters.length; ++index)\n      {\n        var filter = data.removed_review_filters[index];\n        cli_input.removed_review_filters.push([filter.uid, filter.path, filter.type, filter.delegate]);\n      }\n    }\n  }\n  else if (data.repository)\n  {\n    cli_input.repository_id = data.repository.id;\n    cli_input.recursive = !!data.recursive;\n    cli_input.file_ids = data.files.map(\n      function (item)\n      {\n        if (!(item instanceof CriticFile))\n          item = CriticFile.find(item);\n        return item.id;\n      });\n  }\n\n  if (data.user)\n    cli_input.user_id = data.user.id;\n\n  cli_input = JSON.stringify(cli_input) + \"\\n\";\n\n  var cli_args = [python_executable, \"-m\", \"cli\", \"apply-filters\"];\n  var cli_process = new OS.Process(python_executable, { argv: cli_args,\n                                                        environ: { PYTHONPATH: python_path }});\n  var cli_output = cli_process.call(cli_input);\n\n  this.files = JSON.parse(cli_output.trim());\n\n  for (var file_id in this.files)\n  {\n    Object.freeze(this.files[file_id]);\n    for (var user_id in this.files[file_id])\n      Object.freeze(this.files[file_id][user_id]);\n  }\n\n  Object.freeze(this);\n}\n\nfunction getUserFileAssociation(filters, user_id, file_id)\n{\n  user_id = Number(user_id);\n  file_id = Number(file_id);\n\n  var data;\n\n  if ((data = filters.files[file_id]) && (data = data[user_id]))\n    return data[0];\n  else\n    return null;\n}\n\nCriticFilters.prototype.isReviewer = function (user_id, file_id)\n  {\n    return getUserFileAssociation(this, user_id, file_id) == \"reviewer\";\n  };\nCriticFilters.prototype.isWatcher = function (user_id, file_id)\n  {\n    return getUserFileAssociation(this, user_id, file_id) == \"watcher\";\n  };\nCriticFilters.prototype.isRelevant = function (user_id, file_id)\n  {\n    var association = getUserFileAssociation(this, user_id, file_id);\n    return association == \"reviewer\" || association == \"watcher\";\n  };\n\nCriticFilters.prototype.listUsers = function (file_id)\n  {\n    var data = this.files[file_id];\n    if (data)\n      return data;\n    else\n      return {};\n  };\n"
  },
  {
    "path": "src/library/js/v8/critic-filterstransaction.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2013 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\n\"use strict\";\n\nvar filterstransaction_internals = {};\nvar filterstransaction_id_counter = 0;\n\nfunction CriticFiltersTransaction(repository)\n{\n  var internal_id = filterstransaction_id_counter++;\n\n  filterstransaction_internals[internal_id] =\n    { transaction: this,\n      added: {},\n      removed: {} };\n\n  Object.defineProperty(this, \"__id__\", { value: internal_id });\n\n  this.repository = repository;\n}\n\nfunction CriticFiltersTransaction_getInternals(transaction)\n{\n  var internals = filterstransaction_internals[transaction.__id__];\n  if (!internals || transaction != internals.transaction)\n    throw CriticError(\"invalid use\");\n  return internals;\n}\n\nCriticFiltersTransaction.prototype.addFilter =\n  function (filter)\n  {\n    var internals = CriticFiltersTransaction_getInternals(this);\n\n    if (!filter || typeof filter != \"object\")\n      throw CriticError(\"invalid 'data' argument; expected object\");\n\n    var user = filter.user || {};\n    var what = filter.what || {};\n    var type = filter.type || \"\";\n    var delegates = filter.delegates || null;\n\n    if (!(user instanceof CriticUser))\n      throw CriticError(\"invalid 'data.user' argument; expected User object\");\n    if (!(what instanceof CriticDirectory || what instanceof CriticFile))\n      throw CriticError(\"invalid 'data.what' argument; expected Directory or File object\");\n\n    type = String(type);\n\n    if (type != \"reviewer\" && type != \"watcher\")\n      throw CriticError(\"invalid 'type' argument; expected \\\"reviewer\\\" or \\\"watcher\\\"\");\n    if (delegates && !Array.prototype.every.call(delegates, function (item) { return item instanceof CriticUser; }))\n      throw CriticError(\"invalid 'delegates' argument; expected array of User objects\");\n\n    var existing = this.repository.filters.users[user.name];\n    var removed = internals.removed[user.id];\n\n    function isNonConflicting(filter)\n    {\n      return what.path != filter.path || type != filter.type;\n    }\n\n    if (removed && !removed.every(isNonConflicting))\n      throw CriticError(\"added filter is identical to existing filter removed in this transaction\");\n\n    if (existing && !existing.every(isNonConflicting))\n      throw CriticError(\"added filter is identical to existing filter\");\n\n    var added = internals.added[user.id];\n\n    if (!added)\n      added = internals.added[user.id] = [];\n\n    added.push({ path: what.path,\n                 directory_id: what instanceof CriticDirectory ? what.id : 0,\n                 file_id: what instanceof CriticFile ? what.id : 0,\n                 type: type,\n                 delegate: delegates.map(function (user) { return user.name; }).join(\",\") });\n  };\n\nCriticFiltersTransaction.prototype.removeFilter =\n  function (filter)\n  {\n    var internals = CriticFiltersTransaction_getInternals(this);\n\n    if (!filter || typeof filter != \"object\")\n      throw CriticError(\"invalid 'data' argument; expected object\");\n\n    var user = filter.user || {};\n    var what = filter.what || {};\n    var type = filter.type || \"\";\n\n    if (!(user instanceof CriticUser))\n      throw CriticError(\"invalid 'user' argument; expected User object\");\n    if (!(what instanceof CriticDirectory || what instanceof CriticFile))\n      throw CriticError(\"invalid 'what' argument; expected Directory or File object\");\n\n    type = String(type);\n\n    if (type != \"reviewer\" && type != \"watcher\")\n      throw CriticError(\"invalid 'type' argument; expected \\\"reviewer\\\" or \\\"watcher\\\"\");\n\n    function isNonConflicting(filter)\n    {\n      return what.path != filter.path || type != filter.type;\n    }\n\n    var added = internals.added[user.id];\n\n    if (added && !added.every(isNonConflicting))\n      throw CriticError(\"removed filter is identical to existing filter added in this transaction\");\n\n  };\n"
  },
  {
    "path": "src/library/js/v8/critic-git.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2013 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\n\"use strict\";\n\nvar RE_LSTREE_LINE = /^([0-9]{6}) (blob|tree|commit) ([0-9a-f]{40})  *([0-9]+|-)\\t(.*)$/\n\nvar all_repositories = [];\n\nfunction GitObject(sha1, type, size, data)\n{\n  this.sha1 = sha1;\n  this.type = type;\n  this.size = size;\n  this.data = data;\n\n  Object.freeze(this);\n}\n\nfunction CriticCommitFileVersion(commit, path_or_id, data)\n{\n  var path;\n\n  if (typeof path_or_id == \"number\")\n    path = CriticFile.find(path_or_id).path;\n  else\n    path = path_or_id;\n\n  var match = /((?:[^\\/]+\\/)*[^\\/]+)/.exec(path);\n  if (!match)\n    throw CriticError(format(\"invalid path: %s\", path));\n  path = match[1];\n\n  var match = /^(?:(.*)\\/)?([^/]+)$/.exec(path);\n  var dirname = match[1] || \"\";\n  var basename = match[2];\n\n  var line = commit.repository.run(\"ls-tree\", \"--long\", format(\"%s:%s\", commit.sha1, dirname), basename);\n\n  try { match = RE_LSTREE_LINE.exec(line.trim()); }\n  catch (e) { throw new TypeError(Object.prototype.toString.apply(line)); }\n\n  if (!match)\n    throw CriticError(format(\"path doesn't exist in commit: '%s'\", path));\n\n  var mode = match[1];\n  var type = match[2];\n  var sha1 = match[3];\n  var size = match[4];\n  var name = match[5];\n\n  if (type != \"blob\")\n    throw CriticError(format(\"path names a directory: '%s'\", path));\n\n  CriticFileVersion.call(this, commit.repository, path, mode, size, sha1, data);\n\n  this.commit = commit;\n\n  Object.freeze(this);\n}\n\nCriticCommitFileVersion.prototype = Object.create(CriticFileVersion.prototype);\n\nfunction CriticCommitDirectory(commit, path)\n{\n  if (path == \"/\")\n    path = \"\";\n  else\n  {\n    var match = /^((?:[^\\/]+\\/)*[^\\/]+)/.exec(path);\n    if (!match)\n      throw CriticError(format(\"invalid path: %s\", path));\n    path = match[1];\n  }\n\n  var data = commit.repository.run(\"ls-tree\", \"--long\", format(\"%s:%s\", commit.sha1, path));\n  var lines = data.trim().split(\"\\n\");\n  var low_directories = [], low_files = [];\n\n  for (var index = 0; index < lines.length; ++index)\n  {\n    var match = RE_LSTREE_LINE.exec(lines[index]);\n\n    if (!match)\n      throw Error(format(\"Unexpected line: %r\", lines[index]));\n\n    var mode = match[1];\n    var type = match[2];\n    var sha1 = match[3];\n    var size = match[4];\n    var name = match[5];\n\n    if (type == \"blob\")\n      low_files.push([mode, sha1, size, name]);\n    else if (type == \"tree\")\n      low_directories.push(name);\n  }\n\n  this.path = path || \"/\";\n  this.commit = commit;\n\n  var self = this;\n  var parent;\n  var directories;\n  var files;\n\n  function getParent()\n  {\n    if (parent === void 0)\n    {\n      var match = /(.+)\\/[^\\/]+/.exec(self.path);\n      if (!match)\n        parent = null;\n      else\n        parent = new CriticCommitDirectory(self.commit, match[1]);\n    }\n\n    return parent;\n  }\n\n  function getDirectories()\n  {\n    if (!directories)\n    {\n      directories = [];\n\n      for (var index = 0; index < low_directories.length; ++index)\n      {\n        var path = low_directories[index];\n        if (self.path != \"/\")\n          path = format(\"%s/%s\", self.path, path);\n        directories.push(new CriticCommitDirectory(self.commit, path));\n      }\n    }\n\n    return directories;\n  }\n\n  function getFiles()\n  {\n    if (!files)\n    {\n      files = [];\n\n      for (var index = 0; index < low_files.length; ++index)\n      {\n        var path = low_files[index][3];\n        if (self.path != \"/\")\n          path = format(\"%s/%s\", self.path, path);\n        files.push(new CriticCommitFileVersion(self.commit, path));\n      }\n    }\n\n    return files;\n  }\n\n  Object.defineProperties(this, { parent: { get: getParent, enumerable: true },\n                                  directories: { get: getDirectories, enumerable: true },\n                                  files: { get: getFiles, enumerable: true }});\n  Object.freeze(this);\n}\n\nfunction runGitCommand(path, args)\n{\n  var env;\n\n  if (typeof args[args.length - 1] == \"object\")\n    env = args.pop();\n  else\n    env = {};\n\n  env[\"REMOTE_USER\"] = global.user.name;\n\n  var stdin;\n\n  if (\"stdin\" in env)\n  {\n    stdin = env[\"stdin\"];\n    delete env[\"stdin\"];\n  }\n\n  var argv = [git_executable];\n\n  [].push.apply(argv, args);\n\n  var process = new OS.Process(git_executable, { argv: argv, cwd: path, environ: env });\n\n  if (stdin !== void 0)\n    process.stdin = new IO.MemoryFile(stdin, \"r\");\n\n  process.stdout = new IO.MemoryFile;\n  process.stderr = new IO.MemoryFile;\n\n  try\n  {\n    process.run();\n  }\n  catch (error)\n  {\n    var message;\n\n    if (process.exitStatus !== null)\n      message = format(\"Git exited with status %d\", process.exitStatus);\n    else\n      message = format(\"Git terminated by signal %d\", process.terminationSignal);\n\n    var stderr = process.stderr.value.decode();\n\n    if (stderr.trim())\n      message += format(\"\\n%s\", stderr);\n\n    throw CriticError(message);\n  }\n\n  return process.stdout.value.decode();\n}\n\nfunction CriticRepository(name_or_id)\n{\n  var repository_id;\n\n  if (typeof name_or_id == \"number\")\n    repository_id = name_or_id;\n  else\n  {\n    var result = db.execute(\"SELECT id FROM repositories WHERE name=%s\", name_or_id);\n\n    if (result.length)\n      repository_id = result[0].id;\n    else\n      throw CriticError(format(\"%s: no such repository\", name_or_id));\n  }\n\n  for (var index = 0; index < all_repositories.length; ++index)\n    if (all_repositories[index].id == repository_id)\n      return all_repositories[index];\n\n  var result = db.execute(\"SELECT name, path, parent FROM repositories WHERE id=%d\", repository_id)[0];\n\n  if (!result)\n    throw CriticError(format(\"%s: invalid repository ID\", repository_id));\n\n  var command = {\n    name: \"check-repository-access\",\n    data: {\n      repository_id: repository_id\n    }\n  };\n\n  this.access = JSON.parse(executeCLI([command])[0]);\n\n  if (!this.access.read)\n    throw CriticError(format(\"access denied\"));\n\n  this.id = repository_id;\n  this.name = result.name;\n  this.path = result.path;\n\n  var self = this;\n  var catfile = null, catfile_in, catfile_out, catfile_buffer;\n  var filters = null;\n\n  this.fetch = function (sha1)\n    {\n      if (!catfile)\n      {\n        var stdin = new IO.Pipe();\n        var stdout = new IO.Pipe();\n        var argv = [git_executable, \"cat-file\", \"--batch\"];\n\n        catfile = new OS.Process(git_executable, { argv: argv, cwd: self.path });\n        catfile.stdin = stdin.input;\n        catfile.stdout = stdout.output;\n        catfile.stderr = new IO.MemoryFile();\n\n        catfile_in = stdin.output;\n        catfile_in.setCloseOnExec(true);\n        catfile_out = stdout.input;\n        catfile_buffer = new IO.Buffered(catfile_out);\n\n        catfile.start();\n      }\n\n      try\n      {\n        catfile_in.write(format(\"%s\\n\", sha1));\n\n        var line = catfile_buffer.readln();\n\n        if (!line)\n          throw CriticError(format(\"failed to fetch %s: empty response\", sha1));\n\n        var match = /^([0-9a-f]{40}) (commit|tree|blob|tag) (\\d+)$/.exec(line);\n\n        if (!match || match[1] != sha1)\n          throw CriticError(format(\"failed to fetch %s: invalid response: %s\", sha1, JSON.stringify(line)));\n\n        var type = match[2];\n        var size = parseInt(match[3]);\n        var data = catfile_buffer.read(size);\n\n        catfile_buffer.read(1);\n\n        return new GitObject(sha1, type, size, data);\n      }\n      catch (error)\n      {\n        this.shutdown();\n        throw error;\n      }\n    };\n\n  this.shutdown = function ()\n    {\n      if (catfile)\n      {\n        try { catfile.kill(9); catfile.wait(); } catch (e) {}\n        try { catfile_in.close(); } catch (e) {}\n        try { catfile_out.close(); } catch (e) {}\n\n        catfile = catfile_in = catfile_out = catfile_buffer = null;\n      }\n    };\n\n  all_repositories.push(this);\n\n  function getFilters()\n  {\n    var users = {};\n\n    function getUser(name_or_id)\n    {\n      var user = users[name_or_id];\n\n      if (user)\n        return user;\n      else\n        user = new CriticUser(name_or_id);\n\n      return users[user.id] = users[user.name] = user;\n    }\n\n    if (!filters)\n    {\n      filters = [];\n\n      Object.defineProperties(filters, { users: { value: {} },\n                                         paths: { value: {} }});\n\n      var result = db.execute(\"SELECT uid, path, type, delegate FROM filters WHERE repository=%d\", self.id);\n\n      for (var index = 0; index < result.length; ++index)\n      {\n        var row = result[index];\n        var user = getUser(row.uid);\n        var path = row.path;\n        var delegates = [];\n\n        if (row.delegate)\n          row.delegate.split(\",\").forEach(\n            function (name)\n            {\n              try\n              {\n                delegates.push(getUser(name));\n              }\n              catch (error)\n              {\n                /* Ignore invalid user names in the 'delegate' column. */\n              }\n            });\n\n        var filter = Object.freeze({ user: user,\n                                     path: path,\n                                     type: row.type,\n                                     delegates: delegates });\n\n        filters.push(filter);\n\n        (filters.users[user.name] || (filters.users[user.name] = [])).push(filter);\n        (filters.paths[path] || (filters.paths[path] = [])).push(filter);\n      }\n\n      for (var name in filters.users)\n        Object.freeze(filters.users[name]);\n      Object.freeze(filters.users);\n\n      for (var path in filters.paths)\n        Object.freeze(filters.paths[path]);\n      Object.freeze(filters.paths);\n\n      Object.freeze(filters);\n    }\n\n    return filters;\n  }\n\n  Object.defineProperties(this, { filters: { get: getFilters, enumerable: true }});\n  Object.freeze(this);\n}\n\nfunction GitUserTime(fullname, email, utc)\n{\n  this.fullname = fullname;\n  this.email = email;\n  this.time = new Date(parseInt(utc) * 1000);\n\n  var self = this;\n  var user = null;\n\n  function getUser()\n  {\n    if (!user)\n      try\n      {\n        user = new CriticUser({ email: self.email });\n      }\n      catch (error)\n      {\n        return null;\n      }\n\n    return user;\n  }\n\n  Object.defineProperty(this, \"user\", { get: getUser, enumerable: true });\n  Object.freeze(this);\n}\n\nGitUserTime.prototype.toString = function ()\n  {\n    return format(\"%s <%s> at %s\", this.fullname, this.email, this.time);\n  };\n\nfunction CriticCommit(repository, sha1)\n{\n  var object = repository.fetch(sha1);\n\n  while (object.type == \"tag\")\n  {\n    sha1 = object.data.decode().split(\"\\n\")[0].split(\" \")[1];\n    object = repository.fetch(sha1);\n  }\n\n  if (object.type != \"commit\")\n    throw CriticError(\"not a commit\");\n\n  var text = object.data.decode();\n  var tree, author, committer, parents_sha1 = [], message;\n\n  while (true)\n  {\n    var line_length = text.indexOf(\"\\n\");\n\n    if (line_length == 0)\n    {\n      message = text.substring(1);\n      break;\n    }\n    else\n    {\n      var line = text.substring(0, line_length), match;\n\n      if (match = /^tree ([0-9a-f]{40})$/i.exec(line))\n        tree = match[1];\n      else if (match = /^parent ([0-9a-f]{40})$/i.exec(line))\n        parents_sha1.push(match[1]);\n      else if (match = /^author (.+?) <([^>]+)> (\\d+) ([-+]\\d+)$/i.exec(line))\n        author = new GitUserTime(match[1], match[2], match[3]);\n      else if (match = /^committer (.+?) <([^>]+)> (\\d+) ([-+]\\d+)$/i.exec(line))\n        committer = new GitUserTime(match[1], match[2], match[3]);\n\n      text = text.substring(line_length + 1);\n    }\n  }\n\n  var self = this;\n  var commit_id = null;\n  var parents = null;\n\n  function getId()\n  {\n    if (commit_id === null)\n    {\n      var result = db.execute(\"SELECT id FROM commits WHERE sha1=%s\", sha1);\n\n      if (result.length)\n        commit_id = result[0].id;\n      else\n        throw CommitError(format(\"%s: commit not registered in the database\", sha1));\n    }\n\n    return commit_id;\n  }\n\n  function getParents()\n  {\n    if (parents === null)\n      parents = parents_sha1.map(function (sha1) { return new CriticCommit(repository, sha1); });\n\n    return parents;\n  }\n\n  function getSummary()\n  {\n    var match = /^(fixup|squash)! [^\\n]+\\n+([^\\n]+)/.exec(message);\n    if (match)\n      return format(\"[%s] %s\", match[1], match[2]);\n    else\n      return /^[^\\n]*/.exec(message)[0];\n  }\n\n  function getShort()\n  {\n    return repository.revparse(self.sha1, true);\n  }\n\n  Object.defineProperties(this, { id: { get: getId, enumerable: true },\n                                  parents: { get: getParents, enumerable: true },\n                                  summary: { get: getSummary, enumerable: true },\n                                  short: { get: getShort, enumerable: true }});\n\n  this.repository = repository;\n  this.sha1 = sha1;\n  this.tree = tree;\n  this.author = author;\n  this.committer = committer;\n  this.message = message;\n\n  Object.freeze(this);\n}\n\nObject.defineProperties(CriticCommit.prototype,\n  {\n    toString: {\n      value: function ()\n        {\n          return format(\"%s: %s\", this.sha1.substring(0, 8), this.summary);\n        },\n      writable: true,\n      configurable: true },\n\n    getDirectory: {\n      value: function (path)\n        {\n          return new CriticCommitDirectory(this, path);\n        },\n      writable: true,\n      configurable: true },\n\n    getFile: {\n      value: function (path, data)\n        {\n          return new CriticCommitFileVersion(this, path, data);\n        },\n      writable: true,\n      configurable: true },\n\n    isAncestorOf: {\n      value: function (other)\n        {\n          if (!(this instanceof CriticCommit))\n            throw CriticError(\"invalid this object: expected Commit object\");\n          if (!(other instanceof CriticCommit))\n            throw CriticError(\"invalid 'other' argument: expected Commit object\");\n          if (this.repository.id != other.repository.id)\n            return false;\n          else if (this.sha1 == other.sha1)\n            return true;\n          else\n            return this.repository.run(\"merge-base\", this.sha1, other.sha1).trim() == this.sha1;\n        },\n      writable: true,\n      configurable: true }\n  });\n\nCriticRepository.prototype.run = function ()\n  {\n    return runGitCommand(this.path, [].slice.apply(arguments));\n  };\nCriticRepository.prototype.run.supportsInput = true;\n\nCriticRepository.prototype.revparse = function (ref, use_short)\n  {\n    if (!ref || typeof ref != \"string\")\n      throw CriticError(\"invalid ref argument: expected non-empty string\");\n    else\n    {\n      var short = \"--short=40\";\n      if (use_short)\n        if (use_short === true)\n          short = \"--short\";\n        else\n          short = format(\"--short=%d\", use_short);\n      return this.run(\"rev-parse\", \"--verify\", \"--quiet\", short, ref).trim();\n    }\n  };\n\nCriticRepository.prototype.revlist = function (data)\n  {\n    var args = [\"rev-list\"];\n\n    if (data.options)\n    {\n      [].push.apply(args, [].map.call(data.options,\n        function (option)\n        {\n          if (!/-[a-z]|--[a-z-]+(?:=.*)/.test(option))\n            throw CriticError(\"invalid option: \" + option);\n          return option;\n        }));\n    }\n\n    if (data.included)\n    {\n      [].push.apply(args, [].map.call(data.included, String));\n\n      if (data.excluded)\n        [].push.apply(args, [].map.call(data.excluded, function (ref) { return \"^\" + String(ref); }));\n    }\n    else if (data.range)\n    {\n      var range = String(data.range);\n      if (range.match(/\\.\\.\\.?/g).length != 1)\n        throw CriticError(\"invalid range\");\n      args.push(range);\n    }\n    else\n      throw CriticError(\"invalid argument: data.included or data.range must be specified\");\n\n    return this.run.apply(this, args).trim().split(\"\\n\");\n  };\n\nCriticRepository.prototype.getCommit = function (ref_or_id)\n  {\n    var sha1;\n\n    if (typeof ref_or_id == \"number\")\n      sha1 = db.execute(\"SELECT sha1 FROM commits WHERE id=%d\", ref_or_id)[0].sha1;\n    else\n      sha1 = this.revparse(ref_or_id);\n\n    return new CriticCommit(this, sha1);\n  };\n\nCriticRepository.prototype.getBranch = function (name)\n  {\n    return new CriticBranch({ repository: this, name: name });\n  };\n\nfunction createChangeset(repository, type, parent, child)\n{\n  var socket = new IO.Socket(\"unix\", \"stream\");\n  var request = { repository_name: repository.name,\n                  changeset_type: type };\n\n  switch (type)\n  {\n  case \"direct\":\n  case \"merge\":\n    request.child_sha1 = child.sha1;\n    break;\n  case \"custom\":\n  case \"conflicts\":\n    request.parent_sha1 = parent.sha1;\n    request.child_sha1 = child.sha1;\n  }\n\n  socket.connect(IO.SocketAddress.unix(changeset_address));\n  socket.send(JSON.stringify([request]));\n  socket.shutdown(\"write\");\n\n  var result = \"\";\n\n  while (true)\n  {\n    var data = socket.recv(4096);\n    if (data === null)\n      break;\n    result += data.decode();\n  }\n\n  result = JSON.parse(result);\n\n  if (!(result instanceof Array) || result.length != 1 || \"error\" in result[0])\n    throw CriticError(format(\"failed to create changeset!\"));\n}\n\nCriticRepository.prototype.getChangeset = function (data)\n  {\n    var data_argument = data;\n\n    if (typeof data == \"number\")\n      data = { id: data };\n    else if (typeof data.id == \"number\")\n    {\n      data = { id: data.id };\n      if (data_argument.parent)\n        data.parent = data_argument.parent;\n      if (data_argument.child)\n        data.child = data_argument.child;\n      if (data_argument.commit)\n        data.commit = data_argument.commit;\n    }\n    else\n    {\n      var commit, parent, child, type;\n\n      if (data instanceof CriticCommit)\n        commit = data;\n      else if (data.commit)\n        if (data.commit instanceof CriticCommit)\n          commit = data.commit;\n        else\n          commit = this.getCommit(data.commit);\n\n      if (commit)\n      {\n        if (commit.parents.length != 1)\n          throw CriticError(\"invalid use; commit is a merge, use getMergeChangeset() instead\");\n\n        child = commit;\n        parent = commit.parents[0];\n        type = \"direct\";\n      }\n      else\n      {\n        if (data.parent instanceof CriticCommit)\n          parent = data.parent;\n        else\n          parent = this.getCommit(data.parent);\n\n        if (data.child instanceof CriticCommit)\n          child = data.child;\n        else\n          child = this.getCommit(data.child);\n\n        if (child.parents.length == 1 && child.parents[0].sha1 == parent.sha1)\n          type = \"direct\";\n        else\n          type = \"custom\";\n      }\n\n      for (var attempt = 0; attempt < 2; ++attempt)\n      {\n        var result = db.execute(\"SELECT id FROM changesets WHERE parent=%d AND child=%d AND type IN ('direct', 'custom')\", parent.id, child.id);\n\n        if (result.length)\n        {\n          data = { id: result[0].id, parent: parent, child: child };\n          break;\n        }\n        else if (attempt == 0)\n          createChangeset(this, type, parent, child);\n      }\n\n      if (attempt == 2)\n        throw CriticError(\"mysterious error creating/finding cached changeset\");\n    }\n\n    if (typeof data_argument == \"object\")\n      for (var name in data_argument)\n        switch (name)\n        {\n        case \"id\":\n        case \"commit\":\n        case \"child\":\n        case \"parent\":\n          break;\n\n        default:\n          data[name] = data_argument[name];\n        }\n\n    return new CriticChangeset(this, data);\n  };\n\nCriticRepository.prototype.getMergeChangeset = function (commit, data)\n  {\n    var commit;\n\n    data = data || {};\n\n    if (!(commit instanceof CriticCommit))\n      commit = this.getCommit(commit);\n\n    if (commit.parents.length < 2)\n      throw CriticError(format(\"invalid use; %s is not a merge commit\", commit.sha1));\n\n    for (var attempt = 0; attempt < 2; ++attempt)\n    {\n      var result = db.execute(\"SELECT id FROM changesets WHERE child=%d AND type='merge'\", commit.id);\n\n      if (result.length)\n      {\n        var changesets = [];\n\n        for (var index = 0; index < result.length; ++index)\n        {\n          data.id = result[index].id;\n          data.child = commit;\n\n          changesets.push(new CriticChangeset(this, data));\n        }\n\n        return new CriticMergeChangeset(changesets);\n      }\n      else if (attempt == 0)\n        createChangeset(this, \"merge\", null, commit);\n    }\n\n    throw CriticError(\"mysterious error creating/finding cached changeset\");\n  };\n\nfunction CriticRepositoryWorkCopy(repository, branch)\n{\n  if (!repository_work_copy_path ||\n      !IO.File.isDirectory(repository_work_copy_path) ||\n      global.user.id === null)\n    throw CriticError(\"operation not available\");\n\n  var name = format(\"%s/%d/%s\", global.user.name, extension_id, repository.name);\n  var path = repository_work_copy_path + \"/\" + name;\n\n  this.repository = repository;\n  this.path = path;\n\n  if (IO.File.isDirectory(path))\n  {\n    this.run(\"clean\", \"-d\", \"-x\", \"-f\", \"-f\");\n    this.run(\"reset\", \"--hard\");\n\n    if (branch)\n    {\n      this.run(\"fetch\", \"origin\", \"refs/heads/\" + branch);\n      this.run(\"checkout\", \"-q\", \"FETCH_HEAD\");\n      try { this.run(\"branch\", \"-D\", branch); } catch (e) {}\n      this.run(\"checkout\", \"-q\", \"-b\", branch);\n    }\n    else\n    {\n      try\n      {\n        var ref = repository.run(\"symblic-ref\", \"--quiet\", \"HEAD\").trim();\n        this.run(\"fetch\", \"origin\", ref);\n        ref = \"FETCH_HEAD\";\n      }\n      catch (error)\n      {\n        ref = repository.run(\"rev-parse\", \"HEAD\").trim();\n      }\n\n      this.run(\"checkout\", \"-q\", ref);\n    }\n\n    IO.File.utimes(path);\n  }\n  else\n  {\n    var argv = [git_executable, \"clone\"];\n\n    if (branch)\n      argv.push(\"-b\", branch);\n\n    argv.push(repository.path, name);\n\n    var process = new OS.Process(git_executable, { argv: argv, cwd: repository_work_copy_path });\n\n    return process.call();\n  }\n}\n\nCriticRepositoryWorkCopy.prototype.run = function ()\n  {\n    return runGitCommand(this.path, [].slice.apply(arguments));\n  };\nCriticRepositoryWorkCopy.prototype.run.supportsInput = true;\n\nCriticRepository.prototype.getWorkCopy = function ()\n  {\n    return new CriticRepositoryWorkCopy(this);\n  };\n\nCriticRepository.prototype.startFiltersTransaction = function ()\n  {\n    return new CriticFiltersTransaction(this);\n  };\n"
  },
  {
    "path": "src/library/js/v8/critic-html.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2013 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\n\"use strict\";\n\nvar standard_stylesheets = { \"third-party/jquery-ui.css\": 1, \"overrides.css\": 1, \"basic.css\": 1 };\nvar standard_scripts = { \"third-party/jquery.js\": 1, \"third-party/jquery-ui.js\": 1, \"basic.js\": 1 };\nvar standard_links = { home: \"Home\",\n                       dashboard: \"Dashboard\",\n                       branches: \"Branches\",\n                       services: \"Services\",\n                       repositories: \"Repositories\",\n                       search: \"Search\",\n                       manageextensions: \"Extensions\",\n                       config: \"Config\",\n                       tutorial: \"Tutorial\" };\nvar entities = { \"<\": \"&lt;\", \"&\": \"&amp;\", \">\": \"&gt;\", \"'\": \"&apos;\", '\"': \"&quot;\" };\n\nfunction htmlify(text)\n{\n  return String(text).replace(/[<&>'\"]/g, function (ch) { return entities[ch]; });\n}\n\nfunction writeStandardHeader(title, data)\n{\n  if (title instanceof CriticUser)\n    throw CriticError(\"API change: the user argument was moved to the HeaderData dictionary (and is now optional)\");\n\n  write(\"<!DOCTYPE html><html><head><title>%s</title>\", htmlify(title));\n\n  data = data || {};\n\n  var user = data.user || global.user;\n\n  for (var stylesheet in standard_stylesheets)\n    write(\"<link rel=stylesheet type=text/css href=/static-resource/%s>\", stylesheet);\n  if (data.stylesheets)\n    for (var index = 0; index < data.stylesheets.length; ++index)\n      write(\"<link rel=stylesheet type=text/css href='%s'>\", htmlify(data.stylesheets[index]));\n\n  if (data.links)\n    for (var link in data.links)\n    {\n      var match = /^rel=(.*)$/.exec(link);\n      if (match)\n        write(\"<link rel='%s' href='%s'>\", match[1], data.links[link]);\n    }\n\n  write(\"<link rel=icon href=/static-resource/favicon%s.png>\", is_development ? \"-dev\" : \"\");\n  write(\"<style type=text/css>body { %s }</style>\", user.getPreference(\"style.defaultFont\"));\n\n  for (var script in standard_scripts)\n    write(\"<script type=text/javascript src=/static-resource/%s></script>\", script);\n  if (data.scripts)\n    for (var index = 0; index < data.scripts.length; ++index)\n      write(\"<script type=text/javascript src='%s'></script>\", htmlify(data.scripts[index]));\n\n  if (typeof data.review == \"object\")\n    write(\"<script type=text/javascript>var critic = critic || {}; critic.review = { id: %d }; critic.repository = { id: %d };</script>\", data.review.id, data.review.repository.id);\n\n  var opera_class = \"opera\";\n\n  if (is_development)\n    opera_class += \" development\";\n\n  write(\"</head><body><table class=pageheader width=100%%><tr><td class=left valign=bottom align=left><b onclick='location.href=\\\"/\\\"'><b class='%s'>Opera</b><b class=critic>Critic</b></b><ul>\", opera_class);\n\n  var first = true;\n\n  for (var link in standard_links)\n  {\n    if (link == \"services\" && !user.hasRole(\"administrator\"))\n      continue;\n    if (link == \"repositories\" && !user.hasRole(\"repositories\"))\n      continue;\n    write(\"<li><a href=/%s>%s</a></li>\", link, standard_links[link]);\n  }\n\n  var result = db.execute(\"SELECT COUNT(*) AS unread FROM newsitems LEFT OUTER JOIN newsread ON (item=id AND uid=%d) WHERE uid IS NULL\", user.id)[0];\n\n  if (result.unread)\n    write(\"<li><a href=/news style=color:red>News (%d)</a></li>\", result.unread);\n  else\n    write(\"<li><a href=/news>News</a></li>\");\n\n  if (data.links)\n    for (var link in data.links)\n      if (!/^rel=/.test(link))\n        write(\"<li><a href=%s>%s</a></li>\", data.links[link], link);\n\n  if (data.review)\n  {\n    var review_id;\n    if (typeof data.review == \"number\")\n      review_id = data.review;\n    else\n      review_id = data.review.id;\n    write(\"<li><a href=/r/%d>Back to review</a></li>\", review_id);\n  }\n\n  write(\"</ul></td><td class=right><span class=buttonscope-global></span></td></tr></table>\");\n}\n\nfunction writeStandardFooter(data)\n{\n  var result = db.execute(\"SELECT extensions.name, users.fullname \" +\n                          \"FROM extensions \" +\n                          \"JOIN users ON (users.id=extensions.author) \" +\n                          \"WHERE extensions.id=%d\", extension_id)[0];\n\n  write(\"<div class=pagefooter>\");\n  write(\"Generated by extension: %s by %s\", result[0], result[1]);\n  write(\"</div></body></html>\");\n}\n\nfunction PaleYellowTable(title)\n{\n  this.title = title;\n  this.rows = [];\n}\n\nPaleYellowTable.prototype.addHeading = function (title)\n  {\n    this.rows.push(format(\"<tr><td class=h2 colspan=3><h2>%s</h2></td></tr>\", htmlify(title)));\n  };\n\nPaleYellowTable.prototype.addItem = function (data)\n  {\n    if (data.html)\n      this.rows.push(format(\"<tr class=html><td colspan=3>%s</td></tr>\", data.html));\n    else if (data.name)\n    {\n      var buttons_html;\n\n      if (data.buttons)\n      {\n        buttons_html = \"<div class=buttons>\";\n        for (var button_name in data.buttons)\n          buttons_html += format(\"<button onclick='%s'>%s</button>\", htmlify(data.buttons[button_name]), htmlify(button_name));\n        buttons_html += \"</div>\";\n      }\n      else\n        buttons_html = \"\";\n\n      this.rows.push(format(\"<tr class=item><td class=name>%s:</td><td class=value colspan=2>%s%s</td></tr>\", htmlify(data.name), data.value, buttons_html));\n      this.rows.push(format(\"<tr class=help><td colspan=3>%s</td></tr>\", htmlify(data.description)));\n    }\n    else if (data.buttons)\n    {\n      var buttons_html = \"<div class=buttons>\";\n      for (var button_name in data.buttons)\n        buttons_html += format(\"<button onclick='%s'>%s</button>\", htmlify(data.buttons[button_name]), htmlify(button_name));\n      buttons_html += \"</div>\";\n\n      this.rows.push(format(\"<tr class=centered><td colspan=3>%s</td></tr>\", buttons_html));\n    }\n    else if (data.separator)\n      this.rows.push(\"<tr class=separator><td colspan=3><div></div></td></tr>\");\n    else\n      throw CriticError(\"invalid argument\");\n  };\n\nPaleYellowTable.prototype.write = function ()\n  {\n    write(\"<div class=main><table class=paleyellow align=center><colgroup><col width=10%><col width=60%><col width=30%></colgroup><tbody>\");\n    write(\"<tr><td class=h1 colspan=3><h1>%s</h1></td></tr>\", htmlify(this.title));\n\n    for (var index = 0; index < this.rows.length; ++index)\n      write(this.rows[index]);\n\n    write(\"</tbody></table></div>\");\n  };\n\nvar CriticHtml = { writeStandardHeader: writeStandardHeader,\n                   writeStandardFooter: writeStandardFooter,\n                   PaleYellowTable: PaleYellowTable,\n                   escape: htmlify };\n\nObject.freeze(CriticHtml);\n"
  },
  {
    "path": "src/library/js/v8/critic-launcher-fork.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2013 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\nvar data = JSON.parse(readln());\n\nvar critic = new Module();\ncritic.load(data.criticjs_path);\ncritic.close();\n\nvar server_socket = new IO.Socket(\"unix\", \"stream\");\n\nserver_socket.bind(IO.SocketAddress.unix(data.address));\nserver_socket.listen(4);\n\nwriteln(\"listening\");\n\nfunction Child(socket)\n{\n  this.socket = socket;\n  this.stdout = IO.File.pipe();\n  this.stderr = IO.File.pipe();\n  this.check = IO.File.pipe();\n}\n\nChild.prototype.start =\n  function ()\n  {\n    this.socket.sendfd(this.stdout.input);\n    this.stdout.input.close();\n\n    this.socket.sendfd(this.stderr.input);\n    this.stderr.input.close();\n\n    this.process = new Process();\n    this.process.start();\n\n    if (this.process.isSelf)\n    {\n      this.check.input.close();\n\n      File.dup2(this.socket, 0);\n      this.socket.close();\n\n      File.dup2(this.stdout.output, 1);\n      this.stdout.output.close();\n\n      File.dup2(this.stderr.output, 2);\n      this.stderr.output.close();\n\n      this.execute();\n\n      this.check.output.close();\n\n      Process.exit(0);\n    }\n    else\n    {\n      this.stdout.output.close();\n      this.stderr.output.close();\n      this.check.output.close();\n    }\n  };\n\nChild.prototype.execute =\n  function ()\n  {\n    try\n    {\n      var line = read().decode();\n      var data = JSON.parse(line);\n    }\n    catch (e)\n    {\n      throw JSON.stringify(line);\n    }\n\n    critic.setup(data);\n\n    try\n    {\n      var script = new Module();\n\n      script.global.critic = critic;\n      script.load(data.script_path);\n      script.global[data.fn].apply(null, eval(data.argv));\n    }\n    finally\n    {\n      critic.shutdown();\n    }\n  };\n\nChild.prototype.finish =\n  function ()\n  {\n    if (this.process.wait(true))\n    {\n      var result = JSON.stringify({ exitStatus: this.process.exitStatus,\n                                    terminationSignal: this.process.terminationSignal });\n\n      this.socket.send(result + \"\\n\");\n      this.socket.close();\n\n      return true;\n    }\n    else\n      return false;\n  };\n\nvar children = {};\nvar poll = new IO.Poll();\n\npoll.register(server_socket);\n\nwhile (true)\n{\n  if (poll.poll(1000))\n  {\n    poll.read.forEach(function (file) {\n      if (file == server_socket)\n      {\n        var client_socket = server_socket.accept();\n\n        writeln(\"%.3f: client connection opened\", Date.now());\n\n        var child = new Child(client_socket);\n\n        child.start();\n        children[child.process.pid] = child;\n\n        poll.register(child.check.input);\n      }\n    });\n  }\n\n  for (var pid in children)\n  {\n    if (children[pid].finish())\n    {\n      poll.unregister(child.check.input);\n      child.check.input.close();\n\n      delete children[pid];\n\n      writeln(\"%.3f: child process finished\", Date.now());\n    }\n  }\n}\n"
  },
  {
    "path": "src/library/js/v8/critic-launcher.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2013 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\nvar critic = new Module();\n\ncritic.load(IO.Path.absolute(\"critic.js\"));\n\nvar line = readln();\nvar data = JSON.parse(line);\n\ncritic.setup(data);\ncritic.close();\n\nfunction run()\n{\n  try\n  {\n    var script = new Module({ PostgreSQL: false });\n    script.global.critic = critic;\n    script.eval(\"var Encodings = { decode: function (bytes) { return typeof bytes == 'object' ? bytes.decode.apply(bytes, [].slice.call(arguments, 1)) : bytes; } };\");\n\n    try\n    {\n      script.load(data.script_path);\n    }\n    catch (error)\n    {\n      IO.File.stderr.write(format(\"Failed to load '%s':\\n  %s\",\n                                  data.script_path, error));\n      return 1;\n    }\n\n    try\n    {\n      script.global[data.fn].apply(null, eval(data.argv));\n    }\n    catch (error)\n    {\n      IO.File.stderr.write(format(\"Failed to call '%s::%s()':\\n  %s\\n    %s\",\n                                  data.script_path, data.fn,\n                                  error,\n                                  error.stack.replace(/\\n/g, \"\\n    \")));\n      return 1;\n    }\n\n    return 0;\n  }\n  finally\n  {\n    critic.shutdown();\n  }\n}\n\nOS.Process.exit(run());\n"
  },
  {
    "path": "src/library/js/v8/critic-log.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2013 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\n\"use strict\";\n\nfunction CriticLog(user)\n{\n  this.write = function ()\n    {\n      var text, options, category = \"default\";\n\n      if (typeof arguments[arguments.length - 1] == \"object\")\n      {\n        options = arguments[arguments.length - 1];\n        text = format.apply(null, Array.prototype.slice.call(arguments, 0, arguments.length - 1));\n      }\n      else\n        text = format.apply(null, arguments);\n\n      if (options)\n        category = \"\" + options.category;\n\n      db.execute(\"INSERT INTO extensionlog (extension, uid, category, text) VALUES (%d, %d, %s, %s)\",\n                 extension_id, user.id, category, text);\n      db.commit();\n    };\n\n  function getQuery(data)\n  {\n    var terms = [\"extensionlog.extension = %d\", \"extensionlog.uid = %d\"];\n    var parameters = [extension_id, global.user.id];\n\n    if (data && typeof data == \"object\")\n    {\n      if (data.timeStart)\n      {\n        if (typeof data.timeStart == \"object\")\n        {\n          terms.push(\"extensionlog.time >= %s::timestamp\");\n          parameters.push(Date.prototype.toSQLTimestamp.call(data.timeStart));\n        }\n        else\n        {\n          terms.push(\"extensionlog.time >= now() - %s::interval\");\n          parameters.push(String(data.timeStart));\n        }\n      }\n\n      if (data.timeEnd)\n      {\n        if (typeof data.timeEnd == \"object\")\n        {\n          terms.push(\"extensionlog.time <= %s::timestamp\");\n          parameters.push(Date.prototype.toSQLTimestamp.call(data.timeEnd));\n        }\n        else\n        {\n          terms.push(\"extensionlog.time <= now() - %s::interval\");\n          parameters.push(String(data.timeEnd));\n        }\n      }\n\n      if (data.category)\n      {\n        terms.push(\"extensionlog.category = %s\");\n        parameters.push(String(data.category));\n      }\n    }\n\n    return { where: terms.join(\" AND \"), parameters: parameters };\n  }\n\n  this.fetch = function (data)\n    {\n      var query = getQuery(data);\n      var result = db.execute.apply(db, [\"SELECT uid, category, time, text FROM extensionlog WHERE \" + query.where + \" ORDER BY time ASC\"].concat(query.parameters));\n      var users = {};\n      var log = [];\n\n      for (var index = 0; index < result.length; ++index)\n      {\n        var row = result[index];\n        var user = users[row.uid];\n\n        if (!user)\n          user = users[row.uid] = new CriticUser({ id: row.uid });\n\n        log.push(Object.freeze({ user: user, time: row.time, category: row.category, text: row.text }));\n      }\n\n      return log;\n    };\n\n  this.remove = function (data)\n    {\n      var query = getQuery(data);\n\n      db.execute.apply(db, [\"DELETE FROM extensionlog WHERE \" + query.where].concat(query.parameters));\n      db.commit();\n    };\n}\n"
  },
  {
    "path": "src/library/js/v8/critic-mail.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2013 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\n\"use strict\";\n\nfunction sendMail(filename)\n{\n  IO.File.rename(filename, /^(.*)\\.pending$/.exec(filename)[1]);\n}\n\nfunction CriticMailTransaction()\n{\n  this.mails = [];\n}\n\nCriticMailTransaction.prototype.add = function (data)\n  {\n    if (!(\"to\" in data) && !(\"review\" in data))\n      throw CriticError(\"invalid argument; at least one of data.to or data.review must be specified\");\n    if (!(\"subject\" in data))\n      throw CriticError(\"invalid argument; data.subject is required\");\n    if (!(\"body\" in data))\n      throw CriticError(\"invalid argument; data.body is required\");\n\n    var mail = { subject: String(data.subject),\n                 body: String(data.body) };\n\n    if (\"to\" in data)\n    {\n      mail.recipients = [];\n      if (typeof data.to == \"object\" && \"length\" in data.to)\n      {\n        for (var index = 0; index < data.to.length; ++index)\n          mail.recipients.push((new CriticUser(data.to[index])).id);\n      }\n      else\n        mail.recipients.push((new CriticUser(data.to)).id);\n    }\n\n    if (\"from\" in data)\n      mail.sender = (new CriticUser(data.from)).id;\n    else\n      mail.sender = global.user.id;\n\n    if (\"review\" in data)\n      mail.review_id = (new CriticReview(data.review)).id;\n\n    if (\"headers\" in data)\n    {\n      mail.headers = {};\n      for (var name in data.headers)\n        mail.headers[name] = String(data.headers[name]);\n    }\n\n    this.mails.push(mail);\n  };\n\nCriticMailTransaction.prototype.finish = function ()\n  {\n    var argv = [python_executable, \"-m\", \"cli\", \"generate-custom-mails\"];\n    var stdin_data = format(\"%r\\n\", this.mails);\n\n    var process = new OS.Process(python_executable,\n                                 { argv: argv,\n                                   environ: { PYTHONPATH: python_path }});\n\n/*\n    process.stdout = new IO.MemoryFile;\n    process.stderr = new IO.MemoryFile;\n    process.start();\n    process.wait();\n\n    if (process.exitStatus !== 0)\n      throw CriticError(process.stderr.value.decode());\n\n    var stdout_data = process.stdout.value.decode().trim();\n*/\n\n    var stdout_data = process.call(stdin_data).trim();\n    var response = JSON.parse(stdout_data);\n\n    if (typeof response == \"string\")\n      throw CriticError(response);\n\n    response.forEach(sendMail);\n\n    var maildelivery_pid =\n      parseInt(IO.File.read(maildelivery_pid_path).decode().trim());\n\n    OS.Process.kill(maildelivery_pid, 1);\n  };\n"
  },
  {
    "path": "src/library/js/v8/critic-review.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2013 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\n\"use strict\";\n\nvar review_internals = {};\n\nfunction CriticReview_isCreated(review)\n{\n  return review_internals[review.id].created;\n}\n\nfunction CriticReviewFilter(review, user_id, path, type, creator_id)\n{\n  user_id = Number(user_id);\n  path = String(path);\n  creator_id = creator_id || Number(creator_id);\n\n  this.review = review;\n  this.path = path;\n  this.type = type;\n\n  var user = null;\n  var creator = null;\n\n  function getUser()\n  {\n    if (!user)\n      user = new CriticUser(user_id);\n    return user;\n  }\n\n  function getCreator()\n  {\n    if (!creator)\n      if (creator_id)\n        creator = new CriticUser(creator_id);\n      else\n        return null;\n    return creator;\n  }\n\n  Object.defineProperties(this, { user: { get: getUser, enumerable: true },\n                                  creator: { get: getCreator, enumerable: true }});\n  Object.freeze(this);\n}\n\nfunction CriticReviewRebase(review, user_id, old_head_id, new_head_id, new_upstream_id, equivalent_merge_id, replayed_rebase_id, branch_name)\n{\n  this.review = review;\n  this.user = new CriticUser(user_id);\n  this.oldHead = review.repository.getCommit(old_head_id);\n  this.newHead = review.repository.getCommit(new_head_id);\n  this.newUpstream = new_upstream_id && review.repository.getCommit(new_upstream_id);\n  this.equivalentMerge = equivalent_merge_id && review.repository.getCommit(equivalent_merge_id);\n  this.replayedRebase = replayed_rebase_id && review.repository.getCommit(replayed_rebase_id);\n  this.branchName = branch_name;\n\n  Object.freeze(this);\n}\n\nfunction CriticReviewCreated(review_id)\n{\n  this.id = review_id;\n}\n\nfunction CriticReview(arg)\n{\n  var review_id, created = false;\n\n  if (arg && typeof arg == \"object\")\n  {\n    if (arg instanceof CriticReview)\n      return arg;\n    else if (arg instanceof CriticReviewCreated)\n    {\n      review_id = arg.id;\n      created = true;\n    }\n    else\n      review_id = ~~arg;\n  }\n  else\n    review_id = ~~arg;\n\n  var result = db.execute(\"SELECT branch, state, closed_by, dropped_by, summary, description FROM reviews WHERE id=%d\", review_id)[0];\n\n  if (!result)\n    throw CriticError(format(\"%d: invalid review ID\", review_id));\n\n  this.id = review_id;\n  this.owners = [];\n  this.state = result.state;\n  this.closedBy = result.closed_by && new CriticUser(result.closed_by);\n  this.droppedBy = result.dropped_by && new CriticUser(result.dropped_by);\n  this.summary = result.summary || \"\";\n  this.description = result.description || \"\";\n  this.branch = new CriticBranch({ id: result.branch, review: this });\n  this.repository = this.branch.repository;\n\n  var owners = db.execute(\"SELECT uid FROM reviewusers WHERE review=%d AND owner\", review_id);\n  for (var index = 0; index < owners.length; ++index)\n    this.owners.push(new CriticUser(owners[index].uid));\n\n  var self = this;\n  var commits = null;\n  var comment_chains = null;\n  var users = null;\n  var reviewers = null;\n  var watchers = null;\n  var progress = null;\n  var batches = null;\n  var filters = null;\n  var rebases = null;\n  var trackedBranch;\n\n  var internal = review_internals[review_id] = {};\n  internal.filters = filters;\n  internal.created = created;\n\n  result = null;\n  owners = null;\n\n  function getCommits()\n  {\n    if (!commits)\n    {\n      var all = [];\n\n      var result = db.execute(\"SELECT DISTINCT child FROM changesets JOIN reviewchangesets ON (changeset=id) WHERE review=%d\", self.id);\n\n      for (var index = 0; index < result.length; ++index)\n        all.push(self.repository.getCommit(result[index].child));\n\n      commits = new CriticCommitSet(all);\n    }\n\n    return commits;\n  }\n\n  function getCommentChains()\n  {\n    if (!comment_chains)\n    {\n      comment_chains = [];\n      comment_chains.issues = [];\n      comment_chains.notes = [];\n\n      var result = db.execute(\"SELECT id, review, batch, uid, time, type, state, origin, file, first_commit, last_commit, closed_by, addressed_by FROM commentchains WHERE review=%d AND state NOT IN ('draft', 'empty') ORDER BY time ASC\", review_id);\n\n      for (var index = 0; index < result.length; ++index)\n      {\n        var chain = new CriticCommentChain(result[index], { review: self });\n        comment_chains.push(chain);\n        if (chain.type == CriticCommentChain.TYPE_ISSUE)\n          comment_chains.issues.push(chain);\n        else\n          comment_chains.notes.push(chain);\n      }\n\n      Object.freeze(comment_chains.issues);\n      Object.freeze(comment_chains.notes);\n      Object.freeze(comment_chains);\n    }\n\n    return comment_chains;\n  }\n\n  function getUsers()\n  {\n    if (!users)\n    {\n      users = [];\n      users.type = {};\n\n      var result = db.execute(\"SELECT uid, type FROM reviewusers WHERE review=%d\", self.id);\n\n      for (var index = 0; index < result.length; ++index)\n      {\n        var user = new CriticUser(result[index].uid);\n\n        users.push(user);\n        users.type[user.id] = result[index].type;\n      }\n\n      Object.freeze(users.type);\n      Object.freeze(users);\n    }\n\n    return users;\n  }\n\n  function getReviewers()\n  {\n    if (!reviewers)\n    {\n      reviewers = {};\n\n      var result = db.execute(\"SELECT DISTINCT assignee FROM fullreviewuserfiles WHERE review=%d\", self.id);\n\n      for (var index = 0; index < result.length; ++index)\n      {\n        var user = new CriticUser(result[index].assignee);\n\n        reviewers[user.id] = user;\n        reviewers[user.name] = user;\n      }\n\n      Object.freeze(reviewers);\n    }\n\n    return reviewers;\n  }\n\n  function getWatchers()\n  {\n    if (!watchers)\n    {\n      watchers = {};\n\n      var result = db.execute(\"SELECT reviewusers.uid FROM reviewusers LEFT OUTER JOIN fullreviewuserfiles ON (reviewusers.review=fullreviewuserfiles.review AND reviewusers.uid=fullreviewuserfiles.assignee) WHERE reviewusers.review=%d AND fullreviewuserfiles.assignee IS NULL\", self.id);\n\n      for (var index = 0; index < result.length; ++index)\n      {\n        var user = new CriticUser(result[index].uid);\n\n        watchers[user.id] = user;\n        watchers[user.name] = user;\n      }\n\n      Object.freeze(watchers);\n    }\n\n    return watchers;\n  }\n\n  function getBatches()\n  {\n    if (!batches)\n    {\n      batches = [];\n\n      var result = db.execute(\"SELECT id FROM batches WHERE review=%d ORDER BY id ASC\", self.id);\n\n      for (var index = 0; index < result.length; ++index)\n        batches.push(new CriticBatch({ id: result[index].id, review: self }));\n\n      Object.freeze(batches);\n    }\n\n    return batches;\n  }\n\n  function getFilters()\n  {\n    if (!filters)\n    {\n      filters = [];\n\n      var result = db.execute(\"SELECT uid, path, type, creator FROM reviewfilters WHERE review=%d\", self.id);\n\n      for (var index = 0; index < result.length; ++index)\n      {\n        var row = result[index];\n        filters.push(new CriticReviewFilter(self, row.uid, row.path, row.type, row.creator));\n      }\n\n      Object.freeze(filters);\n    }\n\n    return filters;\n  }\n\n  function getRebases()\n  {\n    if (!rebases)\n    {\n      rebases = [];\n\n      var result = db.execute(\"SELECT id, uid, old_head, new_head, new_upstream, equivalent_merge, replayed_rebase, branch FROM reviewrebases WHERE review=%d AND new_head IS NOT NULL ORDER BY id DESC\", self.id);\n\n      for (var index = 0; index < result.length; ++index)\n      {\n        var row = result[index];\n        rebases.push(new CriticReviewRebase(self, row.uid, row.old_head, row.new_head, row.new_upstream, row.equivalent_merge, row.replayed_rebase, row.branch));\n      }\n\n      Object.freeze(rebases);\n    }\n\n    return rebases;\n  }\n\n  function getProgress()\n  {\n    if (!progress)\n    {\n      var pending_lines = 0;\n      var pending_files = 0;\n      var reviewed_lines = 0;\n      var reviewed_files = 0;\n      var issues;\n\n      var result = db.execute(\"SELECT state, SUM(deleted) + SUM(inserted) AS count FROM reviewfiles WHERE review=%d GROUP BY state\", self.id);\n      for (var index = 0; index < result.length; ++index)\n        if (result[index].state == \"pending\")\n          pending_lines = result[index].count;\n        else\n          reviewed_lines = result[index].count;\n\n      var result = db.execute(\"SELECT state, COUNT(*) AS count FROM reviewfiles WHERE review=%d GROUP BY state\", self.id);\n      for (var index = 0; index < result.length; ++index)\n        if (result[index].state == \"pending\")\n          pending_files = result[index].count;\n        else\n          reviewed_files = result[index].count;\n\n      result = null;\n      issues = db.execute(\"SELECT COUNT(id) AS count FROM commentchains WHERE review=%d AND type='issue' AND state='open'\", self.id)[0].count;\n\n      progress = { accepted: self.state == \"open\" && pending_files == 0 && issues == 0,\n                   finished: self.state == \"closed\",\n                   dropped: self.state == \"dropped\",\n                   pendingLines: pending_lines,\n                   pendingFiles: pending_files,\n                   reviewedLines: reviewed_lines,\n                   reviewedFiles: reviewed_files,\n                   openIssues: issues,\n                   toString: function ()\n                     {\n                       if (this.finished)\n                         return \"Finished!\";\n                       else if (this.accepted)\n                         return \"Accepted!\";\n                       else if (this.dropped)\n                         return \"Dropped...\";\n\n                       var percent, pending = this.pendingLines, reviewed = this.reviewedLines;\n\n                       if (pending_lines == 0 && reviewed_lines == 0)\n                         percent = \"?? %\";\n                       else\n                       {\n                         var percent_exact = 100 * reviewed / (pending + reviewed);\n                         var percent_rounded = Math.round(percent_exact);\n\n                         if (percent_exact == 100)\n                           percent = \"100 %\";\n                         else if (reviewed == 0)\n                           percent = \"No progress\";\n                         else if (percent_rounded > 0 && percent_rounded < 100)\n                           percent = format(\"%d %%\", percent_rounded);\n                         else\n                         {\n                           for (var precision = 1; precision < 10; ++precision)\n                           {\n                             percent = format(format(\"%%.%df\", precision), percent_exact);\n                             if (percent.charAt(percent.length - 1) != '0')\n                               break;\n                           }\n                           percent += \" %\";\n                         }\n                       }\n\n                       if (this.openIssues)\n                         return format(\"%s and %d issue%s\", percent, this.openIssues, this.openIssues > 1 ? \"s\" : \"\");\n                       else\n                         return percent;\n                     }};\n\n      Object.freeze(progress);\n    }\n\n    return progress;\n  }\n\n  function getTrackedBranch()\n  {\n    if (trackedBranch === void 0)\n    {\n      var result = db.execute(\"SELECT id FROM trackedbranches WHERE repository=%d AND local_name=%s\",\n                              self.repository.id, self.branch.name)[0];\n\n      if (result)\n        trackedBranch = new CriticTrackedBranch(result.id, { repository: self.repository,\n                                                             review: self });\n      else\n        trackedBranch = null;\n    }\n\n    return trackedBranch;\n  }\n\n  Object.defineProperties(this, { commits: { get: getCommits, enumerable: true },\n                                  commentChains: { get: getCommentChains, enumerable: true },\n                                  users: { get: getUsers, enumerable: true },\n                                  reviewers: { get: getReviewers, enumerable: true },\n                                  watchers: { get: getWatchers, enumerable: true },\n                                  batches: { get: getBatches, enumerable: true },\n                                  filters: { get: getFilters, enumerable: true },\n                                  rebases: { get: getRebases, enumerable: true },\n                                  progress: { get: getProgress, enumerable: true },\n                                  trackedBranch: { get: getTrackedBranch, enumerable: true }});\n  Object.freeze(this);\n}\n\nCriticReview.prototype.getBatch = function (id)\n  {\n    return new CriticBatch({ id: Number(id) });\n  };\n\nCriticReview.prototype.getCommentChain = function (id)\n  {\n    return new CriticCommentChain(id, { review: this });\n  };\n\nCriticReview.prototype.getComment = function (id)\n  {\n    id = ~~id;\n\n    var result = db.execute(\"SELECT chain, batch, uid, time, state, comment FROM comments WHERE id=%d AND state='current'\", id)[0];\n    var chain = new CriticCommentChain(result.chain, { review: this });\n\n    return new CriticComment(chain.id, result.batch, id, result.uid, result.time, result.state, result.comment, { chain: chain });\n  };\n\nCriticReview.prototype.getChangeset = function (commit)\n  {\n    if (commit.parents.length > 1)\n    {\n      var result = db.execute(\"SELECT id FROM changesets WHERE child=%d AND type='merge'\", commit.id);\n      var changesets = [];\n\n      for (var index = 0; index < result.length; ++index)\n        changesets.push(new CriticChangeset(this.repository, { id: result[index].id, child: commit, review: this }));\n\n      return new CriticMergeChangeset(changesets);\n    }\n    else\n    {\n      var result = db.execute(\"SELECT id FROM changesets WHERE child=%d AND type='direct'\", commit.id)[0];\n      return new CriticChangeset(this.repository, { id: result.id, parent: commit.parents[0], child: commit, review: this });\n    }\n  };\n\nCriticReview.prototype.startBatch = function (user)\n  {\n    user = user || global.user;\n\n    if (!(user instanceof CriticUser))\n      throw CriticError(\"invalid argument; expected user object\");\n\n    return new CriticBatch({ internals: batch_internals, review: this, user: user, review_created: CriticReview_isCreated(this) });\n  };\n\nCriticReview.prototype.generateSubjectLine = function (user, preference)\n  {\n    var data = { id: format(\"r/%s\", this.id), summary: this.summary, progress: String(this.progress), branch: this.branch.name };\n    var user_format = user.getPreference(preference);\n\n    try\n    {\n      return format(user_format, data);\n    }\n    catch (e)\n    {\n      var default_format = db.execute(\"SELECT default_string FROM preferences WHERE item=%s\", preference);\n      return format(default_format, data);\n    }\n  };\n\nCriticReview.prototype.getReviewableCommits = function (user)\n  {\n    user = user || global.user;\n\n    if (!(user instanceof CriticUser))\n      throw CriticError(\"invalid argument; expected user object\");\n\n    var result = db.execute(\"SELECT DISTINCT child FROM changesets JOIN fullreviewuserfiles ON (changeset=id) WHERE review=%d AND assignee=%d\", this.id, user.id);\n\n    var commits = [];\n\n    for (var index = 0; index < result.length; ++index)\n      commits.push(this.repository.getCommit(result[index].child));\n\n    return new CriticCommitSet(commits);\n  };\n\nCriticReview.prototype.getFullChangeset = function ()\n  {\n    return this.branch.commits.getChangeset({ review: this });\n  };\n\nCriticReview.prototype.getReviewableChangeset = function (user)\n  {\n    var commits = this.getReviewableCommits(user);\n\n    if (commits.heads.length > 1 && commits.upstreams.length == 1)\n      commits = this.commits.restrict(commits.heads, commits.upstreams);\n\n    return commits.getChangeset({ review: this });\n  };\n\nCriticReview.prototype.increaseSerial = function ()\n  {\n    db.rollback();\n    db.execute(\"UPDATE reviews SET serial=serial+1 WHERE id=%d\", this.id);\n    db.commit();\n  };\n\nfunction CriticPartition(review, commits, rebase)\n{\n  this.review = review;\n  this.commits = commits;\n  this.rebase = rebase;\n\n  Object.freeze(this);\n}\n\nCriticReview.prototype.getCommitPartitions = function ()\n  {\n    var rebases = this.rebases;\n\n    if (rebases.length == 0)\n      return new CriticPartition(this, this.commits, null);\n\n    var partition_commits = this.commits.restrict([this.branch.head]);\n    var remaining_commits = this.commits.without(partition_commits);\n    var partitions = [];\n\n    for (var index = rebases.length - 1; index >= 0; --index)\n    {\n      var rebase = rebases[index];\n\n      partitions.push(new CriticPartition(this, partition_commits, rebase));\n\n      partition_commits = remaining_commits.restrict([rebase.oldHead]);\n      if (partition_commits.length != 0)\n        remaining_commits = remaining_commits.without(partition_commits);\n    }\n\n    partitions.push(new CriticPartition(this, partition_commits, null));\n\n    return Object.freeze(partitions);\n  };\n\nCriticReview.prototype.prepareRebase = function (data)\n  {\n    db.rollback();\n\n    if (db.execute(\"SELECT 1 FROM reviewrebases WHERE review=%d AND new_head IS NULL\", this.id).length != 0)\n      throw CriticError(\"review rebase already in progress\");\n\n    var user = data.user || global.user;\n\n    if (!!data.historyRewrite + !!data.singleCommit + !!data.newUpstream != 1)\n      throw CriticError(\"invalid argument; exactly one of data.historyRewrite, data.singleCommit and data.newUpstream must be specified\");\n\n    var old_head_id = this.branch.head.id;\n\n    if (data.historyRewrite)\n      db.execute(\"INSERT INTO reviewrebases (review, old_head, uid) VALUES (%d, %d, %d)\",\n                 this.id, old_head_id, user.id);\n    else\n    {\n      var upstreams = this.branch.commits.upstreams;\n\n      if (upstreams.length > 1)\n        throw CriticError(\"rebase not supported; review has multiple upstreams\");\n\n      var branch = data.branch || null;\n      var old_upstream = upstreams[0];\n\n      if (data.singleCommit)\n        db.execute(\"INSERT INTO reviewrebases (review, old_head, old_upstream, uid, branch) VALUES (%d, %d, %d, %d, %s)\",\n                   this.id, old_head_id, old_upstream.id, user.id, branch);\n      else\n        db.execute(\"INSERT INTO reviewrebases (review, old_head, old_upstream, new_upstream, uid, branch) VALUES (%d, %d, %d, %d, %d, %s)\",\n                   this.id, old_head_id, old_upstream.id, data.newUpstream.id, user.id, branch);\n    }\n\n    db.execute(\"UPDATE reviews SET serial=serial + 1 WHERE id=%d\", this.id);\n    db.commit();\n  };\n\nCriticReview.prototype.cancelRebase = function (data)\n  {\n    var result = db.execute(\"SELECT id FROM reviewrebases WHERE review=%d AND new_head IS NULL\", this.id)[0];\n\n    if (!result)\n      throw CriticError(\"no review rebase in progress\");\n\n    db.execute(\"DELETE FROM reviewrebases WHERE id=%d\", result.id);\n    db.commit();\n  };\n\nfunction setReviewState(review, user, new_state, verb)\n{\n  user = user || global.user;\n\n  if (!(user instanceof CriticUser))\n    throw CriticError(\"invalid argument; expected user object\");\n\n  db.rollback();\n\n  var error = executeCLI([{ name: \"set-review-state\",\n                            data: { user_id: user.id,\n                                    review_id: review.id,\n                                    old_state: review.state,\n                                    new_state: new_state }}])[0];\n\n  if (error)\n    throw CriticError(format(\"error encountered while %s review: %s\", verb, error));\n\n  db.commit();\n}\n\nCriticReview.prototype.close = function (user)\n  {\n    if (this.state != \"open\")\n      throw CriticError(\"review is not open\");\n    if (!this.progress.accepted)\n      throw CriticError(\"review is not accepted\");\n\n    setReviewState(this, user, \"closed\", \"closing\");\n  };\n\nCriticReview.prototype.drop = function (user)\n  {\n    if (this.state != \"open\")\n      throw CriticError(\"review is not open\");\n    if (this.progress.accepted)\n      throw CriticError(\"review is accepted\");\n\n    setReviewState(this, user, \"dropped\", \"dropping\");\n  };\n\nCriticReview.prototype.reopen = function (user)\n  {\n    if (this.state == \"open\")\n      throw CriticError(\"review is already open\");\n\n    setReviewState(this, user, \"open\", \"opening\");\n  };\n\nCriticReview.create = function (data)\n  {\n    if (!(\"upstream\" in data))\n      throw CriticError(\"missing argument: upstream\");\n    if (!(\"summary\" in data))\n      throw CriticError(\"missing argument: summary\");\n    if (!(\"branch\" in data))\n      throw CriticError(\"missing argument: branch\");\n    if (!(\"owner\" in data))\n      throw CriticError(\"missing argument: owner\");\n\n    var upstream = data.upstream;\n    var summary = String(data.summary);\n    var description = data.description || null;\n    var branch = String(data.branch);\n    var owner = data.owner;\n\n    if (!(upstream instanceof CriticCommit))\n      throw CriticError(\"invalid argument: upstream\");\n    if (branch.substring(0, 2) != \"r/\")\n      throw CriticError(\"invalid argument: branch (doesn't have 'r/' prefix)\");\n    if (!(owner instanceof CriticUser))\n      throw CriticError(\"invalid argument: owner\");\n\n    var repository = data.repository;\n\n    try\n    {\n      repository.revparse(branch);\n      branch = false;\n    }\n    catch (exception)\n    {\n    }\n\n    if (!branch)\n      throw CriticError(\"invalid argument: branch (already exists)\");\n\n    repository.run(\"branch\", branch, upstream.sha1);\n\n    var branch_id = db.execute(\"INSERT INTO branches (name, head, tail, repository) VALUES (%s, %d, %d, %d, %d) RETURNING id\",\n                               branch, upstream.id, upstream.id, repository.id)[0].id;\n\n    var review_id = db.execute(\"INSERT INTO reviews (type, branch, state, summary, description) VALUES (%s, %d, %s, %s, %s) RETURNING id\",\n                               \"official\", branch_id, \"open\", summary, description)[0].id;\n\n    db.execute(\"INSERT INTO reviewusers (review, uid, owner) VALUES (%d, %d, TRUE)\", review_id, owner.id);\n\n    return new CriticReview(review_id);\n  };\n\nCriticReview.find = function (data)\n  {\n    var result;\n\n    if (data.repositoryURL && data.branchName)\n    {\n      if (data.repositoryURL.substring(0, hostname.length + 1) == hostname + \":\")\n        result = db.execute(\"SELECT reviews.id \" +\n                            \"  FROM reviews \" +\n                            \"  JOIN branches ON (branches.id=reviews.branch) \" +\n                            \"  JOIN repositories ON (repositories.id=branches.repository) \" +\n                            \" WHERE branches.name=%s \" +\n                            \"   AND repositories.path=%s\",\n                            data.branchName,\n                            data.repositoryURL.substring(hostname.length + 1));\n      else\n        result = db.execute(\"SELECT reviews.id \" +\n                            \"  FROM reviews \" +\n                            \"  JOIN branches ON (branches.id=reviews.branch) \" +\n                            \"  JOIN trackedbranches ON (trackedbranches.repository=branches.repository \" +\n                            \"                       AND trackedbranches.local_name=branches.name) \" +\n                            \" WHERE trackedbranches.remote_name=%s \" +\n                            \"   AND trackedbranches.remote=%s\",\n                            data.branchName,\n                            data.repositoryURL);\n    }\n\n    return scoped(\n      result,\n      function ()\n      {\n        return this.apply(\n          function (review_id)\n          {\n            return new CriticReview(review_id);\n          });\n      });\n  };\n\nCriticReview.list = function (data)\n  {\n    data = data || {};\n\n    var tables = [\"reviews\"];\n    var conditions = [\"TRUE\"];\n    var argv = [];\n\n    if (data.repository)\n    {\n      var repository_id, repository_name;\n\n      if (data.repository instanceof CriticRepository)\n        repository_id = data.repository.id;\n      else if (parseInt(data.repository) === data.repository)\n        repository_id = data.repository;\n      else\n        repository_name = String(data.repository);\n\n      tables.push(\"branches ON (branches.id=reviews.branch)\");\n\n      if (repository_id !== void 0)\n      {\n        conditions.push(\"branches.repository=%d\");\n        argv.push(repository_id);\n      }\n      else\n      {\n        tables.push(\"repositories ON (repositories.id=branches.repository)\");\n        conditions.push(\"repositories.name=%s\");\n        argv.push(repository_name);\n      }\n    }\n\n    if (data.state)\n    {\n      var valid_states = { open: true, closed: true, dropped: true };\n\n      if (!(data.state in valid_states))\n        throw CriticError(format(\"invalid argument: data.state=%r not valid\", String(data.state)));\n\n      conditions.push(\"reviews.state=%s\");\n      argv.push(data.state);\n    }\n\n    if (data.owner)\n    {\n      var owner_id, owner_name;\n\n      if (data.owner instanceof CriticUser)\n        owner_id = data.owner.id;\n      else if (parseInt(data.owner) === data.owner)\n        owner_id = data.owner;\n      else\n        owner_name = String(data.owner);\n\n      tables.push(\"reviewusers ON (reviewusers.review=reviews.id)\");\n      conditions.push(\"reviewusers.owner\");\n\n      if (owner_id !== void 0)\n      {\n        conditions.push(\"reviewusers.uid=%d\");\n        argv.push(owner_id);\n      }\n      else\n      {\n        tables.push(\"users ON (users.id=reviewusers.uid)\");\n        conditions.push(\"users.name=%s\");\n        argv.push(owner_name);\n      }\n    }\n\n    var query = format(\n      \"SELECT reviews.id FROM %(tables)s WHERE %(conditions)s ORDER BY reviews.id\",\n      { tables: tables.join(\" JOIN \"),\n        conditions: conditions.join(\" AND \") });\n\n    return scoped(\n      db.execute.bind(db, query).apply(null, argv),\n      function ()\n      {\n        return this.apply(\n          function (review_id)\n          {\n            return new CriticReview(review_id);\n          });\n      });\n  };\n"
  },
  {
    "path": "src/library/js/v8/critic-statistics.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2013 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\n\"use strict\";\n\nfunction CriticStatistics()\n{\n  this.review = null;\n  this.repository = null;\n  this.interval_start = null;\n  this.interval_end = null;\n  this.user = null;\n  this.directories = null;\n  this.files = null;\n\n  Object.seal(this);\n}\n\nCriticStatistics.prototype.setReview = function (review)\n  {\n    if (!(review instanceof CriticReview))\n      throw CriticError(\"invalid argument; expected Review object\");\n\n    this.review = review;\n  };\n\nCriticStatistics.prototype.setRepository = function (repository)\n  {\n    if (!(repository instanceof CriticRepository))\n      throw CriticError(\"invalid argument; expected Repository object\");\n\n    this.repository = repository;\n  };\n\nCriticStatistics.prototype.setInterval = function (start, end)\n  {\n    function mangle(item, argument)\n    {\n      if (Object.prototype.toString.call(item) == \"[object Date]\")\n        return item;\n\n      var string = String(item);\n\n      if (/^\\d+ (?:second|minute|hour|day|week|month|year)s?$/.test(string))\n        return string;\n      else\n        throw CriticError(format(\"invalid %s argument; expected Date object or string specifying an interval\", argument));\n    }\n\n    this.interval_start = mangle(start);\n\n    if (end)\n      this.interval_end = mangle(end);\n  };\n\nCriticStatistics.prototype.setUser = function (user)\n  {\n    if (!(user instanceof CriticUser))\n      throw CriticError(\"invalid argument; expected User object\");\n    if (this.user)\n      throw CriticError(\"invalid use; user already set\");\n\n    this.user = user;\n  };\n\nCriticStatistics.prototype.addDirectory = function (directory)\n  {\n    if (!this.directories)\n      this.directories = [];\n\n    this.directories.push(/^(.*)\\/+$/.exec(String(directory))[1]);\n  };\n\nCriticStatistics.prototype.addFile = function (file)\n  {\n    if (!this.files)\n      this.files = [];\n\n    this.files.push(String(file));\n  };\n\nCriticStatistics.prototype.getReviewedLines = function (data)\n  {\n    var grouping = (data && data.grouping) || [\"user\"];\n\n    if (!(\"length\" in grouping))\n      throw CriticError(\"invalid data.grouping argument; expected Array object\");\n    else if (!grouping.length)\n      throw CriticError(\"invalid data.grouping argument; expected non-empty array\");\n\n    var grouping_columns = [];\n\n    for (var index = 0; index < grouping.length; ++index)\n      switch (grouping[index])\n      {\n      case \"user\":\n        grouping_columns.push(\"reviewfilechanges.uid\");\n        break;\n      case \"file\":\n        grouping_columns.push(\"reviewfiles.file\");\n        break;\n      case \"review\":\n        grouping_columns.push(\"reviewfiles.review\");\n        break;\n      default:\n        throw CriticError(format(\"invalid data.grouping[%d] value; expected 'user', 'file' or 'review'\", index));\n      }\n\n    grouping_columns = grouping_columns.join(\", \");\n\n    var filteredfiles_join;\n\n    if (this.directories || this.files)\n      filteredfiles_join = \" filteredfiles JOIN reviewfiles ON (filteredfiles.file=reviewfiles.file)\";\n    else\n      filteredfiles_join = \" reviewfiles\";\n\n    var repository_join;\n\n    if (this.repository)\n      repository_join = \" JOIN reviews ON (reviews.id=reviewfiles.review) JOIN branches ON (branches.id=reviews.branch)\";\n    else\n      repository_join = \"\";\n\n    var query = \"SELECT \" + grouping_columns + \", SUM(deleted) AS deleted, SUM(inserted) AS inserted FROM\" + filteredfiles_join + \" JOIN reviewfilechanges ON (reviewfilechanges.file=reviewfiles.id)\" + repository_join + \" WHERE reviewfilechanges.state='performed' AND reviewfilechanges.to='reviewed'\";\n\n    var params = {};\n\n    if (this.review)\n    {\n      query += \" AND (reviewfiles.review=%(review.id)d)\";\n      params[\"review.id\"] = this.review.id;\n    }\n\n    if (this.repository)\n    {\n      query += \" AND (branches.repository=%(repository.id)d)\";\n      params[\"repository.id\"] = this.repository.id;\n    }\n\n    if (this.user)\n    {\n      query += \" AND (reviewfilechanges.uid=%(user.id)d)\";\n      params[\"user.id\"] = this.user.id;\n    }\n\n    if (this.directories || this.files)\n    {\n      db.execute(\"CREATE TEMPORARY TABLE filteredfiles ( file INTEGER PRIMARY KEY ) ON COMMIT DROP\");\n\n      if (this.directories)\n        for (var index = 0; index < this.directories.length; ++index)\n          db.execute(\"INSERT INTO filteredfiles (file) SELECT id FROM files WHERE path LIKE %s\", this.directories[index] + \"/%\");\n\n      if (this.files)\n        for (var index = 0; index < this.files.length; ++index)\n          db.execute(\"INSERT INTO filteredfiles (file) SELECT id FROM files WHERE MD5(path)=MD5(%s)\", this.files[index]);\n    }\n\n    if (this.interval_start)\n    {\n      if (typeof this.interval_start === \"object\")\n      {\n        query += \" AND (reviewfilechanges.time >= %(start)s::timestamp)\";\n        params[\"start\"] = Date.prototype.toSQLTimestamp.call(this.interval_start);\n      }\n      else\n      {\n        query += \" AND (reviewfilechanges.time >= now() - %(start)s::interval)\";\n        params[\"start\"] = this.interval_start;\n      }\n    }\n\n    if (this.interval_end)\n    {\n      if (typeof this.interval_end === \"object\")\n      {\n        query += \" AND (reviewfilechanges.time <= %(end)s::timestamp)\";\n        params[\"end\"] = Date.prototype.toSQLTimestamp.call(this.interval_end);\n      }\n      else\n      {\n        query += \" AND (reviewfilechanges.time <= now() - %(end)s::interval)\";\n        params[\"end\"] = this.interval_end;\n      }\n    }\n\n    query += \" GROUP BY \" + grouping_columns;\n\n    var result = db.execute(query, params);\n    var all_data = {};\n\n    for (var row_index = 0; row_index < result.length; ++row_index)\n    {\n      var row = result[row_index], data = all_data;\n\n      for (var column_index = 0; column_index < grouping.length - 1; ++column_index)\n        data = data[row[column_index]] || (data[row[column_index]] = {});\n\n      var counts = data[row[column_index]] || (data[row[column_index]] = Object.create(null, { deleteCount: { value: 0, writable: true },\n                                                                                               insertCount: { value: 0, writable: true }}));\n\n      counts.deleteCount += row.deleted;\n      counts.insertCount += row.inserted;\n    }\n\n    /* This drops the temporary table created above. */\n    db.rollback();\n\n    return all_data;\n  };\n\nCriticStatistics.prototype.getWrittenComments = function ()\n  {\n    var filteredfiles_join;\n\n    if (this.directories || this.files)\n      filteredfiles_join = \" filteredfiles JOIN commentchains ON (filteredfiles.file=commentchains.file)\";\n    else\n      filteredfiles_join = \" commentchains\";\n\n    var chains_query = \"SELECT uid, type, COUNT(id) AS count FROM\" + filteredfiles_join + \" WHERE state!='draft' AND state!='empty'\";\n    var comments_query = \"SELECT comments.uid AS uid, COUNT(comments.id) AS count, SUM(CHARACTER_LENGTH(comments.comment)) AS characters FROM\" + filteredfiles_join + \" JOIN comments ON (comments.chain=commentchains.id) WHERE comments.state='current'\";\n    var params = {};\n\n    if (this.review)\n    {\n      chains_query += \" AND (review=%(review.id)d)\";\n      comments_query +=  \" AND (review=%(review.id)d)\";\n      params[\"review.id\"] = this.review.id;\n    }\n\n    if (this.user)\n    {\n      chains_query += \" AND (uid=%(user.id)d)\";\n      comments_query += \" AND (comments.uid=%(user.id)d)\";\n      params[\"user.id\"] = this.user.id;\n    }\n\n    if (this.directories || this.files)\n    {\n      db.execute(\"CREATE TEMPORARY TABLE filteredfiles ( file INTEGER PRIMARY KEY ) ON COMMIT DROP\");\n\n      if (this.directories)\n        for (var index = 0; index < this.directories.length; ++index)\n          db.execute(\"INSERT INTO filteredfiles (file) SELECT id FROM files WHERE path LIKE %s\", this.directories[index] + \"/%\");\n\n      if (this.files)\n        for (var index = 0; index < this.files.length; ++index)\n          db.execute(\"INSERT INTO filteredfiles (file) SELECT id FROM files WHERE MD5(path)=MD5(%s)\", this.files[index]);\n    }\n\n    if (this.interval_start)\n    {\n      if (typeof this.interval_start === \"object\")\n      {\n        chains_query += \" AND (time >= %(start)s::timestamp)\";\n        comments_query += \" AND (comments.time >= %(start)s::timestamp)\";\n        params[\"start\"] = this.interval_start.toSQLTimestamp();\n      }\n      else\n      {\n        chains_query += \" AND (time >= now() - %(start)s::interval)\";\n        comments_query += \" AND (comments.time >= now() - %(start)s::interval)\";\n        params[\"start\"] = this.interval_start;\n      }\n    }\n\n    if (this.interval_end)\n    {\n      if (typeof this.interval_end === \"object\")\n      {\n        chains_query += \" AND (time <= %(end)s::timestamp)\";\n        comments_query += \" AND (comments.time <= %(end)s::timestamp)\";\n        params[\"end\"] = this.interval_end.toSQLTimestamp();\n      }\n      else\n      {\n        chains_query += \" AND (time <= now() - %(end)s::interval)\";\n        comments_query += \" AND (comments.time <= now() - %(end)s::interval)\";\n        params[\"end\"] = this.interval_end;\n      }\n    }\n\n    chains_query += \" GROUP BY uid, type\";\n    comments_query += \" GROUP BY comments.uid\";\n\n    var data = {};\n\n    function getPerUser(user_id)\n    {\n      return data[user_id] || (data[user_id] = Object.create(null, { raisedIssues: { value: 0, writable: true },\n                                                                     writtenNotes: { value: 0, writable: true },\n                                                                     totalComments: { value: 0, writable: true },\n                                                                     totalCharacters: { value: 0, writable: true }}));\n    }\n\n    var result = db.execute(chains_query, params);\n    for (var index = 0; index < result.length; ++index)\n    {\n      var row = result[index];\n      var per_user = getPerUser(row.uid)\n\n      if (row.type == \"issue\")\n        per_user.raisedIssues += row.count;\n      else\n        per_user.writtenNotes += row.count;\n    }\n\n    var result = db.execute(comments_query, params);\n    for (var index = 0; index < result.length; ++index)\n    {\n      var row = result[index];\n      var per_user = getPerUser(row.uid)\n\n      per_user.totalComments += row.count;\n      per_user.totalCharacters += row.characters;\n    }\n\n    /* This drops the temporary table created above. */\n    db.rollback();\n\n    return data;\n  };\n"
  },
  {
    "path": "src/library/js/v8/critic-storage.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2013 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\n\"use strict\";\n\nfunction CriticStorage(user)\n{\n  this.has = function (key)\n    {\n      if (key.length > 64)\n        throw CriticError(format(\"%s: key length exceeds 64 characters\", key));\n\n      var result = db.execute(\"SELECT 1 FROM extensionstorage WHERE extension=%d AND uid=%d AND key=%s\", extension_id, user.id, key)[0];\n      if (result)\n        return true;\n      else\n        return false;\n    };\n\n  this.get = function (key)\n    {\n      if (key.length > 64)\n        throw CriticError(format(\"%s: key length exceeds 64 characters\", key));\n\n      var result = db.execute(\"SELECT text FROM extensionstorage WHERE extension=%d AND uid=%d AND key=%s\", extension_id, user.id, key)[0];\n      if (result)\n        return result.text;\n      else\n        return null;\n    };\n\n  this.set = function (key, text)\n    {\n      if (key.length > 64)\n        throw CriticError(format(\"%s: key length exceeds 64 characters\", key));\n\n      /* Roll the current transaction back first just to make sure we don't\n         commit anything else by mistake.  The current transaction (if there is\n         one) should be \"empty,\" and if it isn't, we don't want to keep it. */\n      db.rollback();\n\n      var result = db.execute(\"SELECT 1 FROM extensionstorage WHERE extension=%d AND uid=%d AND key=%s\", extension_id, user.id, key)[0];\n      if (result)\n        db.execute(\"UPDATE extensionstorage SET text=%s WHERE extension=%d AND uid=%d AND key=%s\", text, extension_id, user.id, key);\n      else\n        db.execute(\"INSERT INTO extensionstorage (extension, uid, key, text) VALUES (%d, %d, %s, %s)\", extension_id, user.id, key, text);\n\n      /* This code has a possible race-condition: someone else might insert a\n         row for this extension-user-key triple between the SELECT and the\n         INSERT above.  It would be excessively unlikely, though, and it would\n         simply cause the transaction commit to fail with a constraint violation\n         error (the extension-user-key triple is the table's primary key.) */\n      db.commit();\n    };\n\n  this.remove = function (key)\n    {\n      /* Roll the current transaction back first just to make sure we don't\n         commit anything else by mistake.  The current transaction (if there is\n         one) should be \"empty,\" and if it isn't, we don't want to keep it. */\n      db.rollback();\n      db.execute(\"DELETE FROM extensionstorage WHERE extension=%d AND uid=%d AND key=%s\", extension_id, user.id, key);\n      db.commit();\n    };\n\n  this.list = function (data)\n    {\n      var condition, value;\n\n      if (!data)\n      {\n        condition = \"%s\";\n        value = true;\n      }\n      else if (data.like)\n      {\n        condition = \"key LIKE %s\";\n        value = data.like;\n      }\n      else if (data.regexp)\n      {\n        condition = \"key ~ %s\";\n        value = data.regexp;\n      }\n      else\n        throw new CriticError(\"invalid arguments\");\n\n      var result = db.execute(\"SELECT key FROM extensionstorage WHERE extension=%d AND uid=%d AND \" + condition + \" ORDER BY key ASC\", extension_id, user.id, value);\n\n      return result.apply(function (key) { return key; });\n    };\n}\n"
  },
  {
    "path": "src/library/js/v8/critic-text.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2013 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\n\"use strict\";\n\nfunction reflow(text, line_length, indent)\n{\n  text = text.replace(\"\\r\", \"\\n\");\n  indent = indent || \"\";\n\n  /* Zero line-length means reflowing is disabled (by user configuration.) */\n  if (line_length == 0)\n    return text;\n\n  var paragraphs = text.split(/\\n\\n+/g);\n\nparagraph_loop:\n  for (var pindex = 0; pindex < paragraphs.length; ++pindex)\n  {\n    var lines = paragraphs[pindex].split(\"\\n\");\n\n    for (var lindex = 0; lindex < lines.length; ++lindex)\n    {\n      var line = lines[lindex];\n      if (/^[ \\t\\-*]/.test(line) || lindex != lines.length - 1 && line.length < line_length * .5)\n      {\n        /* Paragraph seems to be something other than plain text; don't reflow. */\n        if (indent)\n          paragraphs[pindex] = lines.map(function (line) { return indent + line; }).join(\"\\n\");\n        continue paragraph_loop;\n      }\n    }\n\n    var new_lines = [];\n    var new_line = indent;\n    var words = paragraphs[pindex].split(/(\\s+)/g);\n    var ws = \"\";\n\n    for (var windex = 0; windex < words.length; ++windex)\n    {\n      var word = words[windex];\n      if (!word.trim())\n        if (word.indexOf(\"\\n\") != -1)\n          ws = \" \";\n        else\n          ws = word;\n      else\n      {\n        if (new_line != indent)\n          if (new_line.length + ws.length + word.length > line_length)\n          {\n            new_lines.push(new_line);\n            new_line = indent;\n          }\n          else\n            new_line += ws;\n\n        new_line += word;\n      }\n    }\n\n    if (new_line)\n      new_lines.push(new_line);\n\n    paragraphs[pindex] = new_lines.join(\"\\n\");\n  }\n\n  return paragraphs.join(format(\"\\n%s\\n\", indent.trim()));\n}\n\nfunction repeat(s, n)\n{\n  return Array(n + 1).join(s);\n}\n\nfunction spaces(n)\n{\n  return repeat(\" \", n);\n}\n\n/* items = [(path, deleted, inserted), ...] */\nfunction renderFilesLines(items, indent)\n{\n  items = items.slice().sort(function (x, y) { x = x[0]; y = y[0]; switch (true) { case x < y: return -1; case x > y: return 1; default: return 0; } });\n\n  var path_width = Math.max.apply(null, items.map(function (item) { return item[0].length; }));\n  var delete_max = Math.max.apply(null, items.map(function (item) { return item[1]; }));\n  var delete_width = delete_max ? String(delete_max).length : 0;\n  var insert_max = Math.max.apply(null, items.map(function (item) { return item[2]; }));\n  var insert_width = insert_max ? String(insert_max).length : 0;\n\n  function renderLines(item)\n  {\n    var result;\n\n    if (item[1])\n      result = format(\"%s-%d\", spaces(delete_width - String(item[1]).length), item[1]);\n    else if (item[2])\n      result = spaces(delete_width + 1);\n\n    if (item[2])\n      result += format(\" %s+%d\", spaces(insert_width - String(item[2]).length), item[2]);\n\n    return result;\n  }\n\n  var item = items[0];\n  var path = item[0];\n  var result = format(\"%s%s%s %s\\n\", indent, path, spaces(path_width - path.length), renderLines(item));\n  var previous = path.split(\"/\");\n\n  for (var iindex = 1; iindex < items.length; ++iindex)\n  {\n    item = items[iindex];\n    path = item[0];\n\n    var components = path.split(\"/\");\n    var common_prefix_length = 0;\n\n    for (var cindex = 0, ccount = Math.min(previous.length, components.length); cindex < ccount; ++cindex)\n      if (previous[cindex] == components[cindex])\n        common_prefix_length += 1 + components[cindex].length;\n      else\n        break;\n\n    if (common_prefix_length > 4)\n      path = format(\"%s.../%s\", spaces(common_prefix_length - 4), path.substring(common_prefix_length));\n\n    result += format(\"%s%s%s %s\\n\", indent, path, spaces(path_width - path.length), renderLines(item));\n\n    previous = components;\n  }\n\n  return result;\n}\n"
  },
  {
    "path": "src/library/js/v8/critic-trackedbranch.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2014 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\n\"use strict\";\n\nfunction signalBranchTracker()\n{\n  if (branchtracker_pid_path)\n  {\n    var pid = parseInt(IO.File.read(branchtracker_pid_path));\n    var SIGHUP = 1;\n\n    OS.Process.kill(pid, SIGHUP);\n  }\n}\n\nfunction CriticTrackedBranch(id, data) {\n  data = data || {};\n\n  var self = this;\n  var branch_id, branch = data.branch || void 0;\n  var review_id, review = data.review || void 0;\n  var remote_name, disabled, pending;\n\n  this.id = id;\n\n  scoped(\n    db.execute((\"SELECT branches.id, remote, remote_name, disabled, updating,\" +\n                \"       next IS NULL AS pending\" +\n                \"  FROM trackedbranches\" +\n                \"  JOIN branches ON (branches.repository=trackedbranches.repository\" +\n                \"                AND branches.name=trackedbranches.local_name)\" +\n                \" WHERE trackedbranches.id=%d\"),\n               id),\n    function () {\n      var row = this[0];\n\n      if (!row)\n        throw CriticError(format(\"invalid tracked branch id: %d\", id));\n\n      branch_id = row.id;\n      remote_name = row.remote_name;\n      disabled = row.disabled;\n      pending = row.pending;\n\n      self.remote = row.remote;\n      self.forced = row.forced;\n      self.updating = row.updating;\n    });\n\n  if (!review) {\n    scoped(\n      db.execute((\"SELECT id\" +\n                  \"  FROM reviews\" +\n                  \" WHERE branch=%d\"),\n                 branch_id),\n      function () {\n        var row = this[0];\n\n        if (row)\n          review_id = row.id;\n      });\n  }\n\n  function getRemoteName() {\n    return remote_name;\n  }\n\n  function getDisabled() {\n    return disabled;\n  }\n\n  function getPending() {\n    return pending;\n  }\n\n  function getBranch() {\n    if (branch === void 0)\n      branch = new CriticBranch({ id: branch_id });\n    return branch;\n  }\n\n  function getReview() {\n    if (review === void 0) {\n      if (review_id !== void 0)\n        review = new CriticReview(review_id);\n      else\n        review = null;\n    }\n    return review;\n  }\n\n  this.enable = function (new_name) {\n    disabled = false;\n\n    if (new_name)\n      remote_name = new_name;\n    else\n      new_name = remote_name;\n\n    db.execute((\"UPDATE trackedbranches\" +\n                \"   SET remote_name=%s,\" +\n                \"       disabled=FALSE\" +\n                \" WHERE id=%d\"),\n               new_name, this.id);\n\n    this.triggerUpdate();\n  };\n\n  this.disable = function () {\n    disabled = true;\n\n    db.execute((\"UPDATE trackedbranches\" +\n                \"   SET disabled=TRUE\" +\n                \" WHERE id=%d\"),\n               this.id);\n    db.commit();\n  };\n\n  this.triggerUpdate = function () {\n    pending = true;\n\n    db.execute((\"UPDATE trackedbranches\" +\n                \"   SET next=NULL\" +\n                \" WHERE id=%d\"),\n               this.id);\n    db.commit();\n\n    signalBranchTracker();\n  };\n\n  Object.defineProperties(this, { name: { get: getRemoteName,\n                                          enumerable: true },\n                                  disabled: { get: getDisabled,\n                                              enumerable: true },\n                                  pending: { get: getPending,\n                                             enumerable: true },\n                                  branch: { get: getBranch,\n                                            enumerable: true },\n                                  review: { get: getReview,\n                                            enumerable: true } });\n  Object.freeze(this);\n}\n\nCriticTrackedBranch.prototype.getLogEntry = function (value) {\n  if (value === void 0)\n    value = this.branch.head.sha1;\n\n  var result = db.execute((\"  SELECT time, from_sha1, to_sha1, hook_output, \" +\n                           \"         successful\" +\n                           \"    FROM trackedbranchlog\" +\n                           \"   WHERE branch=%d\" +\n                           \"     AND to_sha1=%s\" +\n                           \"ORDER BY time DESC\" +\n                           \"   LIMIT 1\"),\n                          this.id, value);\n  var row = result[0];\n\n  if (!row)\n    return null;\n\n  var from_commit = null, to_commit = null;\n\n  if (row.from_sha1 && !/0{40}/.test(row.from_sha1))\n    from_commit = this.branch.repository.getCommit(row.from_sha1);\n  if (row.to_sha1 && !/0{40}/.test(row.to_sha1))\n    to_commit = this.branch.repository.getCommit(row.to_sha1);\n\n  return Object.freeze({ time: row.time,\n                         oldValue: from_commit,\n                         newValue: to_commit,\n                         hookOutput: row.hook_output,\n                         successful: row.successful });\n};\n\nCriticTrackedBranch.find = function (data) {\n  var result;\n\n  if (data.branch) {\n    result = db.execute((\"SELECT id\" +\n                         \"  FROM trackedbranches\" +\n                         \" WHERE repository=%d\" +\n                         \"   AND local_name=%s\"),\n                        data.branch.repository.id, data.branch.name);\n  } else if (data.remote && data.name) {\n    result = db.execute((\"SELECT id\" +\n                         \"  FROM trackedbranches\" +\n                         \" WHERE remote=%s\" +\n                         \"   AND remote_name=%s\"),\n                        data.remote, data.name);\n  } else {\n    throw CriticError(\"invalid input\");\n  }\n\n  var row = result[0];\n\n  if (row)\n    return new CriticTrackedBranch(row.id, data);\n  else\n    return null;\n};\n"
  },
  {
    "path": "src/library/js/v8/critic-user.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2013 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\n\"use strict\";\n\nfunction CriticFilter(user, repository, data)\n{\n  this.id = data.id;\n  this.user = user;\n  this.repository = repository;\n  this.path = data.path;\n  this.type = data.type;\n\n  if (this.type == \"reviewer\")\n    if (data.delegates)\n    {\n      this.delegates = data.delegates.split(/\\s*,\\s*|\\s+/g).map(\n        function (name)\n        {\n          return new CriticUser({ name: name });\n        });\n      Object.freeze(this.delegates);\n    }\n    else\n      this.delegates = [];\n  else\n    this.delegates = null;\n\n  Object.freeze(this);\n}\n\nfunction CriticUser(data)\n{\n  var user_id;\n\n  if (data && typeof data == \"object\" && data instanceof CriticUser)\n    return data;\n  else if (typeof data == \"number\")\n    user_id = data;\n  else if (data.id)\n    user_id = data.id;\n  else\n  {\n    var result, name;\n\n    if (typeof data == \"string\")\n      name = data;\n\n    if (name || data.name)\n    {\n      result = db.execute(\"SELECT id FROM users WHERE name=%s\", name || data.name)[0];\n\n      if (!result)\n        throw CriticError(format(\"%s: no such user\", name || data.name));\n    }\n    else\n      throw CriticError(\"invalid argument; dictionary must specify one of id and name\");\n\n    user_id = result.id;\n  }\n\n  var result = db.execute(\"SELECT name, useremails.email, verified, fullname \" +\n                            \"FROM users \" +\n                \" LEFT OUTER JOIN useremails ON (useremails.id=users.email) \" +\n                           \"WHERE users.id=%d\",\n                          user_id)[0];\n\n  if (!result)\n    throw CriticError(format(\"%d: invalid user ID\", user_id));\n\n  this.id = user_id;\n  this.name = result.name;\n  this.email = result.verified !== false ? result.email : null;\n  this.fullname = result.fullname;\n  this.isAnonymous = false;\n\n  if (data.name && this.name != data.name)\n    throw CriticError(\"invalid argument; name in dictionary doesn't match id in dictionary\");\n\n  result = null;\n\n  Object.freeze(this);\n}\n\nObject.defineProperties(CriticUser.prototype, {\n  toString: { value: function () { return format(\"%(fullname)s <%(email)s>\", this); } },\n  valueOf: { value: function () { return this.id; } },\n\n  getPreference: {\n    value: function (item)\n      {\n        var result = db.execute(\"SELECT type FROM preferences WHERE item=%s\", item)[0];\n\n        if (!result)\n          throw CriticError(format(\"%s: no such preference\", item));\n\n        var value_column, value_filter = function (value) { return value; };\n\n        switch (result.type)\n        {\n        case \"boolean\":\n          value_filter = Boolean;\n\n        case \"number\":\n          value_column = \"integer\";\n          break;\n\n        case \"string\":\n          value_column = \"string\";\n        }\n\n        var result = db.execute(\"  SELECT \" + value_column + \" AS value\" +\n                                \"    FROM userpreferences\" +\n                                \"   WHERE item=%s\" +\n                                \"     AND (uid=%d OR uid IS NULL)\" +\n                                \"     AND repository IS NULL\" +\n                                \"     AND filter IS NULL \" +\n                                \"ORDER BY uid NULLS LAST\",\n                                item, this.id)[0];\n\n        return value_filter(result.value);\n      }\n  },\n\n  isAuthor: {\n    value: function (commit)\n      {\n        if (this.isAnonymous)\n          return false;\n\n        return this.email == commit.author.email;\n      }\n  },\n\n  hasRole: {\n    value: function (role)\n      {\n        if (this.isAnonymous)\n          return false;\n\n        return Boolean(db.execute(\"SELECT 1 FROM userroles WHERE uid=%d AND role=%s\", this.id, role)[0]);\n      }\n  },\n\n  getFilters: {\n    value: function (repository)\n      {\n        if (this.isAnonymous)\n          throw CriticError(\"not supported; user is anonymous\");\n\n        if (!(repository instanceof CriticRepository))\n          throw CriticError(\"invalid argument: expected Repository object\");\n\n        var result = db.execute(\"SELECT id, \" +\n                                \"       path, \" +\n                                \"       type, \" +\n                                \"       delegate AS delegates \" +\n                                \"  FROM filters \" +\n                                \" WHERE uid=%d \" +\n                                \"   AND repository=%d\",\n                                this.id, repository.id);\n        var filters = [];\n\n        for (var index = 0; index < result.length; ++index)\n          filters.push(new CriticFilter(this, repository, result[index]));\n\n        Object.freeze(filters);\n\n        return filters;\n      }\n  },\n\n  addFilter: {\n    value: function (repository, path, filter_type, delegates)\n      {\n        if (this.isAnonymous)\n          throw CriticError(\"not supported; user is anonymous\");\n\n        if (!(repository instanceof CriticRepository))\n          throw CriticError(\"invalid argument: expected Repository object\");\n\n        path = String(path);\n        filter_type = String(filter_type);\n\n        if (filter_type != \"reviewer\" && filter_type != \"watcher\" && filter_type != \"ignored\")\n          throw CriticError(\"invalid argument: filter type must be 'reviewer', 'watcher' or 'ignored'\");\n        if (/[^\\/]\\*\\*|\\*\\*[^\\/]/.test(path))\n          throw CriticError(\"invalid wildcards in path argument\");\n\n        if (delegates)\n        {\n          delegates = String(delegates).split(/\\s*,\\s*|\\s+/g);\n          for (var index = 0; index < delegates.length; ++index)\n          {\n            if (db.execute(\"SELECT 1 FROM users WHERE name=%s\", delegates[index]).length == 0)\n              throw CriticError(format(\"invalid delegate '%s': no such user\", delegates[index]));\n          }\n          delegates = delegates.join(\",\");\n        }\n        if (!delegates)\n          delegates = null;\n\n        if (filter_type != \"reviewers\" && delegates)\n          throw CriticError(format(\"'%s' filter should not have delegates\", filter_type));\n\n        var filter_id = db.execute(\"INSERT INTO filters (repository, uid, path, type, delegate) \" +\n                                   \"     VALUES (%d, %d, %s, %s, %s) \" +\n                                   \"  RETURNING id\",\n                                   repository.id, this.id, path, filter_type, delegates);\n        db.commit();\n\n        return new CriticFilter(this, repository, { id: filter_id,\n                                                    path: path,\n                                                    type: filter_type,\n                                                    delegates: delegates });\n      }\n  }\n});\n\nfunction CriticAnonymousUser()\n{\n  this.id = null;\n  this.name = null;\n  this.email = null;\n  this.fullname = null;\n  this.isAnonymous = true;\n\n  Object.freeze(this);\n}\n\nCriticAnonymousUser.prototype = Object.create(CriticUser.prototype);\n\nObject.defineProperty(CriticUser, \"current\", { get: function () { return global.user; }, enumerable: true });\n"
  },
  {
    "path": "src/library/js/v8/critic.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2013 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\n\"use strict\";\n\nDate.prototype.format = function ()\n  {\n    return format(\"%04d-%02d-%02d %02d:%02d\", this.getFullYear(), this.getMonth() + 1, this.getDate(), this.getHours(), this.getMinutes());\n  };\n\nDate.prototype.toSQLTimestamp = function ()\n  {\n    return format(\"%04d-%02d-%02d %02d:%02d:%02d\", this.getFullYear(), this.getMonth() + 1, this.getDate(), this.getHours(), this.getMinutes(), this.getSeconds());\n  };\n\nfunction CriticError(message, exception)\n{\n  if (!this)\n    return new CriticError(message, exception);\n\n  this.name = \"CriticError\";\n  this.message = message;\n\n  if (exception)\n  {\n    this.message += \" (\" + exception.message + \")\";\n    this.stack = exception.stack;\n  }\n  else\n  {\n    try\n    {\n      /* Trigger a native exception to copy a stack trace from. */\n      \"foo\"();\n    }\n    catch (exception)\n    {\n      this.stack = exception.stack.replace(/^([^\\n]+\\n){2}/, \"\");\n    }\n  }\n}\n\nCriticError.prototype = Object.create(Error.prototype);\n\nif (!String.prototype.startsWith)\n{\n  String.prototype.startsWith = function (prefix)\n    {\n      return this.length >= prefix.length && this.substring(0, prefix.length) == prefix;\n    };\n}\n\nError.prepareStackTrace = function (error, callsites)\n  {\n    /* Calling methods on CallSite objects sometimes throws an exception saying\n       \"illegal access.\"  Unknown whether this is a bug in V8 or in v8-jsshell,\n       but the latter seems more likely.  This has in particular been observed\n       when calling the \"isConstructor\" method. */\n    function checked(obj, name)\n    {\n      try\n      {\n        return obj[name]();\n      }\n      catch (error)\n      {\n        return format(\"<%s: %s>\", name, String(error));\n      }\n    }\n\n    function describeCallSite(callsite)\n    {\n      if (!callsite || typeof callsite == \"string\")\n        return \"<unknown call-site>\";\n      if (callsite.isEval())\n        return format(\"eval at %s\", describeCallSite(callsite.getEvalOrigin()));\n\n      var filename = checked(callsite, \"getFileName\");\n      var where;\n\n      if (filename)\n      {\n        if (filename.startsWith(library_path + \"/\"))\n          filename = \"<Library>/\" + filename.substring(library_path.length + 1);\n        else\n          filename = \"<Extension>/\" + filename;\n\n        where = format(\"at %s:%d\", filename, checked(callsite, \"getLineNumber\"));\n      }\n      else if (checked(callsite, \"isNative\"))\n        where = \"in native code\";\n      else\n        where = \"at unknown location\";\n\n      var fn = checked(callsite, \"getFunction\");\n      var fnname = checked(callsite, \"getFunctionName\");\n      var what;\n\n      if (fnname)\n      {\n        if (checked(callsite, \"isConstructor\") === true)\n          what = format(\"new %s()\", fnname);\n        else\n          what = format(\"%s()\", fnname);\n      }\n      /* Oddly enough, top-level program code also has a function, whose\n         .toString() returns \"function <program source>\". */\n      else if (fn && String(fn).startsWith(\"function (\"))\n        what = \"<unnamed function>\";\n      else\n        what = \"<program code>\";\n\n      return format(\"%s %s\", what, where);\n    }\n\n    return callsites.map(describeCallSite).join(\"\\n\");\n  };\n\nvar configuration = {\n  maxCommits: 1024\n};\n\nModule.load(\"critic-user.js\");\nModule.load(\"critic-file.js\");\nModule.load(\"critic-git.js\");\nModule.load(\"critic-commitset.js\");\nModule.load(\"critic-changeset.js\");\nModule.load(\"critic-branch.js\");\nModule.load(\"critic-dashboard.js\");\nModule.load(\"critic-review.js\");\nModule.load(\"critic-comment.js\");\nModule.load(\"critic-batch.js\");\nModule.load(\"critic-filters.js\");\nModule.load(\"critic-mail.js\");\nModule.load(\"critic-text.js\");\nModule.load(\"critic-html.js\");\nModule.load(\"critic-storage.js\");\nModule.load(\"critic-log.js\");\nModule.load(\"critic-statistics.js\");\nModule.load(\"critic-trackedbranch.js\");\nModule.load(\"critic-cli.js\");\n\nModule.assign(\"CriticError\", CriticError);\nModule.assign(\"Error\", CriticError);\n\nModule.assign(\"User\", CriticUser);\nModule.assign(\"AnonymousUser\", CriticAnonymousUser);\nModule.assign(\"File\", CriticFile.find);\nModule.assign(\"Repository\", CriticRepository);\nModule.assign(\"Branch\", CriticBranch);\nModule.assign(\"Dashboard\", CriticDashboard);\nModule.assign(\"Review\", CriticReview);\nModule.assign(\"OldBatch\", CriticBatch);\nModule.assign(\"CommentChain\", CriticCommentChain);\nModule.assign(\"CommitSet\", CriticCommitSet);\nModule.assign(\"Filters\", CriticFilters);\nModule.assign(\"Statistics\", CriticStatistics);\nModule.assign(\"Storage\", CriticStorage);\nModule.assign(\"MailTransaction\", CriticMailTransaction);\nModule.assign(\"TrackedBranch\", CriticTrackedBranch);\nModule.assign(\"Log\", CriticLog);\nModule.assign(\"html\", CriticHtml);\nModule.assign(\"text\", Object.freeze({ reflow: reflow }));\n\nvar extras_dir = format(\"%s/extras\", Module.path);\nvar extra_modules = [];\n\nif (IO.File.isDirectory(extras_dir))\n{\n  IO.File.listDirectory(extras_dir).forEach(\n    function (module_name)\n    {\n      var module_dir = format(\"%s/%s\", extras_dir, module_name);\n      var module_main_js = format(\"%s/main.js\", module_dir);\n      if (IO.File.isDirectory(module_dir) &&\n          IO.File.isRegularFile(module_main_js))\n      {\n        var module = new Module();\n        module.load(module_main_js);\n        if (typeof module.name == \"string\")\n          module_name = module.name;\n        Module.assign(module_name, module);\n        extra_modules.push(module);\n      }\n    });\n}\n\nModule.assign(\"Changeset\", Object.freeze(Object.defineProperty(function () {}, \"prototype\", { value: CriticChangeset.prototype })));\nModule.assign(\"MergeChangeset\", Object.freeze(Object.defineProperty(function () {}, \"prototype\", { value: CriticMergeChangeset.prototype })));\nModule.assign(\"ChangesetLine\", Object.freeze(CriticChangesetLineConstants));\n\nModule.assign(\"printProfilingData\", function () {});\n\nvar global = {};\nvar db = null, dbparams = null;\n\nvar library_path;\nvar hostname;\nvar extension_id;\nvar user_id;\nvar authentication_labels;\nvar role;\nvar git_executable;\nvar python_executable;\nvar python_path;\nvar repository_work_copy_path;\nvar changeset_address;\nvar branchtracker_pid_path;\nvar maildelivery_pid_path;\nvar is_development;\n\nfunction setup(data)\n{\n  if (!db)\n  {\n    var dbparams = {};\n\n    if (data.dbname)\n      dbparams.dbname = data.dbname;\n    if (data.dbuser)\n      dbparams.user = data.dbuser;\n    if (data.dbpass)\n      dbparams.password = data.dbpass;\n\n    db = new PostgreSQL.Connection(dbparams);\n  }\n\n  if (data.user_id)\n  {\n    global.user = new CriticUser(data.user_id);\n    if (data.extension_id)\n    {\n      Module.assign(\"storage\", global.storage = new CriticStorage(global.user));\n      Module.assign(\"log\", global.log = new CriticLog(global.user));\n    }\n  }\n  else\n  {\n    global.user = new CriticAnonymousUser();\n  }\n\n  library_path = data.library_path;\n  hostname = data.hostname;\n  extension_id = data.extension_id;\n  user_id = data.user_id;\n  authentication_labels = data.authentication_labels;\n  role = data.role;\n  git_executable = data.git;\n  python_executable = data.python;\n  python_path = data.python_path;\n  repository_work_copy_path = data.repository_work_copy_path;\n  changeset_address = data.changeset_address;\n  branchtracker_pid_path = data.branchtracker_pid_path;\n  maildelivery_pid_path = data.maildelivery_pid_path;\n  is_development = data.is_development;\n\n  IO.File.chdir(data.extension_path);\n\n  var pagesize = parseInt((new OS.Process(\"getconf PAGESIZE\", { shell: true })).call());\n\n  if (data.rlimit.cpu)\n    OS.Process.setrlimit(\"cpu\", data.rlimit.cpu);\n  if (data.rlimit.rss)\n    OS.Process.setrlimit(\"rss\", data.rlimit.rss * (1024 * 1024) / pagesize);\n\n  var critic = this;\n\n  extra_modules.forEach(\n    function (module)\n    {\n      if (typeof module.global.setup == \"function\")\n        module.global.setup(critic, data);\n      module.close();\n    });\n}\n\nfunction shutdown()\n{\n  if (db)\n    db.close();\n\n  for (var index = 0; index < all_repositories.length; ++index)\n    all_repositories[index].shutdown();\n}\n\nfunction connect(data)\n{\n  dbparams = {};\n\n  if (data.dbname)\n    dbparams.dbname = data.dbname;\n  if (data.dbuser)\n    dbparams.user = data.dbuser;\n  if (data.dbpass)\n    dbparams.password = data.dbpass;\n\n  db = new PostgreSQL.Connection(dbparams);\n}\n\nfunction reconnect()\n{\n  db.reset();\n  db = new PostgreSQL.Connection(dbparams);\n}\n\nModule.assign(\"setup\", setup);\nModule.assign(\"shutdown\", shutdown);\nModule.assign(\"connect\", connect);\nModule.assign(\"reconnect\", reconnect);\n"
  },
  {
    "path": "src/linkify.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport re\n\nALL_LINKTYPES = []\n\nclass Context(object):\n    def __init__(self, db=None, request=None, repository=None, review=None, **kwargs):\n        self.db = db\n        self.request = request\n        self.repository = repository or (review.repository if review else None)\n        self.review = review\n        self.extra = kwargs\n\nclass LinkType(object):\n    \"\"\"\n    A link type object is responsible for providing a regexp fragment\n    that matches the words (or substrings) that the link type produces\n    hyper-links from, and for constructing actual URLs from such\n    words.\n    \"\"\"\n\n    def __init__(self, fragment):\n        \"\"\"\n        LinkType(regexp) -> link type object\n\n        Create a link type object and add it to the global list of\n        link type objects.  The 'fragment' argument should be a string\n        containing a regexp fragment without captures suitable to\n        insert into the complete regexp\n\n          (?:^|\\b)(wordA|wordB|...)(?:\\b|$)\n\n        which is then used to split text into \"words\" which are\n        individually turned into links or left as-is.\n        \"\"\"\n\n        self.fragment = fragment\n        self.fragment_regexp = re.compile(\"%s$\" % fragment)\n\n        ALL_LINKTYPES.append(self)\n\n    def match(self, word):\n        return bool(self.fragment_regexp.match(word))\n\n    def linkify(self, word):\n        \"\"\"\n        linkify(word) -> None or a string.\n\n        If the whole word matches what this link type handles,\n        constructs a URL to which this word should be made a link,\n        otherwise returns None.  Implementations should expect to be\n        called with words that don't match what they handle.\n\n        Sub-classes must override this method.\n        \"\"\"\n        pass\n\nclass SimpleLinkType(LinkType):\n    \"\"\"\n    Base class for link type when the word contains the URL.\n    \"\"\"\n\n    def __init__(self, fragment, regexp=None):\n        super(SimpleLinkType, self).__init__(fragment)\n        if isinstance(regexp, basestring):\n            self.regexp = re.compile(regexp)\n        else:\n            self.regexp = regexp\n\n    def linkify(self, word, context):\n        if self.regexp:\n            return self.regexp.match(word).group(1)\n        else:\n            return word\n\nclass HTTP(SimpleLinkType):\n    \"\"\"\n    Link type \"plain URL string\".\n    \"\"\"\n\n    def __init__(self):\n        super(HTTP, self).__init__(\"https?://\\\\S+[^\\\\s.,:;!?)]\")\n\nclass URL(SimpleLinkType):\n    \"\"\"\n    Link type <URL:...>.\n    \"\"\"\n\n    def __init__(self):\n        super(URL, self).__init__(\"<URL:[^>]+>\", \"<URL:([^>]+)>$\")\n\nclass SHA1(LinkType):\n    \"\"\"\n    SHA-1 link type.\n\n    Converts SHA-1 sums in text (either full or abbreviated) into\n    links to the diff of the referenced commit.  When processed in the\n    context of a repository, a matching commit in that repository is\n    preferred (assuming it exists.)  When processed in the context of\n    a review, a 'review=<id>' parameter is appended to the URL, which\n    links to the diff of the referenced commit in the context of the\n    review (which includes comments and allows reviewing.)\n    \"\"\"\n\n    def __init__(self):\n        super(SHA1, self).__init__(\"[0-9A-Fa-f]{8,40}\")\n\n    def linkify(self, word, context):\n        sha1 = word\n        if context.repository \\\n                and context.repository.iscommit(word):\n            sha1 = context.repository.revparse(sha1)\n            if context.review \\\n                    and context.review.containsCommit(context.db, sha1):\n                return \"/%s/%s?review=%d\" % (context.repository.name, sha1, context.review.id)\n            else:\n                return \"/%s/%s\" % (context.repository.name, sha1)\n        else:\n            return \"/%s\" % sha1\n\nclass Diff(LinkType):\n    \"\"\"\n    Diff link type.\n\n    Like the SHA-1 link type, but with two sums separated by '..', and\n    links to the diff between the two referenced commits.\n    \"\"\"\n\n    def __init__(self):\n        super(Diff, self).__init__(\"[0-9A-Fa-f]{8,40}\\\\.\\\\.[0-9A-Fa-f]{8,40}\")\n\n    def linkify(self, word, context):\n        from_sha1, _, to_sha1 = word.partition(\"..\")\n        if context.repository \\\n                and context.repository.iscommit(from_sha1) \\\n                and context.repository.iscommit(to_sha1):\n            from_sha1 = context.repository.revparse(from_sha1)\n            to_sha1 = context.repository.revparse(to_sha1)\n            if context.review \\\n                    and context.review.containsCommit(context.db, from_sha1) \\\n                    and context.review.containsCommit(context.db, to_sha1):\n                return \"/%s/%s..%s?review=%d\" % (context.repository.name, from_sha1, to_sha1, context.review.id)\n            else:\n                return \"/%s/%s..%s\" % (context.repository.name, from_sha1, to_sha1)\n        else:\n            return \"/%s..%s\" % (from_sha1, to_sha1)\n\nclass Review(LinkType):\n    \"\"\"\n    Review link type.\n\n    Converts 'r/<id>' in text into a link to the front-page of the\n    corresponding review.\n    \"\"\"\n\n    def __init__(self):\n        super(Review, self).__init__(\"r/\\\\d+\")\n\n    def linkify(self, word, context):\n        return \"/\" + word\n\nHTTP()\nURL()\nDiff()\nSHA1()\nReview()\n\ntry: import customization.linktypes\nexcept ImportError: pass\n"
  },
  {
    "path": "src/log/__init__.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\n"
  },
  {
    "path": "src/log/commitset.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport gitutils\n\nclass CommitSet:\n    def __init__(self, commits):\n        self.__commits = dict([(str(commit), commit) for commit in commits])\n        self.__merges = set()\n        self.__children = {}\n\n        parents = set()\n\n        for commit in self.__commits.values():\n            for parent in commit.parents:\n                parents.add(parent)\n                self.__children.setdefault(parent, set()).add(commit)\n            if len(commit.parents) > 1:\n                self.__merges.add(commit)\n\n        commit_set = set(self.__commits.values())\n\n        # Heads: commits that aren't the parent of a commit in the set.\n        self.__heads = commit_set - parents\n\n        # Tails: parent commits not included in the set.\n        self.__tails = parents - commit_set\n\n    def __contains__(self, commit):\n        return str(commit) in self.__commits\n\n    def __getitem__(self, key):\n        return self.__commits[str(key)]\n\n    def __len__(self):\n        return len(self.__commits)\n\n    def __iter__(self):\n        return iter(self.__commits.values())\n\n    def __repr__(self):\n        return repr(self.__commits)\n\n    def get(self, key):\n        return self.__commits.get(str(key))\n\n    def getHeads(self):\n        return self.__heads.copy()\n\n    def getTails(self):\n        return self.__tails.copy()\n\n    def getMerges(self):\n        return self.__merges.copy()\n\n    def getChildren(self, commit):\n        children = self.__children.get(commit)\n        if children: return children.copy()\n        else: return set()\n\n    def getParents(self, commit):\n        return set([self.__commits[sha1] for sha1 in commit.parents if sha1 in self.__commits])\n\n    def getFilteredTails(self, repository):\n        \"\"\"Return a set containing each tail commit of the set of commits that isn't an\nancestor of another tail commit of the set.  If the tail commits of the set\nare all different commits on an upstream branch, then this will return only\nthe latest one.\"\"\"\n\n        candidates = self.getTails()\n        result = set()\n\n        while candidates:\n            tail = candidates.pop()\n\n            eliminated = set()\n            for other in candidates:\n                base = repository.mergebase([tail, other])\n                if base == tail:\n                    # Tail is an ancestor of other: tail should not be included\n                    # in the returned set.\n                    break\n                elif base == other:\n                    # Other is an ancestor of tail: other should not be included\n                    # in the returned set.\n                    eliminated.add(other)\n            else:\n                result.add(tail)\n            candidates -= eliminated\n\n        return result\n\n    def getTailsFrom(self, commit):\n        \"\"\"\n        Return a set containing the each tail commit of the set of commits that\n        are ancestors of 'commit' and that are members of this commit set.\n\n        A tail commit of a set is a commit that is not a member of the set but\n        that is a parent of a commit that is a member of the set.\n        \"\"\"\n\n        assert commit in self.__commits\n\n        stack = set([commit.sha1])\n        processed = set()\n        tails = set()\n\n        while stack:\n            commit = self.__commits[stack.pop()]\n\n            if commit not in processed:\n                processed.add(commit)\n\n                for sha1 in commit.parents:\n                    parent = self.__commits.get(sha1)\n                    if parent: stack.add(parent)\n                    else: tails.add(sha1)\n\n        return tails\n\n    def getCommonAncestors(self, commit):\n        \"\"\"Return a set of each commit in this set that is an ancestor of each parent of\n'commit' (which must be a member of the set) or None if the parents of 'commit'\nhave no common ancestor within this set.\"\"\"\n\n        common_ancestors = set()\n        branches = []\n\n        for sha1 in commit.parents:\n            if sha1 not in self.__commits: return common_ancestors\n            branches.append(set())\n\n        for index, sha1 in enumerate(commit.parents):\n            stack = set([sha1])\n            branch = branches[index]\n\n            while stack:\n                commit = self.__commits.get(stack.pop())\n\n                if commit and commit not in branch:\n                    branch.add(commit)\n\n                    for other_index, other_branch in enumerate(branches):\n                        if commit not in other_branch: break\n                    else:\n                        common_ancestors.add(commit)\n                        continue\n\n                    stack.update(set(commit.parents))\n\n        return common_ancestors\n\n    def filtered(self, commits):\n        filtered = set()\n        commits = set(commits)\n\n        while commits:\n            commit = commits.pop()\n\n            if commit not in filtered:\n                filtered.add(commit)\n                commits.update(self.getParents(commit))\n\n        return CommitSet(filtered)\n\n    def without(self, commits):\n        \"\"\"\n        Return a copy of this commit set without 'commit' and any ancestors of\n        'commit' that don't have other descendants in the commit set.\n        \"\"\"\n\n        pending = set(filter(None, (self.__commits.get(str(commit)) for commit in commits)))\n        commits = self.__commits.copy()\n        children = self.__children.copy()\n\n        while pending:\n            commit = pending.pop()\n\n            del commits[commit]\n            if commit in children:\n                del children[commit]\n\n            for parent_sha1 in commit.parents:\n                if parent_sha1 in commits:\n                    children0 = children.get(parent_sha1, set())\n                    children0 -= set([commit])\n                    if not children0:\n                        pending.add(commits[parent_sha1])\n\n        return CommitSet(commits.values())\n\n    def isAncestorOf(self, ancestor, commit):\n        if ancestor == commit:\n            return False\n        else:\n            descendants = self.__children.get(ancestor, set()).copy()\n            pending = descendants.copy()\n\n            while pending and not commit in descendants:\n                descendant = pending.pop()\n                children = self.__children.get(descendant, set()) - descendants\n\n                descendants.update(children)\n                pending.update(children)\n\n            return commit in descendants\n\n    @staticmethod\n    def fromRange(db, from_commit, to_commit, commits=None):\n        repository = to_commit.repository\n        commits = set()\n\n        class NotPossible(Exception): pass\n\n        if commits:\n            def getCommit(sha1):\n                return commits[sha1]\n        else:\n            def getCommit(sha1):\n                return gitutils.Commit.fromSHA1(db, repository, sha1)\n\n        def process(iter_commit):\n            while iter_commit != from_commit and iter_commit not in commits:\n                commits.add(iter_commit)\n\n                if len(iter_commit.parents) > 1:\n                    # A merge commit.  Check if 'from_commit' is an ancestor of\n                    # all its parents.  If not, we don't support constructing a\n                    # commit-set from this range of commits (not because it is\n                    # particularly difficult, but because such a commit-set\n                    # would contain \"unexpected\" merged-in commits.)\n\n                    if from_commit.isAncestorOf(repository.mergebase(iter_commit)):\n                        map(process, [getCommit(sha1) for sha1 in iter_commit.parents])\n                        return\n                    else:\n                        raise NotPossible\n                elif iter_commit.parents:\n                    iter_commit = getCommit(iter_commit.parents[0])\n                else:\n                    return\n\n        if from_commit == to_commit:\n            return CommitSet([to_commit])\n\n        try:\n            process(to_commit)\n            return CommitSet(commits)\n        except NotPossible:\n            return None\n"
  },
  {
    "path": "src/log/html.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nfrom gitutils import Commit\nfrom htmlutils import htmlify, jsify\nfrom profiling import Profiler\nfrom time import time, mktime, strftime, localtime\n\nimport re\nimport log.commitset\n\ndef formatWhen(when):\n    def relative_time(delta, time_unit_singular):\n        time_unit = time_unit_singular\n        if delta > 1:\n            time_unit += \"s\"\n        return \"%d %s ago\" % (delta, time_unit)\n    def inner(when):\n        delta = int(time() - mktime(when))\n        if delta < 60: return relative_time(delta, \"second\")\n        elif delta < 60 * 60: return relative_time(delta / 60, \"minute\")\n        elif delta < 60 * 60 * 24: return relative_time(delta / (60 * 60), \"hour\")\n        elif delta < 60 * 60 * 24 * 30: return relative_time(delta / (60 * 60 * 24), \"day\")\n        else: return strftime(\"%Y-%m-%d\", localtime(mktime(when)))\n    return inner(when).replace(\" \", \"&nbsp;\")\n\ndef renderWhen(target, when):\n    target.innerHTML(formatWhen(when))\n\ndef linkToCommit(commit, overrides={}):\n    if \"review\" in overrides:\n        review = overrides[\"review\"]\n        if \"replayed_rebase\" in overrides:\n            return \"%s..%s?review=%d&conflicts=yes\" % (overrides[\"replayed_rebase\"].sha1[:8], commit.sha1[:8], review.id)\n        return \"%s?review=%d\" % (commit.sha1[:8], review.id)\n    return \"%s/%s\" % (commit.repository.name, commit.sha1)\n\nre_remote_into_local = re.compile(\"^Merge (?:branch|commit) '([^']+)' of [^ ]+ into \\\\1$\")\nre_side_into_main = re.compile(\"^Merge (?:remote )?(?:branch|commit) '[^']+' into .+$\")\nre_octopus = re.compile(\"^Merge ((?:(?:branches|,| and) '[^']+')+) into [^ ]+$\")\n\nclass WhenColumn:\n    def className(self, db, commit):\n        return \"when\"\n    def heading(self, target):\n        target.text(\"When\")\n    def render(self, db, commit, target, overrides={}):\n        renderWhen(target, commit.committer.time)\n\nclass TypeColumn:\n    def className(self, db, commit):\n        return \"type\"\n    def heading(self, target):\n        target.text()\n    def render(self, db, commit, target, overrides={}):\n        if \"type\" in overrides: target.text(overrides[\"type\"])\n        if len(commit.parents) > 1: target.text(\"Merge\")\n        else: target.text()\n\nclass SummaryColumn:\n    def __init__(self, linkToCommit=linkToCommit):\n        self.linkToCommit = linkToCommit\n        self.isFixupOrSquash = None\n    def className(self, db, commit):\n        return \"summary clickable\"\n    def heading(self, target):\n        target.text(\"Summary\")\n    def render(self, db, commit, target, overrides={}):\n        summary = overrides.get(\"summary\", commit.summary())\n        classnames = ([\"commit\", \"clickable-target\"] +\n                      overrides.get(\"summary_classnames\", []))\n\n        if self.isFixupOrSquash is not None:\n            data = self.isFixupOrSquash(commit)\n            if data:\n                what, ref = data\n                target.span(what, critic_ref=ref).text(\"[%s] \" % what)\n                lines = commit.message.splitlines()[1:]\n                while lines and not lines[0].strip():\n                    lines.pop(0)\n                if lines: summary = lines[0]\n                else: summary = None\n                if not summary:\n                    classnames.append(\"nocomment\")\n                    summary = \"(no comment)\"\n\n        url = self.linkToCommit(commit, overrides)\n\n        if summary:\n            target.a(\" \".join(classnames), href=url).text(summary)\n\nclass AuthorColumn:\n    def __init__(self):\n        self.cache = {}\n    def className(self, db, commit):\n        return \"author\"\n    def heading(self, target):\n        target.text(\"Author\")\n    def render(self, db, commit, target, overrides={}):\n        if \"author\" in overrides:\n            fullname = overrides[\"author\"].fullname\n        else:\n            fullname = commit.author.getFullname(db)\n        target.text(fullname)\n\nDEFAULT_COLUMNS = [(10, WhenColumn()),\n                   (5, TypeColumn()),\n                   (65, SummaryColumn()),\n                   (20, AuthorColumn())]\n\ndef render(db, target, title, branch=None, commits=None, columns=DEFAULT_COLUMNS, title_right=None, listed_commits=None, rebases=None, branch_name=None, bottom_right=None, review=None, highlight=None, profiler=None, collapsable=False, user=None, extra_commits=None):\n    addResources(target)\n\n    if not profiler: profiler = Profiler()\n\n    profiler.check(\"log: start\")\n\n    if branch is not None:\n        repository = branch.repository\n        commits = branch.getCommits(db)[:]\n        commit_set = log.commitset.CommitSet(commits)\n    else:\n        assert commits is not None\n        repository = commits[0].repository if len(commits) else None\n        commit_set = log.commitset.CommitSet(commits)\n\n    profiler.check(\"log: commits\")\n\n    heads = commit_set.getHeads()\n    tails = commit_set.getTails()\n\n    rebase_old_heads = set()\n\n    if rebases:\n        class Rebase(object):\n            def __init__(self, rebase_id, old_head, new_head, user,\n                         new_upstream, equivalent_merge, replayed_rebase,\n                         target_branch_name):\n                self.id = rebase_id\n                self.old_head = equivalent_merge or old_head\n                self.new_head = new_head\n                self.user = user\n                self.new_upstream = new_upstream\n                self.equivalent_merge = equivalent_merge\n                self.replayed_rebase = replayed_rebase\n                self.target_branch_name = target_branch_name\n\n        # The first element in the tuples in 'rebases' is the rebase id, which\n        # is an ever-increasing serial number that we can use as an indication\n        # of the order in which the rebases were made.\n        rebases = [Rebase(*rebase) for rebase in sorted(rebases)]\n        rebase_old_heads = set(rebase.old_head for rebase in rebases)\n        heads -= rebase_old_heads\n\n        assert 0 <= len(heads) <= 1\n\n        if not heads:\n            heads = set([rebases[-1].new_head])\n\n    if repository:\n        target.addInternalScript(repository.getJS())\n\n    processed = set()\n    summaries = {}\n\n    for commit in commits:\n        summary = commit.summary().strip()\n        summaries[summary] = commit\n        summaries[commit.sha1] = commit\n\n    if extra_commits:\n        for commit in extra_commits:\n            summary = commit.summary().strip()\n            summaries[summary] = commit\n            summaries[commit.sha1] = commit\n\n    def isFixupOrSquash(commit):\n        key, _, summary = commit.summary().partition(\" \")\n\n        if key in (\"fixup!\", \"squash!\"):\n            what = key[:-1]\n        else:\n            return None\n\n        summary = summary.strip()\n        commit = summaries.get(summary)\n\n        if not commit and re.match(\"[0-9A-Fa-f]{40}$\", summary):\n            commit = summaries.get(summary)\n\n            if not commit:\n                try:\n                    sha1 = repository.revparse(summary)\n                    commit = Commit.fromSHA1(db, repository, sha1)\n                except Exception:\n                    pass\n\n            if commit:\n                summary = commit.summary()\n\n        return what, summary\n\n    for width, column in columns:\n        if isinstance(column, SummaryColumn):\n            column.isFixupOrSquash = isFixupOrSquash\n            break\n\n    def output(table, commit, overrides={}):\n        if commit not in processed:\n            classes = [\"commit\"]\n            row_id = None\n\n            if len(commit.parents) > 1:\n                classes.append(\"merge\")\n\n            if highlight == commit:\n                classes.append(\"highlight\")\n                row_id = commit.sha1\n\n            if review:\n                overrides[\"review\"] = review\n\n            row = table.tr(\" \".join(classes), id=row_id)\n            profiler.check(\"log: rendering: row\")\n            for index, (width, column) in enumerate(columns):\n                column.render(db, commit, row.td(column.className(db, commit)), overrides=overrides)\n                profiler.check(\"log: rendering: column %d\" % (index + 1))\n            processed.add(commit)\n\n            return row\n        else:\n            return None\n\n    cursor = db.cursor()\n\n    def emptyChangeset(child, parent=None):\n        if parent is None:\n            cursor.execute(\"\"\"SELECT 1\n                                FROM fileversions\n                                JOIN changesets ON (changesets.id=fileversions.changeset)\n                                JOIN reviewchangesets ON (reviewchangesets.changeset=changesets.id)\n                               WHERE changesets.child=%s\n                                 AND reviewchangesets.review=%s\"\"\",\n                           (child.getId(db), review.id))\n        else:\n            cursor.execute(\"\"\"SELECT 1\n                                FROM fileversions\n                                JOIN changesets ON (changesets.id=fileversions.changeset)\n                                JOIN reviewchangesets ON (reviewchangesets.changeset=changesets.id)\n                               WHERE changesets.parent=%s\n                                 AND changesets.child=%s\n                                 AND reviewchangesets.review=%s\"\"\",\n                           (parent.getId(db), child.getId(db), review.id))\n        return not cursor.fetchone()\n\n    def inner(target, head, tails, align='right', title=None, table=None, silent_if_empty=set(), upstream=None):\n        if not table:\n            table = target.table('log', align=align, cellspacing=0)\n\n            for width, column in columns: table.col(width=('%d%%' % width))\n\n            if title:\n                thead = table.thead()\n                row = thead.tr(\"title\")\n                header = row.td(\"h1\", colspan=len(columns)).h1()\n                header.text(title)\n                if callable(title_right):\n                    title_right(db, header.span(\"right\"))\n\n                row = thead.tr('headings')\n                for width, column in columns:\n                    column.heading(row.td(column.className(db, None)))\n            elif head is None or head in tails:\n                if upstream:\n                    tag = upstream.findInterestingTag(db)\n                    if tag: what = tag\n                    else: what = upstream.sha1[:8]\n                    message = \"Merged with base branch (%s).\" % what\n                else: message = \"Merged with base branch.\"\n\n                thead = table.thead()\n                row = thead.tr('basemerge')\n                row.td(colspan=len(columns), align='center').text(message)\n                return (None, None, None, False)\n\n        tbody = table.tbody()\n        commit = head\n\n        last_commit = None\n        skipped = True\n\n        while commit and commit not in tails:\n            suppress = False\n            optional_merge = False\n            listed = listed_commits is None or commit.getId(db) in listed_commits\n\n            if commit in silent_if_empty and emptyChangeset(commit):\n                # This is a clean automatically generated merge commit; pretend it isn't here at all.\n                suppress = True\n\n            if not suppress and not listed:\n                suppress = len(commit.parents) == 1\n                optional_merge = not suppress\n\n            if not suppress: commit_tr = output(tbody, commit)\n            else: commit_tr = None\n\n            if listed: skipped = False\n            last_commit = commit\n\n            if len(commit.parents) == 0:\n                break\n            elif len(commit.parents) == 1:\n                commit = commit_set.get(commit.parents[0])\n            elif len(commit.parents) > 1:\n                if len(commit.parents) > 2:\n                    common_ancestors = commit_set.getCommonAncestors(commit)\n                    match = re_octopus.match(commit.message.split(\"\\n\", 1)[0])\n\n                    if match:\n                        titles = re.findall(\"'([^']+)'\", match.group(1))\n                        if len(titles) != len(commit.parents):\n                            titles = None\n                    else:\n                        titles = None\n\n                    for index, sha1 in enumerate(commit.parents):\n                        if sha1 in commit_set:\n                            sublog = tbody.tr('sublog')\n                            inner_last_commit, inner_table, inner_tail, inner_skipped = inner(sublog.td(colspan=len(columns)), commit_set[sha1], common_ancestors, title=titles and titles[index] or None)\n                            if inner_skipped:\n                                sublog.remove()\n                                if optional_merge and commit_tr: commit_tr.remove()\n\n                    if not common_ancestors: return (None, None, None, False)\n\n                    commit = common_ancestors.pop()\n                    continue\n\n                parent1_sha1 = commit.parents[0]\n                parent2_sha1 = commit.parents[1]\n\n                parent1 = commit_set.get(parent1_sha1)\n                parent2 = commit_set.get(parent2_sha1)\n\n                # TODO: Try to remember what this code actually does, and why...\n                if parent1_sha1 in rebase_old_heads:\n                    if parent2:\n                        sublog = tbody.tr('sublog')\n                        inner_last_commit, inner_table, inner_tail, inner_skipped = \\\n                            inner(sublog.td(colspan=len(columns)), parent2, tails)\n                        if inner_skipped:\n                            sublog.remove()\n                            if optional_merge and commit_tr: commit_tr.remove()\n                    return (commit, table, parent1_sha1, False)\n                elif parent2_sha1 in rebase_old_heads:\n                    if parent1:\n                        sublog = tbody.tr('sublog')\n                        inner_last_commit, inner_table, inner_tail, inner_skipped = \\\n                            inner(sublog.td(colspan=len(columns)), parent1, tails)\n                        if inner_skipped:\n                            sublog.remove()\n                            if optional_merge and commit_tr: commit_tr.remove()\n                    return (commit, table, parent2_sha1, False)\n\n                if parent1 and parent2:\n                    common_ancestors = commit_set.getCommonAncestors(commit)\n\n                    merged_remote_into_local = re_remote_into_local.match(commit.summary()) or re_side_into_main.match(commit.summary())\n\n                    def rankPaths(commit, tails):\n                        shortest = None\n                        shortest_length = len(commit_set)\n                        longest = None\n                        longest_length = 0\n\n                        for sha1 in commit.parents:\n                            parent = commit_set[sha1]\n\n                            counted = set()\n                            pending = set([parent])\n\n                            while pending:\n                                candidate = pending.pop()\n\n                                if candidate in counted: continue\n                                if candidate in tails: continue\n\n                                counted.add(candidate)\n                                pending.update(commit_set.getParents(candidate))\n\n                            length = len(counted)\n\n                            if length < shortest_length:\n                                shortest = parent\n                                shortest_length = length\n                            if length >= longest_length:\n                                longest = parent\n                                longest_length = length\n\n                        return shortest, shortest_length, longest, longest_length\n\n                    show_merged, shortest_length, show_normal, longest_length = rankPaths(commit, common_ancestors | tails)\n                    display_parallel = False\n\n                    if merged_remote_into_local and shortest_length * 2 > longest_length:\n                        if len(common_ancestors) == 1 and len(commit_set.filtered([commit]).getTails()) == 1:\n                            display_parallel = True\n                        else:\n                            show_merged = parent2\n                            show_normal = parent1\n\n                    if display_parallel:\n                        all_empty = True\n\n                        for sha1 in commit.parents:\n                            sublog = tbody.tr('sublog')\n                            inner_last_commit, inner_table, inner_tail, inner_skipped = inner(sublog.td(colspan=len(columns)), commit_set[sha1], common_ancestors | tails)\n                            if inner_skipped:\n                                sublog.remove()\n                            else:\n                                all_empty = False\n\n                        if all_empty and optional_merge and commit_tr: commit_tr.remove()\n\n                        commit = common_ancestors.pop()\n                    else:\n                        sublog = tbody.tr('sublog')\n                        inner_last_commit, inner_table, inner_tail, inner_skipped = inner(sublog.td(colspan=len(columns)), show_merged, common_ancestors | tails)\n                        if inner_skipped:\n                            sublog.remove()\n                            if optional_merge and commit_tr: commit_tr.remove()\n\n                        commit = show_normal\n                else:\n                    if parent1: upstream_sha1 = parent2_sha1\n                    else: upstream_sha1 = parent1_sha1\n\n                    if not commit in silent_if_empty:\n                        # Merge with the base branch.\n                        inner(tbody.tr('sublog').td(colspan=len(columns)), None, None, upstream=Commit.fromSHA1(db, repository, upstream_sha1))\n\n                    if parent1: commit = parent1\n                    else: commit = parent2\n\n        return (last_commit, table, last_commit.parents[0] if last_commit and last_commit.parents else None, skipped)\n\n    class_name = \"paleyellow log\"\n\n    if collapsable:\n        class_name += \" collapsable\"\n\n    table = target.table(class_name, align='center', cellspacing=0)\n\n    for width, column in columns: table.col(width=('%d%%' % width))\n\n    thead = table.thead(\"title\")\n    row = thead.tr(\"title\")\n    header = row.td(\"h1\", colspan=len(columns)).h1()\n\n    error_message = None\n\n    if len(commit_set) == 0:\n        thead = table.thead()\n        row = thead.tr('error')\n        cell = row.td(colspan=len(columns), align='center')\n        cell.text(\"No commits. \")\n        if review:\n            cell.a(href=\"showtree?sha1=%s&review=%d\" % (review.branch.head_sha1, review.id)).text(\"[Browse tree]\")\n        return\n    elif len(heads) > 1:\n        error_message = \"Invalid commit set: Multiple heads.\"\n    elif len(heads) == 0:\n        error_message = \"Invalid commit set: No heads.\"\n\n    if error_message is not None:\n        thead = table.thead()\n        row = thead.tr('error')\n        cell = row.td(colspan=len(columns), align='center')\n        cell.text(error_message)\n        return\n\n    head = heads.pop() if heads else None\n\n    row = thead.tr('headings')\n    for width, column in columns:\n        column.heading(row.td(column.className(db, None)))\n\n    first_rebase = True\n    silent_if_empty = set()\n\n    if rebases:\n        for rebase in rebases:\n            if rebase.equivalent_merge:\n                silent_if_empty.add(rebase.equivalent_merge)\n\n        top_rebases = []\n\n        while rebases and head == rebases[-1].new_head:\n            rebase = rebases.pop()\n            top_rebases.append((head, rebase))\n            head = rebase.old_head\n\n        for rebase_head, rebase in top_rebases:\n            thead = table.thead(\"rebase\")\n            row = thead.tr('rebase')\n            cell = row.td(colspan=len(columns), align='center')\n\n            if rebase.new_upstream is None and not rebase.target_branch_name:\n                cell.text(\"History rewritten\")\n            else:\n                cell.text(\"Branch rebased onto \")\n                if rebase.target_branch_name:\n                    anchor = cell.a(href=(\"/checkbranch?repository=%d&commit=%s\"\n                                          % (repository.id, rebase.target_branch_name)))\n                    anchor.text(rebase.target_branch_name)\n                else:\n                    upstream_description = repository.describe(db, rebase.new_upstream.sha1)\n                    if upstream_description is None:\n                        upstream_description = rebase.new_upstream.sha1[:8]\n                    anchor = cell.a(href=\"/%s/%s\" % (repository.name, rebase.new_upstream.sha1))\n                    anchor.text(upstream_description)\n\n            cell.text(\" by %s\" % rebase.user.fullname)\n\n            if first_rebase:\n                cell.text(\": \")\n                review_param = \"&review=%d\" % review.id if review else \"\"\n                cell.a(href=\"log?repository=%d&branch=%s%s\" % (repository.id, branch_name, review_param)).text(\"[actual log]\")\n\n                if user and user == rebase.user:\n                    cell.text(\" \")\n                    cell.a(href=\"javascript:revertRebase(%d)\" % rebase.id).text(\"[revert]\")\n\n                first_rebase = False\n            else:\n                cell.text(\".\")\n\n            if rebase.replayed_rebase and not emptyChangeset(parent=rebase.replayed_rebase,\n                                                             child=rebase.new_head):\n                output(table, rebase.new_head,\n                       overrides={ \"type\": \"Rebase\",\n                                   \"summary\": \"Changes introduced by rebase\",\n                                   \"summary_classnames\": [\"rebase\"],\n                                   \"author\": rebase.user,\n                                   \"replayed_rebase\": rebase.replayed_rebase })\n\n    while True:\n        # 'local_tails' is the set of commits that, when reached, should make\n        # inner() stop outputting commits and instead return.  This set of\n        # commits contains all the \"tails\" of the whole commit-set we're\n        # rendering (in the 'tails' set here), as well as the \"new head\" of the\n        # next rebase to be output.\n\n        local_tails = tails.copy()\n\n        if rebases:\n            local_tails.add(rebases[-1].new_head)\n\n        last_commit, table, tail, skipped = inner(\n            target, head, local_tails, 'center', title, table, silent_if_empty)\n\n        if rebases:\n            rebase = rebases.pop()\n\n            assert tail == rebase.new_head, \"tail (%s) != rebase.new_head (%s)\" % (tail, rebase.new_head)\n\n            while True:\n                head = rebase.old_head\n\n                thead = table.thead(\"rebase\")\n                row = thead.tr('rebase')\n                cell = row.td(colspan=len(columns), align='center')\n\n                if rebase.new_upstream is None and not rebase.target_branch_name:\n                    cell.text(\"History rewritten\")\n                else:\n                    cell.text(\"Branch rebased onto \")\n                    if rebase.target_branch_name:\n                        anchor = cell.a(href=(\"/checkbranch?repository=%d&commit=%s\"\n                                              % (repository.id, rebase.target_branch_name)))\n                        anchor.text(rebase.target_branch_name)\n                    else:\n                        upstream_description = repository.describe(db, rebase.new_upstream.sha1)\n                        if upstream_description is None:\n                            upstream_description = rebase.new_upstream.sha1[:8]\n                        anchor = cell.a(href=\"/%s/%s\" % (repository.name, rebase.new_upstream.sha1))\n                        anchor.text(upstream_description)\n\n                cell.text(\" by %s\" % rebase.user.fullname)\n\n                if first_rebase:\n                    cell.text(\": \")\n                    review_param = \"&review=%d\" % review.id if review else \"\"\n                    cell.a(href=\"log?repository=%d&branch=%s%s\" % (repository.id, branch_name, review_param)).text(\"[actual log]\")\n                    first_rebase = False\n                else:\n                    cell.text(\".\")\n\n                if rebase.replayed_rebase and not emptyChangeset(parent=rebase.replayed_rebase,\n                                                                 child=rebase.new_head):\n                    output(table, rebase.new_head,\n                           overrides={ \"type\": \"Rebase\",\n                                       \"summary\": \"Changes introduced by rebase\",\n                                       \"summary_classnames\": [\"rebase\"],\n                                       \"author\": rebase.user,\n                                       \"replayed_rebase\": rebase.replayed_rebase })\n\n                if rebases and rebases[-1].new_head == head:\n                    rebase = rebases.pop()\n                else:\n                    break\n\n            continue\n\n        if last_commit:\n            if len(last_commit.parents) == 1:\n                upstream = Commit.fromSHA1(db, repository, last_commit.parents[0])\n                upstream_description = repository.describe(db, upstream.sha1)\n\n                if not upstream_description:\n                    upstream_description = upstream.sha1[:8]\n\n                row = table.thead(\"rebase\").tr('upstream')\n                cell = row.td(colspan=len(columns), align='center')\n                cell.text(\"Based on: \")\n                anchor = cell.a(href=\"/%s/%s\" % (repository.name, upstream.sha1))\n                anchor.text(upstream_description)\n\n        if callable(bottom_right):\n            bottom_right(db, table.tfoot().tr().td(colspan=len(columns)))\n\n        break\n\n    profiler.check(\"log: rendering\")\n\n    if \"%d\" in title: header.text(title % len(processed))\n    else: header.text(title)\n\n    if callable(title_right):\n        title_right(db, header.span(\"right\"))\n\ndef renderList(db, target, title, commits, columns=DEFAULT_COLUMNS, title_right=None, bottom_right=None, hide_merges=False, className=\"log\"):\n    addResources(target)\n\n    table = target.table(className, align=\"center\", cellspacing=0)\n\n    for width, column in columns: table.col(width=(\"%d%%\" % width))\n\n    thead = table.thead()\n    title_h1 = None\n\n    if title:\n        row = thead.tr(\"title\")\n        title_h1 = row.td(\"h1\", colspan=len(columns)).h1()\n        title_h1.text(title)\n\n        row = thead.tr(\"headings\")\n        for width, column in columns:\n            column.heading(row.td(column.className(db, None)))\n\n    tbody = table.tbody()\n    merges = 0\n\n    for commit in commits:\n        classname = \"commit\"\n\n        if hide_merges:\n            is_merge = len(commit.parents) > 1\n            if is_merge:\n                classname += \" merge\"\n                merges += 1\n\n        row = tbody.tr(classname, id=commit.sha1)\n\n        for width, column in columns:\n            column.render(db, commit, row.td(column.className(db, commit)))\n\n    if merges and title_h1:\n        title_h1.a(href=\"javascript:void(0);\", onclick=\"showRelevantMerges(event);\").text(\"[Show %d merge commits]\" % merges)\n\n    if callable(title_right):\n        title_right(db, title_h1.span(\"right\"))\n\n    if callable(bottom_right):\n        bottom_right(db, table.tfoot().tr().td(colspan=len(columns)))\n\ndef addResources(target):\n    target.addExternalStylesheet(\"resource/log.css\")\n    target.addExternalScript(\"resource/log.js\")\n"
  },
  {
    "path": "src/mailutils.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport time\nimport os\nimport signal\nimport email.utils\n\nimport configuration\nimport dbutils\n\ndef generateMessageId(index=1):\n    now = time.time()\n\n    timestamp = time.strftime(\"%Y%m%d%H%M%S\", time.gmtime(now))\n    timestamp_ms = \"%04d\" % ((now * 10000) % 10000)\n\n    return \"%s.%s.%04d\" % (timestamp, timestamp_ms, index)\n\ndef queueMail(from_user, to_user, recipients, subject, body, message_id=None,\n              parent_message_id=None, headers=None):\n    if not message_id:\n        message_id = generateMessageId()\n\n    if headers is None:\n        headers = {}\n    else:\n        headers = headers.copy()\n\n    if parent_message_id:\n        parent_message_id = \"<%s@%s>\" % (parent_message_id, configuration.base.HOSTNAME)\n\n    filename = \"%s/%s_%s_%s.txt.pending\" % (configuration.paths.OUTBOX,\n                                            from_user.name, to_user.name,\n                                            message_id)\n\n    with open(filename, \"w\") as file:\n        print >> file, repr({ \"message_id\": message_id,\n                              \"parent_message_id\": parent_message_id,\n                              \"headers\": headers,\n                              \"time\": time.time(),\n                              \"from_user\": from_user,\n                              \"to_user\": to_user,\n                              \"recipients\": recipients,\n                              \"subject\": subject,\n                              \"body\": body })\n\n    return filename\n\nclass User:\n    def __init__(self, *args):\n        if len(args) == 1:\n            self.name = configuration.base.SYSTEM_USER_NAME\n            self.fullname, self.email = email.utils.parseaddr(args[0])\n        else:\n            self.name, self.email, self.fullname = args\n\n    def __repr__(self):\n        return \"User(%r, %r)\" % (self.email, self.fullname)\n\ndef sendMessage(recipients, subject, body):\n    from_user = User(configuration.base.SYSTEM_USER_NAME, configuration.base.SYSTEM_USER_EMAIL, \"Critic System\")\n    filenames = []\n\n    for to_user in recipients:\n        filenames.append(queueMail(from_user, to_user, recipients, subject, body))\n\n    sendPendingMails(filenames)\n\ndef sendAdministratorMessage(source, summary, message):\n    recipients = []\n\n    for recipient in configuration.base.SYSTEM_RECIPIENTS:\n        recipients.append(User(recipient))\n\n    sendMessage(recipients, \"%s: %s\" % (source, summary), message)\n\ndef sendAdministratorErrorReport(db, source, summary, message):\n    if db:\n        installed_sha1 = dbutils.getInstalledSHA1(db)\n    else:\n        installed_sha1 = \"<unknown>\"\n    sendAdministratorMessage(source, summary, \"\"\"\\\n\nCritic encountered an unexpected error.  If you know a series of steps that can\nreproduce this error it would be very useful if you submitted a bug report\nincluding the steps plus the information below (see bug reporting URL at the\nbottom of this e-mail).\n\n%(message)s\n\nCritic version: %(installed_sha1)s\nCritic bug reports can be filed here: https://github.com/jensl/critic/issues/new\n\"\"\" % { \"message\": message, \"installed_sha1\": installed_sha1 })\n\ndef sendExceptionMessage(db, source, exception):\n    lines = exception.splitlines()\n    sendAdministratorErrorReport(db, source, lines[-1], exception.rstrip())\n\ndef sendPendingMails(filenames):\n    for filename in filenames:\n        if filename.endswith(\".txt.pending\"):\n            os.rename(filename, filename[:-len(\".pending\")])\n\n    try:\n        pid = int(open(configuration.services.MAILDELIVERY[\"pidfile_path\"]).read().strip())\n        os.kill(pid, signal.SIGHUP)\n    except:\n        pass\n"
  },
  {
    "path": "src/maintenance/__init__.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2013 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n"
  },
  {
    "path": "src/maintenance/check-branches.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport os\nimport os.path\nimport argparse\nimport errno\n\nsys.path.insert(0, os.path.abspath(os.path.join(os.path.dirname(sys.argv[0]), \"..\")))\n\nimport dbutils\nimport gitutils\nimport log.commitset\nimport progress\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\"--exclude\", action=\"append\", help=\"exclude repository\")\nparser.add_argument(\"--include\", action=\"append\", help=\"include (only) repository\")\nparser.add_argument(\"--dry-run\", \"-n\", action=\"store_true\", help=\"don't touch the database or repositories\")\nparser.add_argument(\"--force\", \"-f\", action=\"store_true\", help=\"update the database and/or repositories\")\n\narguments = parser.parse_args()\n\nif not arguments.dry_run and not arguments.force:\n    print \"One of --dry-run/-n and --force/-f must be specified.\"\n    sys.exit(1)\nelif arguments.dry_run and arguments.force:\n    print \"Only one of --dry-run/-n and --force/-f can be specified.\"\n    sys.exit(1)\n\nforce = arguments.force\n\ndb = dbutils.Database.forSystem()\ncursor = db.cursor()\n\ndef getBranchCommits(repository, branch_id):\n    cursor = db.cursor()\n    cursor.execute(\"SELECT sha1 FROM commits JOIN reachable ON (commit=id) WHERE branch=%s\", (branch_id,))\n\n    return log.commitset.CommitSet(gitutils.Commit.fromSHA1(db, repository, sha1) for (sha1,) in cursor)\n\ndef getReview(branch_id):\n    cursor = db.cursor()\n    cursor.execute(\"SELECT id FROM reviews WHERE branch=%s\", (branch_id,))\n    return cursor.fetchone()[0]\n\ndef getReviewCommits(repository, review_id):\n    review_id = getReview(branch_id)\n\n    cursor.execute(\"\"\"SELECT changesets.child\n                        FROM changesets\n                        JOIN reviewchangesets ON (reviewchangesets.changeset=changesets.id)\n                       WHERE reviewchangesets.review=%s\"\"\",\n                   (review_id,))\n\n    return log.commitset.CommitSet(gitutils.Commit.fromId(db, repository, commit_id) for (commit_id,) in cursor)\n\ndef getReviewHead(repository, review_id):\n    commits = getReviewCommits(repository, review_id)\n    heads = commits.getHeads()\n\n    if len(heads) == 1: return heads.pop()\n\n    cursor = db.cursor()\n    cursor.execute(\"\"\"SELECT commits.sha1\n                        FROM commits\n                        JOIN reviewrebases ON (reviewrebases.old_head=commits.id)\n                       WHERE reviewrebases.review=%s\"\"\",\n                   (review_id,))\n\n    for (sha1,) in cursor: heads.remove(sha1)\n\n    if len(heads) == 1: return heads.pop()\n    else: return None\n\nif arguments.include:\n    cursor.execute(\"SELECT id FROM repositories WHERE name=ANY (%s)\", (arguments.include,))\nelse:\n    cursor.execute(\"SELECT id FROM repositories\")\nrepository_ids = cursor.fetchall()\n\nincorrect_reviews = []\n\nfor repository_id in repository_ids:\n    repository = gitutils.Repository.fromId(db, repository_id)\n\n    if arguments.exclude and repository.name in arguments.exclude:\n        print \"Repository: %s (skipped)\" % repository.name\n        continue\n\n    cursor.execute(\"\"\"SELECT branches.id, branches.name, branches.type, branches.base, commits.sha1\n                        FROM branches\n                        JOIN commits ON (commits.id=branches.head)\n                       WHERE branches.repository=%s\"\"\",\n                   (repository_id,))\n\n    branches = cursor.fetchall()\n    refs = {}\n    batch = []\n\n    try:\n        for line in open(os.path.join(repository.path, \"packed-refs\")):\n            if not line.startswith(\"#\"):\n                try:\n                    sha1, ref = line.split()\n                    if len(sha1) == 40 and ref.startswith(\"refs/heads/\"):\n                        refs[ref[11:]] = sha1\n                except ValueError:\n                    pass\n    except IOError as error:\n        if error.errno == errno.ENOENT: pass\n        else: raise\n\n    progress.start(len(branches), \"Repository: %s\" % repository.name)\n\n    heads_path = os.path.join(repository.path, \"refs\", \"heads\")\n\n    branches_in_db = set()\n\n    for branch_id, branch_name, branch_type, branch_base_id, branch_sha1 in branches:\n        progress.update()\n\n        branches_in_db.add(branch_name)\n\n        try:\n            try: repository_sha1 = open(os.path.join(heads_path, branch_name)).read().strip()\n            except: repository_sha1 = refs.get(branch_name)\n\n            if repository_sha1 != branch_sha1:\n                progress.write(\"NOTE[%s]: %s differs (db:%s != repo:%s)\" % (repository.name, branch_name, branch_sha1[:8], repository_sha1[:8]))\n\n                if branch_type == \"review\":\n                    head = getReviewHead(repository, getReview(branch_id))\n\n                    if not head:\n                        progress.write(\"  invalid review meta-data: r/%d\" % getReview(branch_id))\n                        continue\n\n                    if head.sha1 == branch_sha1:\n                        progress.write(\"  branches.head matches review meta-data; repository is wrong\")\n                        if force: repository.run(\"update-ref\", \"refs/heads/%s\" % branch_name, head.sha1, repository_sha1)\n                        progress.write(\"  repository updated\")\n                    elif head.sha1 == repository_sha1:\n                        progress.write(\"  repository matches review meta-data; branches.head is wrong\")\n                        if force: cursor.execute(\"UPDATE branches SET head=%s WHERE id=%s\", (head.getId(db), branch_id))\n                        db.commit()\n                    else:\n                        progress.write(\"  review meta-data matches neither branches.head nor repository\")\n                        incorrect_reviews.append((getReview(branch_id), \"review meta-data matches neither branches.head nor repository\"))\n                else:\n                    try:\n                        gitutils.Commit.fromSHA1(db, repository, branch_sha1)\n                        progress.write(\"  branches.head exists in repository\")\n                    except KeyboardInterrupt: sys.exit(1)\n                    except:\n                        progress.write(\"  branches.head not in repository; updating branches.head\")\n                        head = gitutils.Commit.fromSHA1(db, repository, repository_sha1)\n                        if force: cursor.execute(\"UPDATE branches SET head=%s WHERE id=%s\", (head.getId(db), branch_id))\n                        db.commit()\n                        continue\n\n                    try:\n                        commits = getBranchCommits(repository, branch_id)\n                        heads = commits.getHeads()\n\n                        if len(heads) > 1:\n                            progress.write(\"  reachable commit-set has multiple heads\")\n                            continue\n\n                        head = heads.pop()\n\n                        if head.sha1 == branch_sha1:\n                            progress.write(\"  reachable agrees with branches.head; repository is wrong\")\n                            if force: repository.run(\"update-ref\", \"refs/heads/%s\" % branch_name, head.sha1, repository_sha1)\n                            progress.write(\"  repository updated\")\n                        elif head.sha1 == repository_sha1:\n                            progress.write(\"  reachable agrees with repository; branches.head is wrong\")\n                            if force: cursor.execute(\"UPDATE branches SET head=%s WHERE id=%s\", (head.getId(db), branch_id))\n                            db.commit()\n                            continue\n                    except KeyboardInterrupt: sys.exit(1)\n                    except:\n                        progress.write(\"  reachable contains missing commits\")\n        except KeyboardInterrupt: sys.exit(1)\n        except:\n            progress.write(\"WARNING[%s]: %s missing!\" % (repository.name, branch_name))\n\n            if branch_type == \"normal\":\n                cursor.execute(\"SELECT id FROM branches WHERE base=%s\", (branch_id,))\n\n                sub_branches = cursor.fetchall()\n                if sub_branches:\n                    progress.write(\"  branch has sub-branches\")\n\n                    base_branch = dbutils.Branch.fromId(db, branch_base_id)\n\n                    for (sub_branch_id,) in sub_branches:\n                        sub_branch = dbutils.Branch.fromId(db, sub_branch_id)\n                        sub_branch.rebase(db, base_branch)\n                        progress.write(\"    rebased sub-branch %s\" % sub_branch.name)\n\n                try:\n                    if force:\n                        cursor.execute(\"DELETE FROM branches WHERE id=%s\", (branch_id,))\n                        db.commit()\n                    progress.write(\"  deleted from database\")\n                except KeyboardInterrupt: sys.exit(1)\n                except:\n                    progress.write(\"  failed to delete from database\")\n                    db.rollback()\n            else:\n                try: review_id = getReview(branch_id)\n                except KeyboardInterrupt: sys.exit(1)\n                except:\n                    progress.write(\"  review branch without review; deleting\")\n                    try:\n                        if force: cursor.execute(\"DELETE FROM branches WHERE id=%s\", (branch_id,))\n                        db.commit()\n                    except KeyboardInterrupt: sys.exit(1)\n                    except:\n                        progress.write(\"  failed to delete from database\")\n                        db.rollback()\n                    continue\n\n                try: commits = getReviewCommits(repository, getReview(branch_id))\n                except KeyboardInterrupt: sys.exit(1)\n                except:\n                    progress.write(\"  review meta-data references missing commits\")\n                    incorrect_reviews.append((getReview(branch_id), \"branches.head = %s\" % branch_sha1))\n                    continue\n\n                heads = commits.getHeads()\n\n                if len(heads) > 1:\n                    progress.write(\"  multiple heads: r/%d\" % review_id)\n                    continue\n\n                head = heads.pop()\n\n                try:\n                    if force: repository.run(\"update-ref\", \"refs/heads/%s\" % branch_name, head.sha1, \"0\" * 40)\n                    progress.write(\"  re-created review branch\")\n                except KeyboardInterrupt: sys.exit(1)\n                except:\n                    progress.write(\"  failed to re-create review branch\")\n                    incorrect_reviews.append((getReview(branch_id), \"failed to re-create review branch\"))\n\n    processed = set()\n\n    def exists_in_db(branch_name):\n        return branch_name in branches_in_db\n\n    def process(path, prefix=None):\n        for entry in os.listdir(path):\n            entry_path = os.path.join(path, entry)\n            branch_name = os.path.join(prefix, entry) if prefix else entry\n            if os.path.isdir(entry_path):\n                process(entry_path, branch_name)\n            elif not exists_in_db(branch_name):\n                progress.write(\"WARNING[%s]: %s exists in the repository but not in the database!\" % (repository.name, branch_name))\n                if force: repository.run(\"update-ref\", \"-d\", \"refs/heads/%s\" % branch_name)\n                progress.write(\"  deleted from repository\")\n            processed.add(branch_name)\n\n    for branch_name in refs.keys():\n        if branch_name not in processed and not exists_in_db(branch_name):\n            progress.write(\"WARNING[%s]: %s exists in the repository but not in the database!\" % (repository.name, branch_name))\n            if force: repository.run(\"update-ref\", \"-d\", \"refs/heads/%s\" % branch_name)\n            progress.write(\"  deleted from repository\")\n\n    process(heads_path)\n\n    progress.end(\".\")\n\nif incorrect_reviews:\n    print \"\\nReviews that need attention:\"\n\n    for review_id, message in incorrect_reviews:\n        print \"  %5d: %s\" % (review_id, message)\n"
  },
  {
    "path": "src/maintenance/check-commits.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport os\nimport cPickle\n\nsys.path.insert(0, os.path.abspath(os.path.join(os.path.dirname(sys.argv[0]), \"..\")))\n\nimport dbutils\nimport gitutils\nimport progress\n\ndb = dbutils.Database.forSystem()\ncursor = db.cursor()\n\ncommits = {}\npending_commits = set()\n\ncursor.execute(\"SELECT COUNT(*) FROM commits\")\n\nprint\n\nprogress.start(cursor.fetchone()[0], prefix=\"Fetching commits ...\")\n\ncursor.execute(\"SELECT id, sha1 FROM commits\")\n\nfor commit_id, commit_sha1 in cursor:\n    commits[commit_id] = commit_sha1\n    pending_commits.add(commit_id)\n\n    progress.update()\n\nprogress.end(\" %d commits.\" % len(commits))\n\nprint\n\ncursor.execute(\"SELECT MAX(CHARACTER_LENGTH(name)) FROM repositories\")\n\nrepository_name_length = cursor.fetchone()[0]\n\ncursor.execute(\"SELECT id FROM repositories ORDER BY id ASC\")\n\nrepositories = [repository_id for (repository_id,) in cursor]\n\ndef processCommits(process_commits):\n    global commits\n\n    processed_commits = set()\n\n    for commit_id in process_commits:\n        try:\n            gitobject = repository.fetch(commits[commit_id])\n            if gitobject.type == \"commit\": processed_commits.add(commit_id)\n        except gitutils.GitError:\n            pass\n        except KeyboardInterrupt:\n            sys.exit(1)\n        except:\n            raise\n\n        progress.update()\n\n    return processed_commits\n\nfor repository_id in repositories:\n    repository = gitutils.Repository.fromId(db, repository_id)\n    repository.disableCache()\n\n    cursor.execute(\"SELECT commit FROM reachable JOIN branches ON (branch=id) WHERE repository=%s\", (repository.id,))\n    process_commits = set(commit_id for (commit_id,) in cursor if commit_id in pending_commits)\n\n    progress.start(len(process_commits), \"Scanning repository: %-*s\" % (repository_name_length, repository.name))\n\n    processed_commits = processCommits(process_commits)\n\n    missing = len(process_commits) - len(processed_commits)\n    if missing:\n        message = \" %d commits found; %d commits missing!\" % (len(processed_commits), missing)\n    else:\n        message = \" %d commits found.\" % len(processed_commits)\n    progress.end(message)\n\n    pending_commits -= processed_commits\n\nif pending_commits:\n    print\n    print \"%d commits still unaccounted for.  Re-scanning all repositories.\" % len(pending_commits)\n    print\n\n    for repository_id in repositories:\n        repository = gitutils.Repository.fromId(db, repository_id)\n        repository.disableCache()\n\n        progress.start(len(pending_commits), \"Re-scanning repository: %-*s\" % (repository_name_length, repository.name))\n\n        processed_commits = processCommits(pending_commits)\n        pending_commits -= processed_commits\n\n        progress.end(\" %d commits found, %d remaining.\" % (len(processed_commits), len(pending_commits)))\n\n        if not pending_commits: break\n\n    if pending_commits:\n        cPickle.dump(pending_commits, open(\"commits-to-purge.pickle\", \"w\"), 2)\n\n        print\n        print \"%d commits that were not found in any repository should be purged.\" % len(pending_commits)\n        print \"Run purge-commits.py to do this.\"\n"
  },
  {
    "path": "src/maintenance/configtest.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport contextlib\nimport traceback\n\ndef reflow(text, indent):\n    try:\n        import textutils\n        return textutils.reflow(text, indent=indent)\n    except Exception:\n        # The 'textutils' module depends on 'configuration', so make our\n        # dependency on it conditional.\n        return text\n\nclass ConfigurationValue(object):\n    def __init__(self, module, name):\n        self.path = module.__file__\n        if self.path.endswith(\".pyc\") or self.path.endswith(\".pyo\"):\n            self.path = self.path[:-1]\n        self.name = name\n\nclass ConfigurationIssue(object):\n    def __init__(self, issue_type, message, values):\n        self.type = issue_type\n        self.message = message\n        self.values = values[:]\n\n    def __str__(self):\n        result = self.type.upper() + \"\\n\"\n        if self.values:\n            result += \"  Relating to settings:\\n\"\n            for value in self.values:\n                result += \"    %s :: %s\\n\" % (value.path, value.name)\n        result += \"  Message:\\n\"\n        result += reflow(self.message, indent=4)\n        return result\n\ndef doTestConfiguration():\n    \"\"\"Do not call directly; call testConfiguration()\"\"\"\n\n    import configuration\n\n    errors = []\n    warnings = []\n    values = []\n\n    def error(message):\n        errors.append(ConfigurationIssue(\"error\", message, values))\n    def warn(message):\n        warnings.append(ConfigurationIssue(\"warning\", message, values))\n\n    class MissingValue(Exception):\n        pass\n\n    @contextlib.contextmanager\n    def value(module, name):\n        values.append(ConfigurationValue(module, name))\n        if not hasattr(module, name):\n            error(\"Configuration value missing: %s.%s\" % (module.__name__, name))\n            raise MissingValue\n        try:\n            yield getattr(module, name)\n        finally:\n            del values[-1]\n\n    try:\n        with value(configuration.base, \"WEB_SERVER_INTEGRATION\") \\\n                as web_server_integration:\n            if web_server_integration not in (\"apache\", \"nginx+uwsgi\",\n                                              \"uwsgi\", \"none\"):\n                error(\"Invalid web server integration: must be one of \"\n                      \"'apache', 'nginx+uwsgi', 'uwsgi' and 'none'.\")\n    except MissingValue:\n        pass\n\n    def checkProvider(providers, name):\n        provider = providers[name]\n        if provider.get(\"enabled\"):\n            if not provider.get(\"client_id\"):\n                error(\"Enabled external authentication provider %r must have \"\n                      \"'client_id' set.\" % name)\n            if not provider.get(\"client_secret\"):\n                error(\"Enabled external authentication provider %r must have \"\n                      \"'client_secret' set.\" % name)\n            if name == \"google\" and not provider.get(\"redirect_uri\"):\n                error(\"Enabled external authentication provider %r must have \"\n                      \"'redirect_uri' set.\" % name)\n            if provider.get(\"bypass_createuser\") \\\n                    and provider.get(\"verify_email_addresses\"):\n                error(\"Enabled external authentication provider %r can't have \"\n                      \"both 'bypass_createuser' and 'verify_email_addresses' \"\n                      \"enabled.\" % name)\n\n    try:\n        with value(configuration.base, \"AUTHENTICATION_MODE\") \\\n                as authentication_mode:\n            if authentication_mode == \"critic\":\n                with value(configuration.base, \"SESSION_TYPE\") as session_type:\n                    if session_type not in (\"httpauth\", \"cookie\"):\n                        error(\"Invalid session type: must be one of 'httpauth' \"\n                              \"and 'cookie'.\")\n            elif authentication_mode != \"host\":\n                # Unconditional external authentication mode\n                with value(configuration.base, \"SESSION_TYPE\") as session_type:\n                    if session_type != \"cookie\":\n                        error(\"Invalid session type: must be 'cookie' (with \"\n                              \"external authentication.)\")\n                with value(configuration.auth, \"PROVIDERS\") as providers:\n                    if authentication_mode not in providers:\n                        error(\"Authentication mode must be 'host', 'critic' or \"\n                              \"name an external authentication provider.\")\n                    else:\n                        provider = providers[authentication_mode]\n                        if not provider.get(\"enabled\"):\n                            error(\"External authentication provider %r must be \"\n                                  \"enabled.\" % authentication_mode)\n                with value(configuration.base, \"REPOSITORY_URL_TYPES\") \\\n                        as repository_url_types:\n                    if \"http\" in repository_url_types:\n                        warn(\"HTTP/HTTPS repository URL type is incompatible \"\n                             \"with using an external authentication provider.\")\n    except MissingValue:\n        pass\n\n    try:\n        with value(configuration.auth, \"PROVIDERS\") as providers:\n            for name in providers.keys():\n                checkProvider(providers, name)\n    except MissingValue:\n        pass\n\n    return (errors, warnings)\n\ndef testConfiguration():\n    \"\"\"Test the system configuration\n\n       Returns a tuple containing two lists of ConfigurationIssue objects.  The\n       first list contains errors, the second warnings.  If the first list is\n       empty, the configuration should be usable.\"\"\"\n\n    try:\n        return doTestConfiguration()\n    except Exception:\n        error = ConfigurationIssue(\n            \"error\",\n            \"FATAL: Failed to test configuration!\\n\\n\" + traceback.format_exc(),\n            [])\n        return ([error], [])\n"
  },
  {
    "path": "src/maintenance/criticctl.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2013 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport argparse\n\nimport auth\nimport configuration\nimport dbutils\nimport inpututils\n\ndb = dbutils.Database.forSystem()\n\ncursor = db.cursor()\ncursor.execute(\"SELECT name FROM roles\")\n\nroles = [role for (role,) in cursor]\n\ndef valid_user(name):\n    try:\n        dbutils.User.fromName(db, name)\n    except dbutils.NoSuchUser:\n        return \"no such user\"\n\ndef valid_role(role):\n    if role not in roles:\n        return \"invalid role; must be one of %s\" % \", \".join(roles)\n\ndef invalid_user(name):\n    try:\n        dbutils.User.fromName(db, name)\n        return \"user exists\"\n    except dbutils.NoSuchUser:\n        pass\n\ndef check_argument(argument, check):\n    if argument and check:\n        error = check(argument)\n        if error:\n            print >>sys.stderr, \"%s: %s\" % (argument, error)\n            sys.exit(-1)\n\ndef use_argument_or_ask(argument, prompt, check=None):\n    if argument:\n        check_argument(argument, check)\n        return argument\n    else:\n        return inpututils.string(prompt, check=check)\n\ndef listusers(argv):\n    formats = {\n        \"tuples\": {\n            \"pre\": \"# id, name, email, fullname, status\\n[\",\n            \"row\": \" (%r, %r, %r, %r, %r),\",\n            \"post\": \"]\",\n        },\n        \"dicts\":  {\n            \"pre\": \"[\",\n            \"row\": \" {'id': %r, 'name': %r, 'email': %r, 'fullname': %r, 'status': %r},\",\n            \"post\": \"]\",\n        },\n        \"table\": {\n            \"pre\": \"  id |    name    |              email             |            fullname            | status\\n\" \\\n                   \"-----+------------+--------------------------------+--------------------------------+--------\",\n            \"row\": \"%4u | %10s | %30s | %-30s | %s\",\n            \"post\": \"\",\n        },\n    }\n\n    parser = argparse.ArgumentParser(\n        description=\"Critic administration interface: listusers\",\n        prog=\"criticctl [options] listusers\")\n\n    parser.add_argument(\"--format\", \"-f\", choices=formats.keys(), default=\"table\",\n                        help='output format (defaults to \"table\")')\n\n    arguments = parser.parse_args(argv)\n\n    cursor.execute(\"\"\"SELECT users.id, name, useremails.email, fullname, status\n                        FROM users\n             LEFT OUTER JOIN useremails ON (useremails.id=users.email)\n                    ORDER BY users.id\"\"\")\n\n    print formats[arguments.format][\"pre\"]\n    for row in cursor:\n        print formats[arguments.format][\"row\"] % row\n    print formats[arguments.format][\"post\"]\n\ndef adduser(argv):\n    class NoEmail:\n        pass\n    class NoPassword:\n        pass\n\n    parser = argparse.ArgumentParser(\n        description=\"Critic administration interface: adduser\",\n        prog=\"criticctl [options] adduser\")\n\n    parser.add_argument(\"--name\", help=\"user name\")\n    parser.add_argument(\"--email\", \"-e\", help=\"email address\")\n    parser.add_argument(\"--no-email\", dest=\"email\", action=\"store_const\",\n                        const=NoEmail, help=\"create user without email address\")\n    parser.add_argument(\"--fullname\", \"-f\", help=\"full name\")\n    parser.add_argument(\"--password\", \"-p\", help=\"password\")\n    parser.add_argument(\"--no-password\", dest=\"password\", action=\"store_const\",\n                        const=NoPassword, help=\"create user without password\")\n\n    arguments = parser.parse_args(argv)\n\n    name = use_argument_or_ask(arguments.name, \"Username:\", check=invalid_user)\n    fullname = use_argument_or_ask(arguments.fullname, \"Full name:\")\n\n    if arguments.email is NoEmail:\n        email = None\n    else:\n        email = use_argument_or_ask(arguments.email, \"Email address:\")\n        if not email.strip():\n            email = None\n\n    if arguments.password is NoPassword:\n        hashed_password = None\n    else:\n        if arguments.password is None:\n            password = inpututils.password(\"Password:\")\n        else:\n            password = arguments.password\n        hashed_password = auth.hashPassword(password)\n\n    dbutils.User.create(db, name, fullname, email, email_verified=None,\n                        password=hashed_password)\n\n    print \"%s: user added\" % name\n\ndef deluser(argv):\n    import reviewing.utils\n\n    parser = argparse.ArgumentParser(\n        description=\"Critic administration interface: deluser\",\n        prog=\"criticctl [options] deluser\")\n\n    parser.add_argument(\"--name\", help=\"user name\")\n\n    arguments = parser.parse_args(argv)\n\n    name = use_argument_or_ask(arguments.name, \"Username:\", check=valid_user)\n\n    reviewing.utils.retireUser(db, dbutils.User.fromName(db, name))\n\n    db.commit()\n\n    print \"%s: user retired\" % name\n\ndef role(command, argv):\n    parser = argparse.ArgumentParser(\n        description=\"Critic administration interface: %s\" % command,\n        prog=\"criticctl [options] %s\" % command)\n\n    parser.add_argument(\"--name\", help=\"user name\")\n    parser.add_argument(\"--role\", choices=roles, help=\"role name\")\n\n    arguments = parser.parse_args(argv)\n\n    name = use_argument_or_ask(arguments.name, \"Username:\", check=valid_user)\n    role = use_argument_or_ask(arguments.role, \"Role:\", check=valid_role)\n\n    user = dbutils.User.fromName(db, name)\n\n    cursor.execute(\"\"\"SELECT 1\n                        FROM userroles\n                       WHERE uid=%s\n                         AND role=%s\"\"\",\n                   (user.id, role))\n\n    if command == \"addrole\":\n        if cursor.fetchone():\n            print \"%s: user already has role '%s'\" % (name, role)\n        else:\n            cursor.execute(\"\"\"INSERT INTO userroles (uid, role)\n                                   VALUES (%s, %s)\"\"\",\n                           (user.id, role))\n            db.commit()\n\n            print \"%s: role '%s' added\" % (name, role)\n    else:\n        if not cursor.fetchone():\n            print \"%s: user doesn't have role '%s'\" % (name, role)\n        else:\n            cursor.execute(\"\"\"DELETE FROM userroles\n                                    WHERE uid=%s\n                                      AND role=%s\"\"\",\n                           (user.id, role))\n\n            db.commit()\n\n            print \"%s: role '%s' removed\" % (name, role)\n\ndef passwd(argv):\n    parser = argparse.ArgumentParser(\n        description=\"Critic administration interface: passwd\",\n        prog=\"criticctl [options] passwd\")\n\n    class NoPassword:\n        pass\n\n    parser.add_argument(\"--name\", help=\"user name\")\n    parser.add_argument(\"--password\", help=\"password\")\n    parser.add_argument(\"--no-password\", dest=\"password\", action=\"store_const\",\n                        const=NoPassword, help=\"delete the user's password\")\n\n    arguments = parser.parse_args(argv)\n\n    name = use_argument_or_ask(arguments.name, \"Username:\", check=valid_user)\n\n    if arguments.password is NoPassword:\n        hashed_password = None\n    else:\n        if arguments.password is None:\n            password = inpututils.password(\"Password:\")\n        else:\n            password = arguments.password\n        hashed_password = auth.hashPassword(password)\n\n    cursor.execute(\"\"\"UPDATE users\n                         SET password=%s\n                       WHERE name=%s\"\"\",\n                   (hashed_password, name))\n\n    db.commit()\n\n    if hashed_password:\n        print \"%s: password changed\" % name\n    else:\n        print \"%s: password deleted\" % name\n\ndef connect(command, argv):\n    parser = argparse.ArgumentParser(\n        description=\"Critic administration interface: %s\" % command,\n        prog=\"criticctl [options] %s\" % command)\n\n    providers = sorted(provider_name for provider_name, provider\n                       in configuration.auth.PROVIDERS.items()\n                       if command == \"disconnect\" or provider.get(\"enabled\"))\n\n    if len(providers) == 0:\n        print >>sys.stderr, \"No external authentication providers configured!\"\n        return 1\n\n    parser.add_argument(\"--name\", help=\"user name\")\n    parser.add_argument(\"--provider\", choices=providers,\n                        help=\"external authentication provider name\")\n\n    if command == \"connect\":\n        parser.add_argument(\"--account\", help=\"external account identifier\")\n\n    arguments = parser.parse_args(argv)\n\n    def valid_provider(provider):\n        if provider not in providers:\n            return (\"invalid authentication provider; must be one of %s\"\n                    % \", \".join(providers))\n\n    name = use_argument_or_ask(arguments.name, \"Username:\", check=valid_user)\n\n    if len(providers) == 1:\n        check_argument(arguments.provider, check=valid_provider)\n        provider = providers[0]\n    else:\n        provider = use_argument_or_ask(\n            arguments.provider, \"Authentication provider:\", check=valid_provider)\n\n    user = dbutils.User.fromName(db, name)\n    provider = auth.PROVIDERS[provider]\n\n    if command == \"connect\":\n        cursor.execute(\"\"\"SELECT 1\n                            FROM externalusers\n                           WHERE uid=%s\n                             AND provider=%s\"\"\",\n                       (user.id, provider.name))\n\n        if cursor.fetchone():\n            print >>sys.stderr, (\"%s: user already connected to a %s\"\n                                 % (user.name, provider.getTitle()))\n            return 1\n\n        account = use_argument_or_ask(\n            arguments.account, provider.getAccountIdDescription() + \":\")\n\n        cursor.execute(\"\"\"SELECT id, uid\n                            FROM externalusers\n                           WHERE provider=%s\n                             AND account=%s\"\"\",\n                       (provider.name, account))\n\n        row = cursor.fetchone()\n\n        if row:\n            external_id, user_id = row\n\n            if user_id is not None:\n                user = dbutils.User.fromId(db, user_id)\n\n                print >>sys.stderr, (\"%s %r: already connected to local user %s\"\n                                     % (provider.getTitle(), account, user.name))\n                return 1\n\n            cursor.execute(\"\"\"UPDATE externalusers\n                                 SET uid=%s\n                               WHERE id=%s\"\"\",\n                           (user.id, external_id))\n        else:\n            cursor.execute(\"\"\"INSERT INTO externalusers (uid, provider, account)\n                                   VALUES (%s, %s, %s)\"\"\",\n                           (user.id, provider.name, account))\n\n        print \"%s: connected to %s %r\" % (name, provider.getTitle(), account)\n    else:\n        cursor.execute(\"\"\"SELECT account\n                            FROM externalusers\n                           WHERE uid=%s\n                             AND provider=%s\"\"\",\n                       (user.id, provider.name))\n\n        row = cursor.fetchone()\n\n        if not row:\n            print >>sys.stderr, (\"%s: user not connected to a %s\"\n                                 % (name, provider.getTitle()))\n            return 1\n\n        account, = row\n\n        cursor.execute(\"\"\"DELETE FROM externalusers\n                                WHERE uid=%s\n                                  AND provider=%s\"\"\",\n                       (user.id, provider.name))\n\n        print (\"%s: disconnected from %s %r\"\n               % (name, provider.getTitle(), account))\n\n    db.commit()\n\n    return 0\n\ndef configtest(command, argv):\n    parser = argparse.ArgumentParser(\n        description=\"Critic administration interface: configtest\",\n        prog=\"criticctl [options] configtest\")\n\n    parser.add_argument(\"--quiet\", \"-q\", action=\"store_true\",\n                        help=\"Suppress non-error/warning output\")\n\n    arguments = parser.parse_args(argv)\n\n    import maintenance.configtest\n\n    errors, warnings = maintenance.configtest.testConfiguration()\n\n    def printIssue(issue):\n        print str(issue)\n        print\n\n    for error in errors:\n        printIssue(error)\n    for warning in warnings:\n        printIssue(warning)\n\n    if not errors:\n        if not arguments.quiet:\n            print \"System configuration valid.\"\n        return 0\n    else:\n        return 1\n\ndef restart(command, argv):\n    parser = argparse.ArgumentParser(\n        description=\"Critic administration interface: restart\",\n        prog=\"criticctl [options] restart\")\n\n    parser.parse_args(argv)\n\n    result = configtest(\"configtest\", [\"--quiet\"])\n\n    if result != 0:\n        print >>sys.stderr, \"ERROR: System configuration is not valid.\"\n        return result\n\n    import os\n    import subprocess\n\n    system_identity = configuration.base.SYSTEM_IDENTITY\n\n    try:\n        os.seteuid(0)\n        os.setegid(0)\n    except OSError:\n        print >>sys.stderr, \"ERROR: 'criticctl restart' must be run as root.\"\n        return 1\n\n    if configuration.base.WEB_SERVER_INTEGRATION == \"apache\":\n        web_server_service = \"apache2\"\n    elif configuration.base.WEB_SERVER_INTEGRATION in (\"nginx+uwsgi\", \"uwsgi\"):\n        web_server_service = \"uwsgi\"\n    else:\n        web_server_service = None\n\n    if web_server_service:\n        subprocess.check_call([\"service\", web_server_service, \"stop\"])\n    subprocess.check_call([\"service\", \"critic-\" + system_identity, \"restart\"])\n    if web_server_service:\n        subprocess.check_call([\"service\", web_server_service, \"start\"])\n\n    return 0\n\ndef stop(command, argv):\n    parser = argparse.ArgumentParser(\n        description=\"Critic administration interface: stop\",\n        prog=\"criticctl [options] stop\")\n\n    parser.parse_args(argv)\n\n    import os\n    import subprocess\n\n    system_identity = configuration.base.SYSTEM_IDENTITY\n\n    try:\n        os.seteuid(0)\n        os.setegid(0)\n    except OSError:\n        print >>sys.stderr, \"ERROR: 'criticctl stop' must be run as root.\"\n        return 1\n\n    if configuration.base.WEB_SERVER_INTEGRATION == \"apache\":\n        web_server_service = \"apache2\"\n    elif configuration.base.WEB_SERVER_INTEGRATION in (\"nginx+uwsgi\", \"uwsgi\"):\n        web_server_service = \"uwsgi\"\n    else:\n        web_server_service = None\n\n    if web_server_service:\n        subprocess.check_call([\"service\", web_server_service, \"stop\"])\n    subprocess.check_call([\"service\", \"critic-\" + system_identity, \"stop\"])\n\n    return 0\n\ndef interactive(command, argv):\n    try:\n        import IPython\n    except ImportError:\n        print >>sys.stderr, \"ERROR: IPython must be installed (import failed)\"\n        return 1\n\n    parser = argparse.ArgumentParser(\n        description=\"Critic administration interface: configtest\",\n        prog=\"criticctl [options] configtest\")\n\n    parser.add_argument(\"--user\", \"-u\", help=\"Impersonate this user\")\n\n    arguments = parser.parse_args(argv)\n\n    import api\n\n    critic = api.critic.startSession(for_user=bool(arguments.user),\n                                     for_system=not bool(arguments.user))\n    db = critic.database\n\n    if arguments.user:\n        db.setUser(dbutils.User.fromName(db, arguments.user))\n\n    IPython.embed()\n\n    return 0\n\ndef main(parser, show_help, command, argv):\n    returncode = 0\n\n    if show_help or command is None:\n        parser.print_help()\n    else:\n        if command == \"listusers\":\n            listusers(argv)\n            return 0\n        elif command == \"adduser\":\n            adduser(argv)\n            return 0\n        elif command == \"deluser\":\n            deluser(argv)\n            return 0\n        elif command in (\"addrole\", \"delrole\"):\n            role(command, argv)\n            return 0\n        elif command == \"passwd\":\n            passwd(argv)\n            return 0\n        elif command in (\"connect\", \"disconnect\"):\n            return connect(command, argv)\n        elif command == \"configtest\":\n            return configtest(command, argv)\n        elif command == \"restart\":\n            return restart(command, argv)\n        elif command == \"stop\":\n            return stop(command, argv)\n        elif command == \"interactive\":\n            return interactive(command, argv)\n        else:\n            print >>sys.stderr, \"ERROR: Invalid command: %s\" % command\n            returncode = 1\n\n    print \"\"\"\nAvailable commands are:\n\n  listusers List all users.\n  adduser   Add a user.\n  deluser   Retire a user.\n  addrole   Add a role to a user.\n  delrole   Remove a role from a user.\n  passwd    Set or delete a user's password.\n\n  connect    Set up connection between user and external authentication\n             provider.\n  disconnect Remove such connection.\n\n  configtest Test system configuration.\n  restart    Restart host WSGI container and Critic's background services.\n  stop       Stop host WSGI container and Critic's background services.\n\n  interactive Drop into an interactive IPython shell.\n\nUse 'criticctl COMMAND --help' to see per command options.\"\"\"\n\n    return returncode\n"
  },
  {
    "path": "src/maintenance/dumppreferences.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport os.path\n\nsys.path.insert(0, os.path.dirname(os.path.dirname(sys.argv[0])))\n\nfrom dbaccess import connect\n\ndb = connect()\ncursor = db.cursor()\n\ncursor.execute(\"SELECT item, type, default_integer, default_string, description FROM preferences\")\n\npreferences = cursor.fetchall()\n\ninstallpreferences_py = open(os.path.join(os.path.dirname(sys.argv[0]), \"installpreferences.py\"), \"w\")\n\nprint >>installpreferences_py, \"PREFERENCES = [ \",\n\nfor index, (item, type, default_integer, default_string, description) in enumerate(preferences):\n    if index != 0:\n        installpreferences_py.write(\"\"\",\n                \"\"\")\n\n    installpreferences_py.write(\"\"\"{ \"item\": %r,\n                  \"type\": %r,\"\"\" % (item, type))\n\n    if type == \"string\":\n        installpreferences_py.write(\"\"\"\n                  \"default_string\": %r,\"\"\" % default_string)\n    else:\n        installpreferences_py.write(\"\"\"\n                  \"default_integer\": %r,\"\"\" % default_integer)\n\n    installpreferences_py.write(\"\"\"\n                  \"description\": %r }\"\"\" % description)\n\nprint >>installpreferences_py, \" ]\"\nprint >>installpreferences_py\nprint >>installpreferences_py, \"def installPreferences(db, quiet):\"\nprint >>installpreferences_py, \"    cursor = db.cursor()\"\nprint >>installpreferences_py\nprint >>installpreferences_py, \"    for preference in PREFERENCES:\"\nprint >>installpreferences_py, \"        item = preference[\\\"item\\\"]\"\nprint >>installpreferences_py, \"        type = preference[\\\"type\\\"]\"\nprint >>installpreferences_py, \"        default_integer = preference.get(\\\"default_integer\\\")\"\nprint >>installpreferences_py, \"        default_string = preference.get(\\\"default_string\\\")\"\nprint >>installpreferences_py, \"        description = preference[\\\"description\\\"]\"\nprint >>installpreferences_py\nprint >>installpreferences_py, \"        cursor.execute(\\\"SELECT 1 FROM preferences WHERE item=%s\\\", (item,))\"\nprint >>installpreferences_py\nprint >>installpreferences_py, \"        if cursor.fetchone():\"\nprint >>installpreferences_py, \"            if not quiet: print \\\"Updating: %s\\\" % item\"\nprint >>installpreferences_py, \"            cursor.execute(\\\"UPDATE preferences SET type=%s, default_integer=%s, default_string=%s, description=%s WHERE item=%s\\\", (type, default_integer, default_string, description, item))\"\nprint >>installpreferences_py, \"        else:\"\nprint >>installpreferences_py, \"            if not quiet: print \\\"Adding:   %s\\\" % item\"\nprint >>installpreferences_py, \"            cursor.execute(\\\"INSERT INTO preferences (item, type, default_integer, default_string, description) VALUES (%s, %s, %s, %s, %s)\\\", (item, type, default_integer, default_string, description))\"\nprint >>installpreferences_py\nprint >>installpreferences_py, \"if __name__ == \\\"__main__\\\":\"\nprint >>installpreferences_py, \"    import sys\"\nprint >>installpreferences_py, \"    import os.path\"\nprint >>installpreferences_py\nprint >>installpreferences_py, \"    sys.path.insert(0, os.path.dirname(os.path.dirname(sys.argv[0])))\"\nprint >>installpreferences_py\nprint >>installpreferences_py, \"    import dbaccess\"\nprint >>installpreferences_py\nprint >>installpreferences_py, \"    db = dbaccess.connect()\"\nprint >>installpreferences_py\nprint >>installpreferences_py, \"    installPreferences(db, \\\"--quiet\\\" in sys.argv or \\\"-q\\\" in sys.argv)\"\nprint >>installpreferences_py\nprint >>installpreferences_py, \"    db.commit()\"\n"
  },
  {
    "path": "src/maintenance/progress.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\n\n__prefix = \"\"\n__current = 0\n__total = 0\n__previous = \"\"\n\ndef __output(string=None):\n    global __prefix, __current, __total, __previous\n\n    if string is None:\n        percent = int(round((100.0 * __current) / __total))\n        string = \"%s [%3d %%]\" % (__prefix, percent)\n\n    if string != __previous:\n        sys.stdout.write(\"\\r%s%s\" % (string, \" \" * (len(__previous) - len(string))))\n        sys.stdout.flush()\n\n        __previous = string\n\ndef start(total, prefix=\"\"):\n    global __prefix, __current, __total, __previous\n\n    __prefix = prefix\n    __current = 0\n    __total = total\n    __previous = \"\"\n\n    if total: __output()\n\ndef update(count=1):\n    global __current\n\n    __current += count\n    __output()\n\ndef end(message):\n    __output(__prefix + message)\n    sys.stdout.write(\"\\n\")\n\ndef write(string):\n    __output(string)\n    sys.stdout.write(\"\\n\")\n    __output()\n\nif __name__ == \"__main__\":\n    import time\n\n    start(1000, prefix=\"Testing: \")\n\n    for index in range(200):\n        time.sleep(0.01)\n        update(5)\n\n    end(\"Finished!\")\n"
  },
  {
    "path": "src/operation/__init__.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport traceback\n\nimport base\nimport dbutils\nimport extensions\n\nfrom textutils import json_encode, json_decode\n\nfrom operation.basictypes import (OperationResult, OperationError,\n                                  OperationFailure, OperationFailureMustLogin)\n\nfrom operation.typechecker import (Optional, Request, RestrictedString, SHA1,\n                                   RestrictedInteger, NonNegativeInteger,\n                                   PositiveInteger, Review, Repository, Commit,\n                                   File, User, Extension)\n\nclass Operation(object):\n\n    \"\"\"\n    Base class for operation implementations.\n\n    Sub-classes  must call Operation.__init__() to define the structure of\n    expected input data.\n\n    An operation accepts input in the form of a JSON object literal and returns\n    a result in the form of a JSON object literal.  The object contains a\n    property named \"status\" whose value should be \"ok\" or \"error\".  If it is\n    \"error\", the object contains a property named \"error\" whose value is an\n    error message.  If the HTTP request method is POST, the input is the request\n    body (this is the usual case) otherwise, if the HTTP request method is GET,\n    the input is the value of the \"data\" URI query parameter (this is supported\n    to simplify ad-hoc testing).\n\n    Operation implementations should inherit this class and implement the\n    process() method.  This method is called with two positional arguments, 'db'\n    and 'user', and one keyword argument per property in the input value.  The\n    process() method should return an OperationResult object or either return or\n    raise an OperationError object.  Any other raised exceptions are caught and\n    converted to OperationError objects.\n    \"\"\"\n\n    def __init__(self, parameter_types, accept_anonymous_user=False):\n        \"\"\"\n        Initialize input data type checker.\n\n        The parameter_types argument must be a dict object.  See TypeChecker and\n        sub-classes for details on how it works.  A parameter types argument of\n\n          { \"name\": str,\n            \"points\": [{\"x\": int, \"y\": int }],\n            \"what\": Optional(str) }\n\n        would for instance represents an input object with two required\n        properties named \"name\" and \"points\", and an optional property named\n        \"what\".  The \"name\" and \"what\" property values should be a strings.  The\n        \"points\" property value should be an array of objects, each with two\n        properties named \"x\" and \"y\", whose values should be integer.\n\n        The operation's process() method would be called with the keyword\n        arguments \"name\", \"points\" and \"what\".\n        \"\"\"\n        from operation.typechecker import TypeChecker\n        if not type(parameter_types) is dict:\n            raise base.ImplementationError(\"invalid source type\")\n        self.__checker = TypeChecker.make(parameter_types)\n        self.__accept_anonymous_user = accept_anonymous_user\n\n    def __call__(self, req, db, user):\n        import auth\n        from operation.typechecker import TypeCheckerContext\n\n        if user.isAnonymous() and not self.__accept_anonymous_user:\n            return OperationFailureMustLogin()\n\n        if req.method == \"POST\": data = req.read()\n        else: data = req.getParameter(\"data\")\n\n        if not data: raise OperationError(\"no input\")\n\n        try: value = json_decode(data)\n        except ValueError as error: raise OperationError(\"invalid input: %s\" % str(error))\n\n        try:\n            self.__checker(value, TypeCheckerContext(req, db, user))\n            return self.process(db, user, **value)\n        except OperationError as error:\n            return error\n        except OperationFailure as failure:\n            return failure\n        except dbutils.NoSuchUser as error:\n            return OperationFailure(code=\"nosuchuser\",\n                                    title=\"Who is '%s'?\" % error.name,\n                                    message=\"There is no user in Critic's database named that.\")\n        except dbutils.NoSuchReview as error:\n            return OperationFailure(code=\"nosuchreview\",\n                                    title=\"Invalid review ID\",\n                                    message=\"The review ID r/%d is not valid.\" % error.id)\n        except auth.AccessDenied as error:\n            return OperationFailure(code=\"accessdenied\",\n                                    title=\"Access denied\",\n                                    message=error.message)\n        except dbutils.TransactionRollbackError:\n            return OperationFailure(code=\"transactionrollback\",\n                                    title=\"Transaction rolled back\",\n                                    message=\"Your database transaction rolled back, probably due to a deadlock.  Please try again.\")\n        except extensions.extension.ExtensionError as error:\n            return OperationFailure(\n                code=\"invalidextension\",\n                title=\"Invalid extension\",\n                message=error.message)\n        except:\n            # Decode value again since the type checkers might have modified it.\n            value = json_decode(data)\n\n            error_message = (\"User: %s\\nReferrer: %s\\nData: %s\\n\\n%s\"\n                             % (user.name,\n                                req.getReferrer(),\n                                json_encode(self.sanitize(value), indent=2),\n                                traceback.format_exc()))\n\n            db.rollback()\n\n            import mailutils\n            import configuration\n\n            if not user.hasRole(db, \"developer\"):\n                mailutils.sendExceptionMessage(db, \"wsgi[%s]\" % req.path, error_message)\n\n            if configuration.debug.IS_DEVELOPMENT or user.hasRole(db, \"developer\"):\n                return OperationError(error_message)\n            else:\n                return OperationError(\"An unexpected error occurred.  \" +\n                                      \"A message has been sent to the system administrator(s) \" +\n                                      \"with details about the problem.\")\n\n    def process(self, *args, **kwargs):\n        raise OperationError(\"not implemented!?!\")\n\n    def sanitize(self, value):\n        \"\"\"Sanitize arguments value for use in error messages or logs.\"\"\"\n        return value\n\n    @staticmethod\n    def requireRole(db, role, user):\n        if not user.hasRole(db, role):\n            raise OperationFailure(\n                code=\"notallowed\",\n                title=\"Not allowed!\",\n                message=\"Operation not permitted, user that lacks role '%s'.\" % role)\n"
  },
  {
    "path": "src/operation/addrepository.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport subprocess\nimport os\nimport signal\n\nimport configuration\n\nimport gitutils\nimport htmlutils\nfrom operation import Operation, OperationResult, OperationFailure, Optional, RestrictedString\n\nclass AddRepository(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"name\": RestrictedString(allowed=lambda ch: ch != \"/\", minlength=1, maxlength=64, ui_name=\"short name\"),\n                                   \"path\": RestrictedString(minlength=1, ui_name=\"path\"),\n                                   \"mirror\": Optional({ \"remote_url\": RestrictedString(maxlength=256, ui_name=\"source repository\"),\n                                                        \"remote_branch\": str,\n                                                        \"local_branch\": str }) })\n\n    def process(self, db, user, name, path, mirror=None):\n        if not user.hasRole(db, \"repositories\"):\n            raise OperationFailure(code=\"notallowed\",\n                                   title=\"Not allowed!\",\n                                   message=\"Only users with the 'repositories' role can add new repositories.\")\n        if name.endswith(\".git\"):\n            raise OperationFailure(code=\"badsuffix_name\",\n                                   title=\"Invalid short name\",\n                                   message=\"The short name must not end with .git\")\n\n        if name == \"r\":\n            raise OperationFailure(code=\"invalid_name\",\n                                   title=\"Invalid short name\",\n                                   message=\"The short name 'r' is not allowed since corresponding /REPOSHORTNAME/{SHA1|BRANCH} URLs would conflict with r/REVIEW_ID URLs.\")\n\n        path = path.strip(\"/\").rsplit(\"/\", 1)\n\n        if len(path) == 2: base, repository_name = path\n        else: base, repository_name = None, path[0]\n\n        if base:\n            main_base_path = os.path.join(configuration.paths.GIT_DIR, base)\n        else:\n            main_base_path = configuration.paths.GIT_DIR\n\n        main_path = os.path.join(main_base_path, repository_name + \".git\")\n\n        cursor = db.cursor()\n        cursor.execute(\"\"\"SELECT name FROM repositories WHERE path=%s\"\"\", (main_path,))\n        row = cursor.fetchone()\n        if row:\n            raise OperationFailure(code=\"duplicaterepository\",\n                                   title=\"Duplicate repository\",\n                                   message=\"The specified path is already used by repository %s\" % row[0])\n        cursor.execute(\"\"\"SELECT name FROM repositories WHERE name=%s\"\"\", (name,))\n        row = cursor.fetchone()\n        if row:\n            raise OperationFailure(code=\"duplicateshortname\",\n                                   title=\"Duplicate short name\",\n                                   message=\"The specified short name is already in use, please select a different short name.\")\n\n        if not os.path.isdir(main_base_path):\n            os.makedirs(main_base_path, mode=0775)\n\n        def git(arguments, cwd):\n            argv = [configuration.executables.GIT] + arguments\n            git = subprocess.Popen(argv, stdout=subprocess.PIPE, stderr=subprocess.PIPE, cwd=cwd)\n            stdout, stderr = git.communicate()\n            if git.returncode != 0:\n                raise gitutils.GitError(\"unexpected output from '%s': %s\" % (\" \".join(argv), stderr))\n\n        if mirror:\n            try:\n                subprocess.check_output([configuration.executables.GIT, \"ls-remote\", mirror[\"remote_url\"]], stderr=subprocess.STDOUT)\n            except subprocess.CalledProcessError as e:\n                raise OperationFailure(code=\"failedreadremote\",\n                                       title=\"Failed to read source repository\",\n                                       message=\"Critic failed to read from the specified source repository. The error reported from git \" +\n                                               \"(when running as the system user '%s') was: <pre>%s</pre>\" % (configuration.base.SYSTEM_USER_NAME, htmlutils.htmlify(e.output)),\n                                       is_html=True)\n\n        git([\"init\", \"--bare\", \"--shared\", repository_name + \".git\"], cwd=main_base_path)\n        git([\"config\", \"receive.denyNonFastforwards\", \"false\"], cwd=main_path)\n        git([\"config\", \"critic.name\", name], cwd=main_path)\n\n        if configuration.debug.IS_QUICKSTART:\n            git([\"config\", \"critic.socket\", os.path.join(configuration.paths.SOCKETS_DIR, \"githook.unix\")], cwd=main_path)\n\n        os.symlink(os.path.join(configuration.paths.INSTALL_DIR, \"hooks\", \"pre-receive\"), os.path.join(main_path, \"hooks\", \"pre-receive\"))\n\n        cursor.execute(\"\"\"INSERT INTO repositories (name, path)\n                               VALUES (%s, %s)\n                            RETURNING id\"\"\",\n                       (name, main_path))\n        repository_id = cursor.fetchone()[0]\n\n        if mirror:\n            cursor.execute(\"\"\"INSERT INTO trackedbranches (repository, local_name, remote, remote_name, forced, delay)\n                                   VALUES (%s, '*', %s, '*', true, '1 day')\"\"\",\n                           (repository_id, mirror[\"remote_url\"]))\n\n            cursor.execute(\"\"\"INSERT INTO trackedbranches (repository, local_name, remote, remote_name, forced, delay)\n                                   VALUES (%s, %s, %s, %s, true, '1 day')\"\"\",\n                           (repository_id, mirror[\"local_branch\"], mirror[\"remote_url\"], mirror[\"remote_branch\"]))\n\n            git([\"symbolic-ref\", \"HEAD\", \"refs/heads/\" + mirror[\"local_branch\"]], cwd=main_path)\n\n        db.commit()\n\n        if mirror:\n            pid = int(open(configuration.services.BRANCHTRACKER[\"pidfile_path\"]).read().strip())\n            os.kill(pid, signal.SIGHUP)\n\n        return OperationResult()\n"
  },
  {
    "path": "src/operation/applyfilters.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2013 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport dbutils\nimport reviewing.utils\n\nfrom operation import Operation, OperationResult\n\nclass QueryGlobalFilters(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"review_id\": int })\n\n    def process(self, db, user, review_id):\n        review = dbutils.Review.fromId(db, review_id)\n        reviewers, watchers = reviewing.utils.queryFilters(db, user, review, globalfilters=True)\n        return OperationResult(reviewers=[dbutils.User.fromId(db, user_id).getJSON() for user_id in reviewers],\n                               watchers=[dbutils.User.fromId(db, user_id).getJSON() for user_id in watchers])\n\nclass ApplyGlobalFilters(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"review_id\": int })\n\n    def process(self, db, user, review_id):\n        review = dbutils.Review.fromId(db, review_id)\n        reviewing.utils.applyFilters(db, user, review, globalfilters=True)\n        return OperationResult()\n\nclass QueryParentFilters(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"review_id\": int })\n\n    def process(self, db, user, review_id):\n        review = dbutils.Review.fromId(db, review_id)\n        reviewers, watchers = reviewing.utils.queryFilters(db, user, review, parentfilters=True)\n        return OperationResult(reviewers=[dbutils.User.fromId(db, user_id).getJSON() for user_id in reviewers],\n                               watchers=[dbutils.User.fromId(db, user_id).getJSON() for user_id in watchers])\n\nclass ApplyParentFilters(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"review_id\": int })\n\n    def process(self, db, user, review_id):\n        review = dbutils.Review.fromId(db, review_id)\n        reviewing.utils.applyFilters(db, user, review, parentfilters=True)\n        return OperationResult()\n"
  },
  {
    "path": "src/operation/autocompletedata.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport dbutils\nimport gitutils\nimport reviewing.filters\n\nfrom operation import Operation, OperationResult, OperationError, Optional\n\nclass GetAutoCompleteData(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"values\": [set([\"users\", \"paths\"])],\n                                   \"review_id\": Optional(int),\n                                   \"changeset_ids\": Optional([int]) })\n\n    def process(self, db, user, values, review_id=None, changeset_ids=None):\n        cursor = db.cursor()\n        data = {}\n\n        if \"users\" in values:\n            cursor.execute(\"SELECT name, fullname FROM users WHERE status!='retired'\")\n            data[\"users\"] = dict(cursor)\n\n        if \"paths\" in values:\n            if review_id is not None:\n                cursor.execute(\"\"\"SELECT files.path, SUM(reviewfiles.deleted), SUM(reviewfiles.inserted)\n                                    FROM files\n                                    JOIN reviewfiles ON (reviewfiles.file=files.id)\n                                   WHERE reviewfiles.review=%s\n                                GROUP BY files.id\"\"\",\n                               (review_id,))\n            elif changeset_ids is not None:\n                cursor.execute(\"\"\"SELECT files.path, SUM(chunks.deleteCount), SUM(chunks.insertCount)\n                                    FROM files\n                                    JOIN chunks ON (chunks.file=files.id)\n                                   WHERE chunks.changeset=ANY (%s)\n                                GROUP BY files.id\"\"\",\n                               (changeset_ids,))\n            else:\n                raise OperationError(\"paths requested, but neither review_id nor changeset_ids given\")\n\n            paths = {}\n\n            for filename, deleted, inserted in cursor:\n                paths[filename] = (0, deleted, inserted)\n\n                components = filename.split(\"/\")\n                for index in range(len(components) - 1, 0, -1):\n                    directory = \"/\".join(components[:index]) + \"/\"\n                    nfiles, current_deleted, current_inserted = paths.get(directory, (0, 0, 0))\n                    paths[directory] = nfiles + 1, current_deleted + deleted, current_inserted + inserted\n\n            data[\"paths\"] = paths\n\n        return OperationResult(**data)\n\nclass GetRepositoryPaths(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"prefix\": str,\n                                   \"repository_id\": Optional(int),\n                                   \"repository_name\": Optional(str) })\n\n    def process(self, db, user, prefix, repository_id=None, repository_name=None):\n        if reviewing.filters.hasWildcard(prefix):\n            return OperationResult(paths={})\n\n        prefix = reviewing.filters.sanitizePath(prefix)\n\n        if repository_id is not None:\n            repository = gitutils.Repository.fromId(db, repository_id)\n        else:\n            repository = gitutils.Repository.fromName(db, repository_name)\n\n        if repository.isEmpty():\n            return OperationResult(paths={})\n\n        paths = {}\n\n        use_prefix = prefix.rpartition(\"/\")[0]\n\n        if use_prefix:\n            names = repository.run(\"ls-tree\", \"-r\", \"--name-only\", \"HEAD\", use_prefix).splitlines()\n        else:\n            names = repository.run(\"ls-tree\", \"-r\", \"--name-only\", \"HEAD\").splitlines()\n\n        def add(path):\n            if path.endswith(\"/\"):\n                if path not in paths:\n                    paths[path] = { \"files\": 0 }\n                paths[path][\"files\"] += 1\n            else:\n                paths[path] = {}\n\n        for name in names:\n            if not name.startswith(prefix):\n                continue\n\n            relname = name[len(prefix):]\n            use_prefix = prefix\n            if prefix.endswith(\"/\"):\n                add(prefix)\n            elif relname.startswith(\"/\"):\n                add(prefix + \"/\")\n                use_prefix = prefix + \"/\"\n                relname = relname[1:]\n\n            localname, pathsep, _ = relname.partition(\"/\")\n\n            add(use_prefix + localname + pathsep)\n\n        return OperationResult(paths=paths)\n"
  },
  {
    "path": "src/operation/basictypes.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport htmlutils\nimport textutils\n\nclass OperationResult:\n\n    \"\"\"\n    Simple container for successful operation result.\n\n    The constructor builds a dictionary from all keyword arguments,\n    and adds {\"status\": \"ok\"} unless a different \"status\" is specified\n    as a keyword argument.\n\n    Converting an OperationResult object to string converts this\n    dictionary to a JSON object literal.\n    \"\"\"\n\n    def __init__(self, **kwargs):\n        self.__value = kwargs\n        if \"status\" not in self.__value:\n            self.__value[\"status\"] = \"ok\"\n        self.__cookies = {}\n    def __str__(self):\n        return textutils.json_encode(self.__value)\n    def set(self, key, value):\n        self.__value[key] = value\n\nclass OperationError(Exception):\n\n    \"\"\"\n    Exception class for unexpected operation errors.\n\n    Converting an OperationError object to string produces a JSON\n    object literal with the properties status=\"error\" and\n    error=<message>.\n    \"\"\"\n\n    def __init__(self, message):\n        self.__message = message\n    def __str__(self):\n        return textutils.json_encode({ \"status\": \"error\",\n                                       \"error\": self.__message })\n\nclass OperationFailure(Exception):\n\n    \"\"\"\n    Exception class for operation failures caused by invalid input.\n\n    Converting an OperationFailure object to string produces a JSON\n    object literal with the properties status=\"failure\", title=<title>\n    and message=<message>.\n    \"\"\"\n\n    def __init__(self, code, title, message, is_html=False):\n        self.__code = code\n        self.__title = htmlutils.htmlify(title)\n        self.__message = message if is_html else htmlutils.htmlify(message)\n    def __str__(self):\n        return textutils.json_encode({ \"status\": \"failure\",\n                                       \"code\": self.__code,\n                                       \"title\": self.__title,\n                                       \"message\": self.__message })\n\nclass OperationFailureMustLogin(OperationFailure):\n    def __init__(self):\n        super(OperationFailureMustLogin, self).__init__(\n            code=\"mustlogin\",\n            title=\"Login Required\",\n            message=\"You have to sign in to perform this operation.\")\n"
  },
  {
    "path": "src/operation/basictypes_unittest.py",
    "content": "import json\n\ndef basic():\n    from operation.basictypes import (\n        OperationResult, OperationError, OperationFailure,\n        OperationFailureMustLogin)\n\n    def convert(value):\n        return json.loads(str(value))\n\n    #\n    # OperationResult\n    #\n\n    # OperationResult has status=ok by default.\n    assert convert(OperationResult()) == { \"status\": \"ok\" }\n\n    # But status can be overridden.\n    assert convert(OperationResult(status=\"bananas\")) == { \"status\": \"bananas\" }\n\n    # Other values can be set as well.\n    assert convert(OperationResult(foo=10)) == { \"status\": \"ok\", \"foo\": 10 }\n\n    # Even to None/null.\n    assert convert(OperationResult(foo=None)) == { \"status\": \"ok\", \"foo\": None }\n\n    # And test OperationResult.set().\n    result = OperationResult()\n    result.set(\"foo\", 10)\n    assert convert(result) == { \"status\": \"ok\", \"foo\": 10 }\n    result.set(\"foo\", [1, 2, 3])\n    assert convert(result) == { \"status\": \"ok\", \"foo\": [1, 2, 3] }\n    result.set(\"foo\", None)\n    assert convert(result) == { \"status\": \"ok\", \"foo\": None }\n\n    #\n    # OperationError\n    #\n\n    assert convert(OperationError(\"wrong!\")) == { \"status\": \"error\",\n                                                  \"error\": \"wrong!\" }\n\n    #\n    # OperationFailure\n    #\n\n    assert (convert(OperationFailure(\"the code\", \"the title\", \"the message\"))\n            == { \"status\": \"failure\", \"code\": \"the code\", \"title\": \"the title\",\n                 \"message\": \"the message\" })\n\n    # Check HTML escaping.\n    assert (convert(OperationFailure(\"<code>\", \"<title>\", \"<message>\"))\n            == { \"status\": \"failure\", \"code\": \"<code>\",\n                 \"title\": \"&lt;title&gt;\", \"message\": \"&lt;message&gt;\" })\n\n    # Check HTML escaping with is_html=True (title still escaped, but not the\n    # message.)\n    assert (convert(OperationFailure(\"<code>\", \"<title>\", \"<message>\", True))\n            == { \"status\": \"failure\", \"code\": \"<code>\",\n                 \"title\": \"&lt;title&gt;\", \"message\": \"<message>\" })\n\n    print \"basic: ok\"\n"
  },
  {
    "path": "src/operation/blame.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport dbutils\nimport gitutils\nimport itertools\nimport diff\n\nfrom operation import Operation, OperationResult, OperationError, Optional\nfrom log.commitset import CommitSet\nfrom changeset.utils import createChangeset\nfrom changeset.load import loadChangesetsForCommits\n\nclass LineAnnotator:\n    class NotSupported: pass\n\n    def __init__(self, db, parent, child, file_ids=None, commits=None, changeset_cache=None):\n        self.parent = parent\n        self.child = child\n        self.commitset = CommitSet.fromRange(db, parent, child, commits=commits)\n        self.changesets = {}\n\n        if not self.commitset: raise LineAnnotator.NotSupported\n\n        commits = []\n\n        if not changeset_cache: changeset_cache = {}\n\n        for commit in self.commitset:\n            if len(commit.parents) > 1: raise LineAnnotator.NotSupported\n\n            if commit in changeset_cache:\n                self.changesets[commit.sha1] = changeset_cache[commit]\n            else:\n                commits.append(commit)\n\n        for changeset in loadChangesetsForCommits(db, parent.repository, commits, filtered_file_ids=file_ids):\n            self.changesets[changeset.child.sha1] = changeset_cache[changeset.child] = changeset\n\n        for commit in set(self.commitset) - set(self.changesets.keys()):\n            changesets = createChangeset(db, None, commit.repository, commit=commit, filtered_file_ids=file_ids, do_highlight=False)\n            assert len(changesets) == 1\n            self.changesets[commit.sha1] = changeset_cache[commit] = changesets[0]\n\n        self.commits = [parent]\n        self.commit_index = { parent.sha1: 0 }\n\n        for commit in self.commitset:\n            self.commit_index[commit.sha1] = len(self.commits)\n            self.commits.append(commit)\n\n    class Line:\n        def __init__(self, sha1, primary):\n            self.sha1 = sha1\n            self.primary = primary\n            self.untouched = True\n        def touch(self, sha1):\n            if self.untouched:\n                self.sha1 = sha1\n                self.untouched = False\n        def __repr__(self):\n            return self.sha1[:8]\n\n    def annotate(self, file_id, first, last, check_user=None):\n        offset = first\n        count = last - first + 1\n\n        initial_lines = [LineAnnotator.Line(sha1, True) for sha1 in itertools.repeat(self.parent.sha1, count)]\n        lines = initial_lines[:]\n        commit = self.commitset.getHeads().pop()\n\n        while True:\n            changeset = self.changesets[commit.sha1]\n            changeset_file = changeset.getFile(file_id)\n\n            if changeset_file:\n                changeset_file.loadOldLines()\n                changeset_file.loadNewLines()\n\n                offset_delta = 0\n                modifications = []\n\n                for chunk in changeset_file.chunks:\n                    if chunk.insertEnd() < offset:\n                        offset_delta -= chunk.delta()\n                        continue\n\n                    if chunk.insert_offset < offset + count:\n                        if not chunk.deleted_lines: chunk.deleted_lines = changeset_file.getOldLines(chunk)\n                        if not chunk.inserted_lines: chunk.inserted_lines = changeset_file.getNewLines(chunk)\n\n                        for line in chunk.getLines():\n                            if line.new_offset < offset:\n                                if line.type == line.DELETED:\n                                    offset_delta += 1\n                                elif line.type == line.INSERTED:\n                                    offset_delta -= 1\n                            elif line.new_offset < offset + count:\n                                if line.type == line.CONTEXT:\n                                    pass\n                                elif line.type == line.DELETED:\n                                    modifications.append((line.new_offset, -1))\n                                else:\n                                    if line.type == line.INSERTED:\n                                        modifications.append((line.new_offset, 1))\n                                    line = lines[line.new_offset - offset]\n                                    if check_user and line.primary and line.untouched and commit.author.email == check_user.email:\n                                        return True\n                                    line.touch(commit.sha1)\n                            else:\n                                break\n\n                modification_offset = offset\n\n                for line_offset, delta in modifications:\n                    if delta > 0:\n                        del lines[line_offset - modification_offset]\n                        count -= 1\n                        modification_offset += 1\n                    else:\n                        lines.insert(line_offset - modification_offset, LineAnnotator.Line(None, False))\n                        count += 1\n                        modification_offset -= 1\n\n                offset += offset_delta\n\n            parents = self.commitset.getParents(commit)\n\n            if len(parents) > 1: raise LineAnnotator.NotSupported\n\n            if parents: commit = parents.pop()\n            else: break\n\n        if check_user:\n            if self.parent.author.email == check_user.email:\n                return any(itertools.imap(lambda line: line.untouched, initial_lines))\n            else:\n                return False\n        else:\n            return [(first + index, self.commit_index[line.sha1]) for index, line in enumerate(initial_lines)]\n\nclass Blame(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"repository_id\": int,\n                                   \"changeset_id\": int,\n                                   \"files\": [{ \"id\": int,\n                                               \"blocks\": [{ \"first\": int,\n                                                            \"last\": int }]\n                                               }]\n                                   })\n\n    def process(self, db, user, repository_id, changeset_id, files):\n        repository = gitutils.Repository.fromId(db, repository_id)\n\n        cursor = db.cursor()\n        cursor.execute(\"SELECT parent, child FROM changesets WHERE id=%s\", (changeset_id,))\n\n        parent_id, child_id = cursor.fetchone()\n        parent = gitutils.Commit.fromId(db, repository, parent_id)\n        child = gitutils.Commit.fromId(db, repository, child_id)\n\n        try:\n            annotator = LineAnnotator(db, parent, child)\n\n            for file in files:\n                for block in file[\"blocks\"]:\n                    lines = annotator.annotate(file[\"id\"], block[\"first\"], block[\"last\"])\n                    block[\"lines\"] = [{ \"offset\": offset, \"commit\": commit } for offset, commit in lines]\n\n            return OperationResult(commits=[{ \"sha1\": commit.sha1,\n                                              \"author_name\": commit.author.name,\n                                              \"author_email\": commit.author.email,\n                                              \"summary\": commit.niceSummary(),\n                                              \"message\": commit.message,\n                                              \"original\": commit == parent,\n                                              \"current\": commit == child }\n                                            for commit in annotator.commits],\n                                   files=files)\n        except LineAnnotator.NotSupported:\n            blame = gitutils.Blame(parent, child)\n\n            paths = {}\n\n            for file in files:\n                file_id = file[\"id\"]\n\n                path = paths.get(file_id)\n\n                if not path:\n                    path = paths[file_id] = dbutils.describe_file(db, file_id)\n\n                for block in file[\"blocks\"]:\n                    block[\"lines\"] = blame.blame(db, path, block[\"first\"], block[\"last\"])\n\n            return OperationResult(commits=blame.commits, files=files)\n"
  },
  {
    "path": "src/operation/brancharchiving.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport dbutils\n\nfrom operation import Operation, OperationResult, OperationFailure, Review\n\n# This operation has no UI entry point; it is used during testing.\nclass ArchiveBranch(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"review\": Review })\n\n    def process(self, db, user, review):\n        if review.branch.archived:\n            raise OperationFailure(\n                code=\"invalidstate\",\n                title=\"Branch already archived!\",\n                message=\"The review's branch has already been archived.\")\n\n        if review.state not in (\"closed\", \"dropped\"):\n            raise OperationFailure(\n                code=\"invalidstate\",\n                title=\"Invalid review state!\",\n                message=(\"The review must be closed or dropped to archive \"\n                         \"its branch.\"))\n\n        review.branch.archive(db)\n        review.cancelScheduledBranchArchival(db)\n\n        db.commit()\n\n        return OperationResult()\n\nclass ResurrectBranch(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"review\": Review })\n\n    def process(self, db, user, review):\n        if not review.branch.archived:\n            raise OperationFailure(\n                code=\"invalidstate\",\n                title=\"Branch not archived!\",\n                message=\"The review's branch has not been archived.\")\n\n        review.branch.resurrect(db)\n        delay = review.scheduleBranchArchival(db)\n\n        db.commit()\n\n        return OperationResult(delay=delay)\n\n# This operation has no UI entry point; it is used during testing.\nclass ScheduleBranchArchival(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"review\": Review,\n                                   \"delay\": int })\n\n    def process(self, db, user, review, delay):\n        # This operation intentionally doesn't check that the review is closed\n        # or dropped, or that the branch isn't already archived.  Those checks\n        # are performed by dbutils.Review.scheduleBranchArchival(), which is a\n        # no-op if the conditions aren't met.\n\n        review.scheduleBranchArchival(db, delay=delay)\n\n        db.commit()\n\n        return OperationResult()\n"
  },
  {
    "path": "src/operation/checkrebase.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nfrom operation import Operation, OperationResult, Optional\n\nimport dbutils\nimport gitutils\nimport reviewing.rebase\nimport changeset.utils as changeset_utils\nimport log.commitset\n\nclass CheckMergeStatus(Operation):\n    def __init__(self):\n        super(CheckMergeStatus, self).__init__({ \"review_id\": int,\n                                                 \"new_head_sha1\": str,\n                                                 \"new_upstream_sha1\": str })\n\n    def process(self, db, user, review_id, new_head_sha1, new_upstream_sha1):\n        review = dbutils.Review.fromId(db, review_id)\n        upstreams = log.commitset.CommitSet(review.branch.getCommits(db)).getFilteredTails(review.repository)\n\n        if len(upstreams) > 1:\n            return OperationResult(rebase_supported=False)\n\n        old_head = review.branch.getHead(db)\n        old_upstream = gitutils.Commit.fromSHA1(db, review.repository, upstreams.pop())\n        new_head = gitutils.Commit.fromSHA1(db, review.repository, new_head_sha1)\n        new_upstream = gitutils.Commit.fromSHA1(db, review.repository, new_upstream_sha1)\n\n        equivalent_merge = reviewing.rebase.createEquivalentMergeCommit(\n            db, review, user, old_head, old_upstream, new_head, new_upstream)\n\n        changesets = changeset_utils.createChangeset(\n            db, user, review.repository, equivalent_merge, do_highlight=False)\n\n        for changeset in changesets:\n            if changeset.files:\n                has_conflicts = True\n                break\n        else:\n            has_conflicts = False\n\n        return OperationResult(rebase_supported=True,\n                               has_conflicts=has_conflicts,\n                               merge_sha1=equivalent_merge.sha1)\n\nclass CheckConflictsStatus(Operation):\n    def __init__(self):\n        super(CheckConflictsStatus, self).__init__({ \"review_id\": int,\n                                                     \"merge_sha1\": Optional(str),\n                                                     \"new_head_sha1\": Optional(str),\n                                                     \"new_upstream_sha1\": Optional(str) })\n\n    def process(self, db, user, review_id, merge_sha1=None, new_head_sha1=None, new_upstream_sha1=None):\n        review = dbutils.Review.fromId(db, review_id)\n\n        if merge_sha1 is not None:\n            merge = gitutils.Commit.fromSHA1(db, review.repository, merge_sha1)\n\n            changesets = changeset_utils.createChangeset(\n                db, user, review.repository, merge, conflicts=True, do_highlight=False)\n\n            url = \"/showcommit?repository=%d&sha1=%s&conflicts=yes\" % (review.repository.id, merge.sha1)\n        else:\n            upstreams = review.getCommitSet(db).getFilteredTails(review.repository)\n\n            if len(upstreams) > 1:\n                return OperationResult(rebase_supported=False)\n\n            old_head = review.branch.getHead(db)\n            old_upstream = gitutils.Commit.fromSHA1(db, review.repository, upstreams.pop())\n            new_head = gitutils.Commit.fromSHA1(db, review.repository, new_head_sha1)\n            new_upstream = gitutils.Commit.fromSHA1(db, review.repository, new_upstream_sha1)\n\n            replay = reviewing.rebase.replayRebase(db, review, user, old_head, old_upstream, new_head, new_upstream)\n\n            changesets = changeset_utils.createChangeset(\n                db, user, review.repository, from_commit=replay, to_commit=new_head, conflicts=True, do_highlight=False)\n\n            url = \"/showcommit?repository=%d&from=%s&to=%s&conflicts=yes\" % (review.repository.id, replay.sha1, new_head.sha1)\n\n        has_changes = False\n        has_conflicts = False\n\n        for changed_file in changesets[0].files:\n            changed_file.loadOldLines()\n\n            file_has_conflicts = False\n\n            for chunk in changed_file.chunks:\n                lines = changed_file.getOldLines(chunk)\n                for line in lines:\n                    if line.startswith(\"<<<<<<<\"):\n                        has_conflicts = file_has_conflicts = True\n                        break\n                if file_has_conflicts:\n                    break\n\n            if not file_has_conflicts:\n                has_changes = True\n\n        return OperationResult(has_conflicts=has_conflicts, has_changes=has_changes, url=url)\n\nclass CheckHistoryRewriteStatus(Operation):\n    def __init__(self):\n        super(CheckHistoryRewriteStatus, self).__init__({ \"review_id\": int,\n                                                          \"new_head_sha1\": str })\n\n    def process(self, db, user, review_id, new_head_sha1):\n        review = dbutils.Review.fromId(db, review_id)\n\n        old_head = review.branch.getHead(db)\n        new_head = gitutils.Commit.fromSHA1(db, review.repository, new_head_sha1)\n\n        mergebase = review.repository.mergebase([old_head, new_head])\n        sha1s = review.repository.revlist([new_head], [mergebase])\n\n        valid = True\n\n        for sha1 in sha1s:\n            commit = gitutils.Commit.fromSHA1(db, review.repository, sha1)\n            if commit.tree == old_head.tree:\n                break\n        else:\n            valid = False\n\n        return OperationResult(valid=valid)\n"
  },
  {
    "path": "src/operation/createcomment.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport dbutils\n\nfrom operation import (Operation, OperationResult, OperationFailure, Optional,\n                       NonNegativeInteger, PositiveInteger, Review, Commit, File)\n\nfrom reviewing.comment import CommentChain, validateCommentChain, createCommentChain, createComment\n\nclass ValidateCommentChain(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"review\": Review,\n                                   \"origin\": set([\"old\", \"new\"]),\n                                   \"parent\": Optional(Commit),\n                                   \"child\": Commit,\n                                   \"file\": File,\n                                   \"offset\": PositiveInteger,\n                                   \"count\": PositiveInteger })\n\n    def process(self, db, user, review, origin, child, file, offset, count, parent=None):\n        verdict, data = validateCommentChain(db, review, origin, parent, child, file, offset, count)\n        return OperationResult(verdict=verdict, **data)\n\ndef checkComment(text):\n    if not text.strip():\n        raise OperationFailure(code=\"emptycomment\",\n                               title=\"Empty comment!\",\n                               message=\"Creating empty (or white-space only) comments is not allowed.\")\n\nclass CreateCommentChain(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"review\": Review,\n                                   \"chain_type\": set([\"issue\", \"note\"]),\n                                   \"commit_context\": Optional({ \"commit\": Commit,\n                                                                \"offset\": NonNegativeInteger,\n                                                                \"count\": PositiveInteger }),\n                                   \"file_context\": Optional({ \"origin\": set([\"old\", \"new\"]),\n                                                              \"parent\": Optional(Commit),\n                                                              \"child\": Commit,\n                                                              \"file\": File,\n                                                              \"offset\": PositiveInteger,\n                                                              \"count\": PositiveInteger }),\n                                   \"text\": str })\n\n    def process(self, db, user, review, chain_type, text, commit_context=None, file_context=None):\n        checkComment(text)\n\n        if commit_context:\n            chain_id = createCommentChain(db, user, review, chain_type, **commit_context)\n        elif file_context:\n            chain_id = createCommentChain(db, user, review, chain_type, **file_context)\n        else:\n            chain_id = createCommentChain(db, user, review, chain_type)\n\n        comment_id = createComment(db, user, chain_id, text, first=True)\n\n        db.commit()\n\n        return OperationResult(chain_id=chain_id, comment_id=comment_id, draft_status=review.getDraftStatus(db, user))\n\nclass CreateComment(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"chain_id\": int,\n                                   \"text\": str })\n\n    def process(self, db, user, chain_id, text):\n        checkComment(text)\n\n        chain = CommentChain.fromId(db, chain_id, user)\n        comment_id = createComment(db, user, chain_id, text)\n\n        db.commit()\n\n        return OperationResult(comment_id=comment_id, draft_status=chain.review.getDraftStatus(db, user))\n"
  },
  {
    "path": "src/operation/createreview.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport re\nimport os\nimport signal\n\nimport dbutils\nimport gitutils\nimport htmlutils\nimport configuration\n\nfrom operation import (Operation, OperationResult, OperationError, Optional,\n                       OperationFailure, Repository)\nfrom reviewing.utils import (parseReviewFilters, parseRecipientFilters,\n                             createReview, getReviewersAndWatchers)\nfrom page.createreview import generateReviewersAndWatchersTable\nfrom log.commitset import CommitSet\n\nif configuration.extensions.ENABLED:\n    import extensions.role.processcommits\n\nfrom cStringIO import StringIO\n\nclass ReviewersAndWatchers(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"repository_id\": int,\n                                   \"commit_ids\": [int],\n                                   \"reviewfilters\": [{ \"username\": str,\n                                                       \"type\": set([\"reviewer\", \"watcher\"]),\n                                                       \"path\": str }],\n                                   \"applyfilters\": bool,\n                                   \"applyparentfilters\": bool })\n\n    def process(req, db, user, repository_id, commit_ids, reviewfilters, applyfilters, applyparentfilters):\n        reviewfilters = parseReviewFilters(db, reviewfilters)\n\n        repository = gitutils.Repository.fromId(db, repository_id)\n        commits = [gitutils.Commit.fromId(db, repository, commit_id) for commit_id in commit_ids]\n\n        all_reviewers, all_watchers = getReviewersAndWatchers(db, repository, commits,\n                                                              reviewfilters=reviewfilters,\n                                                              applyfilters=applyfilters,\n                                                              applyparentfilters=applyparentfilters)\n        document = htmlutils.Document(req)\n\n        generateReviewersAndWatchersTable(db, repository, document,\n                                          all_reviewers, all_watchers,\n                                          applyfilters=applyfilters,\n                                          applyparentfilters=applyparentfilters)\n\n        return OperationResult(html=document.render(plain=True))\n\nclass SubmitReview(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"repository\": Repository,\n                                   \"branch\": str,\n                                   \"summary\": str,\n                                   \"commit_ids\": Optional([int]),\n                                   \"commit_sha1s\": Optional([str]),\n                                   \"applyfilters\": Optional(bool),\n                                   \"applyparentfilters\": Optional(bool),\n                                   \"reviewfilters\": Optional([{ \"username\": str,\n                                                                \"type\": set([\"reviewer\", \"watcher\"]),\n                                                                \"path\": str }]),\n                                   \"recipientfilters\": Optional({ \"mode\": set([\"opt-in\", \"opt-out\"]),\n                                                                  \"included\": Optional([str]),\n                                                                  \"excluded\": Optional([str]) }),\n                                   \"description\": Optional(str),\n                                   \"frombranch\": Optional(str),\n                                   \"trackedbranch\": Optional({ \"remote\": str,\n                                                               \"name\": str }) })\n\n    def process(self, db, user, repository, branch, summary, commit_ids=None,\n                commit_sha1s=None, applyfilters=True, applyparentfilters=True,\n                reviewfilters=None, recipientfilters=None, description=None,\n                frombranch=None, trackedbranch=None):\n        # Raises auth.AccessDenied if access should not be allowed.\n        repository.checkAccess(db, \"modify\")\n\n        if not branch.startswith(\"r/\"):\n            raise OperationFailure(code=\"invalidbranch\",\n                                   title=\"Invalid review branch name\",\n                                   message=\"'%s' is not a valid review branch name; it must have a \\\"r/\\\" prefix.\" % branch)\n\n        if reviewfilters is None:\n            reviewfilters = []\n        if recipientfilters is None:\n            recipientfilters = {}\n\n        components = branch.split(\"/\")\n        for index in range(1, len(components)):\n            try:\n                repository.revparse(\"refs/heads/%s\" % \"/\".join(components[:index]))\n            except gitutils.GitReferenceError:\n                continue\n\n            message = (\"Cannot create branch with name<pre>%s</pre>since there is already a branch named<pre>%s</pre>in the repository.\" %\n                       (htmlutils.htmlify(branch), htmlutils.htmlify(\"/\".join(components[:index]))))\n            raise OperationFailure(code=\"invalidbranch\",\n                                   title=\"Invalid review branch name\",\n                                   message=message,\n                                   is_html=True)\n\n        if commit_sha1s is not None:\n            commits = [gitutils.Commit.fromSHA1(db, repository, commit_sha1) for commit_sha1 in commit_sha1s]\n        elif commit_ids is not None:\n            commits = [gitutils.Commit.fromId(db, repository, commit_id) for commit_id in commit_ids]\n        else:\n            commits = []\n\n        commitset = CommitSet(commits)\n\n        reviewfilters = parseReviewFilters(db, reviewfilters)\n        recipientfilters = parseRecipientFilters(db, recipientfilters)\n\n        review = createReview(db, user, repository, commits, branch, summary, description,\n                              from_branch_name=frombranch,\n                              reviewfilters=reviewfilters,\n                              recipientfilters=recipientfilters,\n                              applyfilters=applyfilters,\n                              applyparentfilters=applyparentfilters)\n\n        extensions_output = StringIO()\n        kwargs = {}\n\n        if configuration.extensions.ENABLED:\n            if extensions.role.processcommits.execute(db, user, review, commits, None, commitset.getHeads().pop(), extensions_output):\n                kwargs[\"extensions_output\"] = extensions_output.getvalue().lstrip()\n\n        if trackedbranch:\n            cursor = db.cursor()\n\n            cursor.execute(\"\"\"SELECT 1\n                                FROM knownremotes\n                               WHERE url=%s\n                                 AND pushing\"\"\",\n                           (trackedbranch[\"remote\"],))\n\n            if cursor.fetchone():\n                delay = \"1 week\"\n            else:\n                delay = \"1 hour\"\n\n            cursor.execute(\"\"\"INSERT INTO trackedbranches (repository, local_name, remote, remote_name, forced, delay)\n                                   VALUES (%s, %s, %s, %s, false, INTERVAL %s)\n                                RETURNING id\"\"\",\n                           (repository.id, branch, trackedbranch[\"remote\"], trackedbranch[\"name\"], delay))\n\n            trackedbranch_id = cursor.fetchone()[0]\n            kwargs[\"trackedbranch_id\"] = trackedbranch_id\n\n            cursor.execute(\"\"\"INSERT INTO trackedbranchusers (branch, uid)\n                                   VALUES (%s, %s)\"\"\",\n                           (trackedbranch_id, user.id))\n\n            db.commit()\n\n            pid = int(open(configuration.services.BRANCHTRACKER[\"pidfile_path\"]).read().strip())\n            os.kill(pid, signal.SIGHUP)\n\n        return OperationResult(review_id=review.id, **kwargs)\n\nclass FetchRemoteBranches(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"remote\": str,\n                                   \"pattern\": Optional(str) },\n                           accept_anonymous_user=True)\n\n    def process(self, db, user, remote, pattern=None):\n        if pattern: regexp = re.compile(pattern.replace(\"*\", \".*\"))\n        else: regexp = None\n\n        try:\n            refs = gitutils.Repository.lsremote(remote, regexp=regexp)\n        except gitutils.GitCommandError as error:\n            if error.output.splitlines()[0].endswith(\"does not appear to be a git repository\"):\n                raise OperationFailure(\n                    code=\"invalidremote\",\n                    title=\"Invalid remote!\",\n                    message=(\"<code>%s</code> does not appear to be a valid Git repository.\"\n                             % htmlutils.htmlify(remote)),\n                    is_html=True)\n            else:\n                raise\n        else:\n            branches = dict([(ref[1], ref[0]) for ref in refs])\n            return OperationResult(branches=branches)\n\nclass FetchRemoteBranch(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"repository\": Repository,\n                                   \"remote\": str,\n                                   \"branch\": str,\n                                   \"upstream\": Optional(str) },\n                           accept_anonymous_user=True)\n\n    def process(self, db, user, repository, remote, branch, upstream=\"refs/heads/master\"):\n        # Raises auth.AccessDenied if access should not be allowed.\n        repository.checkAccess(db, \"modify\")\n\n        cursor = db.cursor()\n\n        # Check if only other repositories are currently tracking branches from\n        # this remote.  If that's the case, then the user most likely either\n        # selected the wrong repository or entered the wrong remote.\n        cursor.execute(\"\"\"SELECT repositories.name\n                            FROM repositories\n                            JOIN trackedbranches ON (trackedbranches.repository=repositories.id)\n                           WHERE trackedbranches.remote=%s\"\"\",\n                       (remote,))\n        repository_names = set(repository_name for repository_name, in cursor)\n        if repository_names and repository.name not in repository_names:\n            raise OperationFailure(\n                code=\"badremote\",\n                title=\"Bad remote!\",\n                message=(\"The remote <code>%s</code> appears to be related to \"\n                         \"%s on this server (<code>%s</code>).  \"\n                         \"You most likely shouldn't be importing branches from \"\n                         \"it into the selected repository (<code>%s</code>).\"\n                         % (htmlutils.htmlify(remote),\n                            (\"another repository\"\n                             if len(repository_names) == 1 else\n                             \"other repositories\"),\n                            htmlutils.htmlify(\", \".join(sorted(repository_names))),\n                            htmlutils.htmlify(repository.name))),\n                is_html=True)\n\n        if not branch.startswith(\"refs/\"):\n            branch = \"refs/heads/%s\" % branch\n\n        try:\n            with repository.fetchTemporaryFromRemote(db, remote, branch) as sha1:\n                head_sha1 = repository.keepalive(sha1)\n        except gitutils.GitReferenceError as error:\n            if error.repository:\n                raise OperationFailure(\n                    code=\"refnotfound\",\n                    title=\"Remote ref not found!\",\n                    message=(\"Could not find the ref <code>%s</code> in the repository <code>%s</code>.\"\n                             % (htmlutils.htmlify(error.ref), htmlutils.htmlify(error.repository))),\n                    is_html=True)\n            else:\n                raise OperationFailure(\n                    code=\"invalidref\",\n                    title=\"Invalid ref!\",\n                    message=(\"The specified ref is invalid: <code>%s</code>.\"\n                             % htmlutils.htmlify(error.ref)),\n                    is_html=True)\n        except gitutils.GitCommandError as error:\n            if error.output.splitlines()[0].endswith(\"does not appear to be a git repository\"):\n                raise OperationFailure(\n                    code=\"invalidremote\",\n                    title=\"Invalid remote!\",\n                    message=(\"<code>%s</code> does not appear to be a valid Git repository.\"\n                             % htmlutils.htmlify(remote)),\n                    is_html=True)\n            else:\n                raise\n\n        if upstream.startswith(\"refs/\"):\n            try:\n                with repository.fetchTemporaryFromRemote(db, remote, upstream) as sha1:\n                    upstream_sha1 = repository.keepalive(sha1)\n            except gitutils.GitReferenceError:\n                raise OperationFailure(\n                    code=\"refnotfound\",\n                    title=\"Remote ref not found!\",\n                    message=(\"Could not find the ref <code>%s</code> in the repository <code>%s</code>.\"\n                             % (htmlutils.htmlify(upstream), htmlutils.htmlify(remote))),\n                    is_html=True)\n        else:\n            try:\n                upstream_sha1 = repository.revparse(upstream)\n            except gitutils.GitReferenceError:\n                raise OperationFailure(\n                    code=\"refnotfound\",\n                    title=\"Local ref not found!\",\n                    message=(\"Could not find the ref <code>%s</code> in the repository <code>%s</code>.\"\n                             % (htmlutils.htmlify(upstream), htmlutils.htmlify(str(repository)))),\n                    is_html=True)\n\n        try:\n            resolved_upstream_sha1 = gitutils.getTaggedCommit(repository, upstream_sha1)\n        except gitutils.GitReferenceError:\n            resolved_upstream_sha1 = None\n\n        if not resolved_upstream_sha1:\n            raise OperationFailure(\n                code=\"missingcommit\",\n                title=\"Upstream commit is missing!\",\n                message=(\"<p>Could not find the commit <code>%s</code> in the \"\n                         \"repository <code>%s</code>.</p>\"\n                         \"<p>Since it would have been fetched along with the \"\n                         \"branch if it actually was a valid upstream commit, \"\n                         \"this means it's not valid.</p>\"\n                         % (htmlutils.htmlify(upstream_sha1), htmlutils.htmlify(str(repository)))),\n                is_html=True)\n\n        commit_sha1s = repository.revlist(included=[head_sha1], excluded=[resolved_upstream_sha1])\n\n        if not commit_sha1s:\n            raise OperationFailure(\n                code=\"emptybranch\",\n                title=\"Branch contains no commits!\",\n                message=(\"All commits referenced by <code>%s</code> are reachable from <code>%s</code>.\"\n                         % (htmlutils.htmlify(branch), htmlutils.htmlify(upstream))),\n                is_html=True)\n\n        cursor.execute(\"SELECT id FROM commits WHERE sha1=ANY (%s)\", (commit_sha1s,))\n\n        return OperationResult(commit_ids=[commit_id for (commit_id,) in cursor],\n                               head_sha1=head_sha1, upstream_sha1=resolved_upstream_sha1)\n"
  },
  {
    "path": "src/operation/draftchanges.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport dbutils\nimport profiling\n\nfrom operation import Operation, OperationResult, Optional\nfrom reviewing.comment import CommentChain, createCommentChain, createComment\nfrom reviewing.mail import sendPendingMails\nfrom reviewing.utils import generateMailsForBatch\n\nclass ReviewStateChange(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"review_id\": int })\n\n    def process(self, db, user, review_id):\n        cursor = db.cursor()\n\n        def unaccepted():\n            # Raised issues.\n            cursor.execute(\"\"\"SELECT 1\n                                FROM commentchains\n                               WHERE commentchains.review=%s\n                                 AND commentchains.uid=%s\n                                 AND commentchains.type='issue'\n                                 AND commentchains.state='draft'\"\"\",\n                           (review_id, user.id))\n            if cursor.fetchone(): return True\n\n            # Reopened issues.\n            cursor.execute(\"\"\"SELECT 1\n                                FROM commentchainchanges\n                                JOIN commentchains ON (commentchains.id=commentchainchanges.chain)\n                               WHERE commentchains.review=%s\n                                 AND commentchains.type='issue'\n                                 AND commentchainchanges.uid=%s\n                                 AND commentchainchanges.state='draft'\n                                 AND commentchainchanges.from_state=commentchains.state\n                                 AND commentchainchanges.to_state='open'\n                                 AND commentchainchanges.to_type IS NULL\"\"\",\n                           (review_id, user.id))\n            if cursor.fetchone(): return True\n\n            # Note converted into issues.\n            cursor.execute(\"\"\"SELECT 1\n                                FROM commentchainchanges\n                                JOIN commentchains ON (commentchains.id=commentchainchanges.chain)\n                               WHERE commentchains.review=%s\n                                 AND commentchains.type='note'\n                                 AND commentchainchanges.uid=%s\n                                 AND commentchainchanges.state='draft'\n                                 AND commentchainchanges.from_type=commentchains.type\n                                 AND commentchainchanges.to_type='issue'\"\"\",\n                           (review_id, user.id))\n            if cursor.fetchone(): return True\n\n            # Unreviewed lines.\n            cursor.execute(\"\"\"SELECT 1\n                                FROM reviewfilechanges\n                                JOIN reviewfiles ON (reviewfiles.id=reviewfilechanges.file)\n                               WHERE reviewfiles.review=%s\n                                 AND reviewfilechanges.uid=%s\n                                 AND reviewfilechanges.state='draft'\n                                 AND reviewfilechanges.from_state=reviewfiles.state\n                                 AND reviewfilechanges.to_state='pending'\"\"\",\n                           (review_id, user.id))\n            if cursor.fetchone(): return True\n\n            # Otherwise still accepted (if accepted before.)\n            return False\n\n        def stillOpen():\n            if unaccepted(): return True\n\n            # Still open issues.\n            cursor.execute(\"\"\"SELECT 1\n                                FROM commentchains\n                     LEFT OUTER JOIN commentchainchanges ON (commentchainchanges.chain=commentchains.id\n                                                         AND commentchainchanges.uid=%s\n                                                         AND commentchainchanges.state='draft'\n                                                         AND (commentchainchanges.to_state IN ('closed', 'addressed')\n                                                           OR commentchainchanges.to_type='note'))\n                               WHERE commentchains.review=%s\n                                 AND commentchains.type='issue'\n                                 AND commentchains.state='open'\n                                 AND commentchainchanges.chain IS NULL\"\"\",\n                           (user.id, review_id))\n            if cursor.fetchone(): return True\n\n            # Still pending lines.\n            cursor.execute(\"\"\"SELECT 1\n                                FROM reviewfiles\n                     LEFT OUTER JOIN reviewfilechanges ON (reviewfilechanges.file=reviewfiles.id\n                                                       AND reviewfilechanges.uid=%s\n                                                       AND reviewfilechanges.state='draft'\n                                                       AND reviewfilechanges.to_state='reviewed')\n                               WHERE reviewfiles.review=%s\n                                 AND reviewfiles.state='pending'\n                                 AND reviewfilechanges.file IS NULL\"\"\",\n                           (user.id, review_id))\n            if cursor.fetchone(): return True\n\n            # Otherwise accepted now.\n            return False\n\n        if dbutils.Review.isAccepted(db, review_id):\n            return OperationResult(current_state=\"accepted\", new_state=\"open\" if unaccepted() else \"accepted\")\n        else:\n            return OperationResult(current_state=\"open\", new_state=\"open\" if stillOpen() else \"accepted\")\n\nclass SubmitChanges(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"review_id\": int,\n                                   \"remark\": Optional(str) })\n\n    def process(self, db, user, review_id, remark=None):\n        cursor = db.cursor()\n        profiler = profiling.Profiler()\n\n        profiler.check(\"start\")\n\n        review = dbutils.Review.fromId(db, review_id)\n\n        profiler.check(\"create review\")\n\n        was_accepted = review.state == \"open\" and review.accepted(db)\n\n        profiler.check(\"accepted before\")\n\n        if remark and remark.strip():\n            chain_id = createCommentChain(db, user, review, 'note')\n            createComment(db, user, chain_id, remark, first=True)\n        else:\n            chain_id = None\n\n        # Create a batch that groups all submitted changes together.\n        cursor.execute(\"INSERT INTO batches (review, uid, comment) VALUES (%s, %s, %s) RETURNING id\", (review.id, user.id, chain_id))\n        batch_id = cursor.fetchone()[0]\n\n        profiler.check(\"batches++\")\n\n        # Reject all draft file approvals where the affected review file isn't in\n        # the state it was in when the change was drafted.\n        cursor.execute(\"\"\"UPDATE reviewfilechanges\n                             SET state='rejected',\n                                 time=now()\n                           WHERE uid=%s\n                             AND state='draft'\n                             AND file IN (SELECT id\n                                            FROM reviewfiles\n                                            JOIN reviewfilechanges ON (reviewfilechanges.file=reviewfiles.id)\n                                           WHERE reviewfiles.review=%s\n                                             AND reviewfilechanges.uid=%s\n                                             AND reviewfilechanges.state='draft'\n                                             AND reviewfilechanges.from_state!=reviewfiles.state)\"\"\",\n                       (user.id, review.id, user.id))\n\n        profiler.check(\"reviewfilechanges reject state changes\")\n\n        # Then perform the remaining draft file approvals by updating the state of\n        # the corresponding review file.\n        cursor.execute(\"\"\"UPDATE reviewfiles\n                             SET state='reviewed',\n                                 reviewer=%s,\n                                 time=now()\n                           WHERE review=%s\n                             AND id IN (SELECT file\n                                          FROM reviewfilechanges\n                                         WHERE uid=%s\n                                           AND state='draft'\n                                           AND to_state='reviewed')\"\"\",\n                       (user.id, review.id, user.id))\n\n        profiler.check(\"reviewfiles pending=>reviewed\")\n\n        # Then perform the remaining draft file disapprovals by updating the state\n        # of the corresponding review file.\n        cursor.execute(\"\"\"UPDATE reviewfiles\n                             SET state='pending',\n                                 reviewer=NULL,\n                                 time=now()\n                           WHERE review=%s\n                             AND id IN (SELECT file\n                                          FROM reviewfilechanges\n                                         WHERE uid=%s\n                                           AND state='draft'\n                                           AND to_state='pending')\"\"\",\n                       (review.id, user.id))\n\n        profiler.check(\"reviewfiles reviewed=>pending\")\n\n        # Finally change the state of just performed approvals from draft to\n        # 'performed'.\n        cursor.execute(\"\"\"UPDATE reviewfilechanges\n                             SET batch=%s,\n                                 state='performed',\n                                 time=now()\n                           WHERE uid=%s\n                             AND state='draft'\n                             AND file IN (SELECT id\n                                            FROM reviewfiles\n                                           WHERE reviewfiles.review=%s)\"\"\",\n                       (batch_id, user.id, review.id))\n\n        profiler.check(\"reviewfilechanges draft=>performed\")\n\n        # Find all chains with draft comments being submitted that the current user\n        # isn't associated with via the commentchainusers table, and associate the\n        # user with them.\n        cursor.execute(\"\"\"SELECT DISTINCT commentchains.id, commentchainusers.uid IS NULL\n                            FROM commentchains\n                            JOIN comments ON (comments.chain=commentchains.id)\n                 LEFT OUTER JOIN commentchainusers ON (commentchainusers.chain=commentchains.id\n                                                   AND commentchainusers.uid=comments.uid)\n                           WHERE commentchains.review=%s\n                             AND comments.uid=%s\n                             AND comments.state='draft'\"\"\",\n                       (review.id, user.id))\n\n        for chain_id, need_associate in cursor.fetchall():\n            if need_associate:\n                cursor.execute(\"INSERT INTO commentchainusers (chain, uid) VALUES (%s, %s)\", (chain_id, user.id))\n\n        profiler.check(\"commentchainusers++\")\n\n        # Find all chains with draft comments being submitted and add a record for\n        # every user associated with the chain to read the comment.\n        cursor.execute(\"\"\"INSERT\n                            INTO commentstoread (uid, comment)\n                          SELECT commentchainusers.uid, comments.id\n                            FROM commentchains, commentchainusers, comments\n                           WHERE commentchains.review=%s\n                             AND commentchainusers.chain=commentchains.id\n                             AND commentchainusers.uid!=comments.uid\n                             AND comments.chain=commentchains.id\n                             AND comments.uid=%s\n                             AND comments.state='draft'\"\"\",\n                       (review.id, user.id))\n\n        profiler.check(\"commentstoread++\")\n\n        # Associate all users associated with a draft comment chain to\n        # the review (if they weren't already.)\n        cursor.execute(\"\"\"SELECT DISTINCT commentchainusers.uid\n                            FROM commentchains\n                            JOIN commentchainusers ON (commentchainusers.chain=commentchains.id)\n                 LEFT OUTER JOIN reviewusers ON (reviewusers.review=commentchains.review AND reviewusers.uid=commentchainusers.uid)\n                           WHERE commentchains.review=%s\n                             AND commentchains.uid=%s\n                             AND commentchains.state='draft'\n                             AND reviewusers.uid IS NULL\"\"\",\n                       (review.id, user.id))\n\n        for (user_id,) in cursor.fetchall():\n            cursor.execute(\"INSERT INTO reviewusers (review, uid) VALUES (%s, %s)\", (review.id, user_id))\n\n        # Change state on all draft commentchains by the user in the review to 'open'.\n        cursor.execute(\"\"\"UPDATE commentchains\n                             SET batch=%s,\n                                 state='open',\n                                 time=now()\n                           WHERE commentchains.review=%s\n                             AND commentchains.uid=%s\n                             AND commentchains.state='draft'\"\"\",\n                       (batch_id, review.id, user.id))\n\n        profiler.check(\"commentchains draft=>open\")\n\n        # Reject all draft comment chain changes where the affected comment\n        # chain isn't in the state it was in when the change was drafted, or has\n        # been morphed into a note since the change was drafted.\n        cursor.execute(\"\"\"UPDATE commentchainchanges\n                             SET state='rejected',\n                                 time=now()\n                           WHERE uid=%s\n                             AND state='draft'\n                             AND from_state IS NOT NULL\n                             AND chain IN (SELECT id\n                                             FROM commentchains\n                                             JOIN commentchainchanges ON (commentchainchanges.chain=commentchains.id\n                                                                      AND (commentchainchanges.from_state!=commentchains.state\n                                                                        OR commentchainchanges.from_last_commit!=commentchains.last_commit\n                                                                        OR commentchains.type!='issue'))\n                                            WHERE commentchains.review=%s\n                                              AND commentchainchanges.uid=%s\n                                              AND commentchainchanges.state='draft')\"\"\",\n                       (user.id, review.id, user.id))\n\n        profiler.check(\"commentchainchanges reject state changes\")\n\n        # Reject all draft comment chain changes where the affected comment chain\n        # type isn't what it was in when the change was drafted.\n        cursor.execute(\"\"\"UPDATE commentchainchanges\n                             SET state='rejected',\n                                 time=now()\n                           WHERE uid=%s\n                             AND state='draft'\n                             AND from_type IS NOT NULL\n                             AND chain IN (SELECT id\n                                             FROM commentchains\n                                             JOIN commentchainchanges ON (commentchainchanges.chain=commentchains.id\n                                                                      AND commentchainchanges.from_type!=commentchains.type)\n                                            WHERE commentchains.review=%s\n                                              AND commentchainchanges.uid=%s\n                                              AND commentchainchanges.state='draft')\"\"\",\n                       (user.id, review.id, user.id))\n\n        profiler.check(\"commentchainchanges reject type changes\")\n\n        # Reject all draft comment chain changes where the affected comment chain\n        # addressed_by isn't what it was in when the change was drafted.\n        cursor.execute(\"\"\"UPDATE commentchainchanges\n                             SET state='rejected',\n                                 time=now()\n                           WHERE uid=%s\n                             AND state='draft'\n                             AND from_addressed_by IS NOT NULL\n                             AND chain IN (SELECT id\n                                             FROM commentchains\n                                             JOIN commentchainchanges ON (commentchainchanges.chain=commentchains.id\n                                                                      AND commentchainchanges.from_addressed_by!=commentchains.addressed_by)\n                                            WHERE commentchains.review=%s\n                                              AND commentchainchanges.uid=%s\n                                              AND commentchainchanges.state='draft')\"\"\",\n                       (user.id, review.id, user.id))\n\n        profiler.check(\"commentchainchanges reject addressed_by changes\")\n\n        # Then perform the remaining draft comment chain changes by updating the\n        # state of the corresponding comment chain.\n\n        # Perform open->closed changes, including setting 'closed_by'.\n        cursor.execute(\"\"\"UPDATE commentchains\n                             SET state='closed',\n                                 closed_by=%s\n                           WHERE review=%s\n                             AND id IN (SELECT chain\n                                          FROM commentchainchanges\n                                         WHERE uid=%s\n                                           AND state='draft'\n                                           AND to_state='closed')\"\"\",\n                       (user.id, review.id, user.id))\n\n        profiler.check(\"commentchains closed\")\n\n        # Perform (closed|addressed)->open changes, including resetting 'closed_by' and\n        # 'addressed_by' to NULL.\n        cursor.execute(\"\"\"SELECT commentchainchanges.to_last_commit, commentchains.id\n                            FROM commentchains\n                            JOIN commentchainchanges ON (commentchainchanges.chain=commentchains.id)\n                           WHERE commentchains.review=%s\n                             AND commentchainchanges.uid=%s\n                             AND commentchainchanges.state='draft'\n                             AND commentchainchanges.to_state='open'\"\"\",\n                       (review.id, user.id))\n        cursor.executemany(\"\"\"UPDATE commentchains\n                                 SET state='open',\n                                     last_commit=%s,\n                                     closed_by=NULL,\n                                     addressed_by=NULL\n                               WHERE id=%s\"\"\",\n                           cursor.fetchall())\n\n        profiler.check(\"commentchains reopen\")\n\n        # Perform addressed->addressed changes, i.e. updating 'addressed_by'.\n        cursor.execute(\"\"\"SELECT commentchainchanges.to_addressed_by, commentchains.id\n                            FROM commentchains\n                            JOIN commentchainchanges ON (commentchainchanges.chain=commentchains.id)\n                           WHERE commentchains.review=%s\n                             AND commentchainchanges.uid=%s\n                             AND commentchainchanges.state='draft'\n                             AND commentchainchanges.to_addressed_by IS NOT NULL\"\"\",\n                       (review.id, user.id))\n        cursor.executemany(\"\"\"UPDATE commentchains\n                                 SET addressed_by=%s\n                               WHERE id=%s\"\"\",\n                           cursor.fetchall())\n\n        profiler.check(\"commentchains reopen (partial)\")\n\n        # Perform type changes.\n        cursor.execute(\"\"\"SELECT commentchainchanges.to_type, commentchains.id\n                            FROM commentchains\n                            JOIN commentchainchanges ON (commentchainchanges.chain=commentchains.id)\n                           WHERE commentchains.review=%s\n                             AND commentchainchanges.uid=%s\n                             AND commentchainchanges.state='draft'\n                             AND commentchainchanges.to_type IS NOT NULL\"\"\",\n                       (review.id, user.id))\n        cursor.executemany(\"\"\"UPDATE commentchains\n                                 SET type=%s\n                               WHERE id=%s\"\"\",\n                           cursor.fetchall())\n\n        profiler.check(\"commentchains type change\")\n\n        # Finally change the state of just performed changes from draft to\n        # 'performed'.\n        cursor.execute(\"\"\"UPDATE commentchainchanges\n                             SET batch=%s,\n                                 state='performed',\n                                 time=now()\n                           WHERE uid=%s\n                             AND state='draft'\n                             AND chain IN (SELECT id\n                                             FROM commentchains\n                                            WHERE review=%s)\"\"\",\n                       (batch_id, user.id, review.id))\n\n        profiler.check(\"commentchainchanges draft=>performed\")\n\n        # Change state on all draft commentchainlines by the user in the review to 'current'.\n        cursor.execute(\"\"\"UPDATE commentchainlines\n                             SET state='current',\n                                 time=now()\n                           WHERE uid=%s\n                             AND state='draft'\n                             AND chain IN (SELECT id\n                                             FROM commentchains\n                                            WHERE review=%s)\"\"\",\n                       (user.id, review.id))\n\n        profiler.check(\"commentchainlines draft=>current\")\n\n        # Change state on all draft comments by the user in the review to 'current'.\n        cursor.execute(\"\"\"UPDATE comments\n                             SET batch=%s,\n                                 state='current',\n                                 time=now()\n                           WHERE comments.uid=%s\n                             AND comments.state='draft'\n                             AND chain IN (SELECT id\n                                             FROM commentchains\n                                            WHERE review=%s)\"\"\",\n                       (batch_id, user.id, review.id))\n\n        profiler.check(\"comments draft=>current\")\n\n        # Associate the submitting user with the review if he isn't already.\n        cursor.execute(\"SELECT 1 FROM reviewusers WHERE review=%s AND uid=%s\", (review.id, user.id))\n        if not cursor.fetchone():\n            cursor.execute(\"INSERT INTO reviewusers (review, uid) VALUES (%s, %s)\", (review.id, user.id))\n\n        generate_emails = profiler.start(\"generate emails\")\n\n        is_accepted = review.state == \"open\" and review.accepted(db)\n        pending_mails = generateMailsForBatch(db, batch_id, was_accepted, is_accepted, profiler=profiler)\n\n        generate_emails.stop()\n\n        review.incrementSerial(db)\n        db.commit()\n\n        profiler.check(\"commit transaction\")\n\n        sendPendingMails(pending_mails)\n\n        profiler.check(\"finished\")\n\n        if user.getPreference(db, \"debug.profiling.submitChanges\"):\n            return OperationResult(batch_id=batch_id,\n                                   serial=review.serial,\n                                   profiling=profiler.output())\n        else:\n            return OperationResult(batch_id=batch_id,\n                                   serial=review.serial)\n\nclass AbortChanges(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"review_id\": int,\n                                   \"what\": { \"approval\": bool,\n                                             \"comments\": bool,\n                                             \"metacomments\": bool }})\n\n    def process(self, db, user, review_id, what):\n        cursor = db.cursor()\n        profiler = profiling.Profiler()\n\n        if what[\"approval\"]:\n            # Delete all pending review file approvals.\n            cursor.execute(\"\"\"DELETE FROM reviewfilechanges\n                                    WHERE uid=%s\n                                      AND state='draft'\n                                      AND file IN (SELECT id\n                                                     FROM reviewfiles\n                                                    WHERE review=%s)\"\"\",\n                           (user.id, review_id))\n            profiler.check(\"approval\")\n\n        if what[\"comments\"]:\n            # Delete all pending comments chains.  This will, via ON DELETE CASCADE,\n            # also delete all related comments and commentchainlines rows.\n            cursor.execute(\"\"\"DELETE FROM commentchains\n                                    WHERE review=%s\n                                      AND uid=%s\n                                      AND state='draft'\"\"\",\n                           (review_id, user.id))\n            profiler.check(\"chains\")\n\n            # Delete all still existing draft comments.\n            cursor.execute(\"\"\"DELETE FROM comments\n                                    WHERE uid=%s\n                                      AND state='draft'\n                                      AND chain IN (SELECT id\n                                                      FROM commentchains\n                                                     WHERE review=%s)\"\"\",\n                           (user.id, review_id))\n            profiler.check(\"replies\")\n\n        if what[\"metacomments\"]:\n            # Delete all still existing draft comment lines.\n            cursor.execute(\"\"\"DELETE FROM commentchainlines\n                                    WHERE uid=%s\n                                      AND state='draft'\n                                      AND chain IN (SELECT id\n                                                      FROM commentchains\n                                                     WHERE review=%s\n                                                       AND state!='draft')\"\"\",\n                           (user.id, review_id))\n            profiler.check(\"comment lines\")\n\n            # Delete all still existing draft comment state changes.\n            cursor.execute(\"\"\"DELETE FROM commentchainchanges\n                                    WHERE uid=%s\n                                      AND state='draft'\n                                      AND chain IN (SELECT id\n                                                      FROM commentchains\n                                                     WHERE review=%s)\"\"\",\n                           (user.id, review_id))\n            profiler.check(\"comment state\")\n\n        db.commit()\n\n        if user.getPreference(db, \"debug.profiling.abortChanges\"):\n            return OperationResult(profiling=profiler.output())\n        else:\n            return OperationResult()\n"
  },
  {
    "path": "src/operation/editresource.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport dbutils\nimport gitutils\n\nfrom operation import Operation, OperationResult, OperationError, Optional\n\nclass StoreResource(Operation):\n    \"\"\"Store new revision of user-edited resource.\"\"\"\n\n    def __init__(self):\n        Operation.__init__(self, { \"name\": str,\n                                   \"source\": str })\n\n    def process(self, db, user, name, source):\n        cursor = db.cursor()\n        cursor.execute(\"SELECT MAX(revision) FROM userresources WHERE uid=%s AND name=%s\", (user.id, name))\n\n        current_revision = cursor.fetchone()[0] or 0\n        next_revision = current_revision + 1\n\n        cursor.execute(\"INSERT INTO userresources (uid, name, revision, source) VALUES (%s, %s, %s, %s)\", (user.id, name, next_revision, source))\n\n        db.commit()\n\n        return OperationResult()\n\nclass ResetResource(Operation):\n    \"\"\"Reset user-edited resource back to its original.\"\"\"\n\n    def __init__(self):\n        Operation.__init__(self, { \"name\": str })\n\n    def process(self, db, user, name):\n        cursor = db.cursor()\n        cursor.execute(\"SELECT MAX(revision) FROM userresources WHERE uid=%s AND name=%s\", (user.id, name))\n\n        current_revision = cursor.fetchone()[0] or 0\n\n        if current_revision > 0:\n            cursor.execute(\"SELECT source FROM userresources WHERE uid=%s AND name=%s AND revision=%s\", (user.id, name, current_revision))\n\n            if cursor.fetchone()[0] is not None:\n                next_revision = current_revision + 1\n\n                cursor.execute(\"INSERT INTO userresources (uid, name, revision) VALUES (%s, %s, %s)\", (user.id, name, next_revision))\n\n                db.commit()\n\n        return OperationResult()\n\nclass RestoreResource(Operation):\n    \"\"\"Restore last user-edited revision of resource after it's been reset.\"\"\"\n\n    def __init__(self):\n        Operation.__init__(self, { \"name\": str })\n\n    def process(self, db, user, name):\n        cursor = db.cursor()\n        cursor.execute(\"SELECT MAX(revision) FROM userresources WHERE uid=%s AND name=%s\", (user.id, name))\n\n        current_revision = cursor.fetchone()[0] or 0\n\n        if current_revision > 1:\n            cursor.execute(\"SELECT source FROM userresources WHERE uid=%s AND name=%s AND revision=%s\", (user.id, name, current_revision))\n\n            if cursor.fetchone()[0] is None:\n                cursor.execute(\"DELETE FROM userresources WHERE uid=%s AND name=%s AND revision=%s\", (user.id, name, current_revision))\n\n                db.commit()\n\n        return OperationResult()\n"
  },
  {
    "path": "src/operation/extensioninstallation.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport dbutils\nimport gitutils\nimport reviewing.filters\n\nfrom operation import (Operation, OperationResult, OperationFailure,\n                       OperationError, Optional, typechecker)\nfrom extensions.installation import (installExtension, uninstallExtension,\n                                     reinstallExtension, InstallationError,\n                                     getExtension)\nfrom extensions.extension import Extension, ExtensionError\nfrom extensions.manifest import FilterHookRole\n\nclass ExtensionOperation(Operation):\n    def __init__(self, perform):\n        Operation.__init__(self, { \"extension_name\": str,\n                                   \"author_name\": Optional(str),\n                                   \"version\": Optional(str),\n                                   \"universal\": Optional(bool) })\n        self.perform = perform\n\n    def process(self, db, user, extension_name, author_name=None, version=None,\n                universal=False):\n        if universal:\n            if not user.hasRole(db, \"administrator\"):\n                raise OperationFailure(code=\"notallowed\",\n                                       title=\"Not allowed!\",\n                                       message=\"Operation not permitted.\")\n            user = None\n\n        if version is not None:\n            if version == \"live\":\n                version = None\n            elif version.startswith(\"version/\"):\n                version = version[8:]\n            else:\n                raise OperationError(\n                    \"invalid version, got '%s', expected 'live' or 'version/*'\"\n                    % version)\n\n        try:\n            self.perform(db, user, author_name, extension_name, version)\n        except InstallationError as error:\n            raise OperationFailure(code=\"installationerror\",\n                                   title=error.title,\n                                   message=error.message,\n                                   is_html=error.is_html)\n\n        return OperationResult()\n\nclass InstallExtension(ExtensionOperation):\n    def __init__(self):\n        ExtensionOperation.__init__(self, installExtension)\n\nclass UninstallExtension(ExtensionOperation):\n    def __init__(self):\n        ExtensionOperation.__init__(self, uninstallExtension)\n\nclass ReinstallExtension(ExtensionOperation):\n    def __init__(self):\n        ExtensionOperation.__init__(self, reinstallExtension)\n\nclass ClearExtensionStorage(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"extension_name\": str,\n                                   \"author_name\": Optional(str) })\n\n    def process(self, db, user, extension_name, author_name=None):\n        extension = getExtension(author_name, extension_name)\n        extension_id = extension.getExtensionID(db, create=False)\n\n        if extension_id is not None:\n            cursor = db.cursor()\n            cursor.execute(\"\"\"DELETE FROM extensionstorage\n                                    WHERE extension=%s\n                                      AND uid=%s\"\"\",\n                           (extension_id, user.id))\n            db.commit()\n\n        return OperationResult()\n\nclass AddExtensionHookFilter(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"subject\": typechecker.User,\n                                   \"extension\": typechecker.Extension,\n                                   \"repository\": typechecker.Repository,\n                                   \"filterhook_name\": str,\n                                   \"path\": str,\n                                   \"data\": Optional(str),\n                                   \"replaced_filter_id\": Optional(int) })\n\n    def process(self, db, user, subject, extension, repository,\n                filterhook_name, path, data=None, replaced_filter_id=None):\n        if user != subject:\n            Operation.requireRole(db, \"administrator\", user)\n\n        path = reviewing.filters.sanitizePath(path)\n\n        if \"*\" in path:\n            try:\n                reviewing.filters.validatePattern(path)\n            except reviewing.filters.PatternError as error:\n                raise OperationFailure(\n                    code=\"invalidpattern\",\n                    title=\"Invalid path pattern\",\n                    message=\"There are invalid wild-cards in the path: %s\" % error.message)\n\n        installed_sha1, _ = extension.getInstalledVersion(db, subject)\n\n        if installed_sha1 is False:\n            raise OperationFailure(\n                code=\"invalidrequest\",\n                title=\"Invalid request\",\n                message=(\"The extension \\\"%s\\\" must be installed first!\"\n                         % extension.getTitle(db)))\n\n        manifest = extension.getManifest(sha1=installed_sha1)\n\n        for role in manifest.roles:\n            if isinstance(role, FilterHookRole) and role.name == filterhook_name:\n                break\n        else:\n            raise OperationFailure(\n                code=\"invalidrequest\",\n                title=\"Invalid request\",\n                message=(\"The extension doesn't have a filter hook role named %r!\"\n                         % filterhook_name))\n\n        cursor = db.cursor()\n\n        if replaced_filter_id is not None:\n            cursor.execute(\"\"\"SELECT 1\n                                FROM extensionhookfilters\n                               WHERE id=%s\n                                 AND uid=%s\"\"\",\n                           (replaced_filter_id, subject.id))\n\n            if not cursor.fetchone():\n                raise OperationFailure(\n                    code=\"invalidoperation\",\n                    title=\"Invalid operation\",\n                    message=\"Filter to replace does not exist or belongs to another user!\")\n\n            cursor.execute(\"\"\"DELETE\n                                FROM extensionhookfilters\n                               WHERE id=%s\"\"\",\n                           (replaced_filter_id,))\n\n        cursor.execute(\"\"\"INSERT INTO extensionhookfilters\n                                        (uid, extension, repository, name,\n                                         path, data)\n                               VALUES (%s, %s, %s, %s, %s, %s)\n                            RETURNING id\"\"\",\n                       (subject.id, extension.getExtensionID(db), repository.id,\n                        filterhook_name, path, data))\n\n        filter_id, = cursor.fetchone()\n\n        db.commit()\n\n        return OperationResult(filter_id=filter_id)\n\nclass DeleteExtensionHookFilter(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"subject\": typechecker.User,\n                                   \"filter_id\": int })\n\n    def process(self, db, user, subject, filter_id):\n        if user != subject:\n            Operation.requireRole(db, \"administrator\", user)\n\n        cursor = db.cursor()\n        cursor.execute(\"\"\"SELECT 1\n                            FROM extensionhookfilters\n                           WHERE id=%s\n                             AND uid=%s\"\"\",\n                       (filter_id, subject.id))\n\n        if not cursor.fetchone():\n            raise OperationFailure(\n                code=\"invalidoperation\",\n                title=\"Invalid operation\",\n                message=\"Filter to delete does not exist or belongs to another user!\")\n\n        cursor.execute(\"\"\"DELETE\n                            FROM extensionhookfilters\n                           WHERE id=%s\"\"\",\n                       (filter_id,))\n\n        db.commit()\n\n        return OperationResult()\n"
  },
  {
    "path": "src/operation/fetchlines.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport gitutils\nimport htmlutils\nimport diff\n\nfrom operation import Operation, OperationResult\n\nclass FetchLines(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"repository_id\": int,\n                                   \"path\": str,\n                                   \"sha1\": str,\n                                   \"ranges\": [{ \"offset\": int,\n                                                \"count\": int,\n                                                \"context\": bool }],\n                                   \"tabify\": bool },\n                           accept_anonymous_user=True)\n\n    def process(self, db, user, repository_id, path, sha1, ranges, tabify):\n        repository = gitutils.Repository.fromId(db, repository_id)\n        cursor = db.cursor()\n\n        def getContext(offset):\n            cursor.execute(\"\"\"SELECT context\n                                FROM codecontexts\n                               WHERE sha1=%s\n                                 AND %s BETWEEN first_line AND last_line\n                            ORDER BY first_line DESC\n                               LIMIT 1\"\"\",\n                           (sha1, offset))\n\n            row = cursor.fetchone()\n\n            if row: return row[0]\n            else: return None\n\n        file = diff.File(repository=repository, path=path, new_sha1=sha1)\n        file.loadNewLines(highlighted=True, request_highlight=True)\n\n        if tabify:\n            tabwidth = file.getTabWidth()\n            indenttabsmode = file.getIndentTabsMode()\n\n        def processRange(offset, count, context):\n            if context: context = getContext(offset)\n            else: context = None\n\n            # Offset is a 1-based line number.\n            start = offset - 1\n            # If count is -1, fetch all lines.\n            end = start + count if count > -1 else None\n\n            lines = file.newLines(highlighted=True)[start:end]\n\n            if tabify:\n                lines = [htmlutils.tabify(line, tabwidth, indenttabsmode) for line in lines]\n\n            return { \"lines\": lines, \"context\": context }\n\n        return OperationResult(ranges=[processRange(**line_range) for line_range in ranges])\n"
  },
  {
    "path": "src/operation/manipulateassignments.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport itertools\n\nimport dbutils\nimport mailutils\nimport reviewing.utils\n\nfrom operation import Operation, OperationResult\n\nclass GetAssignedChanges(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"review_id\": int,\n                                   \"user_name\": str })\n\n    def process(self, db, user, review_id, user_name):\n        reviewer = dbutils.User.fromName(db, user_name)\n\n        cursor = db.cursor()\n        cursor.execute(\"SELECT file FROM fullreviewuserfiles WHERE review=%s AND assignee=%s\", (review_id, reviewer.id))\n\n        return OperationResult(files=[file_id for (file_id,) in cursor])\n\nclass SetAssignedChanges(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"review_id\": int,\n                                   \"user_name\": str,\n                                   \"files\": [int] })\n\n    def process(self, db, user, review_id, user_name, files):\n        reviewer = dbutils.User.fromName(db, user_name)\n        new_file_ids = set(files)\n\n        cursor = db.cursor()\n        cursor.execute(\"SELECT 1 FROM reviewusers WHERE review=%s AND uid=%s\", (review_id, reviewer.id))\n\n        if not cursor.fetchone():\n            cursor.execute(\"INSERT INTO reviewusers (review, uid) VALUES (%s, %s)\", (review_id, reviewer.id))\n            current_file_ids = set()\n        else:\n            cursor.execute(\"SELECT file FROM fullreviewuserfiles WHERE review=%s AND assignee=%s\", (review_id, reviewer.id))\n            current_file_ids = set(file_id for (file_id,) in cursor)\n\n        delete_file_ids = current_file_ids - new_file_ids\n        new_file_ids -= current_file_ids\n\n        if delete_file_ids or new_file_ids:\n            cursor.execute(\"INSERT INTO reviewassignmentstransactions (review, assigner) VALUES (%s, %s) RETURNING id\", (review_id, user.id))\n            transaction_id = cursor.fetchone()[0]\n\n        if delete_file_ids:\n            cursor.executemany(\"\"\"INSERT INTO reviewassignmentchanges (transaction, file, uid, assigned)\n                                       SELECT %s, reviewfiles.id, reviewuserfiles.uid, false\n                                         FROM reviewfiles\n                                         JOIN reviewuserfiles ON (reviewuserfiles.file=reviewfiles.id)\n                                        WHERE reviewfiles.review=%s\n                                          AND reviewfiles.file=%s\n                                          AND reviewuserfiles.uid=%s\"\"\",\n                               itertools.izip(itertools.repeat(transaction_id),\n                                              itertools.repeat(review_id),\n                                              delete_file_ids,\n                                              itertools.repeat(reviewer.id)))\n\n            cursor.executemany(\"\"\"DELETE FROM reviewuserfiles\n                                        WHERE file IN (SELECT id\n                                                         FROM reviewfiles\n                                                        WHERE review=%s\n                                                          AND file=%s)\n                                          AND uid=%s\"\"\",\n                               itertools.izip(itertools.repeat(review_id),\n                                              delete_file_ids,\n                                              itertools.repeat(reviewer.id)))\n\n        if new_file_ids:\n            cursor.executemany(\"\"\"INSERT INTO reviewuserfiles (file, uid)\n                                       SELECT reviewfiles.id, %s\n                                         FROM reviewfiles\n                                        WHERE reviewfiles.review=%s\n                                          AND reviewfiles.file=%s\"\"\",\n                               itertools.izip(itertools.repeat(reviewer.id),\n                                              itertools.repeat(review_id),\n                                              new_file_ids))\n\n            cursor.executemany(\"\"\"INSERT INTO reviewassignmentchanges (transaction, file, uid, assigned)\n                                       SELECT %s, reviewfiles.id, %s, true\n                                         FROM reviewfiles\n                                        WHERE reviewfiles.review=%s\n                                          AND reviewfiles.file=%s\"\"\",\n                               itertools.izip(itertools.repeat(transaction_id),\n                                              itertools.repeat(reviewer.id),\n                                              itertools.repeat(review_id),\n                                              new_file_ids))\n\n        if delete_file_ids or new_file_ids:\n            cursor.execute(\"UPDATE reviews SET serial=serial+1 WHERE id=%s\", (review_id,))\n\n            pending_mails = reviewing.utils.generateMailsForAssignmentsTransaction(db, transaction_id)\n\n            db.commit()\n\n            mailutils.sendPendingMails(pending_mails)\n\n        return OperationResult()\n"
  },
  {
    "path": "src/operation/manipulatecomment.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport dbutils\nimport gitutils\n\nfrom operation import Operation, OperationResult, OperationError, OperationFailure, Optional\nfrom reviewing.comment import Comment, CommentChain, propagate\n\nclass SetCommentChainState(Operation):\n    def __init__(self, parameters):\n        Operation.__init__(self, parameters)\n\n    def setChainState(self, db, user, chain, old_state, new_state, new_last_commit=None):\n        review = chain.review\n\n        if chain.state != old_state:\n            raise OperationFailure(code=\"invalidoperation\",\n                                   title=\"Invalid operation\",\n                                   message=\"The comment chain's state is not '%s'; can't change state to '%s'.\" % (old_state, new_state))\n        elif new_state == \"open\" and review.state != \"open\":\n            raise OperationFailure(code=\"invalidoperation\",\n                                   title=\"Invalid operation\",\n                                   message=\"Can't reopen comment chain in %s review!\" % review.state)\n\n        if chain.last_commit:\n            old_last_commit = chain.last_commit.id\n            if new_last_commit is None:\n                new_last_commit = old_last_commit\n        else:\n            old_last_commit = new_last_commit = None\n\n        cursor = db.cursor()\n\n        if chain.state_is_draft:\n            # The user is reverting a draft state change; just undo the draft\n            # change.\n            cursor.execute(\"\"\"DELETE FROM commentchainchanges\n                               WHERE chain=%s\n                                 AND uid=%s\n                                 AND to_state IS NOT NULL\"\"\",\n                           (chain.id, user.id))\n\n        else:\n            # Otherwise insert a new row into the commentchainchanges table.\n            cursor.execute(\"\"\"INSERT INTO commentchainchanges (uid, chain, from_state, to_state, from_last_commit, to_last_commit)\n                              VALUES (%s, %s, %s, %s, %s, %s)\"\"\",\n                           (user.id, chain.id, old_state, new_state, old_last_commit, new_last_commit))\n\n        db.commit()\n\n        return OperationResult(old_state=old_state, new_state=new_state,\n                               draft_status=review.getDraftStatus(db, user))\n\nclass ReopenResolvedCommentChain(SetCommentChainState):\n    def __init__(self):\n        SetCommentChainState.__init__(self, { \"chain_id\": int })\n\n    def process(self, db, user, chain_id):\n        return self.setChainState(db, user, CommentChain.fromId(db, chain_id, user), \"closed\", \"open\")\n\nclass ReopenAddressedCommentChain(SetCommentChainState):\n    def __init__(self):\n        SetCommentChainState.__init__(self, { \"chain_id\": int,\n                                              \"commit_id\": int,\n                                              \"sha1\": str,\n                                              \"offset\": int,\n                                              \"count\": int })\n\n    def process(self, db, user, chain_id, commit_id, sha1, offset, count):\n        chain = CommentChain.fromId(db, chain_id, user)\n        existing = chain.lines_by_sha1.get(sha1)\n\n        if chain.state != \"addressed\":\n            raise OperationFailure(code=\"invalidoperation\",\n                                   title=\"Invalid operation\",\n                                   message=\"The comment chain is not marked as addressed!\")\n\n        if not existing:\n            assert commit_id == chain.addressed_by.getId(db)\n\n            commits = chain.review.getCommitSet(db).without(chain.addressed_by.parents)\n\n            propagation = propagate.Propagation(db)\n            propagation.setExisting(chain.review, chain.id, chain.addressed_by, chain.file_id, offset, offset + count - 1, True)\n            propagation.calculateAdditionalLines(commits, chain.review.branch.getHead(db))\n\n            commentchainlines_values = []\n\n            for file_sha1, (first_line, last_line) in propagation.new_lines.items():\n                commentchainlines_values.append((chain.id, user.id, file_sha1, first_line, last_line))\n\n            cursor = db.cursor()\n            cursor.executemany(\"\"\"INSERT INTO commentchainlines (chain, uid, sha1, first_line, last_line)\n                                  VALUES (%s, %s, %s, %s, %s)\"\"\",\n                               commentchainlines_values)\n\n            if not propagation.active:\n                old_addressed_by_id = chain.addressed_by.getId(db)\n                new_addressed_by_id = propagation.addressed_by[0].child.getId(db)\n\n                if chain.addressed_by_is_draft:\n                    cursor.execute(\"\"\"UPDATE commentchainchanges\n                                         SET to_addressed_by=%s\n                                       WHERE chain=%s\n                                         AND uid=%s\n                                         AND state='draft'\n                                         AND to_addressed_by=%s\"\"\",\n                                   (new_addressed_by_id, chain.id, user.id, old_addressed_by_id))\n                else:\n                    cursor.execute(\"\"\"INSERT INTO commentchainchanges (uid, chain, from_addressed_by, to_addressed_by)\n                                      VALUES (%s, %s, %s, %s)\"\"\",\n                                   (user.id, chain.id, old_addressed_by_id, new_addressed_by_id))\n\n                old_last_commit_id = chain.last_commit.getId(db)\n                new_last_commit_id = chain.addressed_by.getId(db)\n\n                if chain.last_commit_is_draft:\n                    cursor.execute(\"\"\"UPDATE commentchainchanges\n                                         SET to_last_commit=%s\n                                       WHERE chain=%s\n                                         AND uid=%s\n                                         AND state='draft'\n                                         AND to_last_commit=%s\"\"\",\n                                   (new_last_commit_id, chain.id, user.id, old_last_commit_id))\n                else:\n                    cursor.execute(\"\"\"INSERT INTO commentchainchanges (uid, chain, from_last_commit, to_last_commit)\n                                      VALUES (%s, %s, %s, %s)\"\"\",\n                                   (user.id, chain.id, old_last_commit_id, new_last_commit_id))\n\n                db.commit()\n\n                return OperationResult(old_state='addressed', new_state='addressed',\n                                       draft_status=chain.review.getDraftStatus(db, user))\n        elif offset != existing[0] or count != existing[1]:\n            raise OperationFailure(code=\"invalidoperation\",\n                                   title=\"Invalid operation\",\n                                   message=\"The comment chain is already present at other lines in same file version\")\n\n        return self.setChainState(db, user, chain, \"addressed\", \"open\", new_last_commit=commit_id)\n\nclass ResolveCommentChain(SetCommentChainState):\n    def __init__(self):\n        Operation.__init__(self, { \"chain_id\": int })\n\n    def process(self, db, user, chain_id):\n        return self.setChainState(db, user, CommentChain.fromId(db, chain_id, user), \"open\", \"closed\")\n\nclass MorphCommentChain(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"chain_id\": int,\n                                   \"new_type\": set([\"issue\", \"note\"]) })\n\n    def process(self, db, user, chain_id, new_type):\n        chain = CommentChain.fromId(db, chain_id, user)\n        review = chain.review\n\n        if chain.type == new_type:\n            raise OperationError(\"the comment chain's type is already '%s'\" % new_type)\n        elif new_type == \"note\" and chain.state in (\"closed\", \"addressed\"):\n            raise OperationError(\"can't convert resolved or addressed issue to a note\")\n\n        cursor = db.cursor()\n\n        if chain.state == \"draft\":\n            # The chain is still a draft; just change its type directly.\n            cursor.execute(\"\"\"UPDATE commentchains\n                                 SET type=%s\n                               WHERE id=%s\"\"\",\n                           (new_type, chain.id))\n\n        elif chain.type_is_draft:\n            # The user is reverting a draft chain type change; just undo the\n            # draft change.\n            cursor.execute(\"\"\"DELETE FROM commentchainchanges\n                               WHERE chain=%s\n                                 AND uid=%s\n                                 AND to_type IS NOT NULL\"\"\",\n                           (chain.id, user.id))\n\n        else:\n            # Otherwise insert a new row into the commentchainchanges table.\n            cursor.execute(\"\"\"INSERT INTO commentchainchanges (uid, chain, from_type, to_type)\n                              VALUES (%s, %s, %s, %s)\"\"\",\n                           (user.id, chain.id, chain.type, new_type))\n\n        db.commit()\n\n        return OperationResult(draft_status=review.getDraftStatus(db, user))\n\nclass UpdateComment(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"comment_id\": int,\n                                   \"new_text\": str })\n\n    def process(self, db, user, comment_id, new_text):\n        comment = Comment.fromId(db, comment_id, user)\n\n        if user != comment.user:\n            raise OperationError(\"can't edit comment written by another user\")\n        if comment.state != \"draft\":\n            raise OperationError(\"can't edit comment that has been submitted\")\n        if not new_text.strip():\n            raise OperationError(\"empty comment\")\n\n        cursor = db.cursor()\n        cursor.execute(\"\"\"UPDATE comments\n                             SET comment=%s, time=now()\n                           WHERE id=%s\"\"\",\n                       (new_text, comment.id))\n\n        db.commit()\n\n        return OperationResult(draft_status=comment.chain.review.getDraftStatus(db, user))\n\nclass DeleteComment(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"comment_id\": int })\n\n    def process(self, db, user, comment_id):\n        comment = Comment.fromId(db, comment_id, user)\n\n        if user != comment.user:\n            raise OperationError(\"can't delete comment written by another user\")\n        if comment.state != \"draft\":\n            raise OperationError(\"can't delete comment that has been submitted\")\n\n        cursor = db.cursor()\n        cursor.execute(\"\"\"UPDATE comments\n                             SET state='deleted'\n                           WHERE id=%s\"\"\",\n                       (comment.id,))\n\n        if comment.chain.state == \"draft\":\n            # If the comment chain was a draft, then delete it as well.\n            cursor.execute(\"\"\"UPDATE commentchains\n                                 SET state='empty'\n                               WHERE id=%s\"\"\",\n                           (comment.chain.id,))\n\n        db.commit()\n\n        return OperationResult(draft_status=comment.chain.review.getDraftStatus(db, user))\n\nclass MarkChainsAsRead(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"chain_ids\": Optional([int]),\n                                   \"review_ids\": Optional([int]) })\n\n    def process(self, db, user, chain_ids=None, review_ids=None):\n        cursor = db.cursor()\n\n        if chain_ids:\n            cursor.execute(\"\"\"DELETE FROM commentstoread\n                                    WHERE uid=%s\n                                      AND comment IN (SELECT id\n                                                        FROM comments\n                                                       WHERE chain=ANY (%s))\"\"\",\n                           (user.id, chain_ids))\n\n        if review_ids:\n            cursor.execute(\"\"\"DELETE FROM commentstoread\n                                    WHERE uid=%s\n                                      AND comment IN (SELECT comments.id\n                                                        FROM comments\n                                                        JOIN commentchains ON (commentchains.id=comments.chain)\n                                                       WHERE commentchains.review=ANY (%s))\"\"\",\n                           (user.id, review_ids))\n\n        db.commit()\n\n        return OperationResult()\n"
  },
  {
    "path": "src/operation/manipulatefilters.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport dbutils\nimport gitutils\nimport mailutils\nimport htmlutils\nimport reviewing.utils\nimport reviewing.filters\n\nfrom operation import Operation, OperationResult, OperationError, \\\n    OperationFailure, OperationFailureMustLogin, Optional\n\nclass AddFilter(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"filter_type\": set([\"reviewer\", \"watcher\", \"ignored\"]),\n                                   \"path\": str,\n                                   \"delegates\": [str],\n                                   \"repository_id\": Optional(int),\n                                   \"repository_name\": Optional(str),\n                                   \"replaced_filter_id\": Optional(int) })\n\n    def process(self, db, user, filter_type, path, delegates, repository_id=None, repository_name=None, replaced_filter_id=None):\n        path = reviewing.filters.sanitizePath(path)\n\n        if \"*\" in path:\n            try:\n                reviewing.filters.validatePattern(path)\n            except reviewing.filters.PatternError as error:\n                raise OperationFailure(code=\"invalidpattern\",\n                                       title=\"Invalid path pattern\",\n                                       message=\"There are invalid wild-cards in the path: %s\" % error.message)\n\n        if filter_type == \"reviewer\":\n            delegates = filter(None, delegates)\n            invalid_delegates = []\n            for delegate in delegates:\n                try:\n                    dbutils.User.fromName(db, delegate)\n                except dbutils.NoSuchUser:\n                    invalid_delegates.append(delegate)\n            if invalid_delegates:\n                raise OperationFailure(code=\"invaliduser\",\n                                       title=\"Invalid delegate(s)\",\n                                       message=\"These user-names are not valid: %s\" % \", \".join(invalid_delegates))\n        else:\n            delegates = []\n\n        cursor = db.cursor()\n\n        if repository_id is None:\n            cursor.execute(\"\"\"SELECT id\n                                FROM repositories\n                               WHERE name=%s\"\"\",\n                           (repository_name,))\n            repository_id = cursor.fetchone()[0]\n\n        if replaced_filter_id is not None:\n            cursor.execute(\"\"\"SELECT 1\n                                FROM filters\n                               WHERE id=%s\n                                 AND uid=%s\"\"\",\n                           (replaced_filter_id, user.id))\n\n            if not cursor.fetchone():\n                raise OperationFailure(code=\"invalidoperation\",\n                                       title=\"Invalid operation\",\n                                       message=\"Filter to replace does not exist or belongs to another user!\")\n\n            cursor.execute(\"\"\"DELETE\n                                FROM filters\n                               WHERE id=%s\"\"\",\n                           (replaced_filter_id,))\n\n        cursor.execute(\"\"\"SELECT 1\n                            FROM filters\n                           WHERE uid=%s\n                             AND repository=%s\n                             AND path=%s\"\"\",\n                       (user.id, repository_id, path))\n\n        if cursor.fetchone():\n            raise OperationFailure(code=\"duplicatefilter\",\n                                   title=\"Duplicate filter\",\n                                   message=(\"You already have a filter for the path <code>%s</code> in this repository.\"\n                                            % htmlutils.htmlify(path)),\n                                   is_html=True)\n\n        cursor.execute(\"\"\"INSERT INTO filters (uid, repository, path, type, delegate)\n                               VALUES (%s, %s, %s, %s, %s)\n                            RETURNING id\"\"\",\n                       (user.id, repository_id, path, filter_type, \",\".join(delegates)))\n\n        filter_id = cursor.fetchone()[0]\n\n        db.commit()\n\n        return OperationResult(filter_id=filter_id)\n\nclass DeleteFilter(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"filter_id\": int })\n\n    def process(self, db, user, filter_id):\n        cursor = db.cursor()\n        cursor.execute(\"\"\"SELECT uid\n                            FROM filters\n                           WHERE id=%s\"\"\",\n                       (filter_id,))\n\n        row = cursor.fetchone()\n        if row:\n            if user.id != row[0]:\n                Operation.requireAdministratorRole(db, user)\n\n            cursor.execute(\"\"\"DELETE\n                                FROM filters\n                               WHERE id=%s\"\"\",\n                           (filter_id,))\n\n            db.commit()\n\n        return OperationResult()\n\nclass ReapplyFilters(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"repository_id\": Optional(int),\n                                   \"filter_id\": Optional(int) })\n\n    def process(self, db, user, repository_id=None, filter_id=None):\n        if user.isAnonymous():\n            return OperationFailureMustLogin()\n\n        cursor = db.cursor()\n\n        if filter_id is not None:\n            cursor.execute(\"\"\"SELECT repository, path, type, delegate\n                                FROM filters\n                               WHERE id=%s\"\"\",\n                           (filter_id,))\n            repository_id, filter_path, filter_type, filter_delegate = cursor.fetchone()\n\n        if repository_id is None:\n            cursor.execute(\"\"\"SELECT reviews.id, applyfilters, applyparentfilters, branches.repository\n                                FROM reviews\n                                JOIN branches ON (reviews.branch=branches.id)\n                               WHERE reviews.state!='closed'\"\"\")\n        else:\n            cursor.execute(\"\"\"SELECT reviews.id, applyfilters, applyparentfilters, branches.repository\n                                FROM reviews\n                                JOIN branches ON (reviews.branch=branches.id)\n                               WHERE reviews.state!='closed'\n                                 AND branches.repository=%s\"\"\",\n                           (repository_id,))\n\n        repositories = {}\n\n        # list(review_file_id)\n        assign_changes = []\n\n        # set(review_id)\n        assigned_reviews = set()\n\n        # set(review_id)\n        watched_reviews = set()\n\n        for review_id, applyfilters, applyparentfilters, repository_id in cursor.fetchall():\n            if repository_id in repositories:\n                repository = repositories[repository_id]\n            else:\n                repository = gitutils.Repository.fromId(db, repository_id)\n                repositories[repository_id] = repository\n\n            review = reviewing.filters.Filters.Review(review_id, applyfilters, applyparentfilters, repository)\n            filters = reviewing.filters.Filters()\n\n            filters.setFiles(db, review=review)\n\n            if filter_id is not None:\n                filters.addFilter(user.id, filter_path, filter_type, filter_delegate, filter_id)\n            else:\n                filters.load(db, review=review, user=user)\n\n            cursor.execute(\"\"\"SELECT commits.id, reviewfiles.file, reviewfiles.id\n                                FROM commits\n                                JOIN gitusers ON (gitusers.id=commits.author_gituser)\n                     LEFT OUTER JOIN usergitemails ON (usergitemails.email=gitusers.email\n                                                   AND usergitemails.uid=%s)\n                                JOIN changesets ON (changesets.child=commits.id)\n                                JOIN reviewfiles ON (reviewfiles.changeset=changesets.id)\n                     LEFT OUTER JOIN reviewuserfiles ON (reviewuserfiles.file=reviewfiles.id\n                                                     AND reviewuserfiles.uid=%s)\n                               WHERE reviewfiles.review=%s\n                                 AND usergitemails.uid IS NULL\n                                 AND reviewuserfiles.uid IS NULL\"\"\",\n                            (user.id, user.id, review_id))\n\n            for commit_id, file_id, review_file_id in cursor.fetchall():\n                association = filters.getUserFileAssociation(user.id, file_id)\n\n                if association == 'reviewer':\n                    assign_changes.append(review_file_id)\n                    assigned_reviews.add(review_id)\n                elif association == 'watcher':\n                    watched_reviews.add(review_id)\n\n        cursor.execute(\"\"\"SELECT reviews.id\n                            FROM reviews\n                 LEFT OUTER JOIN reviewusers ON (reviewusers.review=reviews.id\n                                             AND reviewusers.uid=%s)\n                           WHERE reviews.id=ANY (%s)\n                             AND reviewusers.uid IS NULL\"\"\",\n                       (user.id, list(assigned_reviews) + list(watched_reviews)))\n\n        new_reviews = set(review_id for (review_id,) in cursor)\n\n        cursor.executemany(\"\"\"INSERT INTO reviewusers (review, uid)\n                                   VALUES (%s, %s)\"\"\",\n                           [(review_id, user.id) for review_id in new_reviews])\n\n        cursor.executemany(\"\"\"INSERT INTO reviewuserfiles (file, uid)\n                                   VALUES (%s, %s)\"\"\",\n                           [(review_file_id, user.id) for review_file_id in assign_changes])\n\n        db.commit()\n\n        watched_reviews &= new_reviews\n        watched_reviews -= assigned_reviews\n\n        cursor.execute(\"\"\"SELECT id, summary\n                            FROM reviews\n                           WHERE id=ANY (%s)\"\"\",\n                       (list(assigned_reviews | watched_reviews),))\n\n        return OperationResult(assigned_reviews=sorted(assigned_reviews),\n                               watched_reviews=sorted(watched_reviews),\n                               summaries=dict(cursor))\n\nclass CountMatchedPaths(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"single\": Optional({ \"repository_name\": str,\n                                                        \"path\": str }),\n                                   \"multiple\": Optional([int]),\n                                   \"user_id\": Optional(int) })\n\n    def process(self, db, user, single=None, multiple=None, user_id=None):\n        if user_id is None:\n            user_id = user.id\n\n        try:\n            if single:\n                repository = gitutils.Repository.fromName(db, single[\"repository_name\"])\n                path = reviewing.filters.sanitizePath(single[\"path\"])\n\n                cursor = db.cursor()\n                cursor.execute(\"\"\"SELECT path\n                                    FROM filters\n                                   WHERE repository=%s\n                                     AND uid=%s\"\"\",\n                               (repository.id, user_id,))\n\n                paths = set(filter_path for (filter_path,) in cursor)\n                paths.add(path)\n\n                return OperationResult(count=reviewing.filters.countMatchedFiles(repository, list(paths))[path])\n\n            cursor = db.cursor()\n            cursor.execute(\"\"\"SELECT repository, id, path\n                                FROM filters\n                               WHERE id=ANY (%s)\n                            ORDER BY repository\"\"\",\n                           (multiple,))\n\n            per_repository = {}\n            result = []\n\n            for repository_id, filter_id, filter_path in cursor:\n                per_repository.setdefault(repository_id, []).append((filter_id, filter_path))\n\n            for repository_id, filters in per_repository.items():\n                repository = gitutils.Repository.fromId(db, repository_id)\n                counts = reviewing.filters.countMatchedFiles(\n                    repository, [filter_path for (filter_id, filter_path) in filters])\n                for filter_id, filter_path in filters:\n                    result.append({ \"id\": filter_id,\n                                    \"count\": counts[filter_path] })\n\n            return OperationResult(filters=result)\n        except reviewing.filters.PatternError as error:\n            return OperationFailure(code=\"invalidpattern\",\n                                    title=\"Invalid pattern!\",\n                                    message=str(error))\n\nclass GetMatchedPaths(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"repository_name\": str,\n                                   \"path\": str,\n                                   \"user_id\": Optional(int) })\n\n    def process(self, db, user, repository_name, path, user_id=None):\n        if user_id is None:\n            user_id = user.id\n\n        repository = gitutils.Repository.fromName(db, repository_name)\n        path = reviewing.filters.sanitizePath(path)\n\n        cursor = db.cursor()\n        cursor.execute(\"\"\"SELECT path\n                            FROM filters\n                           WHERE repository=%s\n                             AND uid=%s\"\"\",\n                       (repository.id, user_id,))\n\n        paths = set(filter_path for (filter_path,) in cursor)\n        paths.add(path)\n\n        return OperationResult(paths=reviewing.filters.getMatchedFiles(repository, list(paths))[path])\n\nclass AddReviewFilters(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"review_id\": int,\n                                   \"filters\": [{ \"type\": set([\"reviewer\", \"watcher\"]),\n                                                 \"user_names\": Optional([str]),\n                                                 \"user_ids\": Optional([int]),\n                                                 \"paths\": Optional([str]),\n                                                 \"file_ids\": Optional([int]) }] })\n\n    def process(self, db, creator, review_id, filters):\n        review = dbutils.Review.fromId(db, review_id)\n        by_user = {}\n\n        for filter in filters:\n            if \"user_ids\" in filter:\n                user_ids = set(filter[\"user_ids\"])\n            else:\n                user_ids = set([])\n\n            if \"user_names\" in filter:\n                for user_name in filter[\"user_names\"]:\n                    user_ids.add(dbutils.User.fromName(db, user_name).id)\n\n            if \"paths\" in filter:\n                paths = set(reviewing.filters.sanitizePath(path) for path in filter[\"paths\"])\n\n                for path in paths:\n                    try:\n                        reviewing.filters.validatePattern(path)\n                    except reviewing.filters.PatternError as error:\n                        raise OperationFailure(\n                            code=\"invalidpattern\",\n                            title=\"Invalid path pattern\",\n                            message=\"There are invalid wild-cards in the path: %s\" % error.message)\n            else:\n                paths = set()\n\n            if \"file_ids\" in filter:\n                for file_id in filter[\"file_ids\"]:\n                    paths.add(dbutils.describe_file(file_id))\n\n            for user_id in user_ids:\n                reviewer_paths, watcher_paths = by_user.setdefault(user_id, (set(), set()))\n\n                if filter[\"type\"] == \"reviewer\":\n                    reviewer_paths |= paths\n                else:\n                    watcher_paths |= paths\n\n        pending_mails = []\n\n        for user_id, (reviewer_paths, watcher_paths) in by_user.items():\n            try:\n                user = dbutils.User.fromId(db, user_id)\n            except dbutils.InvalidUserId:\n                raise OperationFailure(\n                    code=\"invaliduserid\",\n                    title=\"Invalid user ID\",\n                    message=\"At least one of the specified user IDs was invalid.\")\n            pending_mails.extend(reviewing.utils.addReviewFilters(\n                    db, creator, user, review, reviewer_paths, watcher_paths))\n\n        review = dbutils.Review.fromId(db, review_id)\n        review.incrementSerial(db)\n\n        db.commit()\n\n        mailutils.sendPendingMails(pending_mails)\n\n        return OperationResult()\n\nclass RemoveReviewFilter(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"filter_id\": int })\n\n    def process(self, db, user, filter_id):\n        cursor = db.cursor()\n\n        cursor.execute(\"SELECT review FROM reviewfilters WHERE id=%s\", (filter_id,))\n        review_id = cursor.fetchone()\n        if not review_id:\n            raise OperationFailure(\n                code=\"nosuchfilter\",\n                title=\"No such filter!\",\n                message=(\"Maybe the filter has been deleted since you \"\n                         \"loaded this page?\"))\n\n        cursor.execute(\"DELETE FROM reviewfilters WHERE id=%s\", (filter_id,))\n\n        review = dbutils.Review.fromId(db, review_id)\n        review.incrementSerial(db)\n\n        db.commit()\n\n        return OperationResult()\n"
  },
  {
    "path": "src/operation/manipulatereview.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport dbutils\nimport gitutils\n\nimport reviewing.mail as review_mail\nimport mailutils\n\nfrom operation import (Operation, OperationResult, OperationError, Optional,\n                       OperationFailure, Review, User)\n\nclass CloseReview(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"review_id\": int })\n\n    def process(self, db, user, review_id):\n        review = dbutils.Review.fromId(db, review_id)\n\n        if review.state != \"open\":\n            raise OperationError(\"review not open; can't close\")\n        if not review.accepted(db):\n            raise OperationError(\"review is not accepted; can't close\")\n\n        review.close(db, user)\n        review.disableTracking(db)\n\n        db.commit()\n\n        return OperationResult()\n\nclass DropReview(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"review_id\": int })\n\n    def process(self, db, user, review_id):\n        review = dbutils.Review.fromId(db, review_id)\n\n        if review.state != \"open\":\n            raise OperationError(\"review not open; can't drop\")\n\n        review.drop(db, user)\n        review.disableTracking(db)\n\n        db.commit()\n\n        return OperationResult()\n\nclass ReopenReview(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"review_id\": int })\n\n    def process(self, db, user, review_id):\n        review = dbutils.Review.fromId(db, review_id)\n\n        if review.state == \"open\":\n            raise OperationError(\"review already open; can't reopen\")\n\n        review.reopen(db, user)\n\n        db.commit()\n\n        return OperationResult()\n\nclass PingReview(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"review_id\": int,\n                                   \"note\": str })\n\n    def process(self, db, user, review_id, note):\n        review = dbutils.Review.fromId(db, review_id)\n\n        cursor = db.cursor()\n        cursor.execute(\"\"\"SELECT DISTINCT uid\n                            FROM reviewuserfiles\n                            JOIN reviewfiles ON (reviewfiles.id=reviewuserfiles.file)\n                            JOIN users ON (users.id=reviewuserfiles.uid)\n                           WHERE reviewfiles.review=%s\n                             AND reviewfiles.state='pending'\n                             AND users.status!='retired'\"\"\",\n                       (review.id,))\n\n        user_ids = set([user_id for (user_id,) in cursor.fetchall()])\n\n        # Add the pinging user and the owners (they are usually the same.)\n        user_ids.add(user.id)\n\n        for owner in review.owners: user_ids.add(owner.id)\n\n        recipients = [dbutils.User.fromId(db, user_id) for user_id in user_ids]\n\n        pending_mails = []\n        for recipient in recipients:\n            pending_mails.extend(review_mail.sendPing(db, user, recipient, recipients, review, note))\n        mailutils.sendPendingMails(pending_mails)\n\n        return OperationResult()\n\nclass UpdateReview(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"review_id\": int,\n                                   \"new_summary\": Optional(str),\n                                   \"new_description\": Optional(str),\n                                   \"new_owners\": Optional([str]) })\n\n    def process(self, db, user, review_id, new_summary=None, new_description=None, new_owners=None):\n        review = dbutils.Review.fromId(db, review_id)\n\n        if new_summary is not None:\n            if not new_summary:\n                raise OperationError(\"invalid new summary\")\n            review.setSummary(db, new_summary)\n\n        if new_description is not None:\n            review.setDescription(db, new_description if new_description else None)\n\n        if new_owners is not None:\n            remove_owners = set(review.owners)\n            for user_name in new_owners:\n                owner = dbutils.User.fromName(db, user_name)\n                if owner in remove_owners: remove_owners.remove(owner)\n                else: review.addOwner(db, owner)\n            for owner in remove_owners:\n                review.removeOwner(db, owner)\n\n        review = dbutils.Review.fromId(db, review_id)\n        review.incrementSerial(db)\n\n        db.commit()\n\n        return OperationResult()\n\nclass WatchReview(Operation):\n    def __init__(self):\n        super(WatchReview, self).__init__({ \"review\": Review,\n                                            \"subject\": User })\n\n    def process(self, db, user, review, subject):\n        if user != subject:\n            Operation.requireRole(db, \"administrator\", user)\n\n        cursor = db.readonly_cursor()\n        cursor.execute(\"\"\"SELECT 1\n                            FROM reviewusers\n                           WHERE review=%s\n                             AND uid=%s\"\"\",\n                       (review.id, subject.id))\n\n        if cursor.fetchone():\n            # Already a watcher (or reviewer/owner).\n            return OperationResult()\n\n        cursor.execute(\"\"\"SELECT uid, include\n                            FROM reviewrecipientfilters\n                           WHERE review=%s\n                             AND (uid=%s OR uid IS NULL)\"\"\",\n                       (review.id, subject.id))\n\n        default_include = True\n        user_include = None\n\n        for user_id, include in cursor:\n            if user_id is None:\n                default_include = include\n            else:\n                user_include = include\n\n        with db.updating_cursor(\n                \"reviewusers\", \"reviewrecipientfilters\") as cursor:\n            cursor.execute(\"\"\"INSERT INTO reviewusers (review, uid, type)\n                                   VALUES (%s, %s, 'manual')\"\"\",\n                           (review.id, subject.id))\n\n            if not default_include and user_include is None:\n                cursor.execute(\n                    \"\"\"INSERT INTO reviewrecipientfilters (review, uid, include)\n                            VALUES (%s, %s, TRUE)\"\"\",\n                    (review.id, subject.id))\n\n        return OperationResult()\n\nclass UnwatchReview(Operation):\n    def __init__(self):\n        super(UnwatchReview, self).__init__({ \"review\": Review,\n                                            \"subject\": User })\n\n    def process(self, db, user, review, subject):\n        if user != subject:\n            Operation.requireRole(db, \"administrator\", user)\n\n        cursor = db.readonly_cursor()\n        cursor.execute(\"\"\"SELECT owner\n                            FROM reviewusers\n                           WHERE review=%s\n                             AND uid=%s\"\"\",\n                       (review.id, subject.id))\n        row = cursor.fetchone()\n\n        if not row:\n            # Already not associated.\n            return OperationResult()\n\n        is_owner, = row\n\n        if is_owner:\n            raise OperationFailure(\n                code=\"isowner\",\n                title=\"Is owner\",\n                message=\"Cannot unwatch review since user owns the review.\")\n\n        cursor.execute(\"\"\"SELECT 1\n                            FROM fullreviewuserfiles\n                           WHERE review=%s\n                             AND assignee=%s\"\"\",\n                       (review.id, subject.id))\n\n        if cursor.fetchone():\n            raise OperationFailure(\n                code=\"isreviewer\",\n                title=\"Is reviewer\",\n                message=(\"Cannot unwatch review since user is assigned to \"\n                         \"review changes.\"))\n\n        with db.updating_cursor(\"reviewusers\") as cursor:\n            cursor.execute(\"\"\"DELETE\n                                FROM reviewusers\n                               WHERE review=%s\n                                 AND uid=%s\"\"\",\n                           (review.id, subject.id))\n\n        return OperationResult()\n"
  },
  {
    "path": "src/operation/manipulateuser.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport re\nimport base64\n\nimport dbutils\nimport gitutils\nimport auth\nimport mailutils\nimport textutils\nimport configuration\n\nfrom operation import (Operation, OperationResult, OperationError,\n                       OperationFailure, Optional, User)\n\nclass SetFullname(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"user_id\": int,\n                                   \"value\": str })\n\n    def process(self, db, user, user_id, value):\n        if user.id != user_id:\n            Operation.requireRole(db, \"administrator\", user)\n\n        if not value.strip():\n            raise OperationError(\"empty display name is not allowed\")\n\n        db.cursor().execute(\"UPDATE users SET fullname=%s WHERE id=%s\", (value.strip(), user_id))\n        db.commit()\n\n        return OperationResult()\n\nclass SetGitEmails(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"subject\": User,\n                                   \"value\": [str] })\n\n    def process(self, db, user, subject, value):\n        if user != subject:\n            Operation.requireRole(db, \"administrator\", user)\n\n        for address in value:\n            if not address.strip():\n                raise OperationError(\"empty email address is not allowed\")\n            if address.count(\"@\") != 1:\n                raise OperationError(\"invalid email address\")\n\n        cursor = db.cursor()\n        cursor.execute(\"SELECT email FROM usergitemails WHERE uid=%s\", (subject.id,))\n\n        current_addresses = set(address for (address,) in cursor)\n        new_addresses = set(address.strip() for address in value)\n\n        for address in (current_addresses - new_addresses):\n            cursor.execute(\"DELETE FROM usergitemails WHERE uid=%s AND email=%s\",\n                           (subject.id, address))\n        for address in (new_addresses - current_addresses):\n            cursor.execute(\"INSERT INTO usergitemails (uid, email) VALUES (%s, %s)\",\n                           (subject.id, address))\n\n        db.commit()\n\n        return OperationResult()\n\nclass ChangePassword(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"subject\": Optional(User),\n                                   \"current_pw\": Optional(str),\n                                   \"new_pw\": str })\n\n    def process(self, db, user, new_pw, subject=None, current_pw=None):\n        if subject is None:\n            subject = user\n\n        cursor = db.cursor()\n\n        if not auth.DATABASE.supportsPasswordChange():\n            raise OperationFailure(\n                code=\"notsupported\",\n                title=\"Not supported!\",\n                message=\"Changing password is not supported via this system.\")\n\n        if not new_pw:\n            raise OperationFailure(\n                code=\"emptypassword\",\n                title=\"Empty password!\",\n                message=\"Setting an empty password is not allowed.\")\n\n        if user != subject:\n            Operation.requireRole(db, \"administrator\", user)\n\n            if current_pw is None:\n                # Signal that it doesn't need to be checked.\n                current_pw = True\n\n        try:\n            auth.DATABASE.changePassword(db, subject, current_pw, new_pw)\n        except auth.WrongPassword:\n            raise OperationFailure(\n                code=\"wrongpassword\",\n                title=\"Wrong password!\",\n                message=\"The provided current password is not correct.\")\n\n        return OperationResult()\n\n    def sanitize(self, value):\n        sanitized = value.copy()\n        if \"current_pw\" in value:\n            sanitized[\"current_pw\"] = \"****\"\n        sanitized[\"new_pw\"] = \"****\"\n        return sanitized\n\ndef checkEmailAddressSyntax(address):\n    return bool(re.match(r\"[^@]+@[^.]+(?:\\.[^.]+)*$\", address))\n\ndef sendVerificationMail(db, user, email_id=None):\n    cursor = db.cursor()\n\n    if email_id is None:\n        cursor.execute(\"\"\"SELECT email\n                            FROM users\n                           WHERE id=%s\"\"\",\n                       (user.id,))\n\n        email_id, = cursor.fetchone()\n\n    cursor.execute(\"\"\"SELECT email, verification_token\n                        FROM useremails\n                       WHERE id=%s\"\"\",\n                   (email_id,))\n\n    email, verification_token = cursor.fetchone()\n\n    if verification_token is None:\n        verification_token = auth.getToken(encode=base64.b16encode)\n\n        with db.updating_cursor(\"useremails\") as cursor:\n            cursor.execute(\"\"\"UPDATE useremails\n                                 SET verification_token=%s\n                               WHERE id=%s\"\"\",\n                           (verification_token, email_id))\n\n    if configuration.base.ACCESS_SCHEME == \"http\":\n        protocol = \"http\"\n    else:\n        protocol = \"https\"\n\n    administrators = dbutils.getAdministratorContacts(db, indent=2)\n\n    if administrators:\n        administrators = \":\\n\\n%s\" % administrators\n    else:\n        administrators = \".\"\n\n    recipients = [mailutils.User(user.name, email, user.fullname)]\n    subject = \"[Critic] Please verify your email: %s\" % email\n    body = textutils.reflow(\"\"\"\nThis is a message from the Critic code review system at %(hostname)s.  The user\n'%(username)s' on this system has added this email address to his/her account.\nIf this is you, please confirm this by following this link:\n\n  %(url_prefix)s/verifyemail?email=%(email)s&token=%(verification_token)s\n\nIf this is not you, you can safely ignore this email.  If you wish to report\nabuse, please contact the Critic system's administrators%(administrators)s\n\"\"\" % { \"hostname\": configuration.base.HOSTNAME,\n        \"username\": user.name,\n        \"email\": email,\n        \"url_prefix\": \"%s://%s\" % (protocol, configuration.base.HOSTNAME),\n        \"verification_token\": verification_token,\n        \"administrators\": administrators })\n\n    mailutils.sendMessage(recipients, subject, body)\n\nclass RequestVerificationEmail(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"email_id\": int })\n\n    def process(self, db, user, email_id):\n        cursor = db.cursor()\n        cursor.execute(\"\"\"SELECT uid, email, verified\n                            FROM useremails\n                           WHERE id=%s\"\"\",\n                       (email_id,))\n\n        row = cursor.fetchone()\n\n        if not row:\n            raise OperationFailure(\n                code=\"invalidemailid\",\n                title=\"No such email address\",\n                message=\"The address might have been deleted already.\")\n\n        user_id, email, verified = row\n\n        if verified is True:\n            raise OperationFailure(\n                code=\"alreadyverified\",\n                title=\"Address already verified\",\n                message=\"This address has already been verified.\")\n\n        if user != user_id:\n            Operation.requireRole(db, \"administrator\", user)\n            user = dbutils.User.fromId(db, user_id)\n\n        sendVerificationMail(db, user, email_id)\n\n        db.commit()\n\n        return OperationResult()\n\nclass DeleteEmailAddress(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"email_id\": int })\n\n    def process(self, db, user, email_id):\n        cursor = db.cursor()\n        cursor.execute(\"\"\"SELECT uid\n                            FROM useremails\n                           WHERE id=%s\"\"\",\n                       (email_id,))\n\n        row = cursor.fetchone()\n\n        if not row:\n            raise OperationFailure(\n                code=\"invalidemailid\",\n                title=\"No such email address\",\n                message=\"The address might have been deleted already.\")\n\n        subject_id, = row\n\n        if user != subject_id:\n            Operation.requireRole(db, \"administrator\", user)\n\n        cursor.execute(\"\"\"SELECT useremails.id, users.email IS NOT NULL\n                            FROM useremails\n                 LEFT OUTER JOIN users ON (users.email=useremails.id)\n                           WHERE useremails.uid=%s\"\"\",\n                       (subject_id,))\n\n        emails = dict(cursor)\n\n        # Reject if the user has more than one email address registered and is\n        # trying to delete the selected one.  The UI checks this too, but that\n        # check is not 100 % reliable since it checks the state at the time the\n        # page was loaded, not necessarily the current state.\n        if len(emails) > 1 and emails[email_id]:\n            raise OperationFailure(\n                code=\"notallowed\",\n                title=\"Will not delete current address\",\n                message=(\"This email address is your current address.  Please \"\n                         \"select one of the other addresses as your current \"\n                         \"address before deleting it.\"))\n\n        cursor.execute(\"\"\"UPDATE users\n                             SET email=NULL\n                           WHERE id=%s\n                             AND email=%s\"\"\",\n                       (subject_id, email_id))\n\n        cursor.execute(\"\"\"DELETE FROM useremails\n                                WHERE id=%s\"\"\",\n                       (email_id,))\n\n        db.commit()\n\n        return OperationResult()\n\nclass SelectEmailAddress(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"email_id\": int })\n\n    def process(self, db, user, email_id):\n        cursor = db.cursor()\n        cursor.execute(\"\"\"SELECT uid\n                            FROM useremails\n                           WHERE id=%s\"\"\",\n                       (email_id,))\n\n        row = cursor.fetchone()\n\n        if not row:\n            raise OperationFailure(\n                code=\"invalidemailid\",\n                title=\"No such email address\",\n                message=\"The address might have been deleted already.\")\n\n        user_id, = row\n\n        if user != user_id:\n            Operation.requireRole(db, \"administrator\", user)\n\n        cursor.execute(\"\"\"UPDATE users\n                             SET email=%s\n                           WHERE id=%s\"\"\",\n                       (email_id, user_id))\n\n        db.commit()\n\n        return OperationResult()\n\nclass AddEmailAddress(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"subject\": User,\n                                   \"email\": str })\n\n    def process(self, db, user, subject, email):\n        if not checkEmailAddressSyntax(email):\n            raise OperationFailure(\n                code=\"invalidemail\",\n                title=\"Invalid email address\",\n                message=\"Please provide an address on the form <user>@<host>!\")\n\n        if user != subject:\n            Operation.requireRole(db, \"administrator\", user)\n\n        cursor = db.cursor()\n        cursor.execute(\"\"\"SELECT 1\n                            FROM useremails\n                           WHERE uid=%s\n                             AND email=%s\"\"\",\n                       (subject.id, email))\n\n        if cursor.fetchone():\n            raise OperationFailure(\n                code=\"invalidemail\",\n                title=\"Duplicate email address\",\n                message=\"The exact same address is already registered!\")\n\n        if user.hasRole(db, \"administrator\"):\n            verified = None\n        elif configuration.base.VERIFY_EMAIL_ADDRESSES:\n            verified = False\n        else:\n            verified = None\n\n        with db.updating_cursor(\"users\", \"useremails\") as cursor:\n            cursor.execute(\"\"\"INSERT INTO useremails (uid, email, verified)\n                                   VALUES (%s, %s, %s)\n                                RETURNING id\"\"\",\n                           (subject.id, email, verified))\n\n            email_id, = cursor.fetchone()\n\n            if subject.email is None:\n                cursor.execute(\"\"\"UPDATE users\n                                     SET email=%s\n                                   WHERE id=%s\"\"\",\n                               (email_id, subject.id))\n\n        if verified is False:\n            sendVerificationMail(db, subject, email_id)\n\n        return OperationResult()\n"
  },
  {
    "path": "src/operation/markfiles.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport dbutils\n\nfrom operation import Operation, OperationResult\n\nclass MarkFiles(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"review_id\": int,\n                                   \"reviewed\": bool,\n                                   \"changeset_ids\": [int],\n                                   \"file_ids\": [int] })\n\n    def process(req, db, user, review_id, reviewed, changeset_ids, file_ids):\n        review = dbutils.Review.fromId(db, review_id)\n\n        cursor = db.cursor()\n\n        # Revert any draft changes the user has for the specified files in\n        # the specified changesets.\n        cursor.execute(\"\"\"DELETE FROM reviewfilechanges\n                                WHERE uid=%s\n                                  AND state='draft'\n                                  AND file IN (SELECT id\n                                                 FROM reviewfiles\n                                                WHERE review=%s\n                                                  AND changeset=ANY (%s)\n                                                  AND file=ANY (%s))\"\"\",\n                       (user.id, review.id, changeset_ids, file_ids))\n\n        if reviewed:\n            from_state, to_state = 'pending', 'reviewed'\n        else:\n            from_state, to_state = 'reviewed', 'pending'\n\n        # Insert draft changes for every file whose state would be updated.\n        cursor.execute(\"\"\"INSERT INTO reviewfilechanges (file, uid, from_state, to_state)\n                               SELECT reviewfiles.id, reviewuserfiles.uid, reviewfiles.state, %s\n                                 FROM reviewfiles\n                                 JOIN reviewuserfiles ON (reviewuserfiles.file=reviewfiles.id AND reviewuserfiles.uid=%s)\n                                WHERE reviewfiles.review=%s\n                                  AND reviewfiles.state=%s\n                                  AND reviewfiles.changeset=ANY (%s)\n                                  AND reviewfiles.file=ANY (%s)\"\"\",\n                       (to_state, user.id, review.id, from_state, changeset_ids, file_ids))\n\n        db.commit()\n\n        return OperationResult(draft_status=review.getDraftStatus(db, user))\n"
  },
  {
    "path": "src/operation/miscellaneous.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2015 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport re\n\nimport dbutils\nimport gitutils\nimport configuration\n\nfrom operation import (Operation, OperationResult, OperationFailure, Review,\n                       Repository, SHA1)\n\nclass CheckSerial(Operation):\n    def __init__(self):\n        super(CheckSerial, self).__init__({ \"review_id\": int,\n                                            \"serial\": int })\n\n    def process(self, db, user, review_id, serial):\n        cursor = db.readonly_cursor()\n        cursor.execute(\"SELECT serial FROM reviews WHERE id=%s\", (review_id,))\n\n        current_serial, = cursor.fetchone()\n\n        if serial == current_serial:\n            interval = user.getPreference(db, \"review.updateCheckInterval\")\n            return OperationResult(stale=False, interval=interval)\n\n        return OperationResult(stale=True)\n\nclass RebaseBranch(Operation):\n    def __init__(self):\n        super(RebaseBranch, self).__init__({ \"repository\": Repository,\n                                             \"branch_name\": str,\n                                             \"base_branch_name\": str })\n\n    def process(self, db, user, repository, branch_name, base_branch_name):\n        branch = dbutils.Branch.fromName(db, repository, branch_name)\n        base_branch = dbutils.Branch.fromName(db, repository, base_branch_name)\n\n        branch.rebase(db, base_branch)\n\n        db.commit()\n\n        return OperationResult()\n\nclass SuggestReview(Operation):\n    def __init__(self):\n        super(SuggestReview, self).__init__({ \"repository\": Repository,\n                                              \"sha1\": SHA1 })\n\n    def process(self, db, user, repository, sha1):\n        try:\n            commit = gitutils.Commit.fromSHA1(db, repository, sha1)\n        except gitutils.GitReferenceError:\n            raise OperationFailure(\n                code=\"invalidsha1\",\n                title=\"Invalid SHA-1\",\n                message=\"No such commit: %s in %s\" % (sha1, repository.path))\n\n        suggestions = {}\n\n        cursor = db.readonly_cursor()\n        cursor.execute(\"\"\"SELECT reviews.id\n                            FROM reviews\n                           WHERE reviews.summary=%s\"\"\",\n                       (commit.summary(),))\n        for review_id, in cursor:\n            review = dbutils.Review.fromId(db, review_id)\n            if review.state != 'dropped':\n                suggestions[review_id] = review.getReviewState(db)\n\n        return OperationResult(summary=commit.summary(), reviews=reviews)\n"
  },
  {
    "path": "src/operation/news.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nfrom operation import Operation, OperationResult\n\nclass AddNewsItem(Operation):\n    \"\"\"Add news item.\"\"\"\n\n    def __init__(self):\n        Operation.__init__(self, { \"text\": str })\n\n    def process(self, db, user, text):\n        Operation.requireRole(db, \"newswriter\", user)\n\n        cursor = db.cursor()\n        cursor.execute(\"INSERT INTO newsitems (text) VALUES (%s) RETURNING id\", (text,))\n        item_id = cursor.fetchone()[0]\n        db.commit()\n\n        return OperationResult(item_id=item_id)\n\nclass EditNewsItem(Operation):\n    \"\"\"Add news item.\"\"\"\n\n    def __init__(self):\n        Operation.__init__(self, { \"item_id\": int,\n                                   \"text\": str })\n\n    def process(self, db, user, item_id, text):\n        Operation.requireRole(db, \"newswriter\", user)\n\n        db.cursor().execute(\"UPDATE newsitems SET text=%s WHERE id=%s\", (text, item_id))\n        db.commit()\n\n        return OperationResult()\n"
  },
  {
    "path": "src/operation/rebasereview.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport dbutils\nimport gitutils\nimport log.commitset\n\nfrom operation import (Operation, OperationResult, OperationError, Optional,\n                       Review)\n\ndef doPrepareRebase(db, user, review, new_upstream_arg=None, branch=None):\n    commitset = log.commitset.CommitSet(review.branch.getCommits(db))\n    tails = commitset.getFilteredTails(review.branch.repository)\n\n    cursor = db.cursor()\n\n    cursor.execute(\"SELECT uid FROM reviewrebases WHERE review=%s AND new_head IS NULL\", (review.id,))\n    row = cursor.fetchone()\n    if row:\n        rebaser = dbutils.User.fromId(db, row[0])\n        raise OperationError(\"The review is already being rebased by %s <%s>.\" % (rebaser.fullname, rebaser.email))\n\n    head = commitset.getHeads().pop()\n    head_id = head.getId(db)\n\n    if new_upstream_arg is not None:\n        if len(tails) > 1:\n            raise OperationError(\"Rebase to new upstream commit not supported.\")\n\n        tail = gitutils.Commit.fromSHA1(db, review.branch.repository, tails.pop())\n\n        old_upstream_id = tail.getId(db)\n        if new_upstream_arg == \"0\" * 40:\n            new_upstream_id = None\n        else:\n            if not gitutils.re_sha1.match(new_upstream_arg):\n                cursor.execute(\"SELECT sha1 FROM tags WHERE repository=%s AND name=%s\", (review.branch.repository.id, new_upstream_arg))\n                row = cursor.fetchone()\n                if row: new_upstream_arg = row[0]\n                else: raise OperationError(\"Specified new upstream is invalid.\")\n\n            try: new_upstream = gitutils.Commit.fromSHA1(db, review.branch.repository, new_upstream_arg)\n            except: raise OperationError(\"The specified new upstream commit does not exist in Critic's repository.\")\n\n            new_upstream_id = new_upstream.getId(db)\n    else:\n        old_upstream_id = None\n        new_upstream_id = None\n\n    cursor.execute(\"\"\"INSERT INTO reviewrebases (review, old_head, new_head, old_upstream, new_upstream, uid, branch)\n                           VALUES (%s, %s, NULL, %s, %s, %s, %s)\"\"\",\n                   (review.id, head_id, old_upstream_id, new_upstream_id, user.id, branch))\n\n    review.incrementSerial(db)\n\n    db.commit()\n\ndef doCancelRebase(db, user, review):\n    review.incrementSerial(db)\n\n    db.cursor().execute(\"DELETE FROM reviewrebases WHERE review=%s AND new_head IS NULL\", (review.id,))\n    db.commit()\n\nclass CheckRebase(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"review_id\": int })\n\n    def process(self, db, user, review_id):\n        review = dbutils.Review.fromId(db, review_id)\n        tails = review.getFilteredTails(db)\n        available = \"both\" if len(tails) == 1 else \"inplace\"\n\n        return OperationResult(available=available)\n\nclass SuggestUpstreams(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"review_id\": int })\n\n    def process(self, db, user, review_id):\n        review = dbutils.Review.fromId(db, review_id)\n        tails = review.getFilteredTails(db)\n\n        if len(tails) > 1:\n            raise OperationError(\"Multiple tail commits.\")\n\n        try:\n            from customization.filtertags import getUpstreamPattern\n        except ImportError:\n            def getUpstreamTagPattern(review): pass\n\n        tail = tails.pop()\n        tags = review.branch.repository.run(\"tag\", \"-l\", \"--contains\", tail, getUpstreamTagPattern(review) or \"*\").splitlines()\n\n        cursor = db.cursor()\n        upstreams = []\n\n        for tag in tags:\n            cursor.execute(\"SELECT sha1 FROM tags WHERE repository=%s AND name=%s\", (review.branch.repository.id, tag))\n            row = cursor.fetchone()\n            if row and row[0] != tail:\n                upstreams.append(tag)\n\n        return OperationResult(upstreams=upstreams)\n\nclass PrepareRebase(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"review\": Review,\n                                   \"new_upstream\": Optional(str),\n                                   \"branch\": Optional(str) })\n\n    def process(self, db, user, review, new_upstream=None, branch=None):\n        doPrepareRebase(db, user, review, new_upstream, branch)\n        return OperationResult()\n\nclass CancelRebase(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"review_id\": int })\n\n    def process(self, db, user, review_id):\n        review = dbutils.Review.fromId(db, review_id)\n        doCancelRebase(db, user, review)\n        return OperationResult()\n\nclass RebaseReview(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"review_id\": int,\n                                   \"new_head_sha1\": str,\n                                   \"new_upstream_sha1\": Optional(str),\n                                   \"branch\": Optional(str),\n                                   \"new_trackedbranch\": Optional(str) })\n\n    def process(self, db, user, review_id, new_head_sha1, new_upstream_sha1=None, branch=None, new_trackedbranch=None):\n        review = dbutils.Review.fromId(db, review_id)\n        new_head = gitutils.Commit.fromSHA1(db, review.repository, new_head_sha1)\n\n        cursor = db.cursor()\n\n        if review.state == 'closed':\n            cursor.execute(\"SELECT closed_by FROM reviews WHERE id=%s\", (review.id,))\n            closed_by = cursor.fetchone()[0]\n\n            review.serial += 1\n            cursor.execute(\"UPDATE reviews SET state='open', serial=%s, closed_by=NULL WHERE id=%s\", (review.serial, review.id))\n        else:\n            closed_by = None\n\n        trackedbranch = review.getTrackedBranch(db)\n        if trackedbranch and not trackedbranch.disabled:\n            cursor.execute(\"UPDATE trackedbranches SET disabled=TRUE WHERE id=%s\", (trackedbranch.id,))\n\n        commitset = log.commitset.CommitSet(review.branch.getCommits(db))\n        tails = commitset.getFilteredTails(review.branch.repository)\n\n        if len(tails) == 1 and tails.pop() == new_upstream_sha1:\n            # This appears to be a history rewrite.\n            new_upstream_sha1 = None\n\n        doPrepareRebase(db, user, review, new_upstream_sha1, branch)\n\n        try:\n            with review.repository.relaycopy(\"RebaseReview\") as relay:\n                with review.repository.temporaryref(new_head) as ref_name:\n                    relay.run(\"fetch\", \"origin\", ref_name)\n                relay.run(\"push\", \"--force\", \"origin\",\n                          \"%s:refs/heads/%s\" % (new_head.sha1, review.branch.name),\n                          env={ \"REMOTE_USER\": user.name })\n\n            if closed_by is not None:\n                db.commit()\n                state = review.getReviewState(db)\n                if state.accepted:\n                    review.serial += 1\n                    cursor.execute(\"UPDATE reviews SET state='closed', serial=%s, closed_by=%s WHERE id=%s\", (review.serial, closed_by, review.id))\n\n            if trackedbranch and not trackedbranch.disabled:\n                cursor.execute(\"UPDATE trackedbranches SET disabled=FALSE WHERE id=%s\", (trackedbranch.id,))\n            if new_trackedbranch:\n                cursor.execute(\"UPDATE trackedbranches SET remote_name=%s WHERE id=%s\", (new_trackedbranch, trackedbranch.id))\n\n            db.commit()\n        except:\n            doCancelRebase(db, user, review)\n            raise\n\n        return OperationResult()\n\nclass RevertRebase(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"review\": Review,\n                                   \"rebase_id\": int })\n\n    def process(self, db, user, review, rebase_id):\n        cursor = db.cursor()\n\n        cursor.execute(\"\"\"SELECT old_head, new_head, new_upstream, equivalent_merge, replayed_rebase\n                            FROM reviewrebases\n                           WHERE id=%s\"\"\",\n                       (rebase_id,))\n        old_head_id, new_head_id, new_upstream_id, equivalent_merge_id, replayed_rebase_id = cursor.fetchone()\n\n        cursor.execute(\"SELECT commit FROM previousreachable WHERE rebase=%s\", (rebase_id,))\n        reachable = [commit_id for (commit_id,) in cursor]\n\n        if not reachable:\n            # Fail if rebase was done before the 'previousreachable' table was\n            # added, and we thus don't know what commits the branch contained\n            # before the rebase.\n            raise OperationError(\"Automatic revert not supported; rebase is pre-historic.\")\n\n        if review.branch.getHead(db).getId(db) != new_head_id:\n            raise OperationError(\"Commits added to review after rebase; need to remove them first.\")\n\n        old_head = gitutils.Commit.fromId(db, review.repository, old_head_id)\n        new_head = gitutils.Commit.fromId(db, review.repository, new_head_id)\n\n        cursor.execute(\"DELETE FROM reachable WHERE branch=%s\", (review.branch.id,))\n        cursor.executemany(\"INSERT INTO reachable (branch, commit) VALUES (%s, %s)\",\n                           ((review.branch.id, commit_id) for commit_id in reachable))\n\n        if new_upstream_id:\n            generated_commit_id = equivalent_merge_id or replayed_rebase_id\n            if generated_commit_id is not None:\n                # A generated commit (equivalent merge or replayed rebase) was\n                # added when performing the rebase; remove it.\n\n                # Reopen any issues marked as addressed by the rebase.  If the\n                # rebase was a fast-forward one, issues will have been addressed\n                # by the equivalent merge commit.  Otherwise, issues will have\n                # been addressed by the new head commit (not the replayed rebase\n                # commit.)\n                addressed_by_commit_id = equivalent_merge_id or new_head_id\n                cursor.execute(\"\"\"UPDATE commentchains\n                                     SET state='open', addressed_by=NULL\n                                   WHERE review=%s\n                                     AND state='addressed'\n                                     AND addressed_by=%s\"\"\",\n                               (review.id, addressed_by_commit_id))\n\n                # Delete the review changesets (and, via cascade, all related\n                # assignments and state changes.)\n                cursor.execute(\n                    \"\"\"DELETE FROM reviewchangesets\n                             WHERE review=%s\n                               AND changeset IN (SELECT id\n                                                   FROM changesets\n                                                  WHERE child=%s OR parent=%s)\"\"\",\n                    (review.id, generated_commit_id, generated_commit_id))\n\n        cursor.execute(\"UPDATE branches SET head=%s WHERE id=%s\",\n                       (old_head_id, review.branch.id))\n        cursor.execute(\"DELETE FROM reviewrebases WHERE id=%s\", (rebase_id,))\n\n        review.incrementSerial(db)\n        db.commit()\n\n        review.repository.run(\n            \"update-ref\", \"refs/heads/%s\" % review.branch.name,\n            old_head.sha1, new_head.sha1)\n\n        return OperationResult()\n"
  },
  {
    "path": "src/operation/recipientfilter.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport dbutils\n\nfrom operation import Operation, OperationResult\n\nclass AddRecipientFilter(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"review_id\": int,\n                                   \"user_id\": int,\n                                   \"include\": bool })\n\n    def process(self, db, user, review_id, user_id, include):\n        cursor = db.cursor()\n        cursor.execute(\"SELECT include FROM reviewrecipientfilters WHERE review=%s AND uid=%s\", (review_id, user_id))\n        row = cursor.fetchone()\n\n        if row:\n            if row[0] != include:\n                cursor.execute(\"UPDATE reviewrecipientfilters SET include=%s WHERE review=%s AND uid=%s\", (include, review_id, user_id))\n        else:\n            cursor.execute(\"INSERT INTO reviewrecipientfilters (review, uid, include) VALUES (%s, %s, %s)\", (review_id, user_id, include))\n\n        review = dbutils.Review.fromId(db, review_id)\n        review.incrementSerial(db)\n        db.commit()\n\n        return OperationResult()\n"
  },
  {
    "path": "src/operation/registeruser.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport dbutils\nimport configuration\nimport auth\n\nfrom operation import Operation, OperationResult, Optional, Request\nfrom operation.manipulateuser import sendVerificationMail, checkEmailAddressSyntax\n\nclass RegisterUser(Operation):\n    def __init__(self):\n        super(RegisterUser, self).__init__(\n            { \"req\": Request,\n              \"username\": str,\n              \"fullname\": str,\n              \"email\": str,\n              \"password\": Optional(str),\n              \"external\": Optional({ \"provider\": set(auth.PROVIDERS.keys()),\n                                     \"account\": str,\n                                     \"token\": str }) },\n            accept_anonymous_user=True)\n\n    def process(self, db, user, req, username, fullname, email,\n                password=None, external=None):\n        cursor = db.cursor()\n\n        if not fullname:\n            fullname = username\n        if not email:\n            email = None\n        if not password:\n            # Empty password => disabled.\n            password = None\n\n        if external:\n            provider_config = configuration.auth.PROVIDERS[external[\"provider\"]]\n            provider = auth.PROVIDERS[external[\"provider\"]]\n\n        # Check that user registration is actually enabled.  This would also\n        # disable the UI for user registration, of course, but the UI could be\n        # bypassed, so we should check here as well.\n        if not configuration.base.ALLOW_USER_REGISTRATION:\n            if not external or not provider_config[\"allow_user_registration\"]:\n                return OperationResult(\n                    message=\"User registration is not enabled.\")\n\n        # Check that the user name is valid.\n        try:\n            auth.validateUserName(username)\n        except auth.InvalidUserName as error:\n            return OperationResult(\n                message=\"<u>Invalid user name</u><br>\" + str(error),\n                focus=\"#newusername\")\n\n        # Check that the user name is not already taken.\n        cursor.execute(\"SELECT 1 FROM users WHERE name=%s\", (username,))\n        if cursor.fetchone():\n            return OperationResult(\n                message=\"A user named '%s' already exists!\" % username,\n                focus=\"#newusername\")\n\n        # Check that the email address has some hope of being valid.\n        if email and not checkEmailAddressSyntax(email):\n            return OperationResult(\n                message=(\"<u>Invalid email address</u><br>\"\n                         \"Please provide an address on the form user@host!\"),\n                focus=\"#email\")\n\n        # Check that we have either a password or an external authentication\n        # provider.  If we have neither, the user wouldn't be able to sign in.\n        if password is None and external is None:\n            return OperationResult(\n                message=\"Empty password.\",\n                focus=\"#password1\")\n\n        if password:\n            password = auth.hashPassword(password)\n\n        verify_email_address = configuration.base.VERIFY_EMAIL_ADDRESSES\n\n        if external:\n            # Check that the external authentication token is valid.\n            if not provider.validateToken(db, external[\"account\"],\n                                          external[\"token\"]):\n                return OperationResult(\n                    message=\"Invalid external authentication state.\")\n\n            cursor.execute(\"\"\"SELECT id, uid, email\n                                FROM externalusers\n                               WHERE provider=%s\n                                 AND account=%s\"\"\",\n                           (external[\"provider\"], external[\"account\"]))\n\n            # Note: the token validation above implicitly checks that there's a\n            # matching row in the 'externalusers' table.\n            external_user_id, existing_user_id, external_email = cursor.fetchone()\n\n            # Check that we don't already have a Critic user associated with\n            # this external user.\n            if existing_user_id is not None:\n                existing_user = dbutils.User.fromId(db, existing_user_id)\n                return OperationResult(\n                    message=(\"There is already a Critic user ('%s') connected \"\n                             \"to the %s '%s'\" % (existing_user.name,\n                                                 provider.getTitle(),\n                                                 external[\"account\"])))\n\n            if email == external_email:\n                verify_email_address = provider.configuration[\"verify_email_addresses\"]\n\n            # Reset 'email' column in 'externalusers': we only need it to detect\n            # if the user changed the email address in the \"Create user\" form.\n            # Also reset the 'token' column, which serves no further purpose\n            # beyond this point.\n            with db.updating_cursor(\"externalusers\") as cursor:\n                cursor.execute(\"\"\"UPDATE externalusers\n                                     SET email=NULL,\n                                         token=NULL\n                                   WHERE id=%s\"\"\",\n                               (external_user_id,))\n\n        email_verified = False if email and verify_email_address else None\n\n        user = dbutils.User.create(\n            db, username, fullname, email, email_verified, password,\n            external_user_id=external_user_id)\n\n        if email_verified is False:\n            sendVerificationMail(db, user)\n\n        user.sendUserCreatedMail(\"wsgi[registeruser]\", external)\n\n        auth.createSessionId(db, req, user)\n\n        return OperationResult()\n"
  },
  {
    "path": "src/operation/savesettings.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2013 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nfrom operation import (Operation, OperationResult, OperationFailure,\n                       OperationError, Optional, User, Repository)\n\nimport dbutils\nimport gitutils\n\nclass SaveSettings(Operation):\n    def __init__(self):\n        super(SaveSettings, self).__init__(\n            { \"subject\": Optional(User),\n              \"repository\": Optional(Repository),\n              \"filter_id\": Optional(int),\n              \"settings\": [{ \"item\": str,\n                             \"value\": Optional(set([bool, int, str])) }],\n              \"defaults\": Optional(bool) })\n\n    def process(self, db, user, settings, subject=None, repository=None, filter_id=None, defaults=False):\n        if repository is not None and filter_id is not None:\n            raise OperationError(\"invalid input: both 'repository' and 'filter_id' set\")\n\n        if (subject and subject != user) or defaults:\n            Operation.requireRole(db, \"administrator\", user)\n        else:\n            subject = user\n\n        cursor = db.cursor()\n\n        if filter_id is not None:\n            # Check that the filter exists and that it's one of the user's\n            # filters (or that the user has the administrator role.)\n            cursor.execute(\"SELECT uid FROM filters WHERE id=%s\", (filter_id,))\n            row = cursor.fetchone()\n            if not row:\n                raise OperationFailure(\n                    code=\"nosuchfilter\",\n                    title=\"No such filter!\",\n                    message=(\"Maybe the filter has been deleted since you \"\n                             \"loaded this page?\"))\n            elif row[0] != subject.id:\n                raise OperationFailure(\n                    code=\"invalidfilter\",\n                    title=\"The filter belongs to someone else!\",\n                    message=(\"What are you up to?\"))\n\n        saved_settings = []\n\n        for setting in settings:\n            item = setting[\"item\"]\n            value = setting.get(\"value\")\n\n            if dbutils.User.storePreference(db, item, value, subject,\n                                            repository, filter_id):\n                saved_settings.append(setting[\"item\"])\n\n        db.commit()\n\n        return OperationResult(saved_settings=sorted(saved_settings))\n"
  },
  {
    "path": "src/operation/searchreview.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2013 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport re\nimport urllib\n\nimport configuration\nimport dbutils\nimport gitutils\n\nfrom operation import Operation, OperationResult, OperationFailure\n\ndef globToSQLPattern(glob):\n    pattern = glob.replace(\"\\\\\", \"\\\\\\\\\").replace(\"%\", \"\\\\%\").replace(\"?\", \"_\").replace(\"*\", \"%\")\n    if \"?\" in glob or \"*\" in glob:\n        return pattern\n    return \"%\" + pattern + \"%\"\n\ndef pathToSQLRegExp(path):\n    pattern = \"\"\n    if path.startswith(\"/\"):\n        pattern += \"^\"\n        path = path.lstrip(\"/\")\n    escaped = re.sub(r\"[(){}\\[\\].\\\\+^$]\", lambda match: \"\\\\\" + match.group(), path)\n    replacements = { \"**/\": \"(?:[^/]+/)*\", \"*\": \"[^/]*\", \"?\": \".\" }\n    pattern += re.sub(\"\\*\\*/|\\*|\\?\", lambda match: replacements[match.group()], escaped)\n    return pattern\n\nclass Query(object):\n    def __init__(self, parent=None):\n        if not parent:\n            self.tables = { \"reviews\": set() }\n            self.arguments = []\n        else:\n            self.tables = parent.tables\n            self.arguments = parent.arguments\n        self.conditions = []\n\n    def addTable(self, table, *conditions):\n        self.tables.setdefault(table, set()).update(conditions)\n\nclass Review(object):\n    def __init__(self, review_id, summary):\n        self.review_id = review_id\n        self.summary = summary\n    def json(self):\n        return { \"id\": self.review_id, \"summary\": self.summary }\n\nclass InvalidFilter(Exception):\n    def __init__(self, title, message):\n        self.title = title\n        self.message = message\n\nclass Filter(object):\n    def __init__(self, db, value):\n        self.db = db\n        self.value = value\n        self.check(db)\n    def check(self, db):\n        pass\n    def filter(self, db, review):\n        return True\n\nclass SummaryFilter(Filter):\n    def contribute(self, query):\n        query.conditions.append(\"reviews.summary LIKE %s\")\n        query.arguments.append(globToSQLPattern(self.value))\n\nclass DescriptionFilter(Filter):\n    def contribute(self, query):\n        query.conditions.append(\"reviews.description LIKE %s\")\n        query.arguments.append(globToSQLPattern(self.value))\n\nclass BranchFilter(Filter):\n    def contribute(self, query):\n        query.addTable(\"branches\", \"branches.id=reviews.branch\")\n        query.conditions.append(\"branches.name ~ %s\")\n        query.arguments.append(pathToSQLRegExp(self.value))\n\nclass PathFilter(Filter):\n    def contribute(self, query):\n        query.addTable(\"reviewfiles\", \"reviewfiles.review=reviews.id\")\n        query.addTable(\"files\", \"files.id=reviewfiles.file\")\n\n        if configuration.database.DRIVER == \"postgresql\":\n            # This is just an optimization; with PostgreSQL we have an index\n            # that avoids matching the pattern against most paths.\n\n            static_components = []\n            for component in self.value.split(\"/\"):\n                if component and not (\"*\" in component or \"?\" in component):\n                    static_components.append(component)\n\n            if static_components:\n                query.conditions.append(\"%s <@ STRING_TO_ARRAY(path, '/')\")\n                query.arguments.append(static_components)\n\n        query.conditions.append(\"files.path ~ %s\")\n        query.arguments.append(pathToSQLRegExp(self.value))\n\nclass UserFilter(Filter):\n    def check(self, db):\n        self.user = dbutils.User.fromName(db, self.value)\n    def contribute(self, query):\n        query.addTable(\"reviewusers\", \"reviewusers.review=reviews.id\")\n        query.conditions.append(\"reviewusers.uid=%s\")\n        query.arguments.append(self.user.id)\n\nclass OwnerFilter(UserFilter):\n    def contribute(self, query):\n        super(OwnerFilter, self).contribute(query)\n        query.conditions.append(\"reviewusers.owner\")\n\nclass ReviewerFilter(UserFilter):\n    def contribute(self, query):\n        query.addTable(\"reviewfiles\", \"reviewfiles.review=reviews.id\")\n        query.addTable(\"reviewuserfiles\", \"reviewuserfiles.file=reviewfiles.id\")\n        query.conditions.append(\"reviewuserfiles.uid=%s\")\n        query.arguments.append(self.user.id)\n\nclass StateFilter(Filter):\n    def check(self, db):\n        if self.value not in (\"open\", \"pending\", \"accepted\", \"closed\", \"dropped\"):\n            raise InvalidFilter(\n                title=\"Invalid review state: %r\" % self.value,\n                message=(\"Supported review states are open, pending, accepted, \"\n                         \"closed and dropped.\"))\n    def contribute(self, query):\n        state = \"open\" if self.value in (\"pending\", \"accepted\") else self.value\n        query.conditions.append(\"reviews.state=%s\")\n        query.arguments.append(state)\n    def filter(self, db, review):\n        if self.value == \"pending\":\n            return not dbutils.Review.isAccepted(db, review.review_id)\n        elif self.value == \"accepted\":\n            return dbutils.Review.isAccepted(db, review.review_id)\n        return True\n\nclass RepositoryFilter(Filter):\n    def check(self, db):\n        cursor = db.cursor()\n        cursor.execute(\"SELECT id FROM repositories WHERE name=%s\", (self.value,))\n        row = cursor.fetchone()\n        if not row:\n            raise gitutils.NoSuchRepository(self.value)\n        self.repository_id = row[0]\n    def contribute(self, query):\n        query.addTable(\"branches\", \"branches.id=reviews.branch\")\n        query.conditions.append(\"branches.repository=%s\")\n        query.arguments.append(self.repository_id)\n\nclass OrFilter(Filter):\n    def __init__(self, filters):\n        self.filters = filters\n    def contribute(self, query):\n        conditions = []\n        for search_filter in self.filters:\n            subquery = Query(query)\n            search_filter.contribute(subquery)\n            conditions.append(\"(%s)\" % \" AND \".join(subquery.conditions))\n        query.conditions.append(\"(%s)\" % \" OR \".join(conditions))\n\nclass SearchReview(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"query\": str }, accept_anonymous_user=True)\n\n    def process(req, db, user, query):\n        terms = re.findall(\"\"\"((?:\"[^\"]*\"|'[^']*'|[^ \"']+)+)\"\"\", query)\n        url_terms = []\n        filters = []\n\n        for term in terms:\n            if re.match(\"[a-z\\-]+:\", term):\n                keyword, _, value = term.partition(\":\")\n\n                url_terms.append((\"q\" + keyword, value))\n\n                if keyword == \"summary\":\n                    filter_classes = [SummaryFilter]\n                elif keyword == \"description\":\n                    filter_classes = [DescriptionFilter]\n                elif keyword == \"text\":\n                    filter_classes = [SummaryFilter, DescriptionFilter]\n                elif keyword in (\"branch\", \"b\"):\n                    filter_classes = [BranchFilter]\n                elif keyword in (\"path\", \"p\"):\n                    filter_classes = [PathFilter]\n                elif keyword in (\"user\", \"u\"):\n                    filter_classes = [UserFilter]\n                elif keyword in (\"owner\", \"o\"):\n                    filter_classes = [OwnerFilter]\n                elif keyword == \"reviewer\":\n                    filter_classes = [ReviewerFilter]\n                elif keyword == \"owner-or-reviewer\":\n                    filter_classes = [OwnerFilter, ReviewerFilter]\n                elif keyword in (\"state\", \"s\"):\n                    filter_classes = [StateFilter]\n                elif keyword in (\"repository\", \"repo\", \"r\"):\n                    filter_classes = [RepositoryFilter]\n                else:\n                    raise OperationFailure(\n                        code=\"invalidkeyword\",\n                        title=\"Invalid keyword: %r\" % keyword,\n                        message=(\"Supported keywords are summary, description, \"\n                                 \"text, branch, path, user, owner and reviewer.\"))\n\n                if re.match(\"([\\\"']).*\\\\1$\", value):\n                    value = value[1:-1]\n\n                try:\n                    if len(filter_classes) > 1:\n                        keyword_filters = [filter_class(db, value)\n                                           for filter_class in filter_classes]\n                        filters.append(OrFilter(keyword_filters))\n                    else:\n                        filters.append(filter_classes[0](db, value))\n\n                except InvalidFilter as error:\n                    raise OperationFailure(\n                        code=\"invalidterm\",\n                        title=error.title,\n                        message=error.message)\n                except dbutils.NoSuchUser as error:\n                    raise OperationFailure(\n                        code=\"invalidterm\",\n                        title=\"No such user: %r\" % error.name,\n                        message=(\"The search term following %r must be a valid user name.\"\n                                 % (keyword + \":\")))\n                except gitutils.NoSuchRepository as error:\n                    raise OperationFailure(\n                        code=\"invalidterm\",\n                        title=\"No such repository: %r\" % error.name,\n                        message=(\"The search term following %r must be a valid repository name.\"\n                                 % (keyword + \":\")))\n            else:\n                url_terms.append((\"q\", term))\n\n                if re.match(\"([\\\"']).*\\\\1$\", term):\n                    term = term[1:-1]\n\n                auto_filters = []\n                auto_filters.append(SummaryFilter(db, term))\n                auto_filters.append(DescriptionFilter(db, term))\n                if not re.search(r\"\\s\", term):\n                    auto_filters.append(BranchFilter(db, term))\n                    if re.search(r\"\\w/\\w|\\w\\.\\w+$\", term):\n                        auto_filters.append(PathFilter(db, term))\n\n                filters.append(OrFilter(auto_filters))\n\n        if not filters:\n            raise OperationFailure(\n                code=\"nofilters\",\n                title=\"No search filter specified\",\n                message=\"Your search would find all reviews.  Please restrict it a bit.\")\n\n        query_params = Query()\n\n        for search_filter in filters:\n            search_filter.contribute(query_params)\n\n        query_string = \"\"\"SELECT DISTINCT reviews.id, reviews.summary\n                            FROM %s\n                           WHERE %s\n                        ORDER BY reviews.id DESC\"\"\"\n\n        for conditions in query_params.tables.values():\n            query_params.conditions[0:0] = conditions\n\n        query = query_string % (\", \".join(query_params.tables.keys()),\n                                \" AND \".join(query_params.conditions))\n\n        cursor = db.cursor()\n        cursor.execute(query, query_params.arguments)\n\n        reviews = [Review(review_id, summary) for review_id, summary in cursor]\n\n        for search_filter in filters:\n            reviews = filter(lambda review: search_filter.filter(db, review), reviews)\n\n        return OperationResult(\n            reviews=map(Review.json, reviews),\n            query_string=urllib.urlencode(url_terms))\n"
  },
  {
    "path": "src/operation/servicemanager.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nfrom operation import Operation, OperationResult, Optional, OperationError\n\nimport configuration\nimport textutils\n\nimport os\nimport socket\nimport signal\n\nclass RestartService(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"service_name\": str })\n\n    def process(self, db, user, service_name):\n        Operation.requireRole(db, \"administrator\", user)\n\n        if service_name == \"wsgi\":\n            for pid in os.listdir(configuration.paths.WSGI_PIDFILE_DIR):\n                try: os.kill(int(pid), signal.SIGINT)\n                except: pass\n            return OperationResult()\n        else:\n            connection = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM)\n            connection.connect(configuration.services.SERVICEMANAGER[\"address\"])\n            connection.send(textutils.json_encode({ \"command\": \"restart\", \"service\": service_name }))\n            connection.shutdown(socket.SHUT_WR)\n\n            data = \"\"\n            while True:\n                received = connection.recv(4096)\n                if not received: break\n                data += received\n\n            result = textutils.json_decode(data)\n\n            if result[\"status\"] == \"ok\": return OperationResult()\n            else: raise OperationError(result[\"error\"])\n\nclass GetServiceLog(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"service_name\": str, \"lines\": Optional(int) })\n\n    def process(self, db, user, service_name, lines=40):\n        logfile_paths = { \"manager\": configuration.services.SERVICEMANAGER[\"logfile_path\"] }\n\n        for service in configuration.services.SERVICEMANAGER[\"services\"]:\n            logfile_paths[service[\"name\"]] = service[\"logfile_path\"]\n\n        logfile_path = logfile_paths.get(service_name)\n\n        if not logfile_path:\n            raise OperationError(\"unknown service: %s\" % service_name)\n\n        try: logfile = open(logfile_path)\n        except OSError as error:\n            raise OperationError(\"failed to open logfile: %s\" % error.message)\n\n        return OperationResult(lines=logfile.read().splitlines()[-lines:])\n"
  },
  {
    "path": "src/operation/trackedbranch.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nfrom operation import Operation, OperationResult, OperationFailure, OperationError, Optional\n\nimport dbutils\nimport gitutils\nimport htmlutils\nimport configuration\n\nimport calendar\nimport os\nimport signal\n\ndef getTrackedBranchReviewState(db, branch_id):\n    cursor = db.cursor()\n    cursor.execute(\"\"\"SELECT reviews.state\n                        FROM reviews\n                        JOIN branches ON (branches.id=reviews.branch)\n                        JOIN trackedbranches ON (trackedbranches.repository=branches.repository\n                                             AND trackedbranches.local_name=branches.name)\n                       WHERE trackedbranches.id=%s\"\"\",\n                   (branch_id,))\n\n    row = cursor.fetchone()\n    return row[0] if row else None\n\nclass TrackedBranchLog(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"branch_id\": int })\n\n    def process(self, db, user, branch_id):\n        cursor = db.cursor()\n        cursor.execute(\"\"\"SELECT previous, next\n                            FROM trackedbranches\n                           WHERE id=%s\"\"\",\n                       (branch_id,))\n\n        previous, next = cursor.fetchone()\n        previous = calendar.timegm(previous.utctimetuple()) if previous else None\n        next = calendar.timegm(next.utctimetuple()) if next else None\n\n        cursor.execute(\"\"\"SELECT time, from_sha1, to_sha1, hook_output, successful\n                            FROM trackedbranchlog\n                           WHERE branch=%s\n                        ORDER BY time ASC\"\"\",\n                       (branch_id,))\n\n        items = []\n\n        for update_time, from_sha1, to_sha1, hook_output, successful in cursor:\n            items.append({ \"time\": calendar.timegm(update_time.utctimetuple()),\n                           \"from_sha1\": from_sha1,\n                           \"to_sha1\": to_sha1,\n                           \"hook_output\": hook_output,\n                           \"successful\": successful })\n\n        cursor.execute(\"\"\"SELECT repository\n                            FROM trackedbranches\n                           WHERE id=%s\"\"\",\n                       (branch_id,))\n\n        repository = gitutils.Repository.fromId(db, cursor.fetchone()[0])\n\n        return OperationResult(previous=previous,\n                               next=next,\n                               items=items,\n                               repository={ \"id\": repository.id, \"name\": repository.name })\n\nclass DisableTrackedBranch(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"branch_id\": int })\n\n    def process(self, db, user, branch_id):\n        cursor = db.cursor()\n\n        if not user.hasRole(db, \"administrator\"):\n            cursor.execute(\"\"\"SELECT 1\n                                FROM trackedbranchusers\n                               WHERE branch=%s\n                                 AND uid=%s\"\"\",\n                           (branch_id, user.id))\n\n            if not cursor.fetchone():\n                raise OperationFailure(code=\"notallowed\",\n                                       title=\"Not allowed!\",\n                                       message=\"Operation not permitted.\")\n\n        cursor.execute(\"\"\"UPDATE trackedbranches\n                             SET disabled=TRUE\n                           WHERE id=%s\"\"\",\n                       (branch_id,))\n\n        db.commit()\n\n        return OperationResult()\n\nclass TriggerTrackedBranchUpdate(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"branch_id\": int })\n\n    def process(self, db, user, branch_id):\n        cursor = db.cursor()\n\n        if not user.hasRole(db, \"administrator\"):\n            cursor.execute(\"\"\"SELECT 1\n                                FROM trackedbranchusers\n                               WHERE branch=%s\n                                 AND uid=%s\"\"\",\n                           (branch_id, user.id))\n\n            if not cursor.fetchone():\n                raise OperationFailure(code=\"notallowed\",\n                                       title=\"Not allowed!\",\n                                       message=\"Operation not permitted.\")\n\n        review_state = getTrackedBranchReviewState(db, branch_id)\n        if review_state is not None and review_state != \"open\":\n            raise OperationFailure(code=\"reviewnotopen\",\n                                   title=\"The review is not open!\",\n                                   message=\"You need to reopen the review before new commits can be added to it.\")\n\n        cursor.execute(\"\"\"UPDATE trackedbranches\n                             SET next=NULL\n                           WHERE id=%s\"\"\",\n                       (branch_id,))\n\n        db.commit()\n\n        pid = int(open(configuration.services.BRANCHTRACKER[\"pidfile_path\"]).read().strip())\n        os.kill(pid, signal.SIGHUP)\n\n        return OperationResult()\n\nclass EnableTrackedBranch(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"branch_id\": int,\n                                   \"new_remote_name\": Optional(str) })\n\n    def process(self, db, user, branch_id, new_remote_name=None):\n        cursor = db.cursor()\n\n        if not user.hasRole(db, \"administrator\"):\n            cursor.execute(\"\"\"SELECT 1\n                                FROM trackedbranchusers\n                               WHERE branch=%s\n                                 AND uid=%s\"\"\",\n                           (branch_id, user.id))\n\n            if not cursor.fetchone():\n                raise OperationFailure(code=\"notallowed\",\n                                       title=\"Not allowed!\",\n                                       message=\"Operation not permitted.\")\n\n        review_state = getTrackedBranchReviewState(db, branch_id)\n        if review_state is not None and review_state != \"open\":\n            raise OperationFailure(code=\"reviewnotopen\",\n                                   title=\"The review is not open!\",\n                                   message=\"You need to reopen the review before new commits can be added to it.\")\n\n        if new_remote_name is not None:\n            cursor.execute(\"\"\"SELECT remote\n                                FROM trackedbranches\n                               WHERE id=%s\"\"\",\n                           (branch_id,))\n\n            remote = cursor.fetchone()[0]\n\n            if not gitutils.Repository.lsremote(remote, pattern=\"refs/heads/\" + new_remote_name):\n                raise OperationFailure(\n                    code=\"refnotfound\",\n                    title=\"Remote ref not found!\",\n                    message=(\"Could not find the ref <code>%s</code> in the repository <code>%s</code>.\"\n                             % (htmlutils.htmlify(\"refs/heads/\" + new_remote_name),\n                                htmlutils.htmlify(remote))),\n                    is_html=True)\n\n            cursor.execute(\"\"\"UPDATE trackedbranches\n                                 SET remote_name=%s,\n                                     disabled=FALSE,\n                                     next=NULL\n                               WHERE id=%s\"\"\",\n                           (new_remote_name, branch_id))\n        else:\n            cursor.execute(\"\"\"UPDATE trackedbranches\n                                 SET disabled=FALSE,\n                                     next=NULL\n                               WHERE id=%s\"\"\",\n                           (branch_id,))\n\n        db.commit()\n\n        pid = int(open(configuration.services.BRANCHTRACKER[\"pidfile_path\"]).read().strip())\n        os.kill(pid, signal.SIGHUP)\n\n        return OperationResult()\n\nclass DeleteTrackedBranch(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"branch_id\": int })\n\n    def process(self, db, user, branch_id):\n        cursor = db.cursor()\n\n        if not user.hasRole(db, \"administrator\"):\n            cursor.execute(\"\"\"SELECT 1\n                                FROM trackedbranchusers\n                               WHERE branch=%s\n                                 AND uid=%s\"\"\",\n                           (branch_id, user.id))\n\n            if not cursor.fetchone():\n                raise OperationFailure(code=\"notallowed\",\n                                       title=\"Not allowed!\",\n                                       message=\"Operation not permitted.\")\n\n        cursor.execute(\"\"\"DELETE FROM trackedbranches\n                                WHERE id=%s\"\"\",\n                       (branch_id,))\n\n        db.commit()\n\n        return OperationResult()\n\nclass AddTrackedBranch(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"repository_id\": int,\n                                   \"source_location\": str,\n                                   \"source_name\": str,\n                                   \"target_name\": str,\n                                   \"users\": [str],\n                                   \"forced\": Optional(bool) })\n\n    def process(self, db, user, repository_id, source_location, source_name,\n                target_name, users, forced=None):\n        cursor = db.cursor()\n        cursor.execute(\"\"\"SELECT 1\n                            FROM trackedbranches\n                           WHERE repository=%s\n                             AND local_name=%s\"\"\",\n                       (repository_id, target_name))\n\n        if cursor.fetchone():\n            raise OperationError(\"branch '%s' already tracks another branch\" % target_name)\n\n        users = [dbutils.User.fromName(db, username) for username in users]\n\n        if target_name.startswith(\"r/\"):\n            cursor.execute(\"\"\"SELECT 1\n                                FROM reviews\n                                JOIN branches ON (branches.id=reviews.branch)\n                               WHERE branches.repository=%s\n                                 AND branches.name=%s\"\"\",\n                           (repository_id, target_name))\n\n            if not cursor.fetchone():\n                raise OperationError(\"non-existing review branch can't track another branch\")\n\n            if forced is None:\n                forced = True\n        elif forced is None:\n            forced = False\n\n        cursor.execute(\"\"\"SELECT 1\n                            FROM knownremotes\n                           WHERE url=%s\n                             AND pushing\"\"\",\n                       (source_location,))\n\n        if cursor.fetchone():\n            delay = \"1 week\"\n        else:\n            delay = \"1 hour\"\n\n        cursor.execute(\"\"\"INSERT INTO trackedbranches (repository, local_name, remote, remote_name, forced, delay)\n                               VALUES (%s, %s, %s, %s, %s, INTERVAL %s)\n                            RETURNING id\"\"\",\n                       (repository_id, target_name, source_location, source_name, forced, delay))\n\n        branch_id = cursor.fetchone()[0]\n\n        for user in users:\n            cursor.execute(\"\"\"INSERT INTO trackedbranchusers (branch, uid)\n                                   VALUES (%s, %s)\"\"\",\n                           (branch_id, user.id))\n\n        db.commit()\n\n        pid = int(open(configuration.services.BRANCHTRACKER[\"pidfile_path\"]).read().strip())\n        os.kill(pid, signal.SIGHUP)\n\n        return OperationResult(branch_id=branch_id)\n"
  },
  {
    "path": "src/operation/typechecker.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport re\n\nimport base\nimport dbutils\n\nfrom operation.basictypes import OperationError, OperationFailure\n\nclass Optional:\n\n    \"\"\"Utility class for signaling that a dictionary member is optional.\"\"\"\n\n    def __init__(self, source):\n        self.source = source\n\nclass TypeCheckerContext(object):\n    def __init__(self, *args):\n        self.args = args\n        self.req, self.db, self.user = args\n        self.repository = None\n        self.review = None\n        self.__path = [\"data\"]\n    def __str__(self):\n        return \"\".join(self.__path)\n    def push(self, name):\n        self.__path.append(name)\n    def pop(self):\n        self.__path.pop()\n    def clone(self):\n        copy = TypeCheckerContext(*self.args)\n        copy.copy_from(self)\n        copy.__path = self.__path[:]\n        return copy\n    def copy_from(self, other):\n        self.repository = other.repository\n        self.review = other.review\n\nclass TypeChecker(object):\n\n    \"\"\"\n    Interface for checking operation input type correctness.\n\n    Sub-classes implement the method __call__(value, context) which raises an\n    OperationError if the input is incorrect.\n\n    A type checker structure is created using the static make() function.\n    \"\"\"\n\n    @staticmethod\n    def make(source):\n        \"\"\"\n        Construct a structure of TypeChecker objects.\n\n        The source argument should be a dict object, single-element list object,\n        a set object containing strings, or str, int or bool (the actual type\n        objects, not a string, integer or boolean value).\n\n        If the source argument is a dict object, per-element type checkers are\n        constructed by calling this function on the value of each item in the\n        dictionary.  See DictionaryChecker for details.\n\n        If the source argument is a list object, a per-element type checker is\n        constructed by calling this function on the value of the single element\n        in the list.\n\n        If the source argument is a set object, all elements in it should be\n        strings, and the constructed checker verifies that the value is a string\n        that is a member of the set.\n\n        Otherwise the constructed checker verifies that the value is of the type\n        of the source argument (or, in the case of source=str, that the value's\n        type is either str or unicode).\n        \"\"\"\n\n        if isinstance(source, TypeChecker):\n            return source\n        elif isinstance(source, dict):\n            if (len(source) == 1 and\n                all(key is str for key in source.keys()) and\n                all(isinstance(value, type) for value in source.values())):\n                return GenericDictionaryChecker(source)\n            return DictionaryChecker(source)\n        elif isinstance(source, list):\n            return ArrayChecker(source)\n        elif isinstance(source, set):\n            if all(type(x) is str for x in source):\n                return EnumerationChecker(source)\n            return VariantChecker(source)\n        elif source is str:\n            return StringChecker()\n        elif source is int:\n            return IntegerChecker()\n        elif source is bool:\n            return BooleanChecker()\n\n        try:\n            is_type_checker = issubclass(source, TypeChecker)\n        except TypeError:\n            pass\n        else:\n            if is_type_checker:\n                return source()\n\n        raise base.ImplementationError(\"invalid source type: %r\" % source)\n\n    def getSuffixedCheckers(self):\n        \"\"\"\n        Return a list of (suffix, checker) tuples.\n\n        A suffixed checker allows a parameter to be specified with an optional\n        suffix, that (optionally) restricts the acceptable input values.  For\n        instance, a checker that supports either an integer id or a string name\n        could return a list\n\n          [(\"id\", id_checker), (\"name\", name_checker)]\n\n        with the effect that the input can be one of\n\n          { \"thing\": id-or-name }\n          { \"thing_id\": id }\n          { \"thing_name\": name }\n\n        and the resulting parameter name \"thing\" in either case.\n\n        A checker can also return [(\"suffix\", None)] to signal that a suffix is\n        supported, but that the same checker (not a restricted one) should be\n        applied regardless.\n        \"\"\"\n        return []\n\nclass Implicit(object):\n\n    \"\"\"\n    Mix-in class for implicit parameters.\n\n    An implicit parameter is one we don't expect to receive from the client at\n    all, but rather already have.  The use-case is for an operation to request\n    additional information passed to it, for instance a reference to the Request\n    object.\n    \"\"\"\n\n    pass\n\nclass Prioritized(object):\n\n    \"\"\"\n    Mix-in class for prioritized parameters.\n\n    A prioritized parameter is one that stores information in the context that\n    might be required for the checkers of other parameters, and thus need to be\n    processed first.  DictionaryChecker uses this to control the order in which\n    it processes dictionary items.\n    \"\"\"\n\n    pass\n\nclass Request(TypeChecker, Implicit):\n    def __call__(self, value, context):\n        assert value is None\n        return context.req\n\nclass BooleanChecker(TypeChecker):\n\n    \"\"\"\n    Type checker for booleans.\n\n    Raises an OperationError if the checked value is not a boolean.\n    \"\"\"\n\n    def __call__(self, value, context):\n        if not isinstance(value, bool):\n            raise OperationError(\"invalid input: %s is not a boolean\" % context)\n\nclass StringChecker(TypeChecker):\n\n    \"\"\"\n    Type checker for strings.\n\n    Raises an OperationError if the checked value is not a string.\n    \"\"\"\n\n    def __call__(self, value, context):\n        if not isinstance(value, basestring):\n            raise OperationError(\"invalid input: %s is not a string\" % context)\n\nclass RestrictedString(StringChecker):\n\n    \"\"\"\n    Type checker for restricted strings.\n\n    A restricted string is one that may consist only of certain characters,\n    and/or must be of a certain min/max length.\n\n    Raises an OperationFailure if the checked value is not valid.\n    \"\"\"\n\n    def __init__(self, allowed=None, minlength=None, maxlength=None, ui_name=None):\n        self.allowed = allowed\n        self.minlength = minlength\n        self.maxlength = maxlength\n        self.ui_name = ui_name\n\n    def __call__(self, value, context):\n        super(RestrictedString, self).__call__(value, context)\n        if self.ui_name:\n            ui_name = self.ui_name\n        else:\n            ui_name = context\n        if self.minlength is not None \\\n                and len(value) < self.minlength:\n            raise OperationFailure(\n                code=\"paramtooshort:%s\" % context,\n                title=\"Invalid %s\" % ui_name,\n                message=(\"invalid input: %s must be at least %d characters long\"\n                         % (ui_name, self.minlength)))\n        if self.maxlength is not None \\\n                and len(value) > self.maxlength:\n            raise OperationFailure(\n                code=\"paramtoolong:%s\" % context,\n                title=\"Invalid %s\" % ui_name,\n                message=(\"invalid input: %s must be at most %d characters long\"\n                         % (ui_name, self.maxlength)))\n        if self.allowed:\n            disallowed = [ch for ch in sorted(set(value))\n                          if not self.allowed(ch)]\n            if disallowed:\n                raise OperationFailure(\n                    code=\"paramcontainsillegalchar:%s\" % context,\n                    title=\"Invalid %s\" % ui_name,\n                    message=(\"invalid input: %s may not contain the character%s %s\"\n                             % (ui_name, \"s\" if len(disallowed) > 1 else \"\",\n                                \", \".join(repr(ch) for ch in disallowed))))\n\nclass SHA1(RestrictedString):\n    def __init__(self):\n        super(SHA1, self).__init__(minlength=4,\n                                   maxlength=40,\n                                   allowed=re.compile(\"[0-9A-Fa-f]$\").match)\n\nclass IntegerChecker(TypeChecker):\n\n    \"\"\"\n    Type checker for integers.\n\n    Raises an OperationError if the checked value is not an integer.\n    \"\"\"\n\n    def __call__(self, value, context):\n        if not isinstance(value, int) or isinstance(value, bool):\n            raise OperationError(\"invalid input: %s is not an integer\" % context)\n\nclass RestrictedInteger(IntegerChecker):\n    def __init__(self, minvalue=None, maxvalue=None, ui_name=None):\n        self.minvalue = minvalue\n        self.maxvalue = maxvalue\n        self.ui_name = ui_name\n\n    def __call__(self, value, context):\n        super(RestrictedInteger, self).__call__(value, context)\n        if self.ui_name:\n            ui_name = self.ui_name\n        else:\n            ui_name = context\n        if self.minvalue is not None \\\n                and value < self.minvalue:\n            raise OperationFailure(\n                code=\"valuetoolow:%s\" % context,\n                title=\"Invalid %s parameter\" % ui_name,\n                message=(\"invalid input: %s must be %d or higher\"\n                         % (ui_name, self.minvalue)))\n        if self.maxvalue is not None \\\n                and value > self.maxvalue:\n            raise OperationFailure(\n                code=\"valuetoohigh:%s\" % context,\n                title=\"Invalid %s parameter\" % ui_name,\n                message=(\"invalid input: %s must be %d or lower\"\n                         % (ui_name, self.maxvalue)))\n\nclass NonNegativeInteger(RestrictedInteger):\n    def __init__(self):\n        super(NonNegativeInteger, self).__init__(minvalue=0)\n\nclass PositiveInteger(RestrictedInteger):\n    def __init__(self):\n        super(PositiveInteger, self).__init__(minvalue=1)\n\ndef check(checker, value, context):\n    \"\"\"Apply checker and return converted, or original, value.\"\"\"\n    converted = checker(value, context)\n    if converted is None:\n        return value\n    return converted\n\nclass GenericDictionaryChecker(TypeChecker):\n\n    \"\"\"\n    Type checker for generic dictionary objects.\n\n    Simply checks that every key and every value is of the appropriate type.\n    \"\"\"\n\n    def __init__(self, source):\n        assert len(source) == 1\n        self.__key_checker = TypeChecker.make(source.keys()[0])\n        self.__value_checker = TypeChecker.make(source.values()[0])\n\n    def __call__(self, value, context):\n        if not type(value) is dict:\n            raise OperationError(\"invalid input: %s is not a dictionary\" % context)\n        for key in value:\n            context.push(\".\" + key)\n            self.__key_checker(key, context)\n            value[key] = check(self.__value_checker, value[key], context)\n            context.pop()\n\nclass DictionaryChecker(TypeChecker):\n\n    \"\"\"\n    Type checker for dictionary objects.\n\n    Checks two sets of members: required and optional.  Raises an OperationError\n    if the checked value is not a dictionary or if any required member is not\n    present in it, or if it contains any unexpected members.  Applies\n    per-element checkers on all required members and on all present optional\n    members.\n    \"\"\"\n\n    def __init__(self, source):\n        self.__implicit = []\n        self.__prioritized = []\n        self.__required = []\n        self.__optional = []\n        self.__expected = set()\n\n        for name, source_type in source.items():\n            if isinstance(source_type, Optional):\n                checker = TypeChecker.make(source_type.source)\n                if isinstance(checker, Implicit):\n                    raise base.ImplementationError(\n                        \"implicit parameter cannot be optional: %s\"\n                        % name)\n                self.__optional.append((name, checker))\n            else:\n                checker = TypeChecker.make(source_type)\n                if isinstance(checker, Implicit):\n                    self.__implicit.append((name, checker))\n                elif isinstance(checker, Prioritized):\n                    self.__prioritized.append((name, checker))\n                else:\n                    self.__required.append((name, checker))\n            for suffix, _ in checker.getSuffixedCheckers():\n                if name.endswith(\"_\" + suffix):\n                    raise base.ImplementationError(\n                        \"invalid parameter name: %s (includes optional suffix)\"\n                        % name)\n\n    def __call__(self, value, context):\n        if not type(value) is dict:\n            raise OperationError(\"invalid input: %s is not a dictionary\" % context)\n\n        specified_names = set(value.keys())\n\n        class Missing:\n            pass\n\n        def read_with_suffixes(name, checker):\n            try:\n                if name in value:\n                    specified_names.remove(name)\n                    context.push(\".\" + name)\n                    return check(checker, value[name], context)\n                for suffix, suffixed_checker in checker.getSuffixedCheckers():\n                    suffixed_name = \"%s_%s\" % (name, suffix)\n                    if suffixed_name in value:\n                        specified_names.remove(suffixed_name)\n                        context.push(\".\" + suffixed_name)\n                        if suffixed_checker is not None:\n                            checker = suffixed_checker\n                        return check(checker, value.pop(suffixed_name), context)\n                context.push(\".\" + name)\n                return Missing\n            finally:\n                context.pop()\n\n        for name, checker in self.__implicit:\n            context.push(\".\" + name)\n            if name in value:\n                raise OperationError(\n                    \"invalid input: %s should not be specified\" % context)\n            value[name] = checker(None, context)\n            context.pop()\n\n        def process_members(items, required):\n            for name, checker in items:\n                converted = read_with_suffixes(name, checker)\n                if not converted is Missing:\n                    value[name] = converted\n                elif required:\n                    context.push(\".\" + name)\n                    raise OperationError(\"invalid input: %s missing\" % context)\n\n        process_members(self.__prioritized, True)\n        process_members(self.__required, True)\n        process_members(self.__optional, False)\n\n        if specified_names:\n            context.push(\".\" + specified_names.pop())\n            raise OperationError(\"invalid input: %s was not used\" % context)\n\nclass ArrayChecker(TypeChecker):\n\n    \"\"\"\n    Type checker for arrays.\n\n    Raises an OperationError if the checked value is not an array.  Applies the\n    per-element checker on each element in the array.\n    \"\"\"\n\n    def __init__(self, source):\n        if len(source) != 1:\n            raise base.ImplementationError(\"invalid source type\")\n        self.__checker = TypeChecker.make(source[0])\n\n    def __call__(self, value, context):\n        if not type(value) is list:\n            raise OperationError(\"%s is not a list\" % context)\n        for index, item in enumerate(value):\n            context.push(\"[%d]\" % index)\n            value[index] = check(self.__checker, item, context)\n            context.pop()\n\nclass VariantChecker(TypeChecker):\n\n    \"\"\"\n    Type checker for variants (values of one of a set of types.)\n\n    Raises an OperationError if the checked value is not one of the permitted\n    types (checked by applying a per-type checker on the value.)\n    \"\"\"\n\n    def __init__(self, source):\n        self.__checkers = []\n        self.__suffixed_checkers = []\n        if isinstance(source, dict):\n            for suffix, item in source.items():\n                checker = TypeChecker.make(item)\n                self.__checkers.append(checker)\n                self.__suffixed_checkers.append((suffix, checker))\n        else:\n            self.__checkers.extend(TypeChecker.make(item) for item in source)\n\n    def __call__(self, value, context):\n        for checker in self.__checkers:\n            try:\n                variant_context = context.clone()\n                value = checker(value, variant_context)\n                context.copy_from(variant_context)\n                return value\n            except (OperationError, OperationFailure):\n                pass\n        raise OperationError(\"%s is of invalid type\" % context)\n\n    def getSuffixedCheckers(self):\n        return self.__suffixed_checkers\n\nclass EnumerationChecker(TypeChecker):\n\n    \"\"\"\n    Type checker for enumerations.\n\n    Raises an OperationError if the checked value is not a string or if the\n    string value is not a member of the enumeration.\n    \"\"\"\n\n    def __init__(self, source):\n        self.__checker = TypeChecker.make(str)\n        for item in source:\n            if not type(item) is str:\n                raise base.ImplementationError(\"invalid source type\")\n        self.__enumeration = source\n\n    def __call__(self, value, context):\n        self.__checker(value, context)\n        if value not in self.__enumeration:\n            raise OperationError(\"invalid input: %s is not valid\" % context)\n\nclass Review(PositiveInteger, Prioritized):\n    def __call__(self, value, context):\n        super(Review, self).__call__(value, context)\n        context.review = dbutils.Review.fromId(context.db, value)\n        context.repository = context.review.repository\n        return context.review\n\n    def getSuffixedCheckers(self):\n        return [(\"id\", None)]\n\nclass RepositoryId(PositiveInteger):\n    def __call__(self, value, context):\n        import gitutils\n        super(RepositoryId, self).__call__(value, context)\n        context.repository = gitutils.Repository.fromId(context.db, value)\n        return context.repository\n\nclass RepositoryName(StringChecker):\n    def __call__(self, value, context):\n        import gitutils\n        super(RepositoryName, self).__call__(value, context)\n        context.repository = gitutils.Repository.fromName(context.db, value)\n        return context.repository\n\nclass Repository(VariantChecker, Prioritized):\n    def __init__(self):\n        super(Repository, self).__init__({ \"id\": RepositoryId,\n                                           \"name\": RepositoryName })\n\nclass CommitId(PositiveInteger):\n    def __call__(self, value, context):\n        import gitutils\n        if context.repository is None:\n            raise OperationError(\"missing repository in context\")\n        super(CommitId, self).__call__(value, context)\n        return gitutils.Commit.fromId(context.db, context.repository, value)\n\nclass CommitSHA1(SHA1):\n    def __call__(self, value, context):\n        import gitutils\n        if context.repository is None:\n            raise OperationError(\"missing repository in context\")\n        super(CommitSHA1, self).__call__(value, context)\n        return gitutils.Commit.fromSHA1(context.db, context.repository, value)\n\nclass Commit(VariantChecker):\n    def __init__(self):\n        super(Commit, self).__init__({ \"id\": CommitId,\n                                       \"sha1\": CommitSHA1 })\n\n    def __call__(self, value, context):\n        if context.repository is None:\n            raise OperationError(\"missing repository in context\")\n        return super(Commit, self).__call__(value, context)\n\nclass FileId(PositiveInteger):\n    def __call__(self, value, context):\n        super(FileId, self).__call__(value, context)\n        return dbutils.File.fromId(context.db, value)\n\nclass FilePath(StringChecker):\n    def __call__(self, value, context):\n        super(FilePath, self).__call__(value, context)\n        return dbutils.File.fromPath(context.db, value, insert=False)\n\nclass File(VariantChecker):\n    def __init__(self):\n        super(File, self).__init__({ \"id\": FileId,\n                                     \"path\": FilePath })\n\nclass UserId(PositiveInteger):\n    def __call__(self, value, context):\n        super(UserId, self).__call__(value, context)\n        return dbutils.User.fromId(context.db, value)\n\nclass UserName(StringChecker):\n    def __call__(self, value, context):\n        super(UserName, self).__call__(value, context)\n        return dbutils.User.fromName(context.db, value)\n\nclass User(VariantChecker):\n    def __init__(self):\n        super(User, self).__init__({ \"id\": UserId,\n                                     \"name\": UserName })\n\nclass ExtensionId(PositiveInteger):\n    def __call__(self, value, context):\n        from extensions.extension import Extension\n        super(ExtensionId, self).__call__(value, context)\n        return Extension.fromId(context.db, value)\n\nclass ExtensionKey(StringChecker):\n    def __call__(self, value, context):\n        from extensions.extension import Extension\n        super(ExtensionKey, self).__call__(value, context)\n        author_name, _, extension_name = value.rpartition(\"/\")\n        return Extension(author_name, extension_name)\n\nclass Extension(VariantChecker):\n    def __init__(self):\n        super(Extension, self).__init__({ \"id\": ExtensionId,\n                                          \"key\": ExtensionKey })\n"
  },
  {
    "path": "src/operation/typechecker_unittest.py",
    "content": "import copy\nimport json\n\ndef basic():\n    import htmlutils\n\n    from operation.basictypes import OperationError, OperationFailure\n    from operation.typechecker import (\n        Optional, TypeChecker, TypeCheckerContext, BooleanChecker,\n        StringChecker, RestrictedString, SHA1, IntegerChecker,\n        RestrictedInteger, PositiveInteger, NonNegativeInteger, ArrayChecker,\n        EnumerationChecker, VariantChecker, DictionaryChecker)\n\n    # Check TypeChecker.make()'s handling of basic types.\n    assert type(TypeChecker.make(bool)) is BooleanChecker\n    assert type(TypeChecker.make(str)) is StringChecker\n    assert type(TypeChecker.make(int)) is IntegerChecker\n    assert type(TypeChecker.make([bool])) is ArrayChecker\n    assert type(TypeChecker.make(set([\"foo\", \"bar\"]))) is EnumerationChecker\n    assert type(TypeChecker.make(set([bool, str, int]))) is VariantChecker\n    assert type(TypeChecker.make({ \"foo\": bool })) is DictionaryChecker\n\n    # Check TypeChecker.make()'s handling of TypeChecker sub-classes and\n    # instances thereof.\n    assert isinstance(TypeChecker.make(BooleanChecker), BooleanChecker)\n    boolean_checker = BooleanChecker()\n    assert TypeChecker.make(boolean_checker) is boolean_checker\n\n    def check(checker, *values):\n        checker = TypeChecker.make(checker)\n        results = []\n        for value in values:\n            converted = checker(value, TypeCheckerContext(None, None, None))\n            results.append(value if converted is None else converted)\n        return results\n\n    def should_match(checker, *values, **kwargs):\n        results = check(checker, *values)\n        if \"result\" in kwargs:\n            expected_result = kwargs[\"result\"]\n            for result in results:\n                assert result == expected_result, \\\n                    \"%r != %r\" % (result, expected_result)\n\n    def should_not_match(checker, *values, **expected):\n        for value in values:\n            try:\n                check(checker, copy.deepcopy(value))\n            except (OperationError, OperationFailure) as error:\n                error = json.loads(str(error))\n                for key, value in expected.items():\n                    if isinstance(value, str):\n                        value = set([value])\n                    assert error.get(key) in value, \\\n                        (\"%s: %r not among %r\" % (key, error.get(key), value))\n            else:\n                assert False, \"checker allowed value incorrectly: %r\" % value\n\n    # Check some simple things that should be accepted.\n    should_match(bool, True, False)\n    should_match(str, \"\", \"foo\")\n    should_match(int, -2**31, -1, 0, 1, 2**31)\n    should_match([bool], [], [True, False])\n    should_match([str], [\"\", \"foo\"])\n    should_match([int], [-2**31, -1, 0, 1, 2**31])\n    should_match(set([\"foo\", \"bar\"]), \"foo\", \"bar\")\n    should_match(set([bool, str, int]),\n                 True, False, \"\", \"foo\", -2**31, -1, 0, 1, 2**31)\n\n    # Check some equally simple things that shouldn't be accepted.\n    should_not_match(bool, 10, \"foo\",\n                     error=\"invalid input: data is not a boolean\")\n    should_not_match(str, True, 10,\n                     error=\"invalid input: data is not a string\")\n    should_not_match(int, True, \"foo\", 0.5,\n                     error=\"invalid input: data is not an integer\")\n    should_not_match([bool], [True, 10], [False, \"foo\"],\n                     error=\"invalid input: data[1] is not a boolean\")\n    should_not_match([str], [\"\", True], [\"foo\", 10],\n                     error=\"invalid input: data[1] is not a string\")\n    should_not_match([int], [0, True], [10, \"foo\"],\n                     error=\"invalid input: data[1] is not an integer\")\n    should_not_match(set([\"foo\", \"bar\"]), \"fie\",\n                     error=\"invalid input: data is not valid\")\n    should_not_match(set([\"foo\", \"bar\"]), True, 10,\n                     error=\"invalid input: data is not a string\")\n    should_not_match(set([bool, str, int]), [True], [\"foo\"], [10],\n                     error=\"data is of invalid type\")\n\n    # Check some dictionary checkers.\n    should_match({ \"b\": bool, \"s\": str, \"i\": int },\n                 { \"b\": True, \"s\": \"foo\", \"i\": 10 })\n    should_match({ \"req\": bool, \"opt\": Optional(bool) },\n                 { \"req\": True, \"opt\": False },\n                 { \"req\": False })\n    should_not_match({ \"b\": bool }, { \"b\": \"foo\" }, { \"b\": 10 },\n                     error=\"invalid input: data.b is not a boolean\")\n    should_not_match({ \"b\": bool }, { \"i\": 10 },\n                     error=\"invalid input: data.b missing\")\n    should_not_match({ \"b\": bool }, { \"b\": True, \"i\": 10 },\n                     error=\"invalid input: data.i was not used\")\n    should_not_match({ \"b\": Optional(bool) }, { \"b\": \"foo\" }, { \"b\": 10 },\n                     error=\"invalid input: data.b is not a boolean\")\n\n    # Check suffixed variant checker in dictionary.\n    id_or_name = VariantChecker({ \"id\": int, \"name\": str })\n    should_match({ \"thing\": id_or_name },\n                 { \"thing\": 10 },\n                 { \"thing_id\": 10 },\n                 result={ \"thing\": 10 })\n    should_match({ \"thing\": id_or_name },\n                 { \"thing\": \"foo\" },\n                 { \"thing_name\": \"foo\" },\n                 result={ \"thing\": \"foo\" })\n    should_not_match({ \"thing\": id_or_name },\n                     { \"thing_id\": \"foo\" },\n                     error=\"invalid input: data.thing_id is not an integer\")\n    should_not_match({ \"thing\": id_or_name },\n                     { \"thing_name\": 10 },\n                     error=\"invalid input: data.thing_name is not a string\")\n    should_not_match({ \"thing\": id_or_name },\n                     { \"thing_id\": 10,\n                       \"thing_name\": \"foo\" },\n                     error=(\"invalid input: data.thing_id was not used\",\n                            \"invalid input: data.thing_name was not used\"))\n\n    # Check some RestrictedString types.\n    should_match(RestrictedString, \"\", \"foo\")\n    should_match(RestrictedString(minlength=0), \"\", \"foo\")\n    should_match(RestrictedString(minlength=3), \"foo\")\n    should_match(RestrictedString(maxlength=0), \"\")\n    should_match(RestrictedString(maxlength=3), \"\", \"foo\")\n    should_match(RestrictedString(minlength=0, maxlength=3), \"\", \"foo\")\n    should_match(RestrictedString(allowed=lambda c: False), \"\")\n    should_match(RestrictedString(allowed=lambda c: True), \"\", \"foo\")\n    should_match(RestrictedString(allowed=lambda c: c in \"foo\"), \"\", \"foo\")\n    should_not_match(RestrictedString(), True, 10,\n                     error=\"invalid input: data is not a string\")\n    should_not_match(\n        RestrictedString(minlength=1), \"\",\n        code=\"paramtooshort:data\",\n        title=\"Invalid data\",\n        message=\"invalid input: data must be at least 1 characters long\")\n    should_not_match(\n        RestrictedString(maxlength=2), \"foo\",\n        code=\"paramtoolong:data\",\n        title=\"Invalid data\",\n        message=\"invalid input: data must be at most 2 characters long\")\n    should_not_match(\n        RestrictedString(allowed=lambda c: False), \"foo\",\n        code=\"paramcontainsillegalchar:data\",\n        title=\"Invalid data\",\n        message=\"invalid input: data may not contain the characters 'f', 'o'\")\n    should_not_match(\n        RestrictedString(allowed=lambda c: False, ui_name=\"gazonk\"), \"foo\",\n        code=\"paramcontainsillegalchar:data\",\n        title=\"Invalid gazonk\",\n        message=\"invalid input: gazonk may not contain the characters 'f', 'o'\")\n\n    # Check SHA1.\n    sha1 = \"0123456789abcdefABCDEF0123456789abcdefAB\"\n    should_match(SHA1, *[sha1[:length] for length in range(4, 41)])\n    should_not_match(SHA1, True, 10,\n                     error=\"invalid input: data is not a string\")\n    for ch in range(0, 256):\n        ch = chr(ch)\n        if ch in sha1:\n            continue\n        should_not_match(\n            SHA1, \"012\" + ch,\n            message=htmlutils.htmlify(\n                \"invalid input: data may not contain the character %r\" % ch))\n    should_not_match(\n        SHA1, \"012\",\n        message=\"invalid input: data must be at least 4 characters long\")\n    should_not_match(\n        SHA1, \"0\" * 41,\n        message=\"invalid input: data must be at most 40 characters long\")\n\n    # Check some RestrictedInteger types.\n    should_match(RestrictedInteger, -2**31, -1, 0, 1, 2**31)\n    should_match(RestrictedInteger(minvalue=-2**31), -2**31, -1, 0, 1, 2**31)\n    should_match(RestrictedInteger(minvalue=0), 0, 1, 2**31)\n    should_match(RestrictedInteger(maxvalue=0), -2**31, -1, 0)\n    should_match(RestrictedInteger(maxvalue=2**31), -2**31, -1, 0, 1, 2**31)\n    should_match(RestrictedInteger(minvalue=0, maxvalue=0), 0)\n    should_not_match(RestrictedInteger(), True, \"foo\",\n                     error=\"invalid input: data is not an integer\")\n    should_not_match(RestrictedInteger(minvalue=0), -2**31, -1,\n                     code=\"valuetoolow:data\",\n                     title=\"Invalid data parameter\",\n                     message=\"invalid input: data must be 0 or higher\")\n    should_not_match(RestrictedInteger(maxvalue=0), 1, 2**31,\n                     code=\"valuetoohigh:data\",\n                     title=\"Invalid data parameter\",\n                     message=\"invalid input: data must be 0 or lower\")\n    should_not_match(RestrictedInteger(minvalue=1, ui_name=\"gazonk\"), 0,\n                     code=\"valuetoolow:data\",\n                     title=\"Invalid gazonk parameter\",\n                     message=\"invalid input: gazonk must be 1 or higher\")\n\n    # Check NonNegativeInteger.\n    should_match(NonNegativeInteger, 0, 1, 2**31)\n    should_not_match(NonNegativeInteger, True, \"foo\",\n                     error=\"invalid input: data is not an integer\")\n    should_not_match(NonNegativeInteger, -2**31, -1,\n                     code=\"valuetoolow:data\",\n                     title=\"Invalid data parameter\",\n                     message=\"invalid input: data must be 0 or higher\")\n\n    # Check PositiveInteger.\n    should_match(PositiveInteger, 1, 2**31)\n    should_not_match(PositiveInteger, True, \"foo\",\n                     error=\"invalid input: data is not an integer\")\n    should_not_match(PositiveInteger, -2**31, -1, 0,\n                     code=\"valuetoolow:data\",\n                     title=\"Invalid data parameter\",\n                     message=\"invalid input: data must be 1 or higher\")\n\n    print \"basic: ok\"\n"
  },
  {
    "path": "src/operation/unittest.py",
    "content": "def independence():\n    # Simply check that operation can be imported.\n\n    import operation\n\n    print \"independence: ok\"\n"
  },
  {
    "path": "src/operation/usersession.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nfrom operation import Operation, OperationResult, OperationFailure, Request\n\nimport dbutils\nimport configuration\nimport auth\n\nclass ValidateLogin(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"req\": Request,\n                                   \"fields\": { str: str }},\n                           accept_anonymous_user=True)\n\n    def process(self, db, user, req, fields):\n        if not user.isAnonymous():\n            return OperationResult()\n\n        try:\n            auth.DATABASE.authenticate(db, fields)\n        except auth.AuthenticationFailed as error:\n            return OperationResult(message=error.message)\n        except auth.WrongPassword:\n            return OperationResult(message=\"Wrong password!\")\n\n        auth.createSessionId(db, req, db.user, db.authentication_labels)\n\n        return OperationResult()\n\n    def sanitize(self, value):\n        sanitized = value.copy()\n        for field in auth.DATABASE.getFields():\n            if field[0]:\n                sanitized[\"fields\"][field[1]] = \"****\"\n        return sanitized\n\nclass EndSession(Operation):\n    def __init__(self):\n        Operation.__init__(self, { \"req\": Request })\n\n    def process(self, db, user, req):\n        if not auth.deleteSessionId(db, req, user):\n            raise OperationFailure(\n                code=\"notsignedout\",\n                title=\"Not signed out\",\n                message=\"You were not signed out.\")\n        if not configuration.base.ALLOW_ANONYMOUS_USER:\n            return OperationResult(target_url=\"/\")\n        return OperationResult()\n"
  },
  {
    "path": "src/page/__init__.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport functools\n\nimport request\nimport htmlutils\nimport page.utils\nimport page.parameters\n\nclass Page(object):\n    def __init__(self, name, parameters, handler):\n        self.name = name\n        self.parameters = parameters\n        self.handler = handler\n\n    def __call__(self, req, db, user):\n        parameters = {}\n\n        for name, checker in self.parameters.items():\n            if isinstance(checker, page.parameters.Optional):\n                default = None\n                checker = checker.actual\n            else:\n                default = request.NoDefault\n\n            is_list = isinstance(checker, page.parameters.ListOf)\n\n            if is_list:\n                checker = checker.actual\n\n            if issubclass(checker, page.parameters.Stateful):\n                checker = checker(req, db, user)\n\n            try:\n                value = req.getParameter(name, default, str if is_list else checker)\n            except page.parameters.InvalidParameterValue as error:\n                raise request.InvalidParameterValue(name, req.getParameter(name), error.expected)\n\n            if value is not None:\n                if is_list:\n                    values = []\n                    for item in value.split(\",\"):\n                        values.append(checker(item))\n                    value = values\n\n                parameters[name] = value\n\n        return self.handler(**parameters).generate(self, req, db, user)\n\n    class Handler(object):\n        def __init__(self, review=None):\n            self.review = review\n\n        def setup(self, page, req, db, user):\n            self.page = page\n            self.request = req\n            self.db = db\n            self.user = user\n\n        def generate(self, page, req, db, user):\n            self.setup(page, req, db, user)\n\n            self.document = htmlutils.Document(req)\n            self.html = self.document.html()\n            self.head = self.html.head()\n            self.body = self.html.body()\n\n            self._generateHeader()\n            self.generateContent()\n            self._generateFooter()\n\n            return self.document\n\n        def _generateHeader(self):\n            page.utils.generateHeader(self.body, self.db, self.user,\n                                      current_page=self.page.name,\n                                      generate_right=self.getGenerateHeaderRight(),\n                                      extra_links=self.getExtraLinks())\n            self.generateHeader()\n\n        def generateHeader(self):\n            pass\n\n        def generateContent(self):\n            self.body.div(\"message\").h1(\"center\").text(\"Not implemented!\")\n\n        def _generateFooter(self):\n            self.generateFooter()\n            page.utils.generateFooter(self.body, self.db, self.user,\n                                      current_page=self.page.name)\n\n        def generateFooter(self):\n            pass\n\n        def getGenerateHeaderRight(self):\n            if self.review:\n                import reviewing.utils\n                return functools.partial(reviewing.utils.renderDraftItems, self.db, self.user, self.review)\n\n        def getExtraLinks(self):\n            if self.review:\n                return [(\"r/%d\" % self.review.id, \"Back to Review\")]\n            else:\n                return []\n"
  },
  {
    "path": "src/page/addrepository.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport page.utils\nimport configuration\nimport htmlutils\nimport dbutils\n\ndef renderNewRepository(req, db, user):\n    if not user.hasRole(db, \"repositories\"):\n        raise page.utils.DisplayMessage(title=\"Not allowed!\", body=\"Only users with the 'repositories' role can add new repositories.\")\n\n    document = htmlutils.Document(req)\n\n    html = document.html()\n    head = html.head()\n    body = html.body()\n\n    page.utils.generateHeader(body, db, user)\n\n    document.addExternalStylesheet(\"resource/newrepository.css\")\n    document.addExternalScript(\"resource/newrepository.js\")\n\n    target = body.div(\"main\")\n\n    basic = target.table(\"paleyellow\")\n    basic.col(width='20%')\n    basic.col(width='0')\n    basic.col(width='40%')\n    basic.col(width='40%')\n    h1 = basic.tr().td('h1', colspan=4).h1().text(\"New Repository\")\n\n    row = basic.tr(\"name\")\n    row.td(\"heading\").text(\"Short name:\")\n    row.td(\"prefix\").text()\n    row.td(\"value\").input(name=\"name\")\n    row.td(\"suffix\").text()\n\n    row = basic.tr(\"help\")\n    row.td(colspan=4).text(\"Repository short name.\")\n\n    row = basic.tr(\"path\")\n    row.td(\"heading\").text(\"Path:\")\n    row.td(\"prefix\").text(\"%s:%s/\" % (configuration.base.HOSTNAME, configuration.paths.GIT_DIR))\n    row.td(\"value\").input(name=\"path\")\n    row.td(\"suffix\").text(\".git\")\n\n    row = basic.tr(\"help\")\n    row.td(colspan=4).text(\"Path of repository on the Critic server.\")\n\n    row = basic.tr(\"remote\")\n    row.td(\"heading\").text(\"Source repository:\")\n    row.td(\"prefix\").text()\n    row.td(\"value\").input(name=\"remote\")\n    row.td(\"suffix\").text(\"(optional)\")\n\n    row = basic.tr(\"help\")\n    row.td(colspan=4).text(\"Git URL of repository to mirror.\")\n\n    row = basic.tr(\"branch\")\n    row.td(\"heading\").text(\"Source branch:\")\n    row.td(\"prefix\").text()\n    row.td(\"value\").input(name=\"branch\", value=\"master\", disabled=\"disabled\")\n    row.td(\"suffix\").text()\n\n    row = basic.tr(\"help\")\n    row.td(colspan=4).text(\"This branch in the source repository is automatically mirrored in Critic's repository.\")\n\n    row = basic.tr(\"buttons\")\n    row.td(colspan=4).button(\"add\").text(\"Add Repository\")\n\n    return document\n"
  },
  {
    "path": "src/page/basic.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\ndef generateHeader(target, generate_right=generateEmpty):\n    row = target.table(width='100%', style='border-bottom: 3px solid black').tr()\n    b = row.td(valign='bottom', align='left').b(style='font-family: sans-serif')\n    b.b(style='font-size: 40px; color: #d32226; cursor: pointer', onclick=\"location.href='/';\").text(\"Opera\")\n    b.b(style='font-size: 50px; color: #666666; cursor: pointer', onclick=\"location.href='/';\").text(\"Critic\")\n    generate_right(row.td(valign='bottom', align='right', style='padding-bottom: 10px'))\n"
  },
  {
    "path": "src/page/branches.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport itertools\n\nimport dbutils\nimport gitutils\nimport log.html\nimport htmlutils\nimport page.utils\n\ndef renderBranches(req, db, user):\n    offset = req.getParameter(\"offset\", 0, filter=int)\n    count = req.getParameter(\"count\", 50, filter=int)\n\n    document = htmlutils.Document(req)\n\n    html = document.html()\n    head = html.head()\n    body = html.body()\n\n    page.utils.generateHeader(body, db, user, current_page=\"branches\")\n\n    document.addExternalScript(\"resource/branches.js\")\n    document.addExternalStylesheet(\"resource/branches.css\")\n\n    cursor = db.cursor()\n\n    repository = req.getParameter(\"repository\", None, gitutils.Repository.FromParameter(db))\n\n    if not repository:\n        repository = user.getDefaultRepository(db)\n\n    if repository:\n        title = \"Branches in %s\" % repository.name\n        selected = repository.name\n    else:\n        title = \"Branches\"\n        selected = None\n\n    document.setTitle(title)\n\n    table = body.div(\"main\").table(\"paleyellow branches\", align=\"center\", cellspacing=\"0\")\n\n    row = table.tr(\"title\")\n    row.td(\"h1\", colspan=2).h1().text(title)\n\n    page.utils.generateRepositorySelect(db, user, row.td(\"repositories\", colspan=2), selected=selected)\n\n    if repository:\n        document.addInternalScript(repository.getJS())\n\n        cursor.execute(\"\"\"SELECT branches.id, branches.name, branches.base, branches.review,\n                                 branches.commit_time, COUNT(reachable.branch)\n                            FROM (SELECT branches.id AS id, branches.name AS name, bases.name AS base,\n                                         reviews.id AS review, commits.commit_time AS commit_time\n                                    FROM branches\n                                    JOIN commits ON (commits.id=branches.head)\n                         LEFT OUTER JOIN reviews ON (reviews.origin=branches.id)\n                         LEFT OUTER JOIN branches AS bases ON (branches.base=bases.id)\n                                   WHERE branches.type='normal'\n                                     AND branches.repository=%s\n                                ORDER BY commits.commit_time DESC\n                                   LIMIT %s\n                                   OFFSET %s) AS branches\n                 LEFT OUTER JOIN reachable ON (reachable.branch=branches.id)\n                        GROUP BY branches.id, branches.name, branches.base, branches.review,\n                                 branches.commit_time\n                        ORDER BY branches.commit_time DESC\"\"\",\n                       (repository.id, count, offset))\n\n        branches_found = False\n\n        for branch_id, branch_name, base_name, review_id, commit_time, count in cursor:\n            if not branches_found:\n                row = table.tr(\"headings\")\n                row.td(\"name\").text(\"Name\")\n                row.td(\"base\").text(\"Base\")\n                row.td(\"commits\").text(\"Commits\")\n                row.td(\"when\").text(\"When\")\n                branches_found = True\n\n            row = table.tr(\"branch\")\n\n            def link_to_branch(target, repository, name):\n                url = htmlutils.URL(\"/log\", branch=name, repository=repository.id)\n                target.a(href=url).text(name)\n\n            td_name = row.td(\"name\")\n            link_to_branch(td_name, repository, branch_name)\n\n            if review_id is not None:\n                span = td_name.span(\"review\").preformatted()\n                span.a(href=\"r/%d\" % review_id).text(\"r/%d\" % review_id)\n            elif base_name:\n                url = htmlutils.URL(\"/checkbranch\",\n                                    repository=repository.id,\n                                    commit=branch_name,\n                                    upstream=base_name,\n                                    fetch=\"no\")\n                span = td_name.span(\"check\").preformatted().a(href=url).text(\"check\")\n\n            td_base = row.td(\"base\")\n            if base_name:\n                link_to_branch(td_base, repository, base_name)\n\n            row.td(\"commits\").text(count)\n\n            log.html.renderWhen(row.td(\"when\"), commit_time.timetuple())\n\n        if not branches_found:\n            row = table.tr(\"nothing\")\n            row.td(\"nothing\", colspan=4).text(\"No branches\")\n    else:\n        row = table.tr(\"nothing\")\n        row.td(\"nothing\", colspan=4).text(\"No repository selected\")\n\n    return document\n"
  },
  {
    "path": "src/page/checkbranch.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport dbutils\nimport htmlutils\nimport page.utils\nimport gitutils\nimport re\nimport log.commitset\nimport request\n\ndef addNote(req, db, user):\n    repository_id = req.getParameter(\"repository\", filter=int)\n    branch = req.getParameter(\"branch\")\n    upstream = req.getParameter(\"upstream\")\n    sha1 = req.getParameter(\"sha1\")\n    review_id = req.getParameter(\"review\", None)\n    text = req.read().strip()\n\n    if review_id is not None:\n        review = dbutils.Review.fromId(db, review_id)\n    else:\n        review = None\n\n    cursor = db.cursor()\n    cursor.execute(\"DELETE FROM checkbranchnotes WHERE repository=%s AND branch=%s AND upstream=%s AND sha1=%s\",\n                   (repository_id, branch, upstream, sha1))\n    cursor.execute(\"INSERT INTO checkbranchnotes (repository, branch, upstream, sha1, uid, review, text) VALUES (%s, %s, %s, %s, %s, %s, %s)\",\n                   (repository_id, branch, upstream, sha1, user.id, review_id, text or None))\n    db.commit()\n\n    response = \"ok\"\n\n    if review and review.repository.id == repository_id:\n        repository = gitutils.Repository.fromId(db, repository_id)\n        commit = gitutils.Commit.fromSHA1(db, repository, sha1)\n        commitset = log.commitset.CommitSet(review.branch.getCommits(db))\n\n        upstreams = commitset.getFilteredTails(repository)\n        if len(upstreams) == 1:\n            upstream = upstreams.pop()\n            if repository.mergebase([commit.sha1, upstream]) == upstream:\n                response = \"rebase\"\n\n    return response\n\ndef deleteNote(req, db, user):\n    repository_id = req.getParameter(\"repository\", filter=int)\n    branch = req.getParameter(\"branch\")\n    upstream = req.getParameter(\"upstream\")\n    sha1 = req.getParameter(\"sha1\")\n\n    cursor = db.cursor()\n    cursor.execute(\"DELETE FROM checkbranchnotes WHERE repository=%s AND branch=%s AND upstream=%s AND sha1=%s\",\n                   (repository_id, branch, upstream, sha1))\n    db.commit()\n\n    return \"ok\"\n\ndef renderCheckBranch(req, db, user):\n    mode = \"html\" if req.path == \"checkbranch\" else \"text\"\n\n    repository_arg = req.getParameter(\"repository\", None if mode == \"html\" else request.NoDefault)\n\n    cursor = db.cursor()\n    header_right = []\n\n    if mode == \"html\":\n        document = htmlutils.Document(req)\n\n        html = document.html()\n        head = html.head()\n        body = html.body()\n\n        def generateRight(target):\n            header_right.append(target.div(\"buttons\"))\n\n        page.utils.generateHeader(body, db, user, generate_right=generateRight)\n\n        document.addExternalStylesheet(\"resource/checkbranch.css\")\n        document.addExternalScript(\"resource/checkbranch.js\")\n        document.addInternalScript(user.getJS())\n\n        target = body.div(\"main\")\n    else:\n        result = \"\"\n\n    if repository_arg is not None:\n        repository = gitutils.Repository.fromParameter(db, repository_arg)\n        branch_arg = commit = req.getParameter(\"commit\")\n        fetch = req.getParameter(\"fetch\", \"no\") == \"yes\"\n        upstream_arg = req.getParameter(\"upstream\", \"master\")\n\n        if mode == \"html\":\n            header_right[0].a(\"button\", href=\"tutorial?item=checkbranch\").text(\"Help\")\n            header_right[0].a(\"button\", href=\"checkbranchtext?repository=%s&commit=%s&upstream=%s\" % (repository_arg, branch_arg, upstream_arg)).text(\"Get Textual Log\")\n            header_right[0].span(\"buttonscope buttonscope-global\")\n            target.addInternalScript(repository.getJS())\n            target.addInternalScript(\"var branch = %r, upstream = %r;\" % (branch_arg, upstream_arg))\n\n        upstream = repository.revparse(upstream_arg)\n\n        if fetch:\n            if commit.startswith(\"r/\"):\n                raise page.utils.DisplayMessage(\"Won't fetch review branch from remote!\")\n            repository.updateBranchFromRemote(db, repository.getDefaultRemote(db), commit)\n\n        try: commit = repository.revparse(commit)\n        except: raise page.utils.DisplayMessage(\"Unable to interpret '%s' as a commit reference.\" % commit)\n\n        try: gitutils.Commit.fromSHA1(db, repository, commit)\n        except: raise page.utils.DisplayMessage(\"'%s' doesn't exist in the repository.\" % commit)\n\n        if mode == \"html\":\n            document.setTitle(\"Branch review status: %s\" % branch_arg)\n\n        commits = repository.revlist([commit], [upstream], \"--topo-order\")\n\n        if not commits:\n            try: merge_sha1 = repository.revlist([upstream], [commit], \"--topo-order\", \"--ancestry-path\")[-1]\n            except: raise page.utils.DisplayMessage(\"No merged or unmerged commits found.\")\n\n            merge = gitutils.Commit.fromSHA1(db, repository, merge_sha1)\n\n            if len(merge.parents) == 1:\n                candidate_merges = repository.revlist([commit], [], \"--topo-order\", \"--max-count=256\")\n                for merge_sha1 in candidate_merges:\n                    merge = gitutils.Commit.fromSHA1(db, repository, merge_sha1)\n                    if len(merge.parents) > 1:\n                        use_upstream = merge.parents[1]\n                        break\n                else:\n                    raise page.utils.DisplayMessage(\"Merge into upstream was a fast-forward; can't figure out what was merged in.\")\n            else:\n                assert commit in merge.parents\n\n                use_upstream = None\n                for parent in merge.parents:\n                    if parent != commit:\n                        use_upstream = parent\n                        break\n\n            commits = repository.revlist([commit], [use_upstream], \"--topo-order\")\n            title = \"Merged Commits (%d)\" % len(commits)\n            display_upstream = gitutils.Commit.fromSHA1(db, repository, use_upstream).describe(db)\n        else:\n            title = \"Unmerged Commits (%d)\" % len(commits)\n            display_upstream = upstream_arg\n\n        commits = [gitutils.Commit.fromSHA1(db, repository, sha1) for sha1 in commits]\n\n        if mode == \"html\":\n            table = target.table(\"branchstatus paleyellow\", align=\"center\", cellspacing=0)\n            table.col(width=\"10%\")\n            table.col(width=\"8%\")\n            table.col(width=\"77%\")\n            table.col(width=\"5%\")\n\n            thead = table.thead()\n            h1_cell = thead.tr().td('h1', colspan=4)\n            h1_cell.h1().text(title)\n            p = h1_cell.p()\n            p.text(\"Commits returned by the command: \")\n            p.span(\"command\").text(\"git rev-list --topo-order %s ^%s\" % (branch_arg, display_upstream))\n\n            row = thead.tr(\"headings\")\n            row.th(\"sha1\").text(\"SHA-1\")\n            row.th(\"user\").text(\"Committer\")\n            row.th(\"summary\").text(\"Summary\")\n            row.th(\"Review\").text(\"Review\")\n\n        cursor.execute(\"\"\"SELECT sha1, uid, review, text\n                            FROM checkbranchnotes\n                           WHERE repository=%s\n                             AND branch=%s\n                             AND upstream=%s\"\"\",\n                       (repository.id, branch_arg, upstream_arg))\n\n        notes = {}\n        reds = False\n\n        for sha1, user_id, review_id, text in cursor:\n            notes[sha1] = dbutils.User.fromId(db, user_id), review_id, text\n\n        if commits:\n            merged = set(commits)\n            handled = set()\n\n            current_tbody = None\n            last_tbody = None\n\n            def nameFromEmail(email):\n                offset = email.find(\"@\")\n                if offset != -1: return email[:offset]\n                else: return email\n\n            review = None\n            reviewed_commits = []\n\n            text_items = {}\n            text_order = []\n\n            for commit in commits:\n                if commit not in handled:\n                    cursor.execute(\"\"\"SELECT reviews.id\n                                        FROM reviews\n                                        JOIN branches ON (branches.id=reviews.branch)\n                                        JOIN commits ON (commits.id=branches.head)\n                                       WHERE commits.sha1=%s AND reviews.state!='dropped'\"\"\",\n                                   (commit.sha1,))\n\n                    if commit not in reviewed_commits:\n                        review = None\n\n                    reviewed = set()\n                    best = 0\n\n                    for (review_id,) in cursor:\n                        candidate_review = dbutils.Review.fromId(db, review_id)\n                        candidate_reviewed = filter(lambda commit: commit in merged,\n                                                    candidate_review.branch.getCommits(db))\n\n                        if len(candidate_reviewed) > best:\n                            review = candidate_review\n                            reviewed = candidate_reviewed\n                            best = len(reviewed)\n\n                        reviewed_commits = filter(lambda commit: commit in reviewed, commits)\n\n                    if mode == \"html\":\n                        if review:\n                            current_tbody = None\n\n                            review_tbody = table.tbody(\"reviewed\" if review.state == 'closed' or review.accepted(db) else \"pending\")\n                            first = True\n\n                            review_tbody.tr(\"empty\").td(\"empty\", colspan=4)\n\n                            for reviewed_commit in reviewed_commits:\n                                handled.add(reviewed_commit)\n\n                                row = review_tbody.tr(\"commit\")\n                                row.td(\"sha1\", title=commit.sha1).div().text(reviewed_commit.sha1[:8])\n                                row.td(\"user\").text(nameFromEmail(reviewed_commit.committer.email))\n                                row.td(\"summary\").a(href=\"%s?review=%d\" % (reviewed_commit.sha1, review.id)).text(reviewed_commit.niceSummary())\n\n                                if first:\n                                    row.td(\"review\", rowspan=len(reviewed)).a(href=\"r/%d\" % review.id).text(\"r/%d\" % review.id)\n                                    first = False\n\n                            last_tbody = review_tbody\n                        elif commit.sha1 in notes:\n                            note_user, review_id, text = notes[commit.sha1]\n\n                            try: review = dbutils.Review.fromId(db, review_id) if review_id else None\n                            except: review = None\n\n                            current_tbody = None\n\n                            note_tbody = table.tbody(\"note\" if (not review_id or (review and (review.state == 'closed' or review.accepted(db)))) else \"pending\")\n                            note_tbody.tr(\"empty\").td(\"empty\", colspan=4)\n\n                            row = note_tbody.tr(\"commit\", id=commit.sha1)\n                            row.td(\"sha1\", title=commit.sha1).div().text(commit.sha1[:8])\n                            row.td(\"user\").text(nameFromEmail(commit.committer.email))\n                            row.td(\"summary\").a(href=\"%s?repository=%d\" % (commit.sha1, repository.id)).text(commit.niceSummary())\n\n                            cell = row.td(\"review\")\n\n                            if review_id is None: cell.text()\n                            else: cell.a(href=\"r/%d\" % review_id).text(\"r/%d\" % review_id)\n\n                            row = note_tbody.tr(\"note\")\n                            row.td(\"sha1\").text()\n\n                            cell = row.td(\"note\", colspan=2)\n                            cell.text(\"Set by \")\n                            cell.span(\"user\").text(note_user.fullname)\n                            if text is not None:\n                                cell.text(\": \")\n                                cell.span(\"text\").text(text)\n\n                            row.td(\"edit\").a(\"edit\", href=\"javascript:editCommit(%r, %d, true%s);\" % (commit.sha1, commit.getId(db), (\", %d\" % review_id) if review_id is not None else \"\")).text(\"[edit]\")\n\n                            last_tbody = note_tbody\n                        else:\n                            handled.add(commit)\n\n                            if not current_tbody:\n                                current_tbody = table.tbody(\"unknown\")\n                                current_tbody.tr(\"empty\").td(\"empty\", colspan=4)\n                                last_tbody = current_tbody\n\n                            row = current_tbody.tr(\"commit%s\" % (\" own\" if commit.author.email == user.email else \"\"), id=commit.sha1)\n                            row.td(\"sha1\", title=commit.sha1).div().text(commit.sha1[:8])\n                            row.td(\"user\").text(nameFromEmail(commit.committer.email))\n                            row.td(\"summary\").a(href=\"%s/%s\" % (repository.name, commit.sha1)).text(commit.niceSummary())\n                            row.td(\"edit\").a(\"edit\", href=\"javascript:editCommit(%r, %d, false);\" % (commit.sha1, commit.getId(db))).text(\"[edit]\")\n\n                            reds = True\n                    else:\n                        match = re.search(\"[A-Z][A-Z0-9]*-[0-9]+\", commit.summary())\n\n                        if match:\n                            title = match.group(0)\n                        else:\n                            title = commit.summary(maxlen=50)\n                            if title.endswith(\".\") and not title.endswith(\"...\"):\n                                title = title[:-1]\n\n                        if commit.sha1 in notes:\n                            note_user, review_id, text = notes[commit.sha1]\n                            if review_id: review = dbutils.Review.fromId(db, review_id)\n                        else:\n                            text = None\n\n                        if review:\n                            item = review.getURL(db)\n                            if review.state != \"closed\" and not review.accepted(db):\n                                item += \" (NOT ACCEPTED!)\"\n                            if text:\n                                item += \" (%s: %s)\" % (note_user.fullname, text)\n                        elif text:\n                            item = \"%s: %s\" % (note_user.fullname, text)\n                        else:\n                            item = \"REVIEW STATUS UNKNOWN!\"\n\n                        if title in text_items:\n                            if item not in text_items[title]:\n                                text_items[title].append(item)\n                        else:\n                            text_items[title] = [item]\n                            text_order.append(title)\n\n            if mode == \"html\":\n                last_tbody.tr(\"empty\").td(\"empty\", colspan=4)\n            else:\n                for title in reversed(text_order):\n                    result += \"%s: %s\\n\" % (title, \", \".join(text_items[title]))\n\n        if mode == \"html\":\n            if reds:\n                h1_cell.h2().text(\"There should be no red!\")\n\n            legend = target.div(\"legend\")\n            legend.text(\"Color legend: \")\n            legend.span(\"red\").text(\"Status unknown\")\n            legend.text(\" \")\n            legend.span(\"yellow\").text(\"Status set manually\")\n            legend.text(\" \")\n            legend.span(\"green\").text(\"Verified by Critic\")\n\n            return document\n        else:\n            return page.utils.ResponseBody(result, content_type=\"text/plain\")\n    else:\n        header_right[0].a(\"button\", href=\"tutorial?item=checkbranch\").text(\"Help\")\n\n        table = page.utils.PaleYellowTable(target, \"Check branch review status\")\n\n        def renderRepository(target):\n            page.utils.generateRepositorySelect(db, user, target, name=\"repository\")\n        def renderBranchName(target):\n            target.input(name=\"commit\")\n        def renderFetch(target):\n            target.input(name=\"fetch\", type=\"checkbox\", value=\"yes\")\n        def renderUpstream(target):\n            target.input(name=\"upstream\", value=\"master\")\n\n        table.addItem(\"Repository\", renderRepository, description=\"Repository.\")\n        table.addItem(\"Branch name\", renderBranchName,\n                      description=\"Branch name, or other reference to a commit.\")\n        table.addItem(\"Fetch branch from remote\", renderFetch,\n                      description=(\"Fetch named branch from selected repository's \"\n                                   \"primary remote (from whence its 'master' branch \"\n                                   \"is auto-updated.)\"))\n        table.addItem(\"Upstream\", renderUpstream,\n                      description=\"Name of upstream branch.\")\n\n        def renderCheck(target):\n            target.button(\"check\").text(\"Check branch\")\n\n        table.addCentered(renderCheck)\n\n        return document\n"
  },
  {
    "path": "src/page/config.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport os\nimport fnmatch\n\nimport configuration\nimport dbutils\nimport gitutils\nimport htmlutils\nimport textutils\nimport page.utils\n\ndef renderConfig(req, db, user):\n    highlight = req.getParameter(\"highlight\", None)\n    repository = req.getParameter(\"repository\", None, gitutils.Repository.FromParameter(db))\n    filter_id = req.getParameter(\"filter\", None, int)\n    defaults = req.getParameter(\"defaults\", \"no\") == \"yes\"\n\n    if filter_id is not None:\n        # There can't be system-wide defaults for one of a single user's\n        # filters.\n        defaults = False\n\n    cursor = db.cursor()\n\n    if filter_id is not None:\n        cursor.execute(\"\"\"SELECT filters.path, repositories.name\n                            FROM filters\n                            JOIN repositories ON (repositories.id=filters.repository)\n                           WHERE filters.id=%s\"\"\",\n                       (filter_id,))\n        row = cursor.fetchone()\n        if not row:\n            raise page.utils.InvalidParameterValue(\n                name=\"filter\",\n                value=str(filter_id),\n                expected=\"valid filter id\")\n        title = \"Filter preferences: %s in %s\" % row\n    elif repository is not None:\n        title = \"Repository preferences: %s\" % repository.name\n    else:\n        title = \"User preferences\"\n\n    document = htmlutils.Document(req)\n    document.setTitle(title)\n\n    html = document.html()\n    head = html.head()\n    body = html.body()\n\n    if user.isAnonymous():\n        disabled = \"disabled\"\n    else:\n        disabled = None\n\n    def generate_right(target):\n        if defaults:\n            url = \"/config\"\n            if repository is not None:\n                url += \"?repository=%d\" % repository.id\n            target.a(\"button\", href=url).text(\"Edit Own\")\n        elif user.hasRole(db, \"administrator\"):\n            url = \"/config?defaults=yes\"\n            if repository is not None:\n                url += \"&repository=%d\" % repository.id\n                what = \"Repository Defaults\"\n            else:\n                what = \"System Defaults\"\n            target.a(\"button\", href=url).text(\"Edit \" + what)\n\n    injected = page.utils.generateHeader(body, db, user, current_page=\"config\",\n                                         generate_right=generate_right)\n\n    document.addExternalStylesheet(\"resource/config.css\")\n    document.addExternalScript(\"resource/config.js\")\n    document.addInternalScript(user.getJS())\n    document.addInternalScript(\"var repository_id = %s, filter_id = %s, defaults = %s;\"\n                               % (htmlutils.jsify(repository.id if repository else None),\n                                  htmlutils.jsify(filter_id),\n                                  htmlutils.jsify(defaults)))\n\n    target = body.div(\"main\")\n\n    table = target.table('preferences paleyellow', align='center', cellspacing=0)\n    h1 = table.tr().td('h1', colspan=3).h1()\n    h1.text(title)\n\n    if filter_id is None:\n        page.utils.generateRepositorySelect(\n            db, user, h1.span(\"right\"), allow_selecting_none=True,\n            selected=repository.name if repository else False)\n\n    if filter_id is not None:\n        conditional = \"per_filter\"\n    elif repository is not None:\n        conditional = \"per_repository\"\n    elif defaults:\n        conditional = \"per_system\"\n    else:\n        conditional = \"per_user\"\n\n    cursor = db.cursor()\n    cursor.execute(\"\"\"SELECT item, type, description, per_repository, per_filter\n                        FROM preferences\n                       WHERE %(conditional)s\"\"\"\n                   % { \"conditional\": conditional })\n\n    preferences = dict((item, [preference_type, description, None, None, per_repository, per_filter])\n                       for item, preference_type, description, per_repository, per_filter in cursor)\n\n    def set_values(rows, is_overrides):\n        index = 3 if is_overrides else 2\n        for item, integer, string in rows:\n            if preferences[item][0] == \"boolean\":\n                preferences[item][index] = bool(integer)\n            elif preferences[item][0] == \"integer\":\n                preferences[item][index] = integer\n            else:\n                preferences[item][index] = string\n\n    cursor.execute(\"\"\"SELECT item, integer, string\n                        FROM userpreferences\n                       WHERE item=ANY (%s)\n                         AND uid IS NULL\n                         AND repository IS NULL\"\"\",\n                   (preferences.keys(),))\n\n    set_values(cursor, is_overrides=False)\n\n    if repository is not None:\n        cursor.execute(\"\"\"SELECT item, integer, string\n                            FROM userpreferences\n                           WHERE item=ANY (%s)\n                             AND uid IS NULL\n                             AND repository=%s\"\"\",\n                       (preferences.keys(), repository.id))\n\n        # These are overrides if we're editing the defaults for a specific\n        # repository.\n        set_values(cursor, is_overrides=defaults)\n\n    if not defaults:\n        cursor.execute(\"\"\"SELECT item, integer, string\n                            FROM userpreferences\n                           WHERE item=ANY (%s)\n                             AND uid=%s\n                             AND repository IS NULL\n                             AND filter IS NULL\"\"\",\n                       (preferences.keys(), user.id))\n\n        if filter_id is not None or repository is not None:\n            # We're looking at per-filter or per-repository settings, so the\n            # user's global settings are defaults, not the overrides.  If a\n            # per-filter or per-repository override is deleted, the user's\n            # global setting kicks in instead.\n            set_values(cursor, is_overrides=False)\n\n            if filter_id is not None:\n                cursor.execute(\"\"\"SELECT item, integer, string\n                                    FROM userpreferences\n                                   WHERE item=ANY (%s)\n                                     AND uid=%s\n                                     AND filter=%s\"\"\",\n                               (preferences.keys(), user.id, filter_id))\n            else:\n                cursor.execute(\"\"\"SELECT item, integer, string\n                                    FROM userpreferences\n                                   WHERE item=ANY (%s)\n                                     AND uid=%s\n                                     AND repository=%s\"\"\",\n                               (preferences.keys(), user.id, repository.id))\n\n        # Set the overrides.  This is either the user's global settings, if\n        # we're not looking at per-filter or per-repository settings, or the\n        # user's per-filter or per-repository settings if we are.\n        set_values(cursor, is_overrides=True)\n    elif repository is None:\n        # When editing global defaults, use the values from preferences.json\n        # used when initially installing Critic as the default values.\n        defaults_path = os.path.join(configuration.paths.INSTALL_DIR,\n                                     \"data/preferences.json\")\n        with open(defaults_path) as defaults_file:\n            factory_defaults = textutils.json_decode(defaults_file.read())\n        for item, data in preferences.items():\n            data[3] = data[2]\n            if item in factory_defaults:\n                data[2] = factory_defaults[item][\"default\"]\n                if data[2] == data[3]:\n                    data[3] = None\n\n    if req.getParameter(\"recalculate\", \"no\") == \"yes\":\n        for item, data in preferences.items():\n            if data[2] == data[3]:\n                user.setPreference(db, item, None, repository=repository, filter_id=filter_id)\n                data[3] = None\n        db.commit()\n\n    debug_enabled = user.getPreference(db, \"debug.enabled\")\n\n    for item, (preference_type, description, default_value, current_value, per_repository, per_filter) in sorted(preferences.items()):\n        if item.startswith(\"debug.\") and item != \"debug.enabled\" and not debug_enabled:\n            continue\n\n        line_class_name = \"line\"\n        help_class_name = \"help\"\n\n        if highlight is not None and not fnmatch.fnmatch(item, highlight):\n            continue\n\n        if current_value is None:\n            current_value = default_value\n        else:\n            line_class_name += \" customized\"\n\n        row = table.tr(line_class_name)\n        heading = row.td(\"heading\")\n        heading.text(\"%s:\" % item)\n        value = row.td(\"value\", colspan=2)\n        value.preformatted()\n\n        options = None\n        optgroup = None\n        def addOption(value, name, selected=lambda value: value==current_value, **attributes):\n            (optgroup or options).option(\n                value=value, selected=\"selected\" if selected(value) else None,\n                **attributes).text(name)\n\n        if preference_type == \"boolean\":\n            value.input(\n                \"setting\", type=\"checkbox\", name=item,\n                checked=\"checked\" if current_value else None, disabled=disabled,\n                critic_current=htmlutils.jsify(current_value),\n                critic_default=htmlutils.jsify(default_value))\n        elif preference_type == \"integer\":\n            value.input(\n                \"setting\", type=\"number\", min=0, max=2**31 - 1,\n                name=item, value=current_value, disabled=disabled,\n                critic_current=htmlutils.jsify(current_value),\n                critic_default=htmlutils.jsify(default_value))\n        elif item == \"defaultRepository\":\n            page.utils.generateRepositorySelect(\n                db, user, value, allow_selecting_none=True,\n                placeholder_text=\"No default repository\", selected=current_value,\n                name=item, disabled=disabled,\n                critic_current=htmlutils.jsify(current_value),\n                critic_default=htmlutils.jsify(default_value))\n        elif item == \"defaultPage\":\n            options = value.select(\n                \"setting\", name=item, disabled=disabled,\n                critic_current=htmlutils.jsify(current_value),\n                critic_default=htmlutils.jsify(default_value))\n\n            addOption(\"home\", \"Home\")\n            addOption(\"dashboard\", \"Dashboard\")\n            addOption(\"branches\", \"Branches\")\n            addOption(\"config\", \"Config\")\n            addOption(\"tutorial\", \"Tutorial\")\n        elif item == \"email.urlType\":\n            cursor2 = db.cursor()\n            cursor2.execute(\"\"\"SELECT key, description, authenticated_scheme, hostname\n                                 FROM systemidentities\n                             ORDER BY description ASC\"\"\")\n\n            identities = cursor2.fetchall()\n            selected = set(current_value.split(\",\"))\n\n            options = value.select(\n                \"setting\", name=item, size=len(identities),\n                multiple=\"multiple\", disabled=disabled,\n                critic_current=htmlutils.jsify(current_value),\n                critic_default=htmlutils.jsify(default_value))\n\n            for key, label, authenticated_scheme, hostname in identities:\n                prefix = \"%s://%s/\" % (authenticated_scheme, hostname)\n                addOption(key, label, selected=lambda value: value in selected,\n                          class_=\"url-type flex\",\n                          data_text=label,\n                          data_html=(\"<span class=label>%s</span>\"\n                                     \"<span class=prefix>%s</span>\"\n                                     % (htmlutils.htmlify(label),\n                                        htmlutils.htmlify(prefix))))\n\n        elif item == \"email.updatedReview.quotedComments\":\n            options = value.select(\n                \"setting\", name=item, disabled=disabled,\n                critic_current=htmlutils.jsify(current_value),\n                critic_default=htmlutils.jsify(default_value))\n\n            addOption(\"all\", \"All\")\n            addOption(\"first\", \"First\")\n            addOption(\"last\", \"Last\")\n            addOption(\"firstlast\", \"First & Last\")\n        elif item == \"timezone\":\n            options = value.select(\n                \"setting\", name=item, disabled=disabled,\n                critic_current=htmlutils.jsify(current_value),\n                critic_default=htmlutils.jsify(default_value))\n\n            for group, zones in dbutils.timezones.sortedTimezones(db):\n                optgroup = options.optgroup(label=group)\n                for name, abbrev, utc_offset in zones:\n                    seconds = utc_offset.total_seconds()\n                    offset = \"%s%02d:%02d\" % (\"-\" if seconds < 0 else \"+\", abs(seconds) / 3600, (abs(seconds) % 3600) / 60)\n                    addOption(\"%s/%s\" % (group, name), \"%s (%s / UTC%s)\" % (name, abbrev, offset))\n        elif item == \"repository.urlType\":\n            options = value.select(\n                \"setting\", name=item, disabled=disabled,\n                critic_current=htmlutils.jsify(current_value),\n                critic_default=htmlutils.jsify(default_value))\n            long_path = os.path.join(configuration.paths.GIT_DIR, \"<path>.git\")\n\n            if \"git\" in configuration.base.REPOSITORY_URL_TYPES:\n                addOption(\"git\", \"git://%s/<path>.git\" % configuration.base.HOSTNAME)\n            if \"http\" in configuration.base.REPOSITORY_URL_TYPES:\n                scheme = configuration.base.ACCESS_SCHEME\n                if scheme == \"both\":\n                    if user.isAnonymous():\n                        scheme = \"http\"\n                    else:\n                        scheme = \"https\"\n                addOption(\"http\", \"%s://%s/<path>.git\" % (scheme, configuration.base.HOSTNAME))\n            if \"ssh\" in configuration.base.REPOSITORY_URL_TYPES:\n                addOption(\"ssh\", \"ssh://%s%s\" % (configuration.base.HOSTNAME, long_path))\n            if \"host\" in configuration.base.REPOSITORY_URL_TYPES:\n                addOption(\"host\", \"%s:%s\" % (configuration.base.HOSTNAME, long_path))\n        else:\n            if item.startswith(\"email.subjectLine.\"):\n                placeholder = \"Email type disabled\"\n            else:\n                placeholder = None\n            value.input(\n                \"setting\", type=\"text\", size=80, name=item,\n                placeholder=placeholder, value=current_value, disabled=disabled,\n                critic_current=htmlutils.jsify(current_value),\n                critic_default=htmlutils.jsify(default_value))\n\n        also_configurable_per = []\n\n        if per_repository and repository is None:\n            also_configurable_per.append(\"repository\")\n        if per_filter and filter_id is None:\n            also_configurable_per.append(\"filter\")\n\n        if also_configurable_per:\n            value.span(\"also-configurable-per\").text(\n                \"Also configurable per: %s\" % \", \".join(also_configurable_per))\n\n        reset = value.span(\"reset\")\n        reset.a(href=\"javascript:saveSettings(%s);\" % htmlutils.jsify(item)).text(\"[reset to default]\")\n\n        cell = table.tr(help_class_name).td(\"help\", colspan=3)\n\n        magic_description_links = {\n            \"format string for subject line\":\n                \"/tutorial?item=reconfigure#subject_line_formats\",\n            \"phony recipients\":\n                \"/tutorial?item=reconfigure#review_association_recipients\"\n            }\n\n        for link_text, link_href in magic_description_links.items():\n            prefix, link_text, suffix = description.partition(link_text)\n            if link_text:\n                cell.text(prefix)\n                cell.a(href=link_href).text(link_text)\n                cell.text(suffix)\n                break\n        else:\n            cell.text(description)\n\n    if injected and injected.has_key(\"preferences\") \\\n            and not defaults \\\n            and repository is None \\\n            and filter_id is None:\n        for extension_name, author, preferences in injected[\"preferences\"]:\n            if highlight is not None:\n                prefix = \"%s/%s\" % (author.name, extension_name)\n                preferences = [\n                    preference for preference in preferences\n                    if fnmatch.fnmatch(\"%s/%s\" % (prefix, preference[\"name\"]),\n                                       highlight)]\n\n                if not preferences:\n                    continue\n\n            h2 = table.tr(\"extension\").td(\"extension\", colspan=3).h2()\n            h2.span(\"name\").text(extension_name)\n            h2.text(\" by \")\n            h2.span(\"author\").text(author.fullname)\n\n            for preference in preferences:\n                preference_url = preference[\"url\"]\n                preference_name = preference[\"name\"]\n                preference_type = preference[\"type\"]\n                preference_value = preference[\"value\"]\n                preference_default = preference[\"default\"]\n                preference_description = preference[\"description\"]\n\n                line_class_name = \"line\"\n                help_class_name = \"help\"\n\n                if preference_value != preference_default:\n                    line_class_name += \" customized\"\n\n                row = table.tr(line_class_name)\n                heading = row.td(\"heading\")\n                heading.text(\"%s:\" % preference_name)\n                value = row.td(\"value\", colspan=2)\n                value.preformatted()\n\n                if preference_type == \"boolean\":\n                    value.input(\n                        \"setting\", type=\"checkbox\",\n                        name=preference_name, disabled=disabled,\n                        checked=\"checked\" if preference_value else None,\n                        critic_url=preference_url,\n                        critic_default=htmlutils.jsify(bool(preference_value)),\n                        critic_extension=extension_name)\n                elif preference_type == \"integer\":\n                    value.input(\n                        \"setting\", type=\"number\", min=0,\n                        name=preference_name, value=preference_value,\n                        disabled=disabled, critic_url=preference_url,\n                        critic_default=htmlutils.jsify(preference_default),\n                        critic_extension=extension_name)\n                elif preference_type == \"string\":\n                    value.input(\n                        \"setting\", type=\"text\",\n                        name=preference_name, value=preference_value,\n                        disabled=disabled, critic_url=preference_url,\n                        critic_default=htmlutils.jsify(preference_default),\n                        critic_extension=extension_name)\n                else:\n                    select = value.select(\n                        \"setting\", name=preference_name,\n                        disabled=disabled, critic_url=preference_url,\n                        critic_value=preference_value,\n                        critic_default=htmlutils.jsify(preference_default),\n                        critic_extension=extension_name)\n\n                    for choice in preference_type:\n                        select.option(value=choice[\"value\"], selected=\"selected\" if preference_value == choice[\"value\"] else None).text(choice[\"title\"])\n\n                cell = table.tr(help_class_name).td(\"help\", colspan=3)\n                cell.text(preference_description)\n\n    critic_installed_sha1 = dbutils.getInstalledSHA1(db)\n    div = body.div(\"installed_sha1\")\n    div.text(\"Critic version: \")\n    div.a(href=\"https://critic-review.org/critic/%s\" % critic_installed_sha1).text(critic_installed_sha1)\n\n    return document\n"
  },
  {
    "path": "src/page/confirmmerge.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport dbutils\nimport htmlutils\nimport gitutils\n\nimport page.utils\nimport log.html\nimport log.commitset\n\ndef renderConfirmMerge(req, db, user):\n    confirmation_id = req.getParameter(\"id\", filter=int)\n    tail_sha1 = req.getParameter(\"tail\", None)\n    do_confirm = req.getParameter(\"confirm\", \"no\") == \"yes\"\n    do_cancel = req.getParameter(\"cancel\", \"no\") == \"yes\"\n\n    cursor = db.cursor()\n\n    cursor.execute(\"SELECT review, uid, merge, confirmed, tail FROM reviewmergeconfirmations WHERE id=%s\", (confirmation_id,))\n    row = cursor.fetchone()\n    if not row:\n        raise page.utils.DisplayMessage(\"No pending merge with that id.\")\n    review_id, user_id, merge_id, confirmed, tail_id = row\n\n    review = dbutils.Review.fromId(db, review_id)\n    merge = gitutils.Commit.fromId(db, review.repository, merge_id)\n\n    if confirmed and tail_id is not None:\n        tail_sha1 = gitutils.Commit.fromId(db, review.repository, tail_id).sha1\n\n    cursor.execute(\"SELECT merged FROM reviewmergecontributions WHERE id=%s\", (confirmation_id,))\n\n    merged = [gitutils.Commit.fromId(db, review.repository, merged_id) for (merged_id,) in cursor]\n    merged_set = log.commitset.CommitSet(merged)\n\n    if tail_sha1 is not None:\n        tail = gitutils.Commit.fromSHA1(db, review.repository, tail_sha1)\n        tail_id = tail.getId(db)\n\n        cut = [gitutils.Commit.fromSHA1(db, review.repository, sha1)\n               for sha1 in tail.parents if sha1 in merged_set]\n        merged_set = merged_set.without(cut)\n        merged = list(merged_set)\n    else:\n        tail_id = None\n\n    if do_confirm:\n        cursor.execute(\"UPDATE reviewmergeconfirmations SET confirmed=TRUE, tail=%s WHERE id=%s\", (tail_id, confirmation_id))\n        db.commit()\n    elif do_cancel:\n        cursor.execute(\"DELETE FROM reviewmergeconfirmations WHERE id=%s\", (confirmation_id,))\n        db.commit()\n\n    document = htmlutils.Document(req)\n\n    html = document.html()\n    head = html.head()\n    body = html.body()\n\n    def renderButtons(target):\n        if not do_confirm and not do_cancel:\n            target.button(\"confirmAll\").text(\"Confirm (merge + contributed)\")\n            target.button(\"confirmNone\").text(\"Confirm (merge only)\")\n            target.button(\"cancel\").text(\"Cancel\")\n\n    page.utils.generateHeader(body, db, user, renderButtons, extra_links=[(\"r/%d\" % review.id, \"Back to Review\")])\n\n    document.addExternalStylesheet(\"resource/confirmmerge.css\")\n    document.addExternalScript(\"resource/log.js\")\n    document.addExternalScript(\"resource/confirmmerge.js\")\n    document.addInternalScript(user.getJS())\n    document.addInternalScript(review.getJS())\n    document.addInternalScript(\"var confirmation_id = %d;\" % confirmation_id)\n    document.addInternalScript(\"var merge_sha1 = %s;\" % htmlutils.jsify(merge.sha1))\n\n    if tail_sha1 is not None:\n        document.addInternalScript(\"var tail_sha1 = %s;\" % htmlutils.jsify(tail_sha1))\n\n    if not do_confirm and not do_cancel:\n        heads = merged_set.getHeads()\n        if heads:\n            document.addInternalScript(\"var automaticAnchorCommit = %s;\" % htmlutils.jsify(heads.pop().sha1))\n        else:\n            document.addInternalScript(\"var automaticAnchorCommit = null;\")\n\n    if do_confirm:\n        document.addInternalScript(\"var confirmed = true;\")\n    else:\n        document.addInternalScript(\"var confirmed = false;\")\n\n    target = body.div(\"main\")\n\n    basic = target.table(\"paleyellow\")\n    basic.col(width='15%')\n    basic.col(width='55%')\n    basic.col(width='30%')\n    h1 = basic.tr().td('h1', colspan=3).h1()\n\n    if do_confirm:\n        h1.text(\"CONFIRMED MERGE\")\n    elif do_cancel:\n        h1.text(\"CANCELLED MERGE\")\n    else:\n        h1.text(\"Confirm Merge\")\n\n    row = basic.tr(\"sha1\")\n    row.td(\"heading\").text(\"SHA-1:\")\n    row.td(\"value\").preformatted().text(merge.sha1)\n    row.td().text()\n\n    row = basic.tr(\"message\")\n    row.td(\"heading\").text(\"Message:\")\n    row.td(\"value\").preformatted().text(merge.message)\n    row.td().text()\n\n    if not do_confirm and not do_cancel:\n        row = basic.tr(\"instructions\")\n        row.td(\"heading\").text(\"Instructions:\")\n        row.td(\"value\").preformatted().text(\"\"\"\\\nUse the top right buttons to confirm the merge with or without the contributed commits that it brings.\n\nBy clicking 'Confirm (merge + contributed)' you will bring the merge commit plus all commits that it contributes into the this code review.\n\nBy clicking 'Confirm (merge only)' you will bring only the merge commit itself into the code review and not the contributed commits.\"\n\nBy clicking 'Cancel' you will abort the merge. The code review will not be modified at all from its current state.\"\"\")\n        row.td().text()\n\n\n    if merged:\n        columns = [(10, log.html.WhenColumn()),\n                   (60, log.html.SummaryColumn()),\n                   (16, log.html.AuthorColumn())]\n\n        log.html.render(db, target, \"Contributed Commits\", commits=merged, columns=columns)\n\n    return document\n"
  },
  {
    "path": "src/page/createreview.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport itertools\nimport re\n\nimport auth\nimport dbutils\nimport gitutils\nimport reviewing.utils\nimport log.html\nimport htmlutils\nimport page.utils\nimport diff\nimport configuration\nimport linkify\nimport changeset.utils as changeset_utils\n\nfrom textutils import json_decode, json_encode\n\ndef generateReviewersAndWatchersTable(db, repository, target, all_reviewers, all_watchers, applyfilters=True, applyparentfilters=False):\n    cursor = db.cursor()\n    teams = reviewing.utils.collectReviewTeams(all_reviewers)\n\n    reviewers = set()\n    watchers = set()\n\n    for file_id, file_reviewers in all_reviewers.items():\n        for user_id in file_reviewers:\n            reviewers.add(user_id)\n\n    for file_id, file_watchers in all_watchers.items():\n        for user_id in file_watchers:\n            if user_id not in reviewers:\n                watchers.add(user_id)\n\n    table = target.table(\"filters paleyellow\", align=\"center\")\n    table.tr().td(\"h1\", colspan=3).h1().text(\"Reviewers and Watchers\")\n\n    row = table.tr(\"applyfilters\")\n    row.td(\"value\").input(\"applyfilters\", type=\"checkbox\", checked=(\"checked\" if applyfilters else None))\n    row.td(\"legend\", colspan=2).text(\"Apply global filters. Only disable this in inofficial reviews!\")\n\n    table.tr(\"watchers\").td(\"spacer\", colspan=3)\n\n    if repository.parent and applyfilters:\n        parent = repository.parent\n        parents = []\n\n        while parent:\n            parents.append(parent.name)\n            parent = parent.parent\n\n        if len(parents) == 1: parents = \"repository (%s)\" % parents[0]\n        else: parents = \"repositories (%s)\" % \", \".join(parents)\n\n        row = table.tr(\"applyfilters\")\n        row.td(\"value\").input(\"applyparentfilters\", type=\"checkbox\", checked=(\"checked\" if applyparentfilters else None))\n        row.td(\"legend\", colspan=2).text(\"Apply global filters from upstream %s.\" % parents)\n\n        table.tr(\"watchers\").td(\"spacer\", colspan=3)\n\n    def formatFiles(files):\n        return diff.File.eliminateCommonPrefixes(sorted([dbutils.describe_file(db, file_id) for file_id in files]))\n\n    for team in teams:\n        if team is not None:\n            row = table.tr(\"reviewers\")\n\n            cell = row.td(\"reviewers\")\n            users = sorted([dbutils.User.fromId(db, user_id).fullname for user_id in team])\n            for user in users: cell.text(user).br()\n            row.td(\"willreview\").innerHTML(\"will&nbsp;review\")\n\n            cell = row.td(\"files\")\n            for file in formatFiles(teams[team]):\n                cell.span(\"file\").innerHTML(file).br()\n\n    if None in teams:\n        row = table.tr(\"reviewers\")\n        row.td(\"no-one\", colspan=2).text(\"No reviewers found for changes in\")\n\n        cell = row.td(\"files\")\n        for file in formatFiles(teams[None]):\n            cell.span(\"file\").innerHTML(file).br()\n\n    if watchers:\n        table.tr(\"watchers\").td(\"spacer\", colspan=3)\n\n        row = table.tr(\"watchers\")\n        row.td(\"heading\", colspan=2).text(\"Watchers\")\n        cell = row.td(\"watchers\")\n        for user_id in watchers: cell.text(dbutils.User.fromId(db, user_id).fullname).br()\n\n    table.tr(\"buttons\").td(\"spacer\", colspan=3)\n\n    buttons = table.tr(\"buttons\").td(\"buttons\", colspan=3)\n    buttons.button(onclick=\"addReviewer();\").text(\"Add Reviewer\")\n    buttons.button(onclick=\"addWatcher();\").text(\"Add Watcher\")\n\ndef renderSelectSource(req, db, user):\n    cursor = db.cursor()\n\n    document = htmlutils.Document(req)\n    html = document.html()\n    head = html.head()\n    body = html.body()\n\n    page.utils.generateHeader(body, db, user, current_page=\"createreview\")\n\n    document.addExternalStylesheet(\"resource/createreview.css\")\n    document.addExternalScript(\"resource/createreview.js\")\n    document.addExternalScript(\"resource/autocomplete.js\")\n\n    document.addInternalScript(user.getJS(db))\n    document.setTitle(\"Create Review\")\n\n    target = body.div(\"main\")\n    table = page.utils.PaleYellowTable(target, \"Create Review\")\n    table.titleRight.innerHTML(\"Step 1\")\n\n    default_repository = user.getDefaultRepository(db)\n    default_remotes = {}\n    default_branches = {}\n\n    def renderLocalRepository(target):\n        page.utils.generateRepositorySelect(db, user, target, access_type=\"modify\")\n\n        cursor.execute(\"\"\"SELECT repositories.id, repositories.name, repositories.path\n                            FROM repositories\n                        ORDER BY repositories.id\"\"\")\n\n        for repository_id, name, path in cursor.fetchall():\n            def findRemote(local_name):\n                cursor.execute(\"\"\"SELECT remote\n                                    FROM trackedbranches\n                                   WHERE repository=%s\n                                     AND local_name=%s\"\"\",\n                               (repository_id, local_name))\n                row = cursor.fetchone()\n                if row:\n                    return row[0]\n\n            try:\n                repository = gitutils.Repository.fromId(db, repository_id)\n            except auth.AccessDenied:\n                continue\n\n            remote = branch_name = None\n\n            for branch in repository.getSignificantBranches(db):\n                remote = findRemote(branch.name)\n                if remote:\n                    branch_name = branch.name\n                    break\n\n            if not remote:\n                remote = findRemote(\"*\")\n\n            default_remotes[name] = remote\n            default_branches[name] = branch_name\n\n        document.addInternalScript(\"var default_remotes = %s;\" % json_encode(default_remotes))\n        document.addInternalScript(\"var default_branches = %s;\" % json_encode(default_branches))\n\n    def renderRemoteRepository(target):\n        value = default_remotes.get(default_repository.name) if default_repository else None\n        target.input(\"remote\", value=value)\n\n    def renderWorkBranch(target):\n        target.text(\"refs/heads/\")\n        target.input(\"workbranch\")\n\n    def renderUpstreamCommit(target):\n        default_branch = default_branches.get(default_repository.name) if default_repository else None\n        target.input(\"upstreamcommit\", value=(\"refs/heads/%s\" % default_branch) if default_branch else \"\")\n\n    table.addItem(\"Local Repository\", renderLocalRepository, \"Critic repository to create review in.\")\n    table.addItem(\"Remote Repository\", renderRemoteRepository, \"Remote repository to fetch commits from.\")\n    table.addItem(\"Work Branch\", renderWorkBranch, \"Work branch (in remote repository) containing commits to create review of.\")\n    table.addItem(\"Upstream Commit\", renderUpstreamCommit, \"Upstream commit from which the work branch was branched.\")\n\n    def renderButtons(target):\n        target.button(\"fetchbranch\").text(\"Fetch Branch\")\n\n    table.addCentered(renderButtons)\n\n    return document\n\ndef renderCreateReview(req, db, user):\n    if user.isAnonymous(): raise page.utils.NeedLogin(req)\n\n    repository = req.getParameter(\"repository\", filter=gitutils.Repository.FromParameter(db), default=None)\n    applyparentfilters = req.getParameter(\"applyparentfilters\", \"yes\" if user.getPreference(db, 'review.applyUpstreamFilters') else \"no\") == \"yes\"\n\n    cursor = db.cursor()\n\n    if req.method == \"POST\":\n        data = json_decode(req.read())\n\n        summary = data.get(\"summary\")\n        description = data.get(\"description\")\n        review_branch_name = data.get(\"review_branch_name\")\n        commit_ids = data.get(\"commit_ids\")\n        commit_sha1s = data.get(\"commit_sha1s\")\n    else:\n        summary = req.getParameter(\"summary\", None)\n        description = req.getParameter(\"description\", None)\n        review_branch_name = req.getParameter(\"reviewbranchname\", None)\n\n        commit_ids = None\n        commit_sha1s = None\n\n        commits_arg = req.getParameter(\"commits\", None)\n        remote = req.getParameter(\"remote\", None)\n        upstream = req.getParameter(\"upstream\", \"master\")\n        branch_name = req.getParameter(\"branch\", None)\n\n        if commits_arg:\n            try: commit_ids = map(int, commits_arg.split(\",\"))\n            except: commit_sha1s = [repository.revparse(ref) for ref in commits_arg.split(\",\")]\n        elif branch_name:\n            cursor.execute(\"\"\"SELECT commit\n                                FROM reachable\n                                JOIN branches ON (branch=id)\n                               WHERE repository=%s\n                                 AND name=%s\"\"\",\n                           (repository.id, branch_name))\n            commit_ids = [commit_id for (commit_id,) in cursor]\n\n            if len(commit_ids) > configuration.limits.MAXIMUM_REVIEW_COMMITS:\n                raise page.utils.DisplayMessage(\n                    \"Too many commits!\",\n                    ((\"<p>The branch <code>%s</code> contains %d commits.  Reviews can\"\n                      \"be created from branches that contain at most %d commits.</p>\"\n                      \"<p>This limit can be adjusted by modifying the system setting\"\n                      \"<code>configuration.limits.MAXIMUM_REVIEW_COMMITS</code>.</p>\")\n                     % (htmlutils.htmlify(branch_name), len(commit_ids),\n                        configuration.limits.MAXIMUM_REVIEW_COMMITS)),\n                    html=True)\n        else:\n            return renderSelectSource(req, db, user)\n\n    req.content_type = \"text/html; charset=utf-8\"\n\n    if commit_ids:\n        commits = [gitutils.Commit.fromId(db, repository, commit_id) for commit_id in commit_ids]\n    elif commit_sha1s:\n        commits = [gitutils.Commit.fromSHA1(db, repository, commit_sha1) for commit_sha1 in commit_sha1s]\n    else:\n        commits = []\n\n    if not commit_ids:\n        commit_ids = [commit.getId(db) for commit in commits]\n    if not commit_sha1s:\n        commit_sha1s = [commit.sha1 for commit in commits]\n\n    if summary is None:\n        if len(commits) == 1:\n            summary = commits[0].summary()\n        else:\n            summary = \"\"\n\n    if review_branch_name:\n        invalid_branch_name = \"false\"\n        default_branch_name = review_branch_name\n    else:\n        invalid_branch_name = htmlutils.jsify(user.name + \"/\")\n        default_branch_name = user.name + \"/\"\n\n        match = re.search(\"(?:^|[Ff]ix(?:e[ds])?(?: +for)?(?: +bug)? +)([A-Z][A-Z0-9]+-[0-9]+)\", summary)\n        if match:\n            invalid_branch_name = \"false\"\n            default_branch_name = htmlutils.htmlify(match.group(1))\n\n    changesets = []\n    changeset_utils.createChangesets(db, repository, commits)\n    for commit in commits:\n        changesets.extend(changeset_utils.createChangeset(db, None, repository, commit, do_highlight=False))\n    changeset_ids = [changeset.id for changeset in changesets]\n\n    all_reviewers, all_watchers = reviewing.utils.getReviewersAndWatchers(\n        db, repository, changesets=changesets, applyparentfilters=applyparentfilters)\n\n    document = htmlutils.Document(req)\n    html = document.html()\n    head = html.head()\n\n    document.addInternalScript(user.getJS(db))\n\n    if branch_name:\n        document.addInternalScript(\"var fromBranch = %s;\" % htmlutils.jsify(branch_name))\n\n    if remote:\n        document.addInternalScript(\"var trackedbranch = { remote: %s, name: %s };\" % (htmlutils.jsify(remote), htmlutils.jsify(branch_name)))\n\n    head.title().text(\"Create Review\")\n\n    body = html.body(onload=\"document.getElementById('branch_name').focus()\")\n\n    page.utils.generateHeader(body, db, user, lambda target: target.button(onclick=\"submitReview();\").text(\"Submit Review\"))\n\n    document.addExternalStylesheet(\"resource/createreview.css\")\n    document.addExternalScript(\"resource/createreview.js\")\n    document.addExternalScript(\"resource/reviewfilters.js\")\n    document.addExternalScript(\"resource/autocomplete.js\")\n\n    document.addInternalScript(\"\"\"\nvar invalid_branch_name = %s;\nvar review_data = { commit_ids: %r,\n                    commit_sha1s: %r,\n                    changeset_ids: %r };\"\"\" % (invalid_branch_name,\n                                               commit_ids,\n                                               commit_sha1s,\n                                               changeset_ids))\n    document.addInternalScript(repository.getJS())\n\n    main = body.div(\"main\")\n\n    table = main.table(\"basic paleyellow\", align=\"center\")\n    table.tr().td(\"h1\", colspan=3).h1().text(\"Create Review\")\n\n    row = table.tr(\"line\")\n    row.td(\"heading\").text(\"Branch Name:\")\n    row.td(\"value\").text(\"r/\").input(\"value\", id=\"branch_name\", value=default_branch_name)\n    row.td(\"status\")\n\n    row = table.tr()\n\n    if not remote:\n        row.td(\"help\", colspan=3).div().text(\"\"\"\\\nThis is the main identifier of the review.  It will be created in the review\nrepository to contain the commits below.  Reviewers can fetch it from there, and\nadditional commits can be added to the review later by pushing them to this\nbranch in the review repository.\"\"\")\n    else:\n        row.td(\"help\", colspan=3).div().text(\"\"\"\\\nThis is the main identifier of the review.  It will be created in the review\nrepository to contain the commits below, and reviewers can fetch it from there.\"\"\")\n\n    if remote:\n        row = table.tr(\"line\")\n        row.td(\"heading\").text(\"Tracked Branch:\")\n        value = row.td(\"value\")\n        value.code(\"branch inset\").text(branch_name, linkify=linkify.Context(remote=remote))\n        value.text(\" in \")\n        value.code(\"remote inset\").text(remote, linkify=linkify.Context())\n        row.td(\"status\")\n\n        row = table.tr()\n        row.td(\"help\", colspan=3).div().text(\"\"\"\\\nRather than pushing directly to the review branch in Critic's repository to add\ncommits to the review, you will be pushing to this branch (in a separate\nrepository,) from which Critic will fetch commits and add them to the review\nautomatically.\"\"\")\n\n    row = table.tr(\"line\")\n    row.td(\"heading\").text(\"Summary:\")\n    row.td(\"value\").input(\"value\", id=\"summary\", value=summary)\n    row.td(\"status\")\n\n    row = table.tr()\n    row.td(\"help\", colspan=3).div().text(\"\"\"\\\nThe summary should be a short summary of the changes in the review.  It will\nappear in the subject of all emails sent about the review.\n\"\"\")\n\n    row = table.tr(\"line description\")\n    row.td(\"heading\").text(\"Description:\")\n    textarea = row.td(\"value\").textarea(id=\"description\", rows=12)\n    textarea.preformatted()\n    if description: textarea.text(description)\n    row.td(\"status\")\n\n    row = table.tr()\n    row.td(\"help\", colspan=3).div().text(\"\"\"\\\nThe description should describe the changes to be reviewed.  It is usually fine\nto leave the description empty, since the commit messages are also available in\nthe review.\n\"\"\")\n\n    generateReviewersAndWatchersTable(db, repository, main, all_reviewers, all_watchers, applyparentfilters=applyparentfilters)\n\n    row = table.tr(\"line recipients\")\n    row.td(\"heading\").text(\"Recipient List:\")\n    cell = row.td(\"value\", colspan=2).preformatted()\n    cell.span(\"mode\").text(\"Everyone\")\n    cell.span(\"users\")\n    cell.text(\".\")\n    buttons = cell.div(\"buttons\")\n    buttons.button(onclick=\"editRecipientList();\").text(\"Edit Recipient List\")\n\n    row = table.tr()\n    row.td(\"help\", colspan=3).div().text(\"\"\"\\\nThe basic recipient list for e-mails sent about the review.\n\"\"\")\n\n    log.html.render(db, main, \"Commits\", commits=commits)\n\n    return document\n"
  },
  {
    "path": "src/page/createuser.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport htmlutils\nimport page\nimport auth\nimport dbutils\nimport configuration\n\nfrom page.parameters import Optional\n\nclass CreateUserHandler(page.Page.Handler):\n    def __init__(self, target=None, username=None, email=None, fullname=None,\n                 provider=None, account=None, token=None):\n        super(CreateUserHandler, self).__init__()\n\n        self.target = target\n        self.username = username\n        self.email = email\n        self.fullname = fullname\n        self.provider = provider\n        self.account = account\n        self.token = token\n\n    def generateHeader(self):\n        self.document.addExternalStylesheet(\"resource/createuser.css\")\n        self.document.addExternalScript(\"resource/createuser.js\")\n\n        if self.target:\n            self.document.addInternalScript(\n                \"var target = %s;\" % htmlutils.jsify(self.target))\n\n    def generateContent(self):\n        table = page.utils.PaleYellowTable(self.body, \"Create user\")\n\n        if self.provider and self.token and self.provider in auth.PROVIDERS:\n            provider = auth.PROVIDERS[self.provider]\n            if not provider.validateToken(self.db, self.account, self.token):\n                raise page.utils.DisplayMessage(\"Invalid OAuth2 token\")\n            allow_user_registration = \\\n                provider.configuration.get(\"allow_user_registration\", False)\n        else:\n            provider = None\n            allow_user_registration = configuration.base.ALLOW_USER_REGISTRATION\n\n        if not allow_user_registration:\n            administrators = dbutils.getAdministratorContacts(\n                self.db, as_html=True)\n            raise page.utils.DisplayMessage(\n                title=\"User registration not enabled\",\n                body=((\"<p>The administrator of this system has not enabled \"\n                       \"registration of new users.</p>\"\n                       \"<p>Contact %s if you want to use this system.</p>\")\n                      % administrators),\n                html=True)\n\n        def render(target):\n            table = target.table(\"createuser\", align=\"center\")\n\n            def header(label):\n                row = table.tr(\"header\")\n                row.td(colspan=2).text(label)\n\n            def item(key):\n                row = table.tr(\"item\")\n                row.td(\"key\").text(\"%s:\" % key)\n                return row.td(\"value\")\n\n            def button(class_name):\n                row = table.tr(\"button\")\n                return row.td(colspan=2).button(class_name)\n\n            def separator():\n                table.tr(\"separator1\").td(colspan=2)\n                table.tr(\"separator2\").td(colspan=2)\n\n            if provider:\n                self.document.addInternalScript(\n                    \"var external = { provider: %s, account: %s, token: %s };\"\n                    % (htmlutils.jsify(self.provider),\n                       htmlutils.jsify(self.account),\n                       htmlutils.jsify(self.token)))\n\n                url = provider.getAccountURL(self.account)\n                item(provider.getTitle()).a(\"external\", href=url).text(self.account)\n                separator()\n            else:\n                self.document.addInternalScript(\"var external = null;\")\n\n            message = table.tr(\"status disabled\").td(colspan=2).div(\"message\")\n\n            if self.username:\n                try:\n                    dbutils.User.fromName(self.db, self.username)\n                except dbutils.NoSuchUser:\n                    try:\n                        auth.validateUserName(self.username)\n                    except auth.InvalidUserName as error:\n                        message.u(\"Invalid user name\")\n                        message.br()\n                        message.text(str(error))\n                else:\n                    message.text(\"A user named '%s' already exists!\"\n                                 % self.username)\n\n            item(\"New user name\").input(id=\"newusername\", value=self.username, size=40)\n            item(\"Display name\").input(id=\"fullname\", value=self.fullname, size=40)\n            item(\"Email\").input(id=\"email\", value=self.email, size=40)\n\n            if not provider:\n                separator()\n\n                item(\"Password\").input(id=\"password1\", type=\"password\", size=40)\n                item(\"Password (again)\").input(id=\"password2\", type=\"password\", size=40)\n\n            button(\"create\").text(\"Create user\")\n\n        table.addCentered(render)\n\nclass CreateUser(page.Page):\n    def __init__(self):\n        super(CreateUser, self).__init__(\"createuser\",\n                                         { \"target\": Optional(str),\n                                           \"username\": Optional(str),\n                                           \"email\": Optional(str),\n                                           \"fullname\": Optional(str),\n                                           \"provider\": Optional(str),\n                                           \"account\": Optional(str),\n                                           \"token\": Optional(str) },\n                                         CreateUserHandler)\n"
  },
  {
    "path": "src/page/dashboard.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport dbutils\nimport gitutils\nimport htmlutils\nimport profiling\nimport page.utils\nimport auth\n\ndef renderDashboard(req, db, user):\n    if user.isAnonymous(): default_show = \"open\"\n    else: default_show = user.getPreference(db, \"dashboard.defaultGroups\")\n\n    show = req.getParameter(\"show\", default_show)\n\n    if user.isAnonymous():\n        def possible(group):\n            return group in (\"open\", \"closed\")\n    else:\n        def possible(group):\n            return True\n\n    showlist = filter(possible, show.split(\",\"))\n    showset = set(showlist)\n\n    if user.getPreference(db, \"commit.diff.compactMode\"): default_compact = \"yes\"\n    else: default_compact = \"no\"\n\n    repository_arg = req.getParameter(\"repository\", None)\n    repository = gitutils.Repository.fromParameter(db, repository_arg) if repository_arg else None\n    compact = req.getParameter(\"compact\", default_compact) == \"yes\"\n\n    cursor = db.cursor()\n\n    profiler = profiling.Profiler()\n    document = htmlutils.Document(req)\n\n    document.setTitle(\"Dashboard\")\n\n    html = document.html()\n    head = html.head()\n    body = html.body()\n\n    def generateRight(target):\n        def addLink(key, title=None):\n            if not title: title = key\n            if key not in showset:\n                target.text(\"[\")\n                target.a(href=\"dashboard?show=%s\" % \",\".join(showlist + [key])).text(\"show %s\" % title)\n                target.text(\"]\")\n\n        if user.isAnonymous():\n            addLink(\"open\", \"open\")\n            addLink(\"closed\")\n        else:\n            target.text(\"[\")\n            target.a(href=\"config?highlight=dashboard.defaultGroups\").text(\"configure defaults\")\n            target.text(\"]\")\n\n            addLink(\"owned\")\n            addLink(\"draft\")\n            addLink(\"active\")\n            addLink(\"watched\")\n            addLink(\"open\", \"other open\")\n            addLink(\"closed\")\n\n    page.utils.generateHeader(body, db, user, current_page=\"dashboard\", generate_right=generateRight, profiler=profiler)\n\n    document.addExternalStylesheet(\"resource/dashboard.css\")\n    document.addExternalScript(\"resource/dashboard.js\")\n    document.addInternalScript(user.getJS())\n\n    target = body.div(\"main\")\n\n    def flush(target):\n        return document.render(stop=target, pretty=not compact)\n\n    def includeReview(review_id):\n        if repository:\n            cursor = db.cursor()\n            cursor.execute(\"SELECT branches.repository FROM branches JOIN reviews ON (reviews.branch=branches.id) WHERE reviews.id=%s\", (review_id,))\n            return cursor.fetchone()[0] == repository.id\n        else:\n            return True\n\n    def sortedReviews(data):\n        reviews = []\n        for review_id in sorted(data.keys()):\n            reviews.append((review_id, data[review_id]))\n        return reviews\n\n    def isAccepted(review_ids):\n        cursor.execute(\"\"\"SELECT reviews.id, COUNT(reviewfiles.id)=0 AND COUNT(commentchains.id)=0\n                            FROM reviews\n                 LEFT OUTER JOIN reviewfiles ON (reviewfiles.review=reviews.id\n                                             AND reviewfiles.state='pending')\n                 LEFT OUTER JOIN commentchains ON (commentchains.review=reviews.id\n                                               AND commentchains.type='issue'\n                                               AND commentchains.state='open')\n                           WHERE reviews.id=ANY (%s)\n                        GROUP BY reviews.id\"\"\",\n                       (review_ids,))\n\n        return dict(cursor)\n\n    checked_repositories = {}\n    def accessRepository(repository_id):\n        already_checked = checked_repositories.get(repository_id)\n        if already_checked is not None:\n            return already_checked\n        is_allowed = auth.AccessControlProfile.isAllowedRepository(\n            db.profiles, \"read\", repository_id)\n        checked_repositories[repository_id] = is_allowed\n        return is_allowed\n\n    def renderReviews(target, reviews, lines_and_comments=True, links=True):\n        cursor.execute(\"SELECT id, repository, name FROM branches WHERE id=ANY (%s)\",\n                       (list(branch_id for _, (_, branch_id, _, _) in reviews),))\n\n        branch_data = { branch_id: (repository_id, name)\n                        for branch_id, repository_id, name in cursor }\n\n        for review_id, (summary, branch_id, lines, comments) in reviews:\n            repository_id, branch_name = branch_data[branch_id]\n            if not accessRepository(repository_id):\n                continue\n            row = target.tr(\"review\")\n            row.td(\"name\").text(branch_name)\n            row.td(\"title\").a(href=\"r/%d\" % review_id).text(summary)\n\n            if lines_and_comments:\n                if lines:\n                    if links:\n                        row.td(\"lines\").a(href=\"showcommit?review=%d&filter=pending\" % review_id).text(\"%d lines\" % (sum(lines)))\n                    else:\n                        row.td(\"lines\").text(\"%d lines\" % (sum(lines)))\n                else: row.td(\"lines\").text()\n                if comments:\n                    if links:\n                        row.td(\"comments\").a(href=\"showcomments?review=%s&filter=toread\" % review_id).text(\"%d comment%s\" % (comments, \"s\" if comments > 1 else \"\"))\n                    else:\n                        row.td(\"comments\").text(\"%d comment%s\" % (comments, \"s\" if comments > 1 else \"\"))\n                else: row.td(\"comments\").text()\n\n    def hidden(what):\n        new_show = \",\".join(filter(lambda item: item != what, showlist))\n        if new_show: return \"dashboard?show=%s\" % new_show\n        else: return \"dashboard\"\n\n    profiler.check(\"generate: prologue\")\n\n    def renderOwned():\n        owned_accepted = []\n        owned_open = []\n\n        cursor.execute(\"\"\"SELECT id, summary, branch\n                            FROM reviews\n                            JOIN reviewusers ON (review=id AND reviewusers.owner)\n                           WHERE state='open'\n                             AND uid=%s\n                        ORDER BY id DESC\"\"\",\n                       (user.id,))\n\n        owned = cursor.fetchall()\n\n        profiler.check(\"query: owned\")\n\n        is_accepted = isAccepted(list(review_id for review_id, _, _ in owned))\n\n        for review_id, summary, branch_id in owned:\n            if includeReview(review_id):\n                if is_accepted[review_id]:\n                    owned_accepted.append((review_id, (summary, branch_id, None, None)))\n                else:\n                    owned_open.append((review_id, (summary, branch_id, None, None)))\n\n        profiler.check(\"processing: owned\")\n\n        if owned_accepted or owned_open:\n            table = target.table(\"paleyellow reviews\", id=\"owned\", align=\"center\", cellspacing=0)\n            table.col(width=\"15%\")\n            table.col(width=\"55%\")\n            table.col(width=\"15%\")\n            table.col(width=\"15%\")\n            header = table.tr().td(\"h1\", colspan=4).h1()\n            header.text(\"Owned By You\")\n            header.span(\"right\").a(href=hidden(\"owned\")).text(\"[hide]\")\n\n            if owned_accepted:\n                table.tr(id=\"accepted\").td(\"h2\", colspan=4).h2().text(\"Accepted\")\n                renderReviews(table, owned_accepted)\n\n            if owned_open:\n                table.tr(id=\"open\").td(\"h2\", colspan=4).h2().text(\"Pending\")\n                renderReviews(table, owned_open)\n\n            profiler.check(\"generate: owned\")\n            return True\n\n    def renderDraft():\n        draft_changes = {}\n        draft_comments = {}\n        draft_both = {}\n\n        cursor.execute(\"\"\"SELECT reviews.id, reviews.summary, reviews.branch, SUM(reviewfiles.deleted), SUM(reviewfiles.inserted)\n                            FROM reviews\n                            JOIN reviewfiles ON (reviewfiles.review=reviews.id)\n                            JOIN reviewfilechanges ON (reviewfilechanges.file=reviewfiles.id)\n                           WHERE reviews.state='open'\n                             AND reviewfiles.state=reviewfilechanges.from_state\n                             AND reviewfilechanges.state='draft'\n                             AND reviewfilechanges.uid=%s\n                        GROUP BY reviews.id, reviews.summary, reviews.branch\"\"\",\n                       (user.id,))\n\n        profiler.check(\"query: draft lines\")\n\n        for review_id, summary, branch_id, deleted_count, inserted_count in cursor:\n            if includeReview(review_id):\n                draft_changes[review_id] = (summary, branch_id, (deleted_count, inserted_count), None)\n\n        profiler.check(\"processing: draft lines\")\n\n        cursor.execute(\"\"\"SELECT reviews.id, reviews.summary, reviews.branch, COUNT(comments.id)\n                            FROM reviews\n                            JOIN commentchains ON (commentchains.review=reviews.id)\n                            JOIN comments ON (comments.chain=commentchains.id)\n                           WHERE comments.state='draft'\n                             AND comments.uid=%s\n                        GROUP BY reviews.id, reviews.summary, reviews.branch\"\"\",\n                       [user.id])\n\n        profiler.check(\"query: draft comments\")\n\n        for review_id, summary, branch_id, comments_count in cursor:\n            if includeReview(review_id):\n                if draft_changes.has_key(review_id):\n                    draft_both[review_id] = (summary, branch_id, draft_changes[review_id][2], comments_count)\n                    del draft_changes[review_id]\n                else:\n                    draft_comments[review_id] = (summary, branch_id, None, comments_count)\n\n        profiler.check(\"processing: draft comments\")\n\n        if draft_both or draft_changes or draft_comments:\n            table = target.table(\"paleyellow reviews\", id=\"draft\", align=\"center\", cellspacing=0)\n            table.col(width=\"15%\")\n            table.col(width=\"55%\")\n            table.col(width=\"15%\")\n            table.col(width=\"15%\")\n            header = table.tr().td(\"h1\", colspan=4).h1()\n            header.text(\"Reviews With Unsubmitted Work\")\n            header.span(\"right\").a(href=hidden(\"draft\")).text(\"[hide]\")\n\n            if draft_both:\n                table.tr(id=\"draft-changes-comments\").td(\"h2\", colspan=4).h2().text(\"Draft Changes And Comments\")\n                renderReviews(table, sortedReviews(draft_both), links=False)\n\n            if draft_changes:\n                table.tr(id=\"draft-changes\").td(\"h2\", colspan=4).h2().text(\"Draft Changes\")\n                renderReviews(table, sortedReviews(draft_changes), links=False)\n\n            if draft_comments:\n                table.tr(id=\"draft-comments\").td(\"h2\", colspan=4).h2().text(\"Draft Comments\")\n                renderReviews(table, sortedReviews(draft_comments), links=False)\n\n            profiler.check(\"generate: draft\")\n            return True\n\n    active = {}\n\n    def fetchActive():\n        if not active:\n            with_changes = {}\n            with_comments = {}\n            with_both = {}\n\n            cursor.execute(\"\"\"SELECT reviews.id, reviews.summary, reviews.branch, SUM(reviewfiles.deleted), SUM(reviewfiles.inserted)\n                                FROM reviews\n                                JOIN reviewusers ON (reviewusers.review=reviews.id\n                                                 AND reviewusers.uid=%s)\n                                JOIN reviewfiles ON (reviewfiles.review=reviews.id)\n                                JOIN reviewuserfiles ON (reviewuserfiles.file=reviewfiles.id\n                                                     AND reviewuserfiles.uid=%s)\n                               WHERE reviews.state='open'\n                                 AND reviewfiles.state='pending'\n                            GROUP BY reviews.id, reviews.summary, reviews.branch\"\"\",\n                           (user.id, user.id))\n\n            profiler.check(\"query: active lines\")\n\n            for review_id, summary, branch_id, deleted_count, inserted_count in cursor:\n                if includeReview(review_id):\n                    with_changes[review_id] = (summary, branch_id, (deleted_count, inserted_count), None)\n\n            profiler.check(\"processing: active lines\")\n\n            cursor.execute(\"\"\"SELECT reviews.id, reviews.summary, reviews.branch, unread.count\n                                FROM (SELECT commentchains.review AS review, COUNT(commentstoread.comment) AS count\n                                        FROM commentchains\n                                        JOIN comments ON (comments.chain=commentchains.id)\n                                        JOIN commentstoread ON (commentstoread.comment=comments.id\n                                                            AND commentstoread.uid=%s)\n                                    GROUP BY commentchains.review) AS unread\n                                JOIN reviews ON (reviews.id=unread.review)\n                               WHERE reviews.state='open'\"\"\",\n                           (user.id,))\n\n            profiler.check(\"query: active comments\")\n\n            for review_id, summary, branch_id, comments_count in cursor:\n                if includeReview(review_id):\n                    if with_changes.has_key(review_id):\n                        with_both[review_id] = (summary, branch_id, with_changes[review_id][2], comments_count)\n                        del with_changes[review_id]\n                    else:\n                        with_comments[review_id] = (summary, branch_id, None, comments_count)\n\n            profiler.check(\"processing: active comments\")\n\n            active[\"changes\"] = with_changes\n            active[\"comments\"] = with_comments\n            active[\"both\"] = with_both\n\n    def renderActive():\n        fetchActive()\n\n        if active[\"both\"] or active[\"changes\"] or active[\"comments\"]:\n            table = target.table(\"paleyellow reviews\", id=\"active\", align=\"center\", cellspacing=0)\n            table.col(width=\"15%\")\n            table.col(width=\"55%\")\n            table.col(width=\"15%\")\n            table.col(width=\"15%\")\n            header = table.tr().td(\"h1\", colspan=4).h1()\n            header.text(\"Active Reviews\")\n            header.span(\"right\").a(href=hidden(\"active\")).text(\"[hide]\")\n\n            if active[\"both\"]:\n                review_ids = \",\".join(map(str, active[\"both\"].keys()))\n                h2 = table.tr(id=\"active-changes-comments\").td(\"h2\", colspan=4).h2().text(\"Has Changes And Comments\")\n                h2.a(href=\"javascript:void(0);\", onclick=\"markChainsAsRead([%s]);\" % review_ids).text(\"[mark all as read]\")\n                renderReviews(table, sortedReviews(active[\"both\"]))\n\n            if active[\"changes\"]:\n                table.tr(id=\"active-changes\").td(\"h2\", colspan=4).h2().text(\"Has Changes\")\n                renderReviews(table, sortedReviews(active[\"changes\"]))\n\n            if active[\"comments\"]:\n                review_ids = \",\".join(map(str, active[\"comments\"].keys()))\n                h2 = table.tr(id=\"active-comments\").td(\"h2\", colspan=4).h2().text(\"Has Comments\")\n                h2.a(href=\"javascript:void(0);\", onclick=\"markChainsAsRead([%s]);\" % review_ids).text(\"[mark all as read]\")\n                renderReviews(table, sortedReviews(active[\"comments\"]))\n\n            profiler.check(\"generate: active\")\n            return True\n\n    other = {}\n\n    def fetchWatchedAndClosed():\n        if not other:\n            if \"watched\" not in showset:\n                state_filter = \" WHERE reviews.state='closed'\"\n            elif \"closed\" not in showset:\n                state_filter = \" WHERE reviews.state='open'\"\n            else:\n                state_filter = \"\"\n\n            profiler.check(\"query: watched/closed\")\n\n            watched = {}\n            owned_closed = {}\n            other_closed = {}\n\n            if \"watched\" in showset: fetchActive()\n\n            cursor.execute(\"\"\"SELECT reviews.id, reviews.summary, reviews.branch, reviews.state, reviewusers.owner, reviewusers.uid IS NULL\n                                FROM reviews\n                     LEFT OUTER JOIN reviewusers ON (reviewusers.review=reviews.id AND reviewusers.uid=%s)\"\"\" + state_filter,\n                           (user.id,))\n\n            for review_id, summary, branch_id, review_state, is_owner, not_associated in cursor:\n                if includeReview(review_id):\n                    if review_state == 'open':\n                        if is_owner or not_associated:\n                            continue\n\n                        fetchActive()\n\n                        if active[\"both\"].has_key(review_id) or active[\"changes\"].has_key(review_id) or active[\"comments\"].has_key(review_id):\n                            continue\n\n                        watched[review_id] = summary, branch_id, None, None\n                    elif is_owner:\n                        owned_closed[review_id] = summary, branch_id, None, None\n                    else:\n                        other_closed[review_id] = summary, branch_id, None, None\n\n            profiler.check(\"processing: watched/closed\")\n\n            other[\"watched\"] = watched\n            other[\"owned-closed\"] = owned_closed\n            other[\"other-closed\"] = other_closed\n\n    def renderWatched():\n        fetchWatchedAndClosed()\n\n        watched = other[\"watched\"]\n        accepted = []\n        pending = []\n\n        is_accepted = isAccepted(watched.keys())\n\n        for review_id, (summary, branch_id, lines, comments) in sortedReviews(watched):\n            if is_accepted[review_id]:\n                accepted.append((review_id, (summary, branch_id, lines, comments)))\n            else:\n                pending.append((review_id, (summary, branch_id, lines, comments)))\n\n        if accepted or pending:\n            table = target.table(\"paleyellow reviews\", id=\"watched\", align=\"center\", cellspacing=0)\n            table.col(width=\"30%\")\n            table.col(width=\"70%\")\n            header = table.tr().td(\"h1\", colspan=4).h1()\n            header.text(\"Watched Reviews\")\n            header.span(\"right\").a(href=hidden(\"watched\")).text(\"[hide]\")\n\n            if accepted:\n                table.tr(id=\"active-changes-comments\").td(\"h2\", colspan=4).h2().text(\"Accepted\")\n                renderReviews(table, accepted, False)\n\n            if pending:\n                table.tr(id=\"active-changes-comments\").td(\"h2\", colspan=4).h2().text(\"Pending\")\n                renderReviews(table, pending, False)\n\n            profiler.check(\"generate: watched\")\n            return True\n\n    def renderClosed():\n        fetchWatchedAndClosed()\n\n        owned_closed = other[\"owned-closed\"]\n        other_closed = other[\"other-closed\"]\n\n        if owned_closed or other_closed:\n            table = target.table(\"paleyellow reviews\", id=\"closed\", align=\"center\", cellspacing=0)\n            table.col(width=\"30%\")\n            table.col(width=\"70%\")\n            header = table.tr().td(\"h1\", colspan=4).h1()\n            header.text(\"Closed Reviews\")\n            header.span(\"right\").a(href=hidden(\"closed\")).text(\"[hide]\")\n\n            if not user.isAnonymous():\n                if owned_closed:\n                    table.tr().td(\"h2\", colspan=4).h2().text(\"Owned\")\n                    renderReviews(table, sortedReviews(owned_closed), False)\n\n                if other_closed:\n                    table.tr().td(\"h2\", colspan=4).h2().text(\"Other\")\n                    renderReviews(table, sortedReviews(other_closed), False)\n            else:\n                renderReviews(table, sortedReviews(other_closed), False)\n\n            profiler.check(\"generate: closed\")\n            return True\n\n    def renderOpen():\n        other_open = {}\n\n        cursor.execute(\"\"\"SELECT reviews.id, reviews.summary, reviews.branch\n                            FROM reviews\n                 LEFT OUTER JOIN reviewusers ON (reviewusers.review=reviews.id AND reviewusers.uid=%s)\n                           WHERE reviews.state='open'\n                             AND reviewusers.uid IS NULL\"\"\",\n                       [user.id])\n\n        profiler.check(\"query: open\")\n\n        for review_id, summary, branch_id in cursor:\n            if includeReview(review_id):\n                other_open[review_id] = summary, branch_id, None, None\n\n        profiler.check(\"processing: open\")\n\n        if other_open:\n            accepted = []\n            pending = []\n\n            for review_id, (summary, branch_id, lines, comments) in sortedReviews(other_open):\n                if dbutils.Review.isAccepted(db, review_id):\n                    accepted.append((review_id, (summary, branch_id, lines, comments)))\n                else:\n                    pending.append((review_id, (summary, branch_id, lines, comments)))\n\n            table = target.table(\"paleyellow reviews\", id=\"open\", align=\"center\", cellspacing=0)\n            table.col(width=\"30%\")\n            table.col(width=\"70%\")\n            header = table.tr().td(\"h1\", colspan=4).h1()\n            header.text(\"Open Reviews\" if user.isAnonymous() else \"Other Open Reviews\")\n            header.span(\"right\").a(href=hidden(\"open\")).text(\"[hide]\")\n\n            if accepted:\n                table.tr().td(\"h2\", colspan=4).h2().text(\"Accepted\")\n                renderReviews(table, accepted, False)\n\n            if pending:\n                table.tr().td(\"h2\", colspan=4).h2().text(\"Pending\")\n                renderReviews(table, pending, False)\n\n            profiler.check(\"generate: open\")\n            return True\n\n    render = { \"owned\": renderOwned,\n               \"draft\": renderDraft,\n               \"active\": renderActive,\n               \"watched\": renderWatched,\n               \"closed\": renderClosed,\n               \"open\": renderOpen }\n\n    empty = True\n\n    for item in showlist:\n        if item in render:\n            target.comment(repr(item))\n            if render[item]():\n                empty = False\n                yield flush(target)\n\n    if empty:\n        document.addExternalStylesheet(\"resource/message.css\")\n        body.div(\"message paleyellow\").h1(\"center\").text(\"No reviews!\")\n\n    profiler.output(db=db, user=user, target=document)\n\n    yield flush(None)\n"
  },
  {
    "path": "src/page/editresource.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport dbutils\nimport htmlutils\nimport page.utils\nimport configuration\n\ndef renderEditResource(req, db, user):\n    name = req.getParameter(\"name\", None)\n\n    document = htmlutils.Document(req)\n\n    html = document.html()\n    head = html.head()\n    body = html.body()\n\n    page.utils.generateHeader(body, db, user)\n\n    document.addExternalStylesheet(\"resource/editresource.css\")\n    document.addExternalScript(\"resource/editresource.js\")\n\n    target = body.div(\"main\")\n\n    table = target.table('paleyellow', align='center')\n    table.col(width='10%')\n    table.col(width='60%')\n    table.tr().td('h1', colspan=2).h1().text(\"Resource Editor\")\n\n    select_row = table.tr('select')\n    select_row.td('heading').text('Resource:')\n    select = select_row.td('value').select()\n    if name is None: select.option(selected=\"selected\").text(\"Select resource\")\n    select.option(value=\"diff.css\", selected=\"selected\" if name==\"diff.css\" else None).text(\"Diff coloring\")\n    select.option(value=\"syntax.css\", selected=\"selected\" if name==\"syntax.css\" else None).text(\"Syntax highlighting\")\n\n    help_row = table.tr('help')\n    help_row.td('help', colspan=2).text(\"Select the resource to edit.\")\n\n    is_edited = False\n    is_reset = False\n    source = None\n\n    if name is None:\n        document.addInternalScript(\"var resource_name = null;\");\n        source = \"\"\n    else:\n        if name not in (\"diff.css\", \"syntax.css\"):\n            raise page.utils.DisplayMessage(\"Invalid resource name\", body=\"Must be one of 'diff.css' and 'syntax.css'.\")\n\n        document.addInternalScript(\"var resource_name = %s;\" % htmlutils.jsify(name));\n\n        cursor = db.cursor()\n        cursor.execute(\"SELECT source FROM userresources WHERE uid=%s AND name=%s ORDER BY revision DESC FETCH FIRST ROW ONLY\", (user.id, name))\n        row = cursor.fetchone()\n\n        if row:\n            is_edited = True\n            source = row[0]\n\n        if source is None:\n            is_reset = is_edited\n            source = open(configuration.paths.INSTALL_DIR + \"/resources/\" + name).read()\n\n        document.addInternalScript(\"var original_source = %s;\" % htmlutils.jsify(source));\n\n    table.tr('value').td('value', colspan=2).textarea(rows=source.count(\"\\n\") + 10).preformatted().text(source)\n\n    buttons = table.tr('buttons').td('buttons', colspan=2)\n    buttons.button('save').text(\"Save changes\")\n\n    if is_edited and not is_reset:\n        buttons.button('reset').text(\"Reset to built-in version\")\n\n    if is_reset:\n        buttons.button('restore').text(\"Restore last edited version\")\n\n    return document\n"
  },
  {
    "path": "src/page/filterchanges.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport dbutils\nimport htmlutils\nimport gitutils\n\nimport page.utils\nimport reviewing.utils as review_utils\nimport log.commitset\n\ndef renderFilterChanges(req, db, user):\n    review_id = req.getParameter(\"review\", filter=int)\n    first_sha1 = req.getParameter(\"first\", None)\n    last_sha1 = req.getParameter(\"last\", None)\n\n    cursor = db.cursor()\n\n    review = dbutils.Review.fromId(db, review_id)\n\n    root_directories = {}\n    root_files = {}\n\n    def processFile(file_id):\n        components = dbutils.describe_file(db, file_id).split(\"/\")\n        directories, files = root_directories, root_files\n        for directory_name in components[:-1]:\n            directories, files = directories.setdefault(directory_name, ({}, {}))\n        files[components[-1]] = file_id\n\n    if first_sha1 and last_sha1:\n        cursor.execute(\"\"\"SELECT commits.sha1\n                            FROM commits\n                            JOIN changesets ON (changesets.child=commits.id)\n                            JOIN reviewchangesets ON (reviewchangesets.changeset=changesets.id)\n                           WHERE reviewchangesets.review=%s\"\"\",\n                       (review.id,))\n\n        first_commit = gitutils.Commit.fromSHA1(db, review.repository, first_sha1)\n        last_commit = gitutils.Commit.fromSHA1(db, review.repository, last_sha1)\n\n        if len(first_commit.parents) > 1:\n            raise page.utils.DisplayMessage(\n                title=\"Filtering failed!\",\n                body=(\"First selected commit is a merge commit.  Please go back \"\n                      \"and select a different range of commits.\"),\n                review=review)\n\n        if first_commit.parents:\n            from_commit = gitutils.Commit.fromSHA1(db, review.repository, first_commit.parents[0])\n        else:\n            from_commit = None\n\n        to_commit = last_commit\n        commits = log.commitset.CommitSet.fromRange(db, from_commit, to_commit)\n\n        if not commits:\n            raise page.utils.DisplayMessage(\n                title=\"Filtering failed!\",\n                body=(\"The range of commits selected includes merges with \"\n                      \"ancestors not included in the range.  Please go back \"\n                      \"and select a different range of commits.\"),\n                review=review)\n\n        cursor.execute(\"\"\"SELECT DISTINCT reviewfiles.file\n                            FROM reviewfiles\n                            JOIN changesets ON (changesets.id=reviewfiles.changeset)\n                            JOIN commits ON (commits.id=changesets.child)\n                           WHERE reviewfiles.review=%s\n                             AND commits.sha1=ANY (%s)\"\"\",\n                       (review.id, [commit.sha1 for commit in commits]))\n    else:\n        cursor.execute(\"SELECT DISTINCT file FROM reviewfiles WHERE review=%s\", (review.id,))\n\n    for (file_id,) in cursor:\n        processFile(file_id)\n\n    document = htmlutils.Document(req)\n\n    html = document.html()\n    head = html.head()\n    body = html.body()\n\n    page.utils.generateHeader(body, db, user, lambda target: review_utils.renderDraftItems(db, user, review, target), extra_links=[(\"r/%d\" % review.id, \"Back to Review\")])\n\n    document.addExternalStylesheet(\"resource/filterchanges.css\")\n    document.addExternalScript(\"resource/filterchanges.js\")\n    document.addInternalScript(user.getJS())\n    document.addInternalScript(review.getJS())\n\n    if first_sha1 and last_sha1:\n        document.addInternalScript(\"var commitRange = { first: %s, last: %s };\" % (htmlutils.jsify(first_sha1), htmlutils.jsify(last_sha1)))\n    else:\n        document.addInternalScript(\"var commitRange = null;\")\n\n    target = body.div(\"main\")\n\n    basic = target.table('filter paleyellow', align='center', cellspacing=0)\n    basic.col(width='10%')\n    basic.col(width='60%')\n    basic.col(width='30%')\n    row = basic.tr(\"header\")\n    row.td('h1', colspan=2).h1().text(\"Filter Changes\")\n    row.td('h1 button').button(\"display\").text(\"Display Diff\")\n\n    row = basic.tr(\"headings\")\n    row.td(\"select\").text(\"Include\")\n    row.td(\"path\").text(\"Path\")\n    row.td().text()\n\n    def outputDirectory(base, name, directories, files):\n        if name:\n            level = base.count(\"/\")\n            row = basic.tr(\"directory\", critic_level=level)\n            row.td(\"select\").input(type=\"checkbox\")\n            if level > 1:\n                row.td(\"path\").preformatted().innerHTML((\" \" * (len(base) - 2)) + \"&#8230;/\" + name + \"/\")\n            else:\n                row.td(\"path\").preformatted().innerHTML(base + name + \"/\")\n            row.td().text()\n        else:\n            row = basic.tr(\"directory\", critic_level=-1)\n            row.td(\"select\").input(type=\"checkbox\")\n            row.td(\"path\").preformatted().i().text(\"Everything\")\n            row.td().text()\n            level = -1\n\n        for directory_name in sorted(directories.keys()):\n            outputDirectory(base + name + \"/\" if name else \"\", directory_name, directories[directory_name][0], directories[directory_name][1])\n\n        for file_name in sorted(files.keys()):\n            row = basic.tr(\"file\", critic_file_id=files[file_name], critic_level=level + 1)\n            row.td(\"select\").input(type=\"checkbox\")\n            if level > -1:\n                row.td(\"path\").preformatted().innerHTML((\" \" * (len(base + name) - 1)) + \"&#8230;/\" + htmlutils.htmlify(file_name))\n            else:\n                row.td(\"path\").preformatted().innerHTML(htmlutils.htmlify(file_name))\n            row.td().text()\n\n    outputDirectory(\"\", \"\", root_directories, root_files)\n\n    row = basic.tr(\"footer\")\n    row.td('spacer', colspan=3)\n\n    row = basic.tr(\"footer\")\n    row.td('button', colspan=3).button(\"display\").text(\"Display Diff\")\n\n    if user.getPreference(db, \"ui.keyboardShortcuts\"):\n        page.utils.renderShortcuts(body, \"filterchanges\", review=review)\n\n    return document\n"
  },
  {
    "path": "src/page/home.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport dbutils\nimport htmlutils\nimport page.utils\nimport gitutils\nimport configuration\nimport reviewing.filters\nimport profiling\nimport auth\nimport extensions.role.filterhook\n\nfrom htmlutils import jsify\nfrom textutils import json_encode\n\ndef renderHome(req, db, user):\n    if user.isAnonymous(): raise page.utils.NeedLogin(req)\n\n    profiler = profiling.Profiler()\n\n    cursor = db.cursor()\n\n    readonly = req.getParameter(\"readonly\", \"yes\" if user.name != req.user else \"no\") == \"yes\"\n    repository = req.getParameter(\"repository\", None, gitutils.Repository.FromParameter(db))\n    verified_email_id = req.getParameter(\"email_verified\", None, int)\n\n    if not repository:\n        repository = user.getDefaultRepository(db)\n\n    title_fullname = user.fullname\n\n    if title_fullname[-1] == 's': title_fullname += \"'\"\n    else: title_fullname += \"'s\"\n\n    cursor.execute(\"SELECT email FROM usergitemails WHERE uid=%s ORDER BY email ASC\", (user.id,))\n    gitemails = \", \".join([email for (email,) in cursor])\n\n    document = htmlutils.Document(req)\n\n    html = document.html()\n    head = html.head()\n    body = html.body()\n\n    if user == req.user:\n        actual_user = None\n    else:\n        actual_user = req.user\n\n    def renderHeaderItems(target):\n        if readonly and actual_user and actual_user.hasRole(db, \"administrator\"):\n            target.a(\"button\", href=\"/home?user=%s&readonly=no\" % user.name).text(\"Edit\")\n\n    page.utils.generateHeader(body, db, user, generate_right=renderHeaderItems, current_page=\"home\")\n\n    document.addExternalStylesheet(\"resource/home.css\")\n    document.addExternalScript(\"resource/home.js\")\n    document.addExternalScript(\"resource/autocomplete.js\")\n    if repository: document.addInternalScript(repository.getJS())\n    else: document.addInternalScript(\"var repository = null;\")\n    if actual_user and actual_user.hasRole(db, \"administrator\"):\n        document.addInternalScript(\"var administrator = true;\")\n    else:\n        document.addInternalScript(\"var administrator = false;\")\n    document.addInternalScript(user.getJS())\n    document.addInternalScript(\"user.gitEmails = %s;\" % jsify(gitemails))\n    document.addInternalScript(\"var verifyEmailAddresses = %s;\"\n                               % jsify(configuration.base.VERIFY_EMAIL_ADDRESSES))\n    document.setTitle(\"%s Home\" % title_fullname)\n\n    target = body.div(\"main\")\n\n    basic = target.table('paleyellow basic', align='center')\n    basic.tr().td('h1', colspan=3).h1().text(\"%s Home\" % title_fullname)\n\n    def row(heading, value, help=None, extra_class=None):\n        if extra_class:\n            row_class = \"line \" + extra_class\n        else:\n            row_class = \"line\"\n        main_row = basic.tr(row_class)\n        main_row.td('heading').text(\"%s:\" % heading)\n        value_cell = main_row.td('value', colspan=2)\n        if callable(value): value(value_cell)\n        else: value_cell.text(value)\n        basic.tr('help').td('help', colspan=3).text(help)\n\n    def renderFullname(target):\n        if readonly: target.text(user.fullname)\n        else:\n            target.input(\"value\", id=\"user_fullname\", value=user.fullname)\n            target.span(\"status\", id=\"status_fullname\")\n            buttons = target.span(\"buttons\")\n            buttons.button(onclick=\"saveFullname();\").text(\"Save\")\n            buttons.button(onclick=\"resetFullname();\").text(\"Reset\")\n\n    def renderEmail(target):\n        if not actual_user or actual_user.hasRole(db, \"administrator\"):\n            cursor.execute(\"\"\"SELECT id, email, verified\n                                FROM useremails\n                               WHERE uid=%s\n                            ORDER BY id ASC\"\"\",\n                           (user.id,))\n            rows = cursor.fetchall()\n            if rows:\n                if len(rows) > 1:\n                    target.addClass(\"multiple\")\n                addresses = target.div(\"addresses\")\n                for email_id, email, verified in rows:\n                    checked = \"checked\" if email == user.email else None\n                    selected = \" selected\" if email == user.email else \"\"\n\n                    label = addresses.label(\"address inset flex\" + selected,\n                                            data_email_id=email_id)\n                    if len(rows) > 1:\n                        label.input(name=\"email\", type=\"radio\", value=email,\n                                    checked=checked)\n                    label.span(\"value\").text(email)\n                    actions = label.span(\"actions\")\n\n                    if verified is False:\n                        actions.a(\"action unverified\", href=\"#\").text(\"[unverified]\")\n                    elif verified is True:\n                        now = \" now\" if email_id == verified_email_id else \"\"\n                        actions.span(\"action verified\" + now).text(\"[verified]\")\n                    actions.a(\"action delete\", href=\"#\").text(\"[delete]\")\n            else:\n                target.i().text(\"No email address\")\n            target.span(\"buttons\").button(\"addemail\").text(\n                \"Add email address\")\n        elif user.email is None:\n            target.i().text(\"No email address\")\n        elif user.email_verified is False:\n            # Pending verification: don't show to other users.\n            target.i().text(\"Email address not verified\")\n        else:\n            target.span(\"inset\").text(user.email)\n\n    def renderGitEmails(target):\n        if readonly: target.text(gitemails)\n        else:\n            target.input(\"value\", id=\"user_gitemails\", value=gitemails)\n            target.span(\"status\", id=\"status_gitemails\")\n            buttons = target.span(\"buttons\")\n            buttons.button(onclick=\"saveGitEmails();\").text(\"Save\")\n            buttons.button(onclick=\"resetGitEmails();\").text(\"Reset\")\n\n    def renderPassword(target):\n        cursor.execute(\"SELECT password IS NOT NULL FROM users WHERE id=%s\", (user.id,))\n        has_password = cursor.fetchone()[0]\n        if not has_password:\n            target.text(\"not set\")\n        else:\n            target.text(\"****\")\n        if not readonly:\n            if not has_password or (actual_user and actual_user.hasRole(db, \"administrator\")):\n                target.span(\"buttons\").button(onclick=\"setPassword();\").text(\"Set password\")\n            else:\n                target.span(\"buttons\").button(onclick=\"changePassword();\").text(\"Change password\")\n\n    row(\"User ID\", str(user.id))\n    row(\"User Name\", user.name)\n    row(\"Display Name\", renderFullname, \"This is the name used when displaying commits or comments.\")\n    row(\"Primary Email\", renderEmail, \"This is the primary email address, to which emails are sent.\", extra_class=\"email\")\n    row(\"Git Emails\", renderGitEmails, \"These email addresses (comma-separated) are used to map Git commits to the user.\")\n\n    if configuration.base.AUTHENTICATION_MODE == \"critic\" \\\n            and auth.DATABASE.supportsPasswordChange():\n        row(\"Password\", renderPassword, extra_class=\"password\")\n\n    cursor.execute(\"\"\"SELECT provider, account\n                        FROM externalusers\n                       WHERE uid=%s\"\"\",\n                   (user.id,))\n\n    external_accounts = [(auth.PROVIDERS[provider_name], account)\n                         for provider_name, account in cursor\n                         if provider_name in auth.PROVIDERS]\n\n    if external_accounts:\n        basic.tr().td('h2', colspan=3).h2().text(\"External Accounts\")\n\n        for provider, account in external_accounts:\n            def renderExternalAccount(target):\n                url = provider.getAccountURL(account)\n                target.a(\"external\", href=url).text(account)\n\n            row(provider.getTitle(), renderExternalAccount)\n\n    profiler.check(\"user information\")\n\n    filters = page.utils.PaleYellowTable(body, \"Filters\")\n    filters.titleRight.a(\"button\", href=\"/tutorial?item=filters\").text(\"Tutorial\")\n\n    cursor.execute(\"\"\"SELECT repositories.id, repositories.name, repositories.path,\n                             filters.id, filters.type, filters.path, NULL, filters.delegate\n                        FROM repositories\n                        JOIN filters ON (filters.repository=repositories.id)\n                       WHERE filters.uid=%s\"\"\",\n                   (user.id,))\n\n    rows = cursor.fetchall()\n\n    if configuration.extensions.ENABLED:\n        cursor.execute(\"\"\"SELECT repositories.id, repositories.name, repositories.path,\n                                 filters.id, 'extensionhook', filters.path, filters.name, filters.data\n                            FROM repositories\n                            JOIN extensionhookfilters AS filters ON (filters.repository=repositories.id)\n                           WHERE filters.uid=%s\"\"\",\n                       (user.id,))\n\n        rows.extend(cursor.fetchall())\n\n    FILTER_TYPES = [\"reviewer\", \"watcher\", \"ignored\", \"extensionhook\"]\n\n    def rowSortKey(row):\n        (repository_id, repository_name, repository_path,\n         filter_id, filter_type, filter_path, filter_name, filter_data) = row\n\n        # Rows are grouped by repository first and type second, so sort by\n        # repository name and filter type primarily.\n        #\n        # Secondarily sort by filter name (only for extension hook filters; is\n        # None for regular filters) and filter path.  This sorting is mostly to\n        # achieve a stable order; it has no greater meaning.\n\n        return (repository_name,\n                FILTER_TYPES.index(filter_type),\n                filter_name,\n                filter_path)\n\n    rows.sort(key=rowSortKey)\n\n    if rows:\n        repository = None\n        repository_filters = None\n        tbody_reviewer = None\n        tbody_watcher = None\n        tbody_ignored = None\n        tbody_extensionhook = None\n\n        count_matched_files = {}\n\n        for (repository_id, repository_name, repository_path,\n             filter_id, filter_type, filter_path, filter_name, filter_data) in rows:\n            if not repository or repository.id != repository_id:\n                try:\n                    repository = gitutils.Repository.fromId(db, repository_id)\n                except auth.AccessDenied:\n                    continue\n\n                repository_url = repository.getURL(db, user)\n                filters.addSection(repository_name, repository_url)\n                repository_filters = filters.addCentered().table(\"filters callout\")\n                tbody_reviewer = tbody_watcher = tbody_ignored = tbody_extensionhook = None\n\n            if filter_type == \"reviewer\":\n                if not tbody_reviewer:\n                    tbody_reviewer = repository_filters.tbody()\n                    tbody_reviewer.tr().th(colspan=5).text(\"Reviewer\")\n                tbody = tbody_reviewer\n            elif filter_type == \"watcher\":\n                if not tbody_watcher:\n                    tbody_watcher = repository_filters.tbody()\n                    tbody_watcher.tr().th(colspan=5).text(\"Watcher\")\n                tbody = tbody_watcher\n            elif filter_type == \"ignored\":\n                if not tbody_ignored:\n                    tbody_ignored = repository_filters.tbody()\n                    tbody_ignored.tr().th(colspan=5).text(\"Ignored\")\n                tbody = tbody_ignored\n            else:\n                if not tbody_extensionhook:\n                    tbody_extensionhook = repository_filters.tbody()\n                    tbody_extensionhook.tr().th(colspan=5).text(\"Extension hooks\")\n                tbody = tbody_extensionhook\n\n            row = tbody.tr()\n            row.td(\"path\").text(filter_path)\n\n            if filter_type != \"extensionhook\":\n                delegates = row.td(\"delegates\", colspan=2)\n                if filter_data:\n                    delegates.i().text(\"Delegates: \")\n                    delegates.span(\"names\").text(\", \".join(filter_data.split(\",\")))\n            else:\n                role = extensions.role.filterhook.getFilterHookRole(db, filter_id)\n                if role:\n                    title = row.td(\"title\")\n                    title.text(role.title)\n\n                    data = row.td(\"data\")\n                    data.text(filter_data)\n                else:\n                    row.td(colspan=2).i().text(\"Invalid filter\")\n\n            if filter_path == \"/\":\n                row.td(\"files\").text(\"all files\")\n            else:\n                href = \"javascript:void(showMatchedFiles(%s, %s));\" % (jsify(repository.name), jsify(filter_path))\n                row.td(\"files\").a(href=href, id=(\"f%d\" % filter_id)).text(\"? files\")\n                count_matched_files.setdefault(repository_id, []).append(filter_id)\n\n            links = row.td(\"links\")\n\n            arguments = (jsify(repository.name),\n                         filter_id,\n                         jsify(filter_type),\n                         jsify(filter_path),\n                         jsify(filter_data))\n            links.a(href=\"javascript:void(editFilter(%s, %d, %s, %s, %s));\" % arguments).text(\"[edit]\")\n\n            if filter_type != \"extensionhook\":\n                links.a(href=\"javascript:if (deleteFilterById(%d)) location.reload(); void(0);\" % filter_id).text(\"[delete]\")\n                links.a(href=\"javascript:location.href='/config?filter=%d';\" % filter_id).text(\"[preferences]\")\n            else:\n                links.a(href=\"javascript:if (deleteExtensionHookFilterById(%d)) location.reload(); void(0);\" % filter_id).text(\"[delete]\")\n\n        document.addInternalScript(\"var count_matched_files = %s;\" % json_encode(count_matched_files.values()))\n    else:\n        filters.addCentered().p().b().text(\"No filters\")\n\n        # Additionally check if there are in fact no repositories.\n        cursor.execute(\"SELECT 1 FROM repositories\")\n        if not cursor.fetchone():\n            document.addInternalScript(\"var no_repositories = true;\")\n\n    if not readonly:\n        filters.addSeparator()\n        filters.addCentered().button(onclick=\"editFilter();\").text(\"Add filter\")\n\n    profiler.check(\"filters\")\n\n    hidden = body.div(\"hidden\", style=\"display: none\")\n\n    if configuration.extensions.ENABLED:\n        filterhooks = extensions.role.filterhook.listFilterHooks(db, user)\n    else:\n        filterhooks = []\n\n    with hidden.div(\"filterdialog\") as dialog:\n        paragraph = dialog.p()\n        paragraph.b().text(\"Repository:\")\n        paragraph.br()\n        page.utils.generateRepositorySelect(db, user, paragraph, name=\"repository\")\n\n        paragraph = dialog.p()\n        paragraph.b().text(\"Filter type:\")\n        paragraph.br()\n        filter_type = paragraph.select(name=\"type\")\n        filter_type.option(value=\"reviewer\").text(\"Reviewer\")\n        filter_type.option(value=\"watcher\").text(\"Watcher\")\n        filter_type.option(value=\"ignored\").text(\"Ignored\")\n\n        for extension, manifest, roles in filterhooks:\n            optgroup = filter_type.optgroup(label=extension.getTitle(db))\n            for role in roles:\n                option = optgroup.option(\n                    value=\"extensionhook\",\n                    data_extension_id=extension.getExtensionID(db),\n                    data_filterhook_name=role.name)\n                option.text(role.title)\n\n        paragraph = dialog.p()\n        paragraph.b().text(\"Path:\")\n        paragraph.br()\n        paragraph.input(name=\"path\", type=\"text\")\n        paragraph.span(\"matchedfiles\")\n\n        regular_div = dialog.div(\"regular\")\n\n        paragraph = regular_div.p()\n        paragraph.b().text(\"Delegates:\")\n        paragraph.br()\n        paragraph.input(name=\"delegates\", type=\"text\")\n\n        paragraph = regular_div.p()\n        label = paragraph.label()\n        label.input(name=\"apply\", type=\"checkbox\", checked=\"checked\")\n        label.b().text(\"Apply to existing reviews\")\n\n        for extension, manifest, roles in filterhooks:\n            for role in roles:\n                if not role.data_description:\n                    continue\n\n                filterhook_id = \"%d_%s\" % (extension.getExtensionID(db), role.name)\n\n                extensionhook_div = dialog.div(\n                    \"extensionhook \" + filterhook_id,\n                    style=\"display: none\")\n                extensionhook_div.innerHTML(role.data_description)\n\n                paragraph = extensionhook_div.p()\n                paragraph.b().text(\"Data:\")\n                paragraph.br()\n                paragraph.input(type=\"text\")\n\n    profiler.output(db, user, document)\n\n    return document\n"
  },
  {
    "path": "src/page/loadmanifest.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2015 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport extensions\nimport page.utils\n\ndef renderLoadManifest(req, db, user):\n    key = req.getParameter(\"key\")\n\n    if \"/\" in key:\n        author_name, extension_name = key.split(\"/\", 1)\n    else:\n        author_name, extension_name = None, key\n\n    def load():\n        try:\n            extension = extensions.extension.Extension(author_name, extension_name)\n        except extensions.extension.ExtensionError as error:\n            return str(error)\n\n        try:\n            extension.getManifest()\n        except extensions.manifest.ManifestError as error:\n            return str(error)\n\n        return \"That's a valid manifest, friend.\"\n\n    return page.utils.ResponseBody(load(), content_type=\"text/plain\")\n"
  },
  {
    "path": "src/page/login.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport urllib\n\nimport auth\nimport page\nimport page.utils\nimport auth\nimport configuration\nimport request\n\nfrom page.parameters import Optional\n\nclass LoginHandler(page.Page.Handler):\n    def __init__(self, target=\"/\", optional=\"no\"):\n        super(LoginHandler, self).__init__()\n        self.target = target\n        self.optional = optional == \"yes\"\n\n    def generateHeader(self):\n        self.document.addExternalStylesheet(\"resource/login.css\")\n        self.document.addExternalScript(\"resource/login.js\")\n\n    def generateContent(self):\n        if not self.user.isAnonymous():\n            raise page.utils.MovedTemporarily(self.target, True)\n\n        self.request.ensureSecure()\n\n        if configuration.base.AUTHENTICATION_MODE != \"critic\":\n            raise request.DoExternalAuthentication(\n                configuration.base.AUTHENTICATION_MODE, self.target)\n\n        self.document.setTitle(\"Sign in\")\n\n        def render(target):\n            redirect_url = \"redirect?\" + urllib.urlencode(\n                { \"target\": self.target })\n\n            form = target.form(name=\"login\", method=\"POST\", action=redirect_url)\n            table = form.table(\"login callout\", align=\"center\")\n\n            row = table.tr(\"status disabled\")\n            row.td(colspan=2).text()\n\n            autofocus = \"autofocus\"\n\n            for field in auth.DATABASE.getFields():\n                if len(field) == 3:\n                    hidden, identifier, label = field\n                    description = None\n                else:\n                    hidden, identifier, label, description = field\n\n                if hidden:\n                    field_type = \"password\"\n                else:\n                    field_type = None\n\n                row = table.tr(\"field\")\n                row.td(\"key\").text(label)\n                row.td(\"value\").input(\"field\",\n                                      name=identifier,\n                                      type=field_type,\n                                      autofocus=autofocus)\n\n                # Only autofocus the first field.\n                autofocus = None\n\n            row = table.tr(\"login\")\n            row.td(colspan=2).input(\"login\", type=\"submit\", value=\"Sign in\")\n\n            providers = []\n\n            for name, provider in auth.PROVIDERS.items():\n                providers.append((provider.getTitle(), name))\n\n            if providers:\n                table.tr(\"separator1\").td(colspan=2)\n                table.tr(\"separator2\").td(colspan=2)\n\n                external = table.tr(\"external\").td(colspan=2)\n                first = True\n\n                for title, name in sorted(providers):\n                    div = external.div(\"provider\")\n                    url = \"/externalauth/%s?%s\" % (name, urllib.urlencode(\n                            { \"target\": self.target }))\n                    if first:\n                        div.text(\"Sign in using your \")\n                        first = False\n                    else:\n                        div.text(\"or \")\n                    div.a(href=url).text(title)\n\n            if configuration.base.ALLOW_USER_REGISTRATION:\n                table.tr(\"separator1\").td(colspan=2)\n                table.tr(\"separator2\").td(colspan=2)\n\n                register = table.tr(\"register\").td(colspan=2)\n\n                register.text(\"New to this system? \")\n                register.a(href=\"/createuser\").text(\"Create a user\")\n                register.text(\" to start using it.\")\n\n            if self.optional and configuration.base.ALLOW_ANONYMOUS_USER:\n                table.tr(\"separator1\").td(colspan=2)\n                table.tr(\"separator2\").td(colspan=2)\n\n                row = table.tr(\"continue\")\n                row.td(colspan=2).a(href=self.target).innerHTML(\n                    \"&#8230; or, continue anonymously\")\n\n        paleyellow = page.utils.PaleYellowTable(self.body, \"Sign in\")\n        paleyellow.addCentered(render)\n\nclass Login(page.Page):\n    def __init__(self):\n        super(Login, self).__init__(\"login\",\n                                    { \"target\": Optional(str),\n                                      \"optional\": Optional(str) },\n                                    LoginHandler)\n"
  },
  {
    "path": "src/page/manageextensions.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport page.utils\nimport htmlutils\nimport dbutils\nimport configuration\n\nfrom extensions.extension import Extension, ExtensionError\nfrom extensions.manifest import (ManifestError, PageRole, InjectRole,\n                                 ProcessCommitsRole, FilterHookRole,\n                                 ScheduledRole)\n\ndef renderManageExtensions(req, db, user):\n    if not configuration.extensions.ENABLED:\n        administrators = dbutils.getAdministratorContacts(db, as_html=True)\n        raise page.utils.DisplayMessage(\n            title=\"Extension support not enabled\",\n            body=((\"<p>This Critic system does not support extensions.</p>\"\n                   \"<p>Contact %s to have it enabled, or see the \"\n                   \"<a href='/tutorial?item=administration#extensions'>\"\n                   \"section on extensions</a> in the system administration \"\n                   \"tutorial for more information.</p>\")\n                  % administrators),\n            html=True)\n\n    cursor = db.cursor()\n\n    what = req.getParameter(\"what\", \"available\")\n    selected_versions = page.utils.json_decode(req.getParameter(\"select\", \"{}\"))\n    focused = req.getParameter(\"focus\", None)\n\n    if what == \"installed\":\n        title = \"Installed Extensions\"\n        listed_extensions = []\n        for extension_id, _, _, _ in Extension.getInstalls(db, user):\n            try:\n                listed_extensions.append(Extension.fromId(db, extension_id))\n            except ExtensionError as error:\n                listed_extensions.append(error)\n    else:\n        title = \"Available Extensions\"\n        listed_extensions = Extension.find(db)\n\n    req.content_type = \"text/html; charset=utf-8\"\n\n    document = htmlutils.Document(req)\n    document.setTitle(\"Manage Extensions\")\n\n    html = document.html()\n    head = html.head()\n    body = html.body()\n\n    def generateRight(target):\n        target.a(\"button\", href=\"tutorial?item=extensions\").text(\"Tutorial\")\n        target.text(\" \")\n        target.a(\"button\", href=\"tutorial?item=extensions-api\").text(\"API Documentation\")\n\n    page.utils.generateHeader(body, db, user, current_page=\"extensions\", generate_right=generateRight)\n\n    document.addExternalStylesheet(\"resource/manageextensions.css\")\n    document.addExternalScript(\"resource/manageextensions.js\")\n    document.addInternalScript(user.getJS())\n\n    table = page.utils.PaleYellowTable(body, title)\n\n    def addTitleRightLink(url, label):\n        if user.name != req.user:\n            url += \"&user=%s\" % user.name\n        table.titleRight.text(\" \")\n        table.titleRight.a(href=url).text(\"[\" + label + \" extensions]\")\n\n    if what != \"installed\" or focused:\n        addTitleRightLink(\"/manageextensions?what=installed\", \"installed\")\n    if what != \"available\" or focused:\n        addTitleRightLink(\"/manageextensions?what=available\", \"available\")\n\n    for item in listed_extensions:\n        if isinstance(item, ExtensionError):\n            extension_error = item\n            extension = item.extension\n        else:\n            extension_error = None\n            extension = item\n\n        if focused and extension.getKey() != focused:\n            continue\n\n        extension_path = extension.getPath()\n\n        if extension.isSystemExtension():\n            hosting_user = None\n        else:\n            hosting_user = extension.getAuthor(db)\n\n        selected_version = selected_versions.get(extension.getKey(), False)\n        installed_sha1, installed_version = extension.getInstalledVersion(db, user)\n        universal_sha1, universal_version = extension.getInstalledVersion(db, None)\n        installed_upgradeable = universal_upgradeable = False\n\n        if extension_error is None:\n            if installed_sha1:\n                current_sha1 = extension.getCurrentSHA1(installed_version)\n                installed_upgradeable = installed_sha1 != current_sha1\n            if universal_sha1:\n                current_sha1 = extension.getCurrentSHA1(universal_version)\n                universal_upgradeable = universal_sha1 != current_sha1\n\n        def massage_version(version):\n            if version is None:\n                return \"live\"\n            elif version:\n                return \"version/%s\" % version\n            else:\n                return None\n\n        if selected_version is False:\n            selected_version = installed_version\n        if selected_version is False:\n            selected_version = universal_version\n\n        install_version = massage_version(selected_version)\n        installed_version = massage_version(installed_version)\n        universal_version = massage_version(universal_version)\n\n        manifest = None\n\n        if extension_error is None:\n            try:\n                if selected_version is False:\n                    manifest = extension.getManifest()\n                else:\n                    manifest = extension.getManifest(selected_version)\n            except ManifestError as error:\n                pass\n        elif installed_sha1:\n            manifest = extension.getManifest(installed_version, installed_sha1)\n        elif universal_sha1:\n            manifest = extension.getManifest(universal_version, universal_sha1)\n\n        if manifest:\n            if what == \"available\" and manifest.hidden:\n                # Hide from view unless the user is hosting the extension, or is\n                # an administrator and the extension is a system extension.\n                if extension.isSystemExtension():\n                    if not user.hasRole(db, \"administrator\"):\n                        continue\n                elif hosting_user != user:\n                    continue\n        else:\n            if hosting_user != user:\n                continue\n\n        extension_id = extension.getExtensionID(db, create=False)\n\n        if not user.isAnonymous():\n            buttons = []\n\n            if extension_id is not None:\n                cursor.execute(\"\"\"SELECT 1\n                                    FROM extensionstorage\n                                   WHERE extension=%s\n                                     AND uid=%s\"\"\",\n                               (extension_id, user.id))\n\n                if cursor.fetchone():\n                    buttons.append((\"Clear storage\",\n                                    (\"clearExtensionStorage(%s, %s)\"\n                                     % (htmlutils.jsify(extension.getAuthorName()),\n                                        htmlutils.jsify(extension.getName())))))\n\n            if not installed_version:\n                if manifest and install_version and install_version != universal_version:\n                    buttons.append((\"Install\",\n                                    (\"installExtension(%s, %s, %s)\"\n                                     % (htmlutils.jsify(extension.getAuthorName()),\n                                        htmlutils.jsify(extension.getName()),\n                                        htmlutils.jsify(install_version)))))\n            else:\n                buttons.append((\"Uninstall\",\n                                (\"uninstallExtension(%s, %s)\"\n                                 % (htmlutils.jsify(extension.getAuthorName()),\n                                    htmlutils.jsify(extension.getName())))))\n\n                if manifest and (install_version != installed_version\n                                 or (installed_sha1 and installed_upgradeable)):\n                    if install_version == installed_version:\n                        label = \"Upgrade\"\n                    else:\n                        label = \"Install\"\n\n                    buttons.append((\"Upgrade\",\n                                    (\"reinstallExtension(%s, %s, %s)\"\n                                     % (htmlutils.jsify(extension.getAuthorName()),\n                                        htmlutils.jsify(extension.getName()),\n                                        htmlutils.jsify(install_version)))))\n\n            if user.hasRole(db, \"administrator\"):\n                if not universal_version:\n                    if manifest and install_version:\n                        buttons.append((\"Install (universal)\",\n                                        (\"installExtension(%s, %s, %s, true)\"\n                                         % (htmlutils.jsify(extension.getAuthorName()),\n                                            htmlutils.jsify(extension.getName()),\n                                            htmlutils.jsify(install_version)))))\n                else:\n                    buttons.append((\"Uninstall (universal)\",\n                                    (\"uninstallExtension(%s, %s, true)\"\n                                     % (htmlutils.jsify(extension.getAuthorName()),\n                                        htmlutils.jsify(extension.getName())))))\n\n                    if manifest and (install_version != universal_version\n                                     or (universal_sha1 and universal_upgradeable)):\n                        if install_version == universal_version:\n                            label = \"Upgrade (universal)\"\n                        else:\n                            label = \"Install (universal)\"\n\n                        buttons.append((label,\n                                        (\"reinstallExtension(%s, %s, %s, true)\"\n                                         % (htmlutils.jsify(extension.getAuthorName()),\n                                            htmlutils.jsify(extension.getName()),\n                                            htmlutils.jsify(universal_version)))))\n        else:\n            buttons = None\n\n        def renderItem(target):\n            target.span(\"name\").innerHTML(extension.getTitle(db, html=True))\n\n            if hosting_user:\n                is_author = manifest and manifest.isAuthor(db, hosting_user)\n                is_sole_author = is_author and len(manifest.authors) == 1\n            else:\n                is_sole_author = False\n\n            if extension_error is None:\n                span = target.span(\"details\")\n                span.b().text(\"Details: \")\n                select = span.select(\"details\", critic_author=extension.getAuthorName(), critic_extension=extension.getName())\n                select.option(value='', selected=\"selected\" if selected_version is False else None).text(\"Select version\")\n                versions = extension.getVersions()\n                if versions:\n                    optgroup = select.optgroup(label=\"Official Versions\")\n                    for version in versions:\n                        optgroup.option(value=\"version/%s\" % version, selected=\"selected\" if selected_version == version else None).text(\"%s\" % version.upper())\n                optgroup = select.optgroup(label=\"Development\")\n                optgroup.option(value='live', selected=\"selected\" if selected_version is None else None).text(\"LIVE\")\n\n            if manifest:\n                is_installed = bool(installed_version)\n\n                if is_installed:\n                    target.span(\"installed\").text(\" [installed]\")\n                else:\n                    is_installed = bool(universal_version)\n\n                    if is_installed:\n                        target.span(\"installed\").text(\" [installed (universal)]\")\n\n                target.div(\"description\").preformatted().text(manifest.description, linkify=True)\n\n                if not is_sole_author:\n                    authors = target.div(\"authors\")\n                    authors.b().text(\"Author%s:\" % (\"s\" if len(manifest.authors) > 1 else \"\"))\n                    authors.text(\", \".join(author.name for author in manifest.getAuthors()))\n            else:\n                is_installed = False\n\n                div = target.div(\"description broken\").preformatted()\n\n                if extension_error is None:\n                    anchor = div.a(href=\"loadmanifest?key=%s\" % extension.getKey())\n                    anchor.text(\"[This extension has an invalid MANIFEST file]\")\n                else:\n                    div.text(\"[This extension has been deleted or has become inaccessible]\")\n\n            if selected_version is False:\n                return\n\n            pages = []\n            injects = []\n            processcommits = []\n            filterhooks = []\n            scheduled = []\n\n            if manifest:\n                for role in manifest.roles:\n                    if isinstance(role, PageRole):\n                        pages.append(role)\n                    elif isinstance(role, InjectRole):\n                        injects.append(role)\n                    elif isinstance(role, ProcessCommitsRole):\n                        processcommits.append(role)\n                    elif isinstance(role, FilterHookRole):\n                        filterhooks.append(role)\n                    elif isinstance(role, ScheduledRole):\n                        scheduled.append(role)\n\n            role_table = target.table(\"roles\")\n\n            if pages:\n                role_table.tr().th(colspan=2).text(\"Pages\")\n\n                for role in pages:\n                    row = role_table.tr()\n                    url = \"%s/%s\" % (dbutils.getURLPrefix(db, user), role.pattern)\n                    if is_installed and \"*\" not in url:\n                        row.td(\"pattern\").a(href=url).text(url)\n                    else:\n                        row.td(\"pattern\").text(url)\n                    td = row.td(\"description\")\n                    td.text(role.description)\n\n            if injects:\n                role_table.tr().th(colspan=2).text(\"Page Injections\")\n\n                for role in injects:\n                    row = role_table.tr()\n                    row.td(\"pattern\").text(\"%s/%s\" % (dbutils.getURLPrefix(db, user), role.pattern))\n                    td = row.td(\"description\")\n                    td.text(role.description)\n\n            if processcommits:\n                role_table.tr().th(colspan=2).text(\"ProcessCommits hooks\")\n                ul = role_table.tr().td(colspan=2).ul()\n\n                for role in processcommits:\n                    li = ul.li()\n                    li.text(role.description)\n\n            if filterhooks:\n                role_table.tr().th(colspan=2).text(\"FilterHook hooks\")\n\n                for role in filterhooks:\n                    row = role_table.tr()\n                    row.td(\"title\").text(role.title)\n                    row.td(\"description\").text(role.description)\n\n            if scheduled:\n                role_table.tr().th(colspan=2).text(\"Scheduled hooks\")\n\n                for role in scheduled:\n                    row = role_table.tr()\n                    row.td(\"pattern\").text(\"%s @ %s\" % (role.frequency, role.at))\n                    td = row.td(\"description\")\n                    td.text(role.description)\n\n        installed_by = \"\"\n\n        if extension_id is not None:\n            cursor.execute(\"\"\"SELECT uid\n                                FROM extensioninstalls\n                                JOIN extensions ON (extensions.id=extensioninstalls.extension)\n                               WHERE extensions.id=%s\"\"\",\n                           (extension.getExtensionID(db, create=False),))\n\n            user_ids = set(user_id for user_id, in cursor.fetchall())\n            if user_ids:\n                installed_by = \" (installed\"\n                if None in user_ids:\n                    installed_by += \" universally\"\n                    user_ids.remove(None)\n                    if user_ids:\n                        installed_by += \" and\"\n                if user_ids:\n                    installed_by += (\" by %d user%s\"\n                                  % (len(user_ids),\n                                     \"s\" if len(user_ids) > 1 else \"\"))\n                installed_by += \")\"\n\n        table.addItem(\"Extension\", renderItem, extension_path + \"/\" + installed_by, buttons)\n\n    document.addInternalScript(\"var selected_versions = %s;\" % page.utils.json_encode(selected_versions))\n\n    return document\n"
  },
  {
    "path": "src/page/managereviewers.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport dbutils\nimport htmlutils\n\nimport page.utils\nimport reviewing.utils as review_utils\n\ndef renderManageReviewers(req, db, user):\n    review_id = req.getParameter(\"review\", filter=int)\n\n    cursor = db.cursor()\n\n    review = dbutils.Review.fromId(db, review_id)\n\n    root_directories = {}\n    root_files = {}\n\n    def processFile(file_id):\n        components = dbutils.describe_file(db, file_id).split(\"/\")\n        directories, files = root_directories, root_files\n        for directory_name in components[:-1]:\n            directories, files = directories.setdefault(directory_name, ({}, {}))\n        files[components[-1]] = file_id\n\n    cursor.execute(\"SELECT file FROM reviewfiles WHERE review=%s\", (review.id,))\n\n    for (file_id,) in cursor:\n        processFile(file_id)\n\n    cursor.execute(\"SELECT name FROM users WHERE name IS NOT NULL\")\n    users = [user_name for (user_name,) in cursor if user_name]\n\n    document = htmlutils.Document(req)\n\n    html = document.html()\n    head = html.head()\n    body = html.body()\n\n    page.utils.generateHeader(body, db, user, lambda target: review_utils.renderDraftItems(db, user, review, target), extra_links=[(\"r/%d\" % review.id, \"Back to Review\")])\n\n    document.addExternalStylesheet(\"resource/managereviewers.css\")\n    document.addExternalScript(\"resource/managereviewers.js\")\n    document.addInternalScript(user.getJS());\n    document.addInternalScript(review.getJS());\n    document.addInternalScript(\"var users = [ %s ];\" % \", \".join([htmlutils.jsify(user_name) for user_name in sorted(users)]))\n\n    target = body.div(\"main\")\n\n    basic = target.table('manage paleyellow', align='center')\n    basic.col(width='10%')\n    basic.col(width='60%')\n    basic.col(width='30%')\n    basic.tr().td('h1', colspan=3).h1().text(\"Manage Reviewers\")\n\n    row = basic.tr(\"current\")\n    row.td(\"select\").text(\"Current:\")\n    cell = row.td(\"value\")\n    for index, reviewer in enumerate(review.reviewers):\n        if index != 0: cell.text(\", \")\n        cell.span(\"reviewer\", critic_username=reviewer.name).innerHTML(htmlutils.htmlify(reviewer.fullname).replace(\" \", \"&nbsp;\"))\n    row.td(\"right\").text()\n\n    row = basic.tr(\"reviewer\")\n    row.td(\"select\").text(\"Reviewer:\")\n    row.td(\"value\").input(\"reviewer\").span(\"message\")\n    row.td(\"right\").button(\"save\").text(\"Save\")\n\n    row = basic.tr(\"help\")\n    row.td(\"help\", colspan=3).text(\"Enter the name of a current reviewer to edit assignments (or unassign.)  Enter the name of another user to add a new reviewer.\")\n\n    row = basic.tr(\"headings\")\n    row.td(\"select\").text(\"Assigned\")\n    row.td(\"path\").text(\"Path\")\n    row.td().text()\n\n    def outputDirectory(base, name, directories, files):\n        if name:\n            level = base.count(\"/\")\n            row = basic.tr(\"directory\", critic_level=level)\n            row.td(\"select\").input(type=\"checkbox\")\n            if level > 1:\n                row.td(\"path\").preformatted().innerHTML((\" \" * (len(base) - 2)) + \"&#8230;/\" + name + \"/\")\n            else:\n                row.td(\"path\").preformatted().innerHTML(base + name + \"/\")\n            row.td().text()\n        else:\n            level = 0\n\n        for directory_name in sorted(directories.keys()):\n            outputDirectory(base + name + \"/\" if name else \"\", directory_name, directories[directory_name][0], directories[directory_name][1])\n\n        for file_name in sorted(files.keys()):\n            row = basic.tr(\"file\", critic_file_id=files[file_name], critic_level=level + 1)\n            row.td(\"select\").input(type=\"checkbox\")\n            row.td(\"path\").preformatted().innerHTML((\" \" * (len(base + name) - 1)) + \"&#8230;/\" + htmlutils.htmlify(file_name))\n            row.td().text()\n\n    outputDirectory(\"\", \"\", root_directories, root_files)\n\n    return document\n"
  },
  {
    "path": "src/page/news.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport page.utils\nimport textformatting\nimport htmlutils\n\ndef renderNewsItem(db, user, target, text, timestamp):\n    table = target.table(\"paleyellow\", align=\"center\")\n    textformatting.renderFormatted(db, user, table, text.splitlines(), toc=False,\n                                   title_right=timestamp)\n    table.tr(\"back\").td(\"back\").div().a(\"back\", href=\"news\").text(\"Back\")\n\ndef renderNewsItems(db, user, target, display_unread, display_read):\n    target.setTitle(\"News\")\n\n    table = target.table(\"paleyellow\", align=\"center\")\n    table.tr(\"h1\").td(\"h1\", colspan=3).h1().text(\"News\")\n\n    cursor = db.cursor()\n    cursor.execute(\"\"\"SELECT id, date, text, uid IS NULL\n                        FROM newsitems\n             LEFT OUTER JOIN newsread ON (item=id AND uid=%s)\n                    ORDER BY date DESC, id DESC\"\"\",\n                   (user.id,))\n\n    nothing = True\n\n    for item_id, date, text, unread in cursor:\n        if (unread and display_unread) or (not unread and display_read):\n            row = table.tr(\"item\", critic_item_id=item_id)\n            row.td(\"date\").text(date)\n            row.td(\"title\").text(text.split(\"\\n\", 1)[0])\n            row.td(\"status\").text(\"unread\" if unread else None)\n            nothing = False\n\n    if nothing:\n        row = table.tr(\"nothing\")\n        row.td(\"nothing\", colspan=3).text(\"No %s news!\" % \"unread\" if display_unread else \"read\")\n\n    if not display_unread or not display_read:\n        table.tr(\"show\").td(\"show\", colspan=3).div().a(\"show\", href=\"news?display=all\").text(\"Show All\")\n\ndef renderNews(req, db, user):\n    item_id = req.getParameter(\"item\", None, filter=int)\n    display = req.getParameter(\"display\", \"unread\")\n\n    document = htmlutils.Document(req)\n\n    html = document.html()\n    head = html.head()\n    body = html.body()\n\n    cursor = db.cursor()\n\n    def renderButtons(target):\n        if user.hasRole(db, \"newswriter\"):\n            if item_id is not None:\n                target.button(\"editnewsitem\").text(\"Edit Item\")\n            target.button(\"addnewsitem\").text(\"Add News Item\")\n\n    page.utils.generateHeader(body, db, user, current_page=\"news\", generate_right=renderButtons)\n\n    document.addExternalStylesheet(\"resource/tutorial.css\")\n    document.addExternalStylesheet(\"resource/comment.css\")\n    document.addExternalStylesheet(\"resource/news.css\")\n    document.addExternalScript(\"resource/news.js\")\n    document.addInternalStylesheet(\"div.main table td.text { %s }\" % user.getPreference(db, \"style.tutorialFont\"))\n\n    target = body.div(\"main\")\n\n    if item_id:\n        cursor.execute(\"SELECT text, date FROM newsitems WHERE id=%s\", (item_id,))\n\n        text, date = cursor.fetchone()\n\n        document.addInternalScript(\"var news_item_id = %d;\" % item_id)\n        document.addInternalScript(\"var news_text = %s;\" % htmlutils.jsify(text))\n\n        renderNewsItem(db, user, target, text, date.isoformat())\n\n        if not user.isAnonymous() and user.name == req.user:\n            cursor.execute(\"SELECT 1 FROM newsread WHERE item=%s AND uid=%s\", (item_id, user.id))\n            if not cursor.fetchone():\n                cursor.execute(\"INSERT INTO newsread (item, uid) VALUES (%s, %s)\", (item_id, user.id))\n                db.commit()\n    else:\n        renderNewsItems(db, user, target, display in (\"unread\", \"all\"), display in (\"read\", \"all\"))\n\n    return document\n"
  },
  {
    "path": "src/page/parameters.py",
    "content": "import base\nimport dbutils\n\nclass InvalidParameterValue(base.Error):\n    def __init__(self, expected):\n        self.expected = expected\n\nclass Optional(object):\n    def __init__(self, actual):\n        self.actual = actual\n\nclass ListOf(object):\n    def __init__(self, actual):\n        self.actual = actual\n\ndef check_integer(value, what=\"value\"):\n    try:\n        value = int(value)\n    except ValueError:\n        raise InvalidParameterValue(\"an integer %s\" % what)\n    else:\n        return value\n\nclass Stateful(object):\n    def __init__(self, req, db, user):\n        self.req = req\n        self.db = db\n        self.user = user\n\nclass ReviewId(Stateful):\n    def __call__(self, value):\n        review_id = check_integer(value, \"review id\")\n\n        try:\n            review = dbutils.Review.fromId(self.db, review_id)\n        except dbutils.NoSuchReview:\n            raise InvalidParameterValue(\"a valid review id\")\n\n        return review\n"
  },
  {
    "path": "src/page/processcommits.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2015 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport cStringIO\n\nimport extensions\nimport gitutils\nimport log.commitset\nimport page.utils\n\ndef renderProcessCommits(req, db, user):\n    review_id = req.getParameter(\"review\", filter=int)\n    commit_ids = map(int, req.getParameter(\"commits\").split(\",\"))\n\n    review = dbutils.Review.fromId(db, review_id)\n    all_commits = [gitutils.Commit.fromId(db, review.repository, commit_id) for commit_id in commit_ids]\n    commitset = log.commitset.CommitSet(all_commits)\n\n    heads = commitset.getHeads()\n    tails = commitset.getTails()\n\n    def process():\n        if len(heads) != 1:\n            return \"invalid commit-set; multiple heads\"\n        if len(tails) != 1:\n            return \"invalid commit-set; multiple tails\"\n\n        old_head = gitutils.Commit.fromSHA1(db, review.repository, tails.pop())\n        new_head = heads.pop()\n\n        output = cStringIO.StringIO()\n\n        extensions.role.processcommits.execute(db, user, review, all_commits, old_head, new_head, output)\n\n        return output.getvalue()\n\n    return page.utils.ResponseBody(process(), content_type=\"text/plain\")\n"
  },
  {
    "path": "src/page/rebasetrackingreview.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport page\nimport htmlutils\nimport gitutils\nimport request\n\nfrom page.parameters import Optional, ReviewId\n\nclass RebaseTrackingReview(page.Page):\n    def __init__(self):\n        super(RebaseTrackingReview, self).__init__(\"rebasetrackingreview\",\n                                                   { \"review\": ReviewId,\n                                                     \"newbranch\": Optional(str),\n                                                     \"upstream\": Optional(str),\n                                                     \"newhead\": Optional(str),\n                                                     \"newupstream\": Optional(str) },\n                                                   RebaseTrackingReview.Handler)\n\n    class Handler(page.Page.Handler):\n        def __init__(self, review, newbranch=None, upstream=None, newhead=None, newupstream=None):\n            super(RebaseTrackingReview.Handler, self).__init__(review)\n            self.newbranch = newbranch\n            self.upstream = upstream\n            self.newhead = newhead\n            self.newupstream = newupstream\n\n        def generateHeader(self):\n            self.document.addExternalStylesheet(\"resource/rebasetrackingreview.css\")\n            self.document.addExternalScript(\"resource/autocomplete.js\")\n            self.document.addExternalScript(\"resource/rebasetrackingreview.js\")\n\n        def generateContent(self):\n            trackedbranch = self.review.getTrackedBranch(self.db)\n\n            if not trackedbranch:\n                raise request.DisplayMessage(\"Not supported!\", \"The review r/%d is not tracking a remote branch.\" % self.review.id)\n\n            self.document.addInternalScript(self.review.repository.getJS())\n            self.document.addInternalScript(self.review.getJS())\n            self.document.addInternalScript(\"var trackedbranch = { remote: %s, name: %s };\"\n                                            % (htmlutils.jsify(trackedbranch.remote),\n                                               htmlutils.jsify(trackedbranch.name)))\n\n            table = page.utils.PaleYellowTable(self.body, \"Rebase tracking review\")\n\n            def renderRemote(target):\n                target.span(\"value inset\", id=\"remote\").text(trackedbranch.remote)\n            def renderCurrentBranch(target):\n                target.span(\"value inset\", id=\"currentbranch\").text(\"refs/heads/\" + trackedbranch.name)\n\n            table.addItem(\"Remote\", renderRemote)\n            table.addItem(\"Current branch\", renderCurrentBranch)\n            table.addSeparator()\n\n            if self.newbranch is not None and self.upstream is not None and self.newhead is not None and self.newupstream is not None:\n                import log.html\n                import log.commitset\n\n                sha1s = self.review.repository.revlist(included=[self.newhead], excluded=[self.newupstream])\n                new_commits = log.commitset.CommitSet(gitutils.Commit.fromSHA1(self.db, self.review.repository, sha1) for sha1 in sha1s)\n\n                new_heads = new_commits.getHeads()\n                if len(new_heads) != 1:\n                    raise page.utils.DisplayMessage(\"Invalid commit-set!\", \"Multiple heads.  (This ought to be impossible...)\")\n                new_upstreams = new_commits.getFilteredTails(self.review.repository)\n                if len(new_upstreams) != 1:\n                    raise page.utils.DisplayMessage(\"Invalid commit-set!\", \"Multiple upstreams.  (This ought to be impossible...)\")\n\n                new_head = new_heads.pop()\n                new_upstream_sha1 = new_upstreams.pop()\n\n                old_commits = log.commitset.CommitSet(self.review.branch.getCommits(self.db))\n                old_upstreams = old_commits.getFilteredTails(self.review.repository)\n\n                if len(old_upstreams) != 1:\n                    raise page.utils.DisplayMessage(\"Rebase not supported!\", \"The review has mulitple upstreams and can't be rebased.\")\n\n                if len(old_upstreams) == 1 and new_upstream_sha1 in old_upstreams:\n                    # This appears to be a history rewrite.\n                    new_upstream = None\n                    new_upstream_sha1 = None\n                    rebase_type = \"history\"\n                else:\n                    old_upstream = gitutils.Commit.fromSHA1(self.db, self.review.repository, old_upstreams.pop())\n                    new_upstream = gitutils.Commit.fromSHA1(self.db, self.review.repository, new_upstream_sha1)\n\n                    if old_upstream.isAncestorOf(new_upstream):\n                        rebase_type = \"move:ff\"\n                    else:\n                        rebase_type = \"move\"\n\n                self.document.addInternalScript(\"var check = { rebase_type: %s, old_head_sha1: %s, new_head_sha1: %s, new_upstream_sha1: %s, new_trackedbranch: %s };\"\n                                                % (htmlutils.jsify(rebase_type),\n                                                   htmlutils.jsify(self.review.branch.head_sha1),\n                                                   htmlutils.jsify(new_head.sha1),\n                                                   htmlutils.jsify(new_upstream_sha1),\n                                                   htmlutils.jsify(self.newbranch[len(\"refs/heads/\"):])))\n\n                def renderNewBranch(target):\n                    target.span(\"value inset\", id=\"newbranch\").text(self.newbranch)\n                    target.text(\" @ \")\n                    target.span(\"value inset\").text(new_head.sha1[:8] + \" \" + new_head.niceSummary())\n                def renderUpstream(target):\n                    target.span(\"value inset\", id=\"upstream\").text(self.upstream)\n                    target.text(\" @ \")\n                    target.span(\"value inset\").text(new_upstream.sha1[:8] + \" \" + new_upstream.niceSummary())\n\n                table.addItem(\"New branch\", renderNewBranch)\n\n                if new_upstream:\n                    table.addItem(\"New upstream\", renderUpstream)\n\n                table.addSeparator()\n\n                def renderMergeStatus(target):\n                    target.a(\"status\", id=\"status_merge\").text(\"N/A\")\n                def renderConflictsStatus(target):\n                    target.a(\"status\", id=\"status_conflicts\").text(\"N/A\")\n                def renderHistoryRewriteStatus(target):\n                    target.a(\"status\", id=\"status_historyrewrite\").text(\"N/A\")\n\n                table.addSection(\"Status\")\n\n                if rebase_type == \"history\":\n                    table.addItem(\"History rewrite\", renderHistoryRewriteStatus)\n                else:\n                    if rebase_type == \"move:ff\":\n                        table.addItem(\"Merge\", renderMergeStatus)\n                    table.addItem(\"Conflicts\", renderConflictsStatus)\n\n                def renderRebaseReview(target):\n                    target.button(id=\"rebasereview\", onclick=\"rebaseReview();\", disabled=\"disabled\").text(\"Rebase Review\")\n\n                table.addSeparator()\n                table.addCentered(renderRebaseReview)\n\n                log.html.render(self.db, self.body, \"Rebased commits\", commits=list(new_commits))\n            else:\n                try:\n                    from customization.branches import getRebasedBranchPattern\n                except ImportError:\n                    def getRebasedBranchPattern(branch_name): return None\n\n                pattern = getRebasedBranchPattern(trackedbranch.name)\n\n                try:\n                    from customization.branches import isRebasedBranchCandidate\n                except ImportError:\n                    isRebasedBranchCandidate = None\n\n                if pattern or isRebasedBranchCandidate:\n                    candidates = [name[len(\"refs/heads/\"):]\n                                  for sha1, name in gitutils.Repository.lsremote(trackedbranch.remote, pattern=pattern, include_heads=True)\n                                  if name.startswith(\"refs/heads/\")]\n\n                    if isRebasedBranchCandidate is not None:\n                        def isCandidate(name):\n                            return isRebasedBranchCandidate(trackedbranch.name, name)\n\n                        candidates = filter(isCandidate, candidates)\n                else:\n                    candidates = []\n\n                if len(candidates) > 1:\n                    def renderCandidates(target):\n                        target.text(\"refs/heads/\")\n                        dropdown = target.select(id=\"newbranch\")\n                        for name in candidates:\n                            dropdown.option(value=name).text(name)\n\n                    table.addItem(\"New branch\", renderCandidates,\n                                    buttons=[(\"Edit\", \"editNewBranch(this);\")])\n                else:\n                    if len(candidates) == 1:\n                        default_value = candidates[0]\n                    else:\n                        default_value = trackedbranch.name\n\n                    def renderEdit(target):\n                        target.text(\"refs/heads/\")\n                        target.input(id=\"newbranch\", value=default_value)\n\n                    table.addItem(\"New branch\", renderEdit)\n\n                def renderUpstreamInput(target):\n                    target.input(id=\"upstream\", value=\"refs/heads/master\")\n\n                table.addItem(\"Upstream\", renderUpstreamInput)\n\n                def renderFetchBranch(target):\n                    target.button(onclick=\"fetchBranch();\").text(\"Fetch Branch\")\n\n                table.addSeparator()\n                table.addCentered(renderFetchBranch)\n\n"
  },
  {
    "path": "src/page/repositories.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport page.utils\nimport htmlutils\nimport dbutils\nimport gitutils\nimport configuration\n\ndef renderRepositories(req, db, user):\n    req.content_type = \"text/html; charset=utf-8\"\n\n    document = htmlutils.Document(req)\n    document.setTitle(\"Repositories\")\n\n    html = document.html()\n    head = html.head()\n    body = html.body()\n\n    def generateRight(target):\n        if user.hasRole(db, \"repositories\"):\n            target.a(\"button\", href=\"newrepository\").text(\"Add Repository\")\n\n    page.utils.generateHeader(body, db, user, current_page=\"repositories\", generate_right=generateRight)\n\n    document.addExternalStylesheet(\"resource/repositories.css\")\n    document.addExternalScript(\"resource/repositories.js\")\n    document.addInternalScript(user.getJS())\n\n    if user.name == req.user and user.hasRole(db, \"administrator\"):\n        document.addInternalScript(\"user.administrator = true;\")\n\n    cursor = db.cursor()\n    cursor.execute(\"SELECT id, name, path, parent FROM repositories ORDER BY name ASC\")\n\n    rows = cursor.fetchall()\n\n    class Repository:\n        def __init__(self, repository_id, name, path, parent_id):\n            self.id = repository_id\n            self.name = name\n            self.path = path\n            self.parent_id = parent_id\n            self.default_remote = None\n            self.location = gitutils.Repository.constructURL(db, user, path)\n\n    repositories = list(Repository(*row) for row in rows)\n    repository_by_id = dict((repository.id, repository) for repository in repositories)\n\n    def render(target):\n        table = target.table(\"repositories callout\")\n\n        headings = table.tr(\"headings\")\n        headings.th(\"name\").text(\"Short name\")\n        headings.th(\"location\").text(\"Location\")\n        headings.th(\"upstream\").text(\"Upstream\")\n\n        table.tr(\"spacer\").td(\"spacer\", colspan=3)\n\n        for repository in repositories:\n            row = table.tr(\"repository %s\" % repository.name)\n            row.td(\"name\").text(repository.name)\n            row.td(\"location\").text(repository.location)\n\n            if repository.parent_id:\n                row.td(\"upstream\").text(repository_by_id[repository.parent_id].name)\n            else:\n                row.td(\"upstream\").text()\n\n            cursor.execute(\"\"\"SELECT id, local_name, remote, remote_name, disabled\n                                FROM trackedbranches\n                               WHERE repository=%s\n                            ORDER BY id ASC\"\"\",\n                           (repository.id,))\n\n            details = table.tr(\"details %s\" % repository.name).td(colspan=3)\n\n            branches = [(branch_id, local_name, remote, remote_name, disabled)\n                        for branch_id, local_name, remote, remote_name, disabled in cursor\n                        if not local_name.startswith(\"r/\")]\n\n            if branches:\n                trackedbranches = details.table(\"trackedbranches\", cellspacing=0)\n                trackedbranches.tr().th(\"title\", colspan=5).text(\"Tracked Branches\")\n\n                row = trackedbranches.tr(\"headings\")\n                row.th(\"localname\").text(\"Local branch\")\n                row.th(\"remote\").text(\"Repository\")\n                row.th(\"remotename\").text(\"Remote branch\")\n                row.th(\"enabled\").text(\"Enabled\")\n                row.th(\"users\").text(\"Users\")\n\n                default_remote = \"\"\n\n                for branch_id, local_name, remote, remote_name, disabled in sorted(branches, key=lambda branch: branch[1]):\n                    cursor.execute(\"SELECT uid FROM trackedbranchusers WHERE branch=%s\", (branch_id,))\n\n                    user_ids = [user_id for (user_id,) in cursor.fetchall()]\n\n                    row = trackedbranches.tr(\"branch\", critic_branch_id=branch_id, critic_user_ids=\",\".join(map(str, user_ids)))\n\n                    if local_name == \"*\":\n                        row.td(\"localname\").i().text(\"Tags\")\n                        default_remote = remote\n                    else:\n                        row.td(\"localname\").text(local_name)\n                        if local_name == \"master\" and not default_remote:\n                            default_remote = remote\n                    row.td(\"remote\").text(remote)\n                    if remote_name == \"*\":\n                        row.td(\"remotename\").i().text(\"N/A\")\n                    else:\n                        row.td(\"remotename\").text(remote_name)\n                    row.td(\"enabled\").text(\"No\" if disabled else \"Yes\")\n\n                    cell = row.td(\"users\")\n\n                    for index, user_id in enumerate(user_ids):\n                        if index: cell.text(\", \")\n                        trackedbranch_user = dbutils.User.fromId(db, user_id)\n                        cell.span(\"user\").text(trackedbranch_user.name)\n\n                if default_remote:\n                    repository.default_remote = default_remote\n\n            buttons = details.div(\"buttons\")\n            buttons.button(onclick=\"addTrackedBranch(%d);\" % repository.id).text(\"Add Tracked Branch\")\n\n    paleyellow = page.utils.PaleYellowTable(body, \"Repositories\")\n    paleyellow.addCentered(render)\n\n    repositories_js = []\n\n    for repository in repositories:\n        name = htmlutils.jsify(repository.name)\n        path = htmlutils.jsify(repository.path)\n        location = htmlutils.jsify(repository.location)\n        default_remote = htmlutils.jsify(repository.default_remote)\n\n        repositories_js.append((\"%d: { name: %s, path: %s, location: %s, defaultRemoteLocation: %s }\"\n                                % (repository.id, name, path, location, default_remote)))\n\n    document.addInternalScript(\"var repositories = { %s };\" % \", \".join(repositories_js))\n\n    return document\n"
  },
  {
    "path": "src/page/search.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport urlparse\n\nimport htmlutils\nimport textutils\nimport page.utils\n\ndef renderSearch(req, db, user):\n    document = htmlutils.Document(req)\n    document.setTitle(\"Review Search\")\n\n    html = document.html()\n    head = html.head()\n    body = html.body()\n\n    page.utils.generateHeader(body, db, user, current_page=\"search\")\n\n    document.addExternalStylesheet(\"resource/search.css\")\n    document.addExternalScript(\"resource/search.js\")\n    document.addExternalScript(\"resource/autocomplete.js\")\n    document.addInternalScript(user.getJS())\n\n    cursor = db.cursor()\n    cursor.execute(\"SELECT name, fullname FROM users\")\n\n    users = dict(cursor)\n\n    document.addInternalScript(\"var users = %s;\" % textutils.json_encode(users))\n\n    def renderQuickSearch(target):\n        wrap = target.div(\"quicksearch callout\")\n        wrap.p().text(\"\"\"A Quick Search dialog can be opened from any page\n                         using the \"F\" keyboard shortcut.\"\"\")\n        wrap.p().a(href=\"/tutorial?item=search\").text(\"More information\")\n\n    def renderInput(target, label, name, placeholder=\"\"):\n        fieldset = target.fieldset(\"search-\" + name)\n        fieldset.label(\"input-label\").text(label)\n        fieldset.input(type=\"text\", name=name, placeholder=placeholder)\n\n    def renderInputWithOptions(target, label, name, options, placeholder=\"\"):\n        fieldset = target.fieldset(\"search-\" + name)\n        fieldset.label(\"input-label\").text(label)\n        checkGroup = fieldset.div(\"input-options checkbox-group\")\n        for option in options:\n            opt_label = checkGroup.label()\n            opt_label.input(type=\"checkbox\", name=option[\"name\"],\n                            checked=\"checked\" if \"checked\" in option else None)\n            opt_label.text(option[\"label\"])\n        fieldset.input(type=\"text\", name=name, placeholder=placeholder)\n\n    def renderFreetext(target):\n        options=[{ \"name\": \"freetextSummary\", \"label\": \"Summary\",\n                   \"checked\": True },\n                 { \"name\": \"freetextDescription\", \"label\": \"Description\",\n                   \"checked\": True }]\n        renderInputWithOptions(target, label=\"Search term\", name=\"freetext\",\n                               placeholder=\"free text search\", options=options)\n\n    def renderState(target):\n        state = target.fieldset(\"search-state\")\n        state.label(\"input-label\").text(\"State\")\n        select = state.select(name=\"state\")\n        select.option(value=\"\", selected=\"selected\").text(\"Any state\")\n        select.option(value=\"open\").text(\"Open\")\n        select.option(value=\"pending\").text(\"Pending\")\n        select.option(value=\"accepted\").text(\"Accepted\")\n        select.option(value=\"closed\").text(\"Finished\")\n        select.option(value=\"dropped\").text(\"Dropped\")\n\n    def renderUser(target):\n        options=[{ \"name\": \"userOwner\", \"label\": \"Owner\", \"checked\": True },\n                 { \"name\": \"userReviewer\", \"label\": \"Reviewer\" }]\n        renderInputWithOptions(target, label=\"User\", name=\"user\",\n                               placeholder=\"user name(s)\", options=options)\n\n    def renderRepository(target):\n        fieldset = target.fieldset(\"search-repository\")\n        fieldset.label(\"input-label\").text(\"Repository\")\n        page.utils.generateRepositorySelect(\n            db, user, fieldset, name=\"repository\", selected=False,\n            placeholder_text=\"Any repository\", allow_selecting_none=True)\n\n    section = body.section(\"paleyellow section\")\n    section.h1(\"section-heading\").text(\"Review Search\")\n\n    url_terms = []\n\n    for name, value in urlparse.parse_qsl(req.query):\n        if name == \"q\":\n            url_terms.append(value)\n        elif name.startswith(\"q\"):\n            url_terms.append(\"%s:%s\" % (name[1:], value))\n\n    wrap = section.div(\"flex\")\n    search = wrap.form(\"search\", name=\"search\")\n\n    if url_terms:\n        row = search.div(\"flex\")\n        query = row.fieldset(\"search-query\")\n        query.label(\"input-label\").text(\"Search query\")\n        query.input(type=\"text\", name=\"query\", value=\" \".join(url_terms))\n\n        result = section.div(\"search-result\", style=\"display: none\")\n        result.h2().text(\"Search result\")\n        result.div(\"callout\")\n    else:\n        row = search.div(\"flex\")\n        renderFreetext(row)\n        renderState(row)\n\n        renderUser(search)\n\n        row = search.div(\"flex\")\n        renderRepository(row)\n        renderInput(row, \"Branch\", \"branch\")\n\n        renderInput(search, \"Path\", \"path\")\n\n    buttons = search.div(\"search-buttons\")\n\n    if url_terms:\n        buttons.button(type=\"submit\").text(\"Search again\")\n        buttons.a(\"button\", href=\"/search\").text(\"Show full search form\")\n    else:\n        buttons.button(type=\"submit\").text(\"Search\")\n\n    renderQuickSearch(wrap)\n\n    return document\n"
  },
  {
    "path": "src/page/services.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport os\nimport socket\nimport time\nimport errno\n\nimport page.utils\nimport htmlutils\nimport dbutils\nimport configuration\nimport textutils\n\ndef renderServices(req, db, user):\n    req.content_type = \"text/html; charset=utf-8\"\n\n    document = htmlutils.Document(req)\n    document.setTitle(\"Services\")\n\n    html = document.html()\n    head = html.head()\n    body = html.body()\n\n    page.utils.generateHeader(body, db, user, current_page=\"services\")\n\n    document.addExternalStylesheet(\"resource/services.css\")\n    document.addExternalScript(\"resource/services.js\")\n    document.addInternalScript(user.getJS())\n\n    delay = 0.5\n    connected = False\n\n    while not connected and delay <= 10:\n        connection = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM)\n\n        # This loop is for the case where we just restarted the service manager\n        # via the /services UI.  The client-side script immediately reloads the\n        # page after restart, which typically leads to us trying to connect to\n        # the service manager while it's in the process of restarting.  So just\n        # try a couple of times if at first the connection fails.\n\n        try:\n            connection.connect(configuration.services.SERVICEMANAGER[\"address\"])\n            connected = True\n        except socket.error as error:\n            if error[0] in (errno.ENOENT, errno.ECONNREFUSED):\n                time.sleep(delay)\n                delay += delay\n            else: raise\n\n    if not connected:\n        raise page.utils.DisplayMessage(\"Service manager not responding!\")\n\n    connection.send(textutils.json_encode({ \"query\": \"status\" }))\n    connection.shutdown(socket.SHUT_WR)\n\n    data = \"\"\n    while True:\n        received = connection.recv(4096)\n        if not received: break\n        data += received\n\n    result = textutils.json_decode(data)\n\n    if result[\"status\"] == \"error\":\n        raise page.utils.DisplayMessage(result[\"error\"])\n\n    paleyellow = page.utils.PaleYellowTable(body, \"Services\")\n\n    def render(target):\n        table = target.table(\"services callout\")\n\n        headings = table.tr(\"headings\")\n        headings.th(\"name\").text(\"Name\")\n        headings.th(\"module\").text(\"Module\")\n        headings.th(\"pid\").text(\"PID\")\n        headings.th(\"rss\").text(\"RSS\")\n        headings.th(\"cpu\").text(\"CPU\")\n        headings.th(\"uptime\").text(\"Uptime\")\n        headings.th(\"commands\").text()\n\n        table.tr(\"spacer\").td(\"spacer\", colspan=4)\n\n        def formatUptime(seconds):\n            def inner(seconds):\n                if seconds < 60: return \"%d seconds\" % seconds\n                elif seconds < 60 * 60: return \"%d minutes\" % (seconds / 60)\n                elif seconds < 60 * 60 * 24: return \"%d hours\" % (seconds / (60 * 60))\n                else: return \"%d days\" % (seconds / (60 * 60 * 24))\n            return inner(int(seconds)).replace(\" \", \"&nbsp;\")\n\n        def formatRSS(bytes):\n            if bytes < 1024: return \"%d B\" % bytes\n            elif bytes < 1024 ** 2: return \"%.1f kB\" % (float(bytes) / 1024)\n            elif bytes < 1024 ** 3: return \"%.1f MB\" % (float(bytes) / 1024 ** 2)\n            else: return \"%.1f GB\" % (float(bytes) / 1024 ** 3)\n\n        def formatCPU(seconds):\n            minutes = int(seconds / 60)\n            seconds = seconds - minutes * 60\n            seconds = \"%2.2f\" % seconds\n            if seconds.find(\".\") == 1: seconds = \"0\" + seconds\n            return \"%d:%s\" % (minutes, seconds)\n\n        def getProcessData(pid):\n            try:\n                items = open(\"/proc/%d/stat\" % pid).read().split()\n\n                return { \"cpu\": formatCPU(float(int(items[13]) + int(items[14])) / os.sysconf(\"SC_CLK_TCK\")),\n                         \"rss\": formatRSS(int(items[23]) * os.sysconf(\"SC_PAGE_SIZE\")) }\n            except:\n                return { \"cpu\": \"N/A\",\n                         \"rss\": \"N/A\" }\n\n        for service_name, service_data in sorted(result[\"services\"].items()):\n            process_data = getProcessData(service_data[\"pid\"])\n\n            row = table.tr(\"service\")\n            row.td(\"name\").text(service_name)\n            row.td(\"module\").text(service_data[\"module\"])\n            row.td(\"pid\").text(service_data[\"pid\"] if service_data[\"pid\"] != -1 else \"(not running)\")\n            row.td(\"rss\").text(process_data[\"rss\"])\n            row.td(\"cpu\").text(process_data[\"cpu\"])\n            row.td(\"uptime\").innerHTML(formatUptime(service_data[\"uptime\"]))\n\n            commands = row.td(\"commands\")\n            commands.a(href=\"javascript:void(restartService(%s));\" % htmlutils.jsify(service_name)).text(\"[restart]\")\n            commands.a(href=\"javascript:void(getServiceLog(%s));\" % htmlutils.jsify(service_name)).text(\"[log]\")\n\n        for index, pid in enumerate(os.listdir(configuration.paths.WSGI_PIDFILE_DIR)):\n            startup = float(open(os.path.join(configuration.paths.WSGI_PIDFILE_DIR, pid)).read())\n            uptime = time.time() - startup\n\n            process_data = getProcessData(int(pid))\n\n            row = table.tr(\"service\")\n            row.td(\"name\").text(\"wsgi:%d\" % index)\n            row.td(\"module\").text()\n            row.td(\"pid\").text(pid)\n            row.td(\"rss\").text(process_data[\"rss\"])\n            row.td(\"cpu\").text(process_data[\"cpu\"])\n            row.td(\"uptime\").innerHTML(formatUptime(uptime))\n\n            commands = row.td(\"commands\")\n            commands.a(href=\"javascript:void(restartService('wsgi'));\").text(\"[restart]\")\n\n    paleyellow.addCentered(render)\n\n    return document\n"
  },
  {
    "path": "src/page/showbatch.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport page.utils\nimport dbutils\nimport reviewing.comment as review_comment\nimport reviewing.utils as review_utils\nimport htmlutils\nimport diff\n\nfrom htmlutils import jsify, htmlify\n\ndef renderShowBatch(req, db, user):\n    batch_id = req.getParameter(\"batch\", None, filter=int)\n    review_id = req.getParameter(\"review\", None, filter=int)\n\n    cursor = db.cursor()\n\n    if batch_id is None and review_id is None:\n        return page.utils.displayMessage(db, req, user, \"Missing argument: 'batch'\")\n\n    if batch_id:\n        cursor.execute(\"SELECT review, uid, comment FROM batches WHERE id=%s\", (batch_id,))\n\n        row = cursor.fetchone()\n        if not row:\n            raise page.utils.DisplayMessage(\"Invalid batch ID: %d\" % batch_id)\n\n        review_id, author_id, chain_id = row\n        author = dbutils.User.fromId(db, author_id)\n    else:\n        chain_id = None\n        author = user\n\n    review = dbutils.Review.fromId(db, review_id)\n\n    if chain_id:\n        batch_chain = review_comment.CommentChain.fromId(db, chain_id, user, review=review)\n        batch_chain.loadComments(db, user)\n    else:\n        batch_chain = None\n\n    document = htmlutils.Document(req)\n\n    html = document.html()\n    head = html.head()\n    body = html.body()\n\n    page.utils.generateHeader(body, db, user, lambda target: review_utils.renderDraftItems(db, user, review, target), extra_links=[(\"r/%d\" % review.id, \"Back to Review\")])\n\n    document.addExternalStylesheet(\"resource/showreview.css\")\n    document.addExternalStylesheet(\"resource/showbatch.css\")\n    document.addExternalStylesheet(\"resource/review.css\")\n    document.addExternalStylesheet(\"resource/comment.css\")\n    document.addExternalScript(\"resource/review.js\")\n    document.addExternalScript(\"resource/comment.js\")\n    document.addInternalScript(user.getJS())\n    document.addInternalScript(review.getJS())\n\n    if batch_chain:\n        document.addInternalScript(\"commentChainById[%d] = %s;\" % (batch_chain.id, batch_chain.getJSConstructor()))\n\n    target = body.div(\"main\")\n\n    table = target.table('paleyellow basic comments', align='center')\n    table.col(width='10%')\n    table.col(width='80%')\n    table.col(width='10%')\n    table.tr().td('h1', colspan=3).h1().text(\"Review by %s\" % htmlify(author.fullname))\n\n    if batch_chain:\n        batch_chain.loadComments(db, user)\n\n        row = table.tr(\"line\")\n        row.td(\"heading\").text(\"Comment:\")\n        row.td(\"value\").preformatted().div(\"text\").text(htmlify(batch_chain.comments[0].comment))\n        row.td(\"status\").text()\n\n    def renderFiles(title, cursor):\n        files = []\n\n        for file_id, delete_count, insert_count in cursor.fetchall():\n            files.append((dbutils.describe_file(db, file_id), delete_count, insert_count))\n\n        paths = []\n        deleted = []\n        inserted = []\n\n        for path, delete_count, insert_count in sorted(files):\n            paths.append(path)\n            deleted.append(delete_count)\n            inserted.append(insert_count)\n\n        if paths:\n            diff.File.eliminateCommonPrefixes(paths)\n\n            row = table.tr(\"line\")\n            row.td(\"heading\").text(title)\n\n            files_table = row.td().table(\"files callout\")\n            headers = files_table.thead().tr()\n            headers.th(\"path\").text(\"Changed Files\")\n            headers.th(\"lines\", colspan=2).text(\"Lines\")\n\n            files = files_table.tbody()\n            for path, delete_count, insert_count in zip(paths, deleted, inserted):\n                file = files.tr()\n                file.td(\"path\").preformatted().innerHTML(path)\n                file.td(\"lines\").preformatted().text(\"-%d\" % delete_count if delete_count else None)\n                file.td(\"lines\").preformatted().text(\"+%d\" % insert_count if insert_count else None)\n\n            row.td(\"status\").text()\n\n    def condition(table_name):\n        if batch_id:\n            return \"%s.batch=%d\" % (table_name, batch_id)\n        else:\n            return \"review=%d AND %s.batch IS NULL AND %s.uid=%d\" % (review.id, table_name, table_name, author.id)\n\n    cursor.execute(\"\"\"SELECT reviewfiles.file, SUM(deleted), SUM(inserted)\n                        FROM reviewfiles\n                        JOIN reviewfilechanges ON (reviewfilechanges.file=reviewfiles.id)\n                       WHERE %s\n                         AND reviewfilechanges.to_state='reviewed'\n                    GROUP BY reviewfiles.file\"\"\" % condition(\"reviewfilechanges\"))\n    renderFiles(\"Reviewed:\", cursor)\n\n    cursor.execute(\"\"\"SELECT reviewfiles.file, SUM(deleted), SUM(inserted)\n                        FROM reviewfiles\n                        JOIN reviewfilechanges ON (reviewfilechanges.file=reviewfiles.id)\n                       WHERE %s\n                         AND reviewfilechanges.to_state='pending'\n                    GROUP BY reviewfiles.file\"\"\" % condition(\"reviewfilechanges\"))\n    renderFiles(\"Unreviewed:\", cursor)\n\n    def renderChains(title, cursor, replies):\n        all_chains = [review_comment.CommentChain.fromId(db, chain_id, user, review=review)\n                      for (chain_id,) in cursor]\n\n        if not all_chains:\n            return\n\n        for chain in all_chains:\n            chain.loadComments(db, user)\n\n        issue_chains = filter(lambda chain: chain.type == \"issue\", all_chains)\n        draft_issues = filter(lambda chain: chain.state == \"draft\", issue_chains)\n        open_issues = filter(lambda chain: chain.state == \"open\", issue_chains)\n        addressed_issues = filter(lambda chain: chain.state == \"addressed\", issue_chains)\n        closed_issues = filter(lambda chain: chain.state == \"closed\", issue_chains)\n        note_chains = filter(lambda chain: chain.type == \"note\", all_chains)\n        draft_notes = filter(lambda chain: chain.state == \"draft\" and chain != batch_chain, note_chains)\n        open_notes = filter(lambda chain: chain.state == \"open\" and chain != batch_chain, note_chains)\n\n        def renderChains(target, chains):\n            for chain in chains:\n                row = target.tr(\"comment\")\n                row.td(\"author\").text(chain.user.fullname)\n                row.td(\"title\").a(href=\"showcomment?chain=%d\" % chain.id).innerHTML(chain.leader())\n                row.td(\"when\").text(chain.when())\n\n        def showcomments(filter_param):\n            params = { \"review\": review.id, \"filter\": filter_param }\n            if batch_id:\n                params[\"batch\"] = batch_id\n            return htmlutils.URL(\"/showcomments\", **params)\n\n        if draft_issues or open_issues or addressed_issues or closed_issues:\n            h2 = table.tr().td(\"h2\", colspan=3).h2().text(title)\n            if len(draft_issues) + len(open_issues) + len(addressed_issues) + len(closed_issues) > 1:\n                h2.a(href=showcomments(\"issues\")).text(\"[display all]\")\n\n            if draft_issues:\n                h3 = table.tr(id=\"draft-issues\").td(\"h3\", colspan=3).h3().text(\"Draft issues\")\n                if len(draft_issues) > 1:\n                    h3.a(href=showcomments(\"draft-issues\")).text(\"[display all]\")\n                renderChains(table, draft_issues)\n\n            if batch_id is not None or replies:\n                if open_issues:\n                    h3 = table.tr(id=\"open-issues\").td(\"h3\", colspan=3).h3().text(\"Still open issues\")\n                    if batch_id and len(open_issues) > 1:\n                        h3.a(href=showcomments(\"open-issues\")).text(\"[display all]\")\n                    renderChains(table, open_issues)\n\n                if addressed_issues:\n                    h3 = table.tr(id=\"addressed-issues\").td(\"h3\", colspan=3).h3().text(\"Now addressed issues\")\n                    if batch_id and len(addressed_issues) > 1:\n                        h3.a(href=showcomments(\"addressed-issues\")).text(\"[display all]\")\n                    renderChains(table, addressed_issues)\n\n                if closed_issues:\n                    h3 = table.tr(id=\"closed-issues\").td(\"h3\", colspan=3).h3().text(\"Now closed issues\")\n                    if batch_id and len(closed_issues) > 1:\n                        h3.a(href=showcomments(\"closed-issues\")).text(\"[display all]\")\n                    renderChains(table, closed_issues)\n\n        if draft_notes or open_notes:\n            h2 = table.tr().td(\"h2\", colspan=3).h2().text(title)\n            if len(draft_notes) + len(open_notes) > 1:\n                h2.a(href=showcomments(\"notes\")).text(\"[display all]\")\n\n            if draft_notes:\n                h3 = table.tr(id=\"draft-notes\").td(\"h3\", colspan=3).h3().text(\"Draft notes\")\n                if len(draft_notes) > 1:\n                    h3.a(href=showcomments(\"draft-notes\")).text(\"[display all]\")\n                renderChains(table, draft_notes)\n\n            if open_notes:\n                h3 = table.tr(id=\"notes\").td(\"h3\", colspan=3).h3().text(\"Notes\")\n                if batch_id and len(open_notes) > 1:\n                    h3.a(href=showcomments(\"open-notes\")).text(\"[display all]\")\n                renderChains(table, open_notes)\n\n    cursor.execute(\"SELECT id FROM commentchains WHERE %s AND type='issue'\" % condition(\"commentchains\"))\n\n    renderChains(\"Raised issues\", cursor, False)\n\n    cursor.execute(\"\"\"SELECT commentchains.id\n                        FROM commentchains\n                        JOIN commentchainchanges ON (commentchainchanges.chain=commentchains.id)\n                       WHERE %s\n                         AND to_state='closed'\"\"\" % condition(\"commentchainchanges\"))\n\n    renderChains(\"Resolved issues\", cursor, False)\n\n    cursor.execute(\"\"\"SELECT commentchains.id\n                        FROM commentchains\n                        JOIN commentchainchanges ON (commentchainchanges.chain=commentchains.id)\n                       WHERE %s\n                         AND to_state='open'\"\"\" % condition(\"commentchainchanges\"))\n\n    renderChains(\"Reopened issues\", cursor, False)\n\n    cursor.execute(\"\"\"SELECT commentchains.id\n                        FROM commentchains\n                        JOIN commentchainchanges ON (commentchainchanges.chain=commentchains.id)\n                       WHERE %s\n                         AND to_type='issue'\"\"\" % condition(\"commentchainchanges\"))\n\n    renderChains(\"Converted into issues\", cursor, False)\n\n    cursor.execute(\"\"\"SELECT commentchains.id\n                        FROM commentchains\n                        JOIN commentchainchanges ON (commentchainchanges.chain=commentchains.id)\n                       WHERE %s\n                         AND to_type='note'\"\"\" % condition(\"commentchainchanges\"))\n\n    renderChains(\"Converted into notes\", cursor, False)\n\n    cursor.execute(\"SELECT id FROM commentchains WHERE %s AND type='note'\" % condition(\"commentchains\"))\n\n    renderChains(\"Written notes\", cursor, False)\n\n    cursor.execute(\"\"\"SELECT commentchains.id\n                        FROM commentchains\n                        JOIN comments ON (comments.chain=commentchains.id)\n                       WHERE %s\n                         AND comments.id!=commentchains.first_comment\"\"\" % condition(\"comments\"))\n\n    renderChains(\"Replied to\", cursor, True)\n\n    return document\n"
  },
  {
    "path": "src/page/showbranch.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport page.utils\nimport gitutils\nimport dbutils\nimport htmlutils\nimport configuration\nimport request\n\nimport log.html as log_html\n\ndef renderShowBranch(req, db, user):\n    branch_name = req.getParameter(\"branch\")\n    base_name = req.getParameter(\"base\", None)\n    review_id = req.getParameter(\"review\", None, filter=int)\n\n    repository = req.getParameter(\"repository\", user.getPreference(db, \"defaultRepository\"))\n    if not repository:\n        raise request.MissingParameter(\"repository\")\n    repository = gitutils.Repository.fromParameter(db, repository)\n\n    cursor = db.cursor()\n\n    cursor.execute(\"SELECT id, type, base, head, tail FROM branches WHERE name=%s AND repository=%s\", (branch_name, repository.id))\n\n    try:\n        branch_id, branch_type, base_id, head_id, tail_id = cursor.fetchone()\n    except:\n        return page.utils.displayMessage(db, req, user, \"'%s' doesn't name a branch!\" % branch_name)\n\n    branch = dbutils.Branch.fromName(db, repository, branch_name)\n    rebased = False\n\n    if base_name:\n        base = dbutils.Branch.fromName(db, repository, base_name)\n\n        if base is None:\n            return page.utils.displayMessage(db, req, user, \"'%s' doesn't name a branch!\" % base_name)\n\n        old_count, new_count, base_old_count, base_new_count = branch.rebase(db, base)\n\n        if base_old_count is not None:\n            new_base_base_name = base.base.name\n        else:\n            new_base_base_name = None\n\n        rebased = True\n\n    document = htmlutils.Document(req)\n\n    html = document.html()\n    head = html.head()\n    body = html.body()\n\n    document.addExternalStylesheet(\"resource/showbranch.css\")\n\n    def renderCreateReview(target):\n        if not user.isAnonymous() and branch and branch.review is None and not rebased:\n            url = htmlutils.URL(\"/createreview\", repository=repository.id, branch=branch_name)\n            target.a(\"button\", href=url).text(\"Create Review\")\n\n    if review_id is not None:\n        review = dbutils.Review.fromId(db, review_id)\n    else:\n        review = None\n\n    if review:\n        extra_links = [(\"r/%d\" % review.id, \"Back to Review\")]\n        document.addInternalScript(review.getJS())\n    else:\n        extra_links = []\n\n    page.utils.generateHeader(body, db, user, renderCreateReview, extra_links=extra_links)\n\n    document.addInternalScript(branch.getJS())\n\n    title_right = None\n\n    if rebased:\n        def renderPerformRebase(db, target):\n            target.button(\"perform\", onclick=\"rebase(%s, %s, %s, %s, %s, %s, %s)\" % tuple(map(htmlutils.jsify, [branch_name, base_name, new_base_base_name, old_count, new_count, base_old_count, base_new_count]))).text(\"Perform Rebase\")\n\n        title_right = renderPerformRebase\n    elif base_id is not None:\n        bases = []\n        base = branch.base\n\n        if base:\n            if base.type == \"review\":\n                bases.append(\"master\")\n            else:\n                base = base.base\n                while base:\n                    bases.append(base.name)\n                    base = base.base\n\n        cursor.execute(\"SELECT name FROM branches WHERE base=%s\", (branch.id,))\n\n        for (name,) in cursor:\n            bases.append(name)\n\n        def renderSelectBase(db, target):\n            select = target.select(\"base\")\n            select.option(value=\"*\").text(\"Select new base\")\n            select.option(value=\"*\").text(\"---------------\")\n\n            for name in bases:\n                select.option(\"base\", value=name.split(\" \")[0]).text(name)\n\n        if not bases and branch.base:\n            cursor.execute(\"SELECT commit FROM reachable WHERE branch=%s\", (branch.id,))\n\n            commit_ids = [commit_id for (commit_id,) in cursor]\n\n            for commit_id in commit_ids:\n                cursor.execute(\"SELECT 1 FROM reachable WHERE branch=%s AND commit=%s\", (branch.base.id, commit_id))\n                if cursor.fetchone():\n                    bases.append(\"%s (trim)\" % branch.base.name)\n                    break\n\n        if bases:\n            title_right = renderSelectBase\n\n    target = body.div(\"main\")\n\n    if branch_type == 'normal':\n        cursor.execute(\"SELECT COUNT(*) FROM reachable WHERE branch=%s\", (branch_id,))\n\n        commit_count = cursor.fetchone()[0]\n        if commit_count > configuration.limits.MAXIMUM_REACHABLE_COMMITS:\n            offset = req.getParameter(\"offset\", default=0, filter=int)\n            limit = req.getParameter(\"limit\", default=200, filter=int)\n\n            head = gitutils.Commit.fromId(db, repository, head_id)\n            tail = gitutils.Commit.fromId(db, repository, tail_id) if tail_id else None\n\n            sha1s = repository.revlist([head], [tail] if tail else [], \"--skip=%d\" % offset, \"--max-count=%d\" % limit)\n            commits = [gitutils.Commit.fromSHA1(db, repository, sha1) for sha1 in sha1s]\n\n            def moreCommits(db, target):\n                target.a(href=\"/log?branch=%s&offset=%d&limit=%d\" % (branch_name, offset + limit, limit)).text(\"More commits...\")\n\n            log_html.renderList(db, target, branch.name, commits, title_right=title_right, bottom_right=moreCommits)\n\n            return document\n\n    log_html.render(db, target, branch.name, branch=branch, title_right=title_right)\n\n    return document\n"
  },
  {
    "path": "src/page/showcomment.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport dbutils\nimport htmlutils\nimport page.utils\nimport profiling\nimport linkify\n\nimport reviewing.comment as review_comment\nimport reviewing.utils as review_utils\nimport reviewing.html as review_html\nimport changeset.utils as changeset_utils\n\nimport operation.blame\nimport log.commitset\n\ndef renderShowComment(req, db, user):\n    chain_id = req.getParameter(\"chain\", filter=int)\n    context_lines = req.getParameter(\"context\", user.getPreference(db, \"comment.diff.contextLines\"), filter=int)\n\n    default_compact = \"yes\" if user.getPreference(db, \"commit.diff.compactMode\") else \"no\"\n    compact = req.getParameter(\"compact\", default_compact) == \"yes\"\n\n    default_tabify = \"yes\" if user.getPreference(db, \"commit.diff.visualTabs\") else \"no\"\n    tabify = req.getParameter(\"tabify\", default_tabify) == \"yes\"\n\n    original = req.getParameter(\"original\", \"no\") == \"yes\"\n\n    chain = review_comment.CommentChain.fromId(db, chain_id, user)\n\n    if chain is None or chain.state == \"empty\":\n        raise page.utils.DisplayMessage(\"Invalid comment chain ID: %d\" % chain_id)\n\n    review = chain.review\n\n    document = htmlutils.Document(req)\n\n    html = document.html()\n    head = html.head()\n    body = html.body()\n\n    document.setTitle(\"%s in review %s\" % (chain.title(False), review.branch.name))\n\n    def renderHeaderItems(target):\n        review_utils.renderDraftItems(db, user, review, target)\n        target.div(\"buttons\").span(\"buttonscope buttonscope-global\")\n\n    page.utils.generateHeader(body, db, user, renderHeaderItems, extra_links=[(\"r/%d\" % review.id, \"Back to Review\")])\n\n    document.addExternalScript(\"resource/showcomment.js\")\n    document.addInternalScript(user.getJS(db))\n    document.addInternalScript(review.repository.getJS())\n    document.addInternalScript(review.getJS())\n    document.addInternalScript(\"var contextLines = %d;\" % context_lines)\n    document.addInternalScript(\"var keyboardShortcuts = %s;\" % (user.getPreference(db, \"ui.keyboardShortcuts\") and \"true\" or \"false\"))\n\n    if not user.isAnonymous() and user.name == req.user:\n        document.addInternalScript(\"$(function () { markChainsAsRead([%d]); });\" % chain_id)\n\n    review_html.renderCommentChain(db, body.div(\"main\"), user, review, chain, context_lines=context_lines, compact=compact, tabify=tabify, original=original, linkify=linkify.Context(db=db, request=req, review=review))\n\n    if user.getPreference(db, \"ui.keyboardShortcuts\"):\n        page.utils.renderShortcuts(body, \"showcomment\", review=review)\n\n    yield document.render(pretty=not compact)\n\ndef renderShowComments(req, db, user):\n    context_lines = req.getParameter(\"context\", user.getPreference(db, \"comment.diff.contextLines\"), filter=int)\n\n    default_compact = \"yes\" if user.getPreference(db, \"commit.diff.compactMode\") else \"no\"\n    compact = req.getParameter(\"compact\", default_compact) == \"yes\"\n\n    default_tabify = \"yes\" if user.getPreference(db, \"commit.diff.visualTabs\") else \"no\"\n    tabify = req.getParameter(\"tabify\", default_tabify) == \"yes\"\n\n    original = req.getParameter(\"original\", \"no\") == \"yes\"\n\n    review_id = req.getParameter(\"review\", filter=int)\n    batch_id = req.getParameter(\"batch\", None, filter=int)\n    filter = req.getParameter(\"filter\", \"all\")\n    blame = req.getParameter(\"blame\", None)\n\n    profiler = profiling.Profiler()\n\n    review = dbutils.Review.fromId(db, review_id)\n    review.repository.enableBlobCache()\n\n    cursor = db.cursor()\n\n    profiler.check(\"create review\")\n\n    if blame is not None:\n        blame_user = dbutils.User.fromName(db, blame)\n\n        cursor.execute(\"\"\"SELECT commentchains.id\n                            FROM commentchains\n                            JOIN commentchainlines ON (commentchainlines.chain=commentchains.id)\n                            JOIN fileversions ON (fileversions.new_sha1=commentchainlines.sha1)\n                            JOIN changesets ON (changesets.id=fileversions.changeset)\n                            JOIN commits ON (commits.id=changesets.child)\n                            JOIN gitusers ON (gitusers.id=commits.author_gituser)\n                            JOIN usergitemails USING (email)\n                            JOIN reviewchangesets ON (reviewchangesets.changeset=changesets.id AND reviewchangesets.review=commentchains.review)\n                           WHERE commentchains.review=%s\n                             AND usergitemails.uid=%s\n                             AND commentchains.state!='empty'\n                             AND (commentchains.state!='draft' OR commentchains.uid=%s)\n                        ORDER BY commentchains.file, commentchainlines.first_line\"\"\",\n                       (review.id, blame_user.id, user.id))\n\n        include_chain_ids = set([chain_id for (chain_id,) in cursor])\n\n        profiler.check(\"initial blame filtering\")\n    else:\n        include_chain_ids = None\n\n    if filter == \"toread\":\n        query = \"\"\"SELECT commentchains.id\n                     FROM commentchains\n                     JOIN comments ON (comments.chain=commentchains.id)\n                     JOIN commentstoread ON (commentstoread.comment=comments.id)\n          LEFT OUTER JOIN commentchainlines ON (commentchainlines.chain=commentchains.id)\n                    WHERE review=%s\n                      AND commentstoread.uid=%s\n                 ORDER BY file, first_line\"\"\"\n\n        cursor.execute(query, (review.id, user.id))\n    else:\n        query = \"\"\"SELECT commentchains.id\n                     FROM commentchains\n          LEFT OUTER JOIN commentchainlines ON (chain=id)\n                    WHERE review=%s\n                      AND commentchains.state!='empty'\"\"\"\n\n        arguments = [review.id]\n\n        if filter == \"issues\":\n            query += \" AND type='issue' AND (commentchains.state!='draft' OR commentchains.uid=%s)\"\n            arguments.append(user.id)\n        elif filter == \"draft-issues\":\n            query += \" AND type='issue' AND commentchains.state='draft' AND commentchains.uid=%s\"\n            arguments.append(user.id)\n        elif filter == \"open-issues\":\n            query += \" AND type='issue' AND commentchains.state='open'\"\n        elif filter == \"addressed-issues\":\n            query += \" AND type='issue' AND commentchains.state='addressed'\"\n        elif filter == \"closed-issues\":\n            query += \" AND type='issue' AND commentchains.state='closed'\"\n        elif filter == \"notes\":\n            query += \" AND type='note' AND (commentchains.state!='draft' OR commentchains.uid=%s)\"\n            arguments.append(user.id)\n        elif filter == \"draft-notes\":\n            query += \" AND type='note' AND commentchains.state='draft' AND commentchains.uid=%s\"\n            arguments.append(user.id)\n        elif filter == \"open-notes\":\n            query += \" AND type='note' AND commentchains.state='open'\"\n        else:\n            query += \" AND (commentchains.state!='draft' OR commentchains.uid=%s)\"\n            arguments.append(user.id)\n\n        if batch_id is not None:\n            query += \" AND batch=%s\"\n            arguments.append(batch_id)\n\n        # This ordering is inaccurate if comments apply to the same file but\n        # different commits, but then, in that case there isn't really a\n        # well-defined natural order either.  Two comments that apply to the\n        # same file and commit will at least be order by line number, and that's\n        # better than nothing.\n        query += \" ORDER BY file, first_line\"\n\n        cursor.execute(query, arguments)\n\n    profiler.check(\"main query\")\n\n    if include_chain_ids is None:\n        chain_ids = [chain_id for (chain_id,) in cursor]\n    else:\n        chain_ids = [chain_id for (chain_id,) in cursor if chain_id in include_chain_ids]\n\n    profiler.check(\"query result\")\n\n    document = htmlutils.Document(req)\n\n    html = document.html()\n    head = html.head()\n    body = html.body()\n\n    document.addInternalScript(user.getJS(db))\n    document.addInternalScript(review.getJS())\n\n    page.utils.generateHeader(body, db, user, lambda target: review_utils.renderDraftItems(db, user, review, target), extra_links=[(\"r/%d\" % review.id, \"Back to Review\")])\n\n    profiler.check(\"page header\")\n\n    target = body.div(\"main\")\n\n    if chain_ids and not user.isAnonymous() and user.name == req.user:\n        document.addInternalScript(\"$(function () { markChainsAsRead([%s]); });\" % \", \".join(map(str, chain_ids)))\n\n    if chain_ids:\n        processed = set()\n\n        chains = []\n        file_ids = set()\n        changesets_files = {}\n        changesets = {}\n\n        if blame is not None:\n            annotators = {}\n            commits = log.commitset.CommitSet(review.branch.getCommits(db))\n\n        for chain_id in chain_ids:\n            if chain_id in processed:\n                continue\n            else:\n                processed.add(chain_id)\n\n                chain = review_comment.CommentChain.fromId(db, chain_id, user, review=review)\n                chains.append(chain)\n\n                if chain.file_id is not None:\n                    file_ids.add(chain.file_id)\n                    parent, child = review_html.getCodeCommentChainChangeset(db, chain, original)\n                    if parent and child:\n                        changesets_files.setdefault((parent, child), set()).add(chain.file_id)\n\n        profiler.check(\"load chains\")\n\n        changeset_cache = {}\n\n        for (from_commit, to_commit), filtered_file_ids in changesets_files.items():\n            changesets[(from_commit, to_commit)] = changeset_utils.createChangeset(db, user, review.repository, from_commit=from_commit, to_commit=to_commit, filtered_file_ids=filtered_file_ids)[0]\n            profiler.check(\"create changesets\")\n\n            if blame is not None:\n                annotators[(from_commit, to_commit)] = operation.blame.LineAnnotator(db, from_commit, to_commit, file_ids=file_ids, commits=commits, changeset_cache=changeset_cache)\n                profiler.check(\"create annotators\")\n\n        for chain in chains:\n            if blame is not None and chain.file_id is not None:\n                try:\n                    changeset = changesets[(chain.first_commit, chain.last_commit)]\n                    annotator = annotators[(chain.first_commit, chain.last_commit)]\n                except KeyError:\n                    # Most likely a comment created via /showfile.  Such a\n                    # comment could be in code that 'blame_user' modified in the\n                    # review, but for now, let's skip the comment.\n                    continue\n                else:\n                    file_in_changeset = changeset.getFile(chain.file_id)\n\n                    if not file_in_changeset:\n                        continue\n\n                    try:\n                        offset, count = chain.lines_by_sha1[file_in_changeset.new_sha1]\n                    except KeyError:\n                        # Probably a chain raised against the \"old\" side of the diff.\n                        continue\n                    else:\n                        if not annotator.annotate(chain.file_id, offset, offset + count - 1, check_user=blame_user):\n                            continue\n\n            profiler.check(\"detailed blame filtering\")\n\n            if chain.file_id is not None:\n                from_commit, to_commit = review_html.getCodeCommentChainChangeset(db, chain, original)\n                changeset = changesets.get((from_commit, to_commit))\n            else:\n                changeset = None\n\n            review_html.renderCommentChain(db, target, user, review, chain,\n                                           context_lines=context_lines,\n                                           compact=compact,\n                                           tabify=tabify,\n                                           original=original,\n                                           changeset=changeset,\n                                           linkify=linkify.Context(db=db, request=req, review=review))\n\n            profiler.check(\"rendering\")\n\n            yield document.render(stop=target, pretty=not compact) + \"<script>console.log((new Date).toString());</script>\"\n\n            profiler.check(\"transfer\")\n\n        page.utils.renderShortcuts(target, \"showcomments\", review=review)\n    else:\n        target.h1(align=\"center\").text(\"No comments.\")\n\n    profiler.output(db, user, document)\n\n    yield document.render(pretty=not compact)\n"
  },
  {
    "path": "src/page/showcommit.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport htmlutils\nimport page.utils\nimport dbutils\nimport gitutils\nimport diff\nimport changeset.html as changeset_html\nimport changeset.utils as changeset_utils\nimport changeset.detectmoves as changeset_detectmoves\nimport reviewing.utils as review_utils\nimport reviewing.comment as review_comment\nimport reviewing.filters as review_filters\nimport log.html as log_html\nfrom log.commitset import CommitSet\nimport profiling\nimport re\n\ndef renderCommitInfo(db, target, user, repository, review, commit, conflicts=False, minimal=False):\n    cursor = db.cursor()\n\n    msg = commit.message.splitlines()\n\n    commit_info = target.table(\"commit-info\")\n\n    def outputBranches(target, commit):\n        cursor.execute(\"\"\"SELECT branches.name, reviews.id\n                            FROM branches\n                            JOIN reachable ON (reachable.branch=branches.id)\n                            JOIN commits ON (commits.id=reachable.commit)\n                 LEFT OUTER JOIN reviews ON (reviews.branch=branches.id)\n                           WHERE branches.repository=%s\n                             AND commits.sha1=%s\"\"\",\n                       (repository.id, commit.sha1))\n\n        for branch, review_id in cursor:\n            span = cell.span(\"branch\")\n\n            if review_id is None:\n                url = htmlutils.URL(\"/log\", repository=repository.id, branch=branch)\n                title = branch\n            else:\n                url = \"/r/%d\" % review_id\n                title = url\n\n            span.text(\"[\")\n            span.a(\"branch\", href=url).text(title)\n            span.text(\"]\")\n\n    def outputTags(target, commit):\n        cursor.execute(\"SELECT name FROM tags WHERE repository=%s AND sha1=%s\",\n                       (repository.id, commit.sha1))\n\n        for (tag,) in cursor:\n            target.span(\"tag\").text(\"[%s]\" % tag)\n\n    if len(commit.parents) > 1:\n        row = commit_info.tr(\"commit-info\")\n        row.th(align='right').text(\"Alternate view:\")\n\n        review_arg = \"&review=%d\" % review.id if review else \"\"\n\n        if conflicts:\n            row.td(align='left').a(href=\"/showcommit?sha1=%s&repository=%d%s\" % (commit.sha1, repository.id, review_arg)).text(\"display changes relative to parents\")\n        else:\n            row.td(align='left').a(href=\"/showcommit?sha1=%s&repository=%d%s&conflicts=yes\" % (commit.sha1, repository.id, review_arg)).text(\"display conflict resolution changes\")\n\n    row = commit_info.tr(\"commit-info\")\n    row.th(align='right').text(\"SHA-1:\")\n    cell = row.td(align='left')\n\n    if minimal:\n        cell.a(href=\"/%s/%s?review=%d\" % (repository.name, commit.sha1, review.id)).text(commit.sha1)\n    else:\n        cell.text(commit.sha1)\n\n    if repository.name != user.getPreference(db, \"defaultRepository\"):\n        cell.text(\" in \")\n        cell.b().text(repository.getURL(db, user))\n\n    if not minimal:\n        if review: review_arg = \"&review=%d\" % review.id\n        else: review_arg = \"\"\n\n        span = cell.span(\"links\").span(\"link\")\n        span.text(\"[\")\n        span.a(\"link\", href=\"/showtree?sha1=%s%s\" % (commit.sha1, review_arg)).innerHTML(\"browse&nbsp;tree\")\n        span.text(\"]\")\n\n    if not review:\n        outputBranches(cell.span(\"branches\"), commit)\n        outputTags(cell.span(\"tags\"), commit)\n\n    if minimal or commit.author.email != commit.committer.email or commit.author.time != commit.committer.time:\n        row = commit_info.tr(\"commit-info\")\n        row.th(align='right').text(\"Author:\")\n        row.td(align='left').text(str(commit.author))\n\n        if not minimal:\n            row = commit_info.tr(\"commit-info\")\n            row.th(align='right').text(\"Commit:\")\n            row.td(align='left').text(str(commit.committer))\n    else:\n        row = commit_info.tr(\"commit-info\")\n        row.th(align='right').text(\"Author/Commit:\")\n        row.td(align='left').text(str(commit.author))\n\n    if not minimal:\n        if review: review_url_contribution = \"?review=%d\" % review.id\n        else: review_url_contribution = \"\"\n\n        for parent_sha1 in commit.parents:\n            parent = gitutils.Commit.fromSHA1(db, repository, parent_sha1)\n\n            if not review or review.containsCommit(db, parent):\n                parent_href = \"/%s/%s%s\" % (repository.name, parent.sha1, review_url_contribution)\n\n                row = commit_info.tr(\"commit-info\")\n                row.th(align='right').text(\"Parent:\")\n                cell = row.td(align='left')\n                cell.a(href=parent_href, rel=\"previous\").text(\"%s\" % parent.niceSummary())\n                cell.setLink(\"previous\", parent_href)\n\n                if not review:\n                    outputBranches(cell.span(\"branches\"), parent)\n                    outputTags(cell.span(\"tags\"), parent)\n\n        cursor.execute(\"SELECT child FROM edges WHERE parent=%s\", [commit.id])\n        child_ids = cursor.fetchall()\n\n        for (child_id,) in child_ids:\n            if not review or review.containsCommit(db, child_id):\n                try: child = gitutils.Commit.fromId(db, repository, child_id)\n                except: continue\n\n                child_href = \"/%s/%s%s\" % (repository.name, child.sha1, review_url_contribution)\n\n                row = commit_info.tr(\"commit-info\")\n                row.th(align='right').text(\"Child:\")\n                cell = row.td(align='left')\n                cell.a(href=child_href).text(\"%s\" % child.niceSummary())\n\n                if len(child_ids) == 1: cell.setLink(\"next\", child_href)\n\n                if not review:\n                    outputBranches(cell.span(\"branches\"), child)\n                    outputTags(cell.span(\"tags\"), child)\n\n    def linkToCommit(commit):\n        if review:\n            cursor.execute(\"SELECT 1 FROM commits JOIN changesets ON (child=commits.id) JOIN reviewchangesets ON (changeset=changesets.id) WHERE sha1=%s AND review=%s\", (commit.sha1, review.id))\n            if cursor.fetchone():\n                return \"%s/%s?review=%d\" % (repository.name, commit.sha1, review.id)\n        return \"%s/%s\" % (repository.name, commit.sha1)\n\n    highlight_index = 0\n\n    if msg[0].startswith(\"fixup!\") or msg[0].startswith(\"squash!\"):\n        for candidate_index, line in enumerate(msg[1:]):\n            if line.strip():\n                highlight_index = candidate_index + 1\n                break\n\n    commit_msg = commit_info.tr(\"commit-msg\").td(colspan=2).table(\"commit-msg\", cellspacing=0)\n    for index, text in enumerate(msg):\n        className = \"line single\"\n        if index == 0: className += \" first\"\n        elif index == len(msg) - 1: className += \" last\"\n        if index < highlight_index or len(commit.parents) > 1:\n            lengthLimit = None\n        elif index == highlight_index:\n            lengthLimit = \"60-80\"\n        else:\n            lengthLimit = \"70-90\"\n        if index == highlight_index:\n            className += \" highlight\"\n        row = commit_msg.tr(className)\n        row.td(\"edge\").text()\n        cell = row.td(\"line single commit-msg\", id=\"msg%d\" % index, critic_length_limit=lengthLimit)\n        if text: cell.preformatted().text(text, linkify=linkToCommit, repository=repository)\n        else: cell.text()\n        row.td(\"edge\").text()\n\n    commit_msg.script(type=\"text/javascript\").text(\"applyLengthLimit($(\\\"table.commit-msg td.line.commit-msg\\\"))\");\n\n    if review:\n        chains = review_comment.loadCommentChains(db, review, user, commit=commit)\n\n        for chain in chains:\n            commit_info.addInternalScript(\"commentChains.push(%s);\" % chain.getJSConstructor(commit.sha1))\n\ndef renderCommitFiles(db, target, user, repository, review, changeset=None, changesets=None, file_id=\"f%d\", approve_file_id=\"a%d\", parent_index=None, nparents=1, conflicts=False, files=None):\n    def countChanges(file):\n        delete_count = 0\n        insert_count = 0\n        if file.chunks:\n            for chunk in file.chunks:\n                delete_count += chunk.delete_count\n                insert_count += chunk.insert_count\n        return delete_count, insert_count\n\n    commit_files = target.table(\"commit-files\", cellspacing=0)\n\n    if nparents > 1:\n        def getpath(x): return x[1]\n        def setpath(x, p): x[1] = p\n\n        diff.File.eliminateCommonPrefixes(files, getpath=getpath, setpath=setpath)\n\n        for data in files:\n            in_parent = data[2]\n            for index in range(len(in_parent)):\n                file_in_parent = in_parent[index]\n                if file_in_parent:\n                    in_parent[index] = (file_in_parent, countChanges(file_in_parent))\n                else:\n                    in_parent[index] = (None, None)\n\n        section = commit_files.thead()\n        row = section.tr(\"parents\")\n\n        if review:\n            row.th(\"approve\").text(\"Reviewed\")\n\n        row.th().text(\"Changed Files\")\n\n        review_files = []\n\n        for index in range(nparents):\n            if conflicts and index + 1 == nparents: text = \"Conflicts\"\n            else: text = \"Parent %d\" % (index + 1)\n\n            row.th(\"parent\", colspan=2).text(text)\n\n            if review:\n                review_files.append(changesets[index].getReviewFiles(db, user, review))\n\n        if review:\n            row.th(\"reviewed-by\").text(\"Reviewed By\")\n\n        section = commit_files.tbody()\n\n        for file_id, file_path, in_parent in files:\n            row = section.tr(critic_file_id=file_id)\n            fully_approved = True\n\n            if review:\n                approve = row.td(\"approve file\")\n                reviewers = {}\n\n                for index, (file, lines) in enumerate(in_parent):\n                    if file:\n                        span = approve.span(\"parent%d\" % index)\n\n                        if review_files[index].has_key(file.id):\n                            review_file = review_files[index][file.id]\n                            can_approve = review_file[0]\n                            is_approved = review_file[1] == \"reviewed\"\n\n                            for user_id in review_file[2]:\n                                reviewers[user_id] = dbutils.User.fromId(db, user_id)\n\n                            if not is_approved: fully_approved = False\n                        else:\n                            can_approve = False\n                            is_approved = True\n\n                        if can_approve:\n                            if is_approved: checked = \"checked\"\n                            else: checked = None\n                            input = span.input(type=\"checkbox\", critic_parent_index=index, id=\"p%da%d\" % (index, file.id), checked=checked)\n                        elif not is_approved:\n                            span.text(\"pending\")\n\n            row.td(\"path\").a(href=\"#f%d\" % file_id).innerHTML(file_path)\n\n            for index, (file, lines) in enumerate(in_parent):\n                if file:\n                    if file.isBinaryChanges():\n                        row.td(\"parent\", colspan=2, critic_parent_index=index).i().text(\"binary\")\n                    elif file.isEmptyFile():\n                        row.td(\"parent\", colspan=2, critic_parent_index=index).i().text(\"empty\")\n                    elif file.old_mode == \"160000\" and file.new_mode == \"160000\":\n                        if conflicts and index + 1 == nparents:\n                            row.td(colspan=2).text()\n                        else:\n                            module_repository = repository.getModuleRepository(db, changesets[index].child, file.path)\n                            if module_repository:\n                                url = \"/showcommit?repository=%d&from=%s&to=%s\" % (module_repository.id, file.old_sha1, file.new_sha1)\n                                row.td(\"parent\", critic_parent_index=index, colspan=2).i().a(href=url).text(\"updated submodule\")\n                            else:\n                                row.td(\"parent\", critic_parent_index=index, colspan=2).i().text(\"updated submodule\")\n                    else:\n                        row.td(\"parent\", critic_parent_index=index).text(lines[0] and \"-%d\" % lines[0] or \"\")\n                        row.td(\"parent\", critic_parent_index=index).text(lines[1] and \"+%d\" % lines[1] or \"\")\n                else:\n                    row.td(colspan=2).text()\n\n            if review:\n                cell = row.td(\"reviewed-by\")\n                names = sorted([user.fullname for user in reviewers.values()])\n                if names:\n                    if fully_approved: cell.text(\", \".join(names))\n                    else:  cell.text(\"( \" + \", \".join(names) + \" )\")\n                else: cell.text()\n\n        return\n\n    paths = diff.File.eliminateCommonPrefixes([file.path for file in changeset.files])\n    changes = map(countChanges, changeset.files)\n\n    if review:\n        review_files = changeset.getReviewFiles(db, user, review)\n\n    additional = False\n    for file in changeset.files:\n        if (file.old_mode and file.new_mode and file.old_mode != file.new_mode) or (file.wasRemoved() and file.old_mode) or (file.wasAdded() and file.new_mode):\n            additional = True\n            break\n        elif (file.old_mode and file.old_mode == \"160000\") or (file.new_mode and file.new_mode == \"160000\"):\n            additional = True\n\n    section = commit_files.thead()\n    row = section.tr()\n    ncolumns = 3\n\n    if review:\n        row.th(\"approve\").text(\"Reviewed\")\n        ncolumns += 1\n    row.th().text(\"Changed Files\")\n    row.th(colspan=2).text(\"Lines\")\n    if additional:\n        row.th().text(\"Additional\")\n        ncolumns += 1\n    if review:\n        row.th().text(\"Reviewed By\")\n        ncolumns += 1\n\n    if review:\n        for is_reviewer, state, reviewers in review_files.values():\n            if is_reviewer:\n                can_approve_anything = True\n                break\n        else:\n            can_approve_anything = False\n\n    section = commit_files.tbody()\n    if review and can_approve_anything:\n        row = section.tr()\n        checkbox_everything = row.td(\"approve everything\").input(type=\"checkbox\", __generator__=True)\n        row.td(colspan=ncolumns - 1).i().text(\"Everything\")\n    else:\n        checkbox_everything = None\n\n    all_reviewed = True\n\n    for file, path, lines in zip(changeset.files, paths, changes):\n        row = section.tr(critic_file_id=file.id)\n        fully_reviewed = True\n        if parent_index is not None:\n            row.setAttribute(\"critic-parent-index\", parent_index)\n        if review:\n            if file.id in review_files:\n                review_file = review_files[file.id]\n                can_review = review_file[0]\n                is_reviewed = review_file[1] == \"reviewed\"\n                reviewers = [dbutils.User.fromId(db, user_id) for user_id in review_file[2]]\n            else:\n                can_review = False\n                is_reviewed = True\n                reviewers = []\n\n            if not is_reviewed: fully_reviewed = False\n\n            if can_review:\n                if is_reviewed: checked = \"checked\"\n                else:\n                    checked = None\n                    all_reviewed = False\n                input = row.td(\"approve file\").input(type=\"checkbox\", critic_parent_index=parent_index, id=approve_file_id % file.id, checked=checked)\n            else:\n                if is_reviewed:\n                    cell = row.td()\n                    cell.text()\n                else:\n                    row.td(\"approve file\").text(\"pending\")\n\n        row.td(\"path\").a(href=(\"#\" + file_id) % file.id).innerHTML(path)\n        if file.hasChanges():\n            if file.isBinaryChanges():\n                row.td(colspan=2).i().text(\"binary\")\n            elif file.isEmptyFile():\n                row.td(colspan=2).i().text(\"empty\")\n            else:\n                row.td().text(lines[0] and \"-%d\" % lines[0] or \"\")\n                row.td().text(lines[1] and \"+%d\" % lines[1] or \"\")\n        else:\n            row.td(colspan=2).i().text(\"no changes\")\n\n        if file.old_mode is not None and file.new_mode is not None and file.old_mode != file.new_mode:\n            cell = row.td()\n            cell.i().text(\"mode: \")\n            cell.text(\"%s => %s\" % (file.old_mode, file.new_mode))\n        elif (file.wasRemoved() and file.old_mode) or (file.wasAdded() and file.new_mode):\n            cell = row.td()\n            if file.old_mode == \"160000\" or file.new_mode == \"160000\":\n                cell.i().text(\"added submodule\" if file.wasAdded() else \"removed submodule\")\n            else:\n                cell.i().text(\"added: \" if file.wasAdded() else \"removed: \")\n                cell.text(\"%s\" % file.new_mode if file.wasAdded() else file.old_mode)\n        elif file.old_mode == \"160000\" and file.new_mode == \"160000\":\n            module_repository = repository.getModuleRepository(db, changeset.child, file.path)\n            if module_repository:\n                url = \"/showcommit?repository=%d&from=%s&to=%s\" % (module_repository.id, file.old_sha1, file.new_sha1)\n                row.td().i().a(href=url).text(\"updated submodule\")\n            else:\n                row.td().i().text(\"updated submodule\")\n        elif additional:\n            row.td().text()\n\n        if review:\n            cell = row.td(\"reviewed-by\")\n            names = sorted([user.fullname for user in reviewers])\n            if names:\n                if fully_reviewed: cell.text(\", \".join(names))\n                else: cell.text(\"( \" + \", \".join(names) + \" )\")\n            else: cell.text()\n\n    if all_reviewed and checkbox_everything:\n        checkbox_everything.setAttribute(\"checked\", \"checked\")\n\n    target.script().text(\"registerPathHandlers();\")\n\ndef render(db, target, user, repository, review, changesets, commits, listed_commits=None, context_lines=3, is_merge=False, conflicts=False, moves=False, compact=False, wrap=True, tabify=False, profiler=None, rebases=None):\n    cursor = db.cursor()\n\n    main = target.div(\"main\")\n\n    options = {}\n\n    if not user.getPreference(db, \"ui.keyboardShortcuts\"):\n        options['show'] = True\n\n    if user.getPreference(db, \"commit.expandAllFiles\"):\n        options['show'] = True\n        options['expand'] = True\n\n    if compact:\n        options['compact'] = True\n\n    if tabify:\n        options['tabify'] = True\n\n    options['commit'] = changesets[0].child\n\n    if len(changesets) == 1:\n        if commits and len(commits) > 1:\n            columns = [(10, log_html.WhenColumn()),\n                       (5, log_html.TypeColumn()),\n                       (65, log_html.SummaryColumn()),\n                       (20, log_html.AuthorColumn())]\n\n            log_html.render(db, main, \"Squashed History\", commits=commits, listed_commits=listed_commits, rebases=rebases, review=review, columns=columns, collapsable=True)\n        elif changesets[0].parent is None or (changesets[0].parent.sha1 in changesets[0].child.parents) or conflicts:\n            if conflicts and len(changesets[0].child.parents) == 1:\n                commit = changesets[0].parent\n            else:\n                commit = changesets[0].child\n            renderCommitInfo(db, main, user, repository, review, commit, conflicts)\n        else:\n            main.setAttribute(\"style\", \"margin-bottom: 20px; padding-bottom: 10px\")\n\n        if moves:\n            def renderMoveHeaderLeft(db, target, file):\n                target.text(file.move_source_file.path)\n            def renderMoveHeaderRight(db, target, file):\n                target.text(file.move_target_file.path)\n\n            options['show'] = True\n            options['expand'] = True\n            options['support_expand'] = False\n            options['header_left'] = renderMoveHeaderLeft\n            options['header_right'] = renderMoveHeaderRight\n\n            context_lines = 0\n        else:\n            renderCommitFiles(db, target, user, repository, review, changeset=changesets[0])\n\n        yield target\n\n        for stop in changeset_html.render(db, target, user, repository, changesets[0], review, context_lines=context_lines, options=options, wrap=wrap):\n            yield stop\n    else:\n        commit = changesets[0].child\n\n        renderCommitInfo(db, main, user, repository, review, commit)\n\n        if profiler: profiler.check(\"render commit info\")\n\n        nparents = len(changesets)\n        target.addInternalScript(\"var parentsCount = %d;\" % nparents)\n\n        files = {}\n\n        for index, changeset in enumerate(changesets):\n            for file in changeset.files:\n                files.setdefault(file.id, [file.id, file.path, [None] * nparents])[2][index] = file\n\n        renderCommitFiles(db, target, user, repository, review, changesets=changesets, file_id=\"p%df%%d\" % index, approve_file_id=\"p%da%%d\" % index, nparents=nparents, conflicts=changesets[-1].conflicts, files=diff.File.sorted(files.values(), key=lambda x: x[1]))\n\n        if profiler: profiler.check(\"render commit files\")\n\n        mergebase = repository.mergebase(commit, db=db)\n\n        if profiler: profiler.check(\"merge base\")\n\n        yield target\n\n        relevant_commits = []\n\n        cursor.execute(\"SELECT parent, file, sha1 FROM relevantcommits JOIN commits ON (relevant=id) WHERE commit=%s\", (commit.getId(db),))\n\n        rows = cursor.fetchall()\n\n        if rows:\n            for index in range(len(changesets)):\n                relevant_commits.append({})\n\n            commits_by_sha1 = {}\n\n            for parent_index, file_id, sha1 in rows:\n                if sha1 not in commits_by_sha1:\n                    commits_by_sha1[sha1] = gitutils.Commit.fromSHA1(db, repository, sha1)\n                relevant_commits[parent_index].setdefault(file_id, []).append(commits_by_sha1[sha1])\n        else:\n            values = []\n            commits_by_sha1 = {}\n\n            for index, changeset in enumerate(changesets):\n                relevant_files = set([file.path for file in changeset.files])\n                files = {}\n\n                if not changeset.conflicts:\n                    commit_range = \"%s..%s\" % (mergebase, changeset.parent.sha1)\n                    relevant_lines = repository.run(\"log\", \"--name-only\", \"--full-history\", \"--format=sha1:%H\", commit_range, \"--\", *relevant_files).splitlines()\n\n                    for line in relevant_lines:\n                        if line.startswith(\"sha1:\"):\n                            sha1 = line[5:]\n                        elif line in relevant_files:\n                            if sha1 not in commits_by_sha1:\n                                commits_by_sha1[sha1] = gitutils.Commit.fromSHA1(db, repository, sha1)\n\n                            relevant_commit = commits_by_sha1[sha1]\n                            file_id = dbutils.find_file(db, path=line)\n                            values.append((commit.getId(db), index, file_id, relevant_commit.getId(db)))\n                            files.setdefault(file_id, []).append(relevant_commit)\n\n                relevant_commits.append(files)\n\n            cursor.executemany(\"INSERT INTO relevantcommits (commit, parent, file, relevant) VALUES (%s, %s, %s, %s)\", values)\n\n        if profiler: profiler.check(\"collecting relevant commits\")\n\n        if tabify:\n            target.script(type=\"text/javascript\").text(\"calculateTabWidth();\")\n\n        for index, changeset in enumerate(changesets):\n            parent = target.div(\"parent\", id=\"p%d\" % index)\n\n            options['support_expand'] = bool(changeset.conflicts)\n            options['file_id'] = lambda base: \"p%d%s\" % (index, base)\n            options['line_id'] = lambda base: \"p%d%s\" % (index, base)\n            options['line_cell_id'] = lambda base: base is not None and \"p%d%s\" % (index, base) or None\n            options['file_id_format'] = \"p%df%%d\" % index\n\n            relevant_commits_per_file = {}\n            for file in changeset.files:\n                relevant_commits_per_file[file.id] = []\n                for index1, changeset1 in enumerate(changesets):\n                    if index1 != index:\n                        relevant_commits_per_file[file.id].extend(relevant_commits[index1].get(file.id, []))\n\n            if changeset.conflicts: text = \"Merge conflict resolutions\"\n            else: text = \"Changes relative to %s parent\" % (\"first\", \"second\", \"third\", \"fourth\", \"fifth\", \"seventh\", \"eight\", \"ninth\")[index]\n\n            parent.h1().text(text)\n\n            def renderRelevantCommits(db, target, file):\n                commits = relevant_commits_per_file.get(file.id)\n                if commits:\n                    def linkToCommit(commit, overrides={}):\n                        return \"%s/%s?file=%d\" % (commit.repository.name, commit.sha1, file.id)\n\n                    columns = [(70, log_html.SummaryColumn(linkToCommit=linkToCommit)),\n                               (30, log_html.AuthorColumn())]\n\n                    log_html.renderList(db, target, \"Relevant Commits\", commits, columns=columns, hide_merges=True, className=\"log relevant\")\n\n            options['content_after'] = renderRelevantCommits\n            options['parent_index'] = index\n            options['merge'] = True\n\n            for stop in changeset_html.render(db, parent, user, repository, changeset,\n                                              review, context_lines=context_lines,\n                                              options=options, wrap=wrap, parent_index=index):\n                yield stop\n\n        if profiler: profiler.check(\"render diff\")\n\n    if user.getPreference(db, \"ui.keyboardShortcuts\"):\n        page.utils.renderShortcuts(target, \"showcommit\",\n                                   review=review,\n                                   merge_parents=len(changesets),\n                                   squashed_diff=commits and len(commits) > 1)\n\nclass NoChangesFound(Exception):\n    pass\n\ndef commitRangeFromReview(db, user, review, filter_value, file_ids):\n    edges = cursor = db.cursor()\n\n    if filter_value == \"everything\":\n        cursor.execute(\"\"\"SELECT DISTINCT changesets.parent, changesets.child\n                            FROM changesets\n                            JOIN reviewchangesets ON (reviewchangesets.changeset=changesets.id)\n                           WHERE reviewchangesets.review=%s\"\"\",\n                       (review.id,))\n    elif filter_value == \"pending\":\n        cursor.execute(\"\"\"SELECT DISTINCT changesets.parent, changesets.child\n                            FROM changesets\n                            JOIN reviewfiles ON (reviewfiles.changeset=changesets.id)\n                            JOIN reviewuserfiles ON (reviewuserfiles.file=reviewfiles.id)\n                           WHERE reviewfiles.review=%s\n                             AND reviewuserfiles.uid=%s\n                             AND reviewfiles.state='pending'\"\"\",\n                       (review.id, user.id))\n    elif filter_value == \"reviewable\":\n        cursor.execute(\"\"\"SELECT DISTINCT changesets.parent, changesets.child\n                            FROM changesets\n                            JOIN reviewfiles ON (reviewfiles.changeset=changesets.id)\n                            JOIN reviewuserfiles ON (reviewuserfiles.file=reviewfiles.id)\n                           WHERE reviewfiles.review=%s\n                             AND reviewuserfiles.uid=%s\"\"\",\n                       (review.id, user.id))\n    elif filter_value == \"relevant\":\n        filters = review_filters.Filters()\n        filters.setFiles(db, review=review)\n        filters.load(db, review=review, user=user)\n\n        cursor.execute(\"\"\"SELECT DISTINCT changesets.parent, changesets.child, reviewfiles.file, reviewuserfiles.uid IS NOT NULL\n                            FROM changesets\n                            JOIN reviewfiles ON (reviewfiles.changeset=changesets.id)\n                 LEFT OUTER JOIN reviewuserfiles ON (reviewuserfiles.file=reviewfiles.id\n                                                 AND reviewuserfiles.uid=%s)\n                           WHERE reviewfiles.review=%s\"\"\",\n                       (user.id, review.id))\n\n        edges = set()\n\n        for parent_id, child_id, file_id, is_reviewer in cursor:\n            if is_reviewer or filters.isRelevant(user, file_id):\n                edges.add((parent_id, child_id))\n    elif filter_value == \"files\":\n        assert len(file_ids) != 0\n\n        cursor.execute(\"\"\"SELECT DISTINCT changesets.parent, changesets.child\n                            FROM changesets\n                            JOIN reviewchangesets ON (reviewchangesets.changeset=changesets.id)\n                            JOIN fileversions ON (fileversions.changeset=changesets.id)\n                           WHERE reviewchangesets.review=%s\n                             AND fileversions.file=ANY (%s)\"\"\",\n                       (review.id, list(file_ids)))\n    else:\n        raise page.utils.InvalidParameterValue(\n            name=\"filter\",\n            value=filter_value,\n            expected=\"one of 'everything', 'pending', 'reviewable', 'relevant' and 'files'\")\n\n    listed_commits = set()\n    with_pending = set()\n\n    for parent_id, child_id in edges:\n        listed_commits.add(child_id)\n        with_pending.add((parent_id, child_id))\n\n    if len(listed_commits) == 1:\n        commit = gitutils.Commit.fromId(db, review.repository, child_id)\n        return commit.sha1, commit.sha1, list(listed_commits), listed_commits\n\n    if filter_value in (\"everything\", \"reviewable\", \"relevant\", \"files\"):\n        cursor.execute(\"SELECT child FROM changesets JOIN reviewchangesets ON (changeset=id) WHERE review=%s\", (review.id,))\n        all_commits = [gitutils.Commit.fromId(db, review.repository, commit_id) for (commit_id,) in cursor]\n\n        commitset = CommitSet(review.branch.getCommits(db))\n        tails = commitset.getFilteredTails(review.repository)\n\n        if len(commitset) == 0:\n            raise page.utils.DisplayMessage(\n                title=\"Empty review\",\n                body=(\"This review contains no commits.  It is thus not \"\n                      \"meaningful to display a filtered view of the changes \"\n                      \"in it.\"),\n                review=review)\n        elif len(tails) > 1:\n            ancestor = review.repository.getCommonAncestor(tails)\n            paths = []\n\n            cursor.execute(\"SELECT DISTINCT file FROM reviewfiles WHERE review=%s\", (review.id,))\n            files_in_review = set(file_id for (file_id,) in cursor)\n\n            if filter_value == \"files\":\n                files_in_review &= file_ids\n\n            paths_in_review = set(dbutils.describe_file(db, file_id) for file_id in files_in_review)\n            paths_in_upstreams = set()\n\n            for tail in tails:\n                paths_in_upstream = set(review.repository.run(\"diff\", \"--name-only\", \"%s..%s\" % (ancestor, tail)).splitlines())\n                paths_in_upstreams |= paths_in_upstream\n\n                paths.append((tail, paths_in_upstream))\n\n            overlapping_changes = paths_in_review & paths_in_upstreams\n\n            if overlapping_changes:\n                candidates = []\n\n                for index1, data in enumerate(paths):\n                    for index2, (tail, paths_in_upstream) in enumerate(paths):\n                        if index1 != index2 and paths_in_upstream & paths_in_review:\n                            break\n                    else:\n                        candidates.append(data)\n            else:\n                candidates = paths\n\n            if not candidates:\n                paths.sort(cmp=lambda a, b: cmp(len(a[1]), len(b[1])))\n\n                url = \"/%s/%s..%s?file=%s\" % (review.repository.name, paths[0][0][:8], review.branch.head_sha1[:8], \",\".join(map(str, sorted(files_in_review))))\n\n                message = \"\"\"\\\n<p>It is not possible to generate a diff of the requested set of\ncommits that contains only changes from those commits.</p>\n\n<p>The following files would contain unrelated changes:<p>\n<pre style='padding-left: 2em'>%s</pre>\n\n<p>You can use the URL below if you want to view this diff anyway,\nincluding the unrelated changes.</p>\n<pre style='padding-left: 2em'><a href='%s'>%s%s</a></pre>\"\"\" % (\"\\n\".join(sorted(overlapping_changes)), url, dbutils.getURLPrefix(db, user), url)\n\n                raise page.utils.DisplayMessage(title=\"Impossible Diff\",\n                                                body=message,\n                                                review=review,\n                                                html=True)\n            else:\n                candidates.sort(cmp=lambda a, b: cmp(len(b[1]), len(a[1])))\n\n                return candidates[0][0], review.branch.head_sha1, all_commits, listed_commits\n\n        if tails:\n            from_sha1 = tails.pop()\n        else:\n            # Review starts with the initial commit.\n            from_sha1 = None\n\n        return from_sha1, review.branch.head_sha1, all_commits, listed_commits\n\n    if not with_pending:\n        raise NoChangesFound()\n\n    cursor.execute(\"\"\"SELECT parent, child\n                        FROM changesets\n                        JOIN reviewchangesets ON (id=changeset)\n                       WHERE review=%s\"\"\", (review.id,))\n\n    children = set()\n    parents = set()\n    edges = {}\n\n    for parent_id, child_id in cursor.fetchall():\n        children.add(child_id)\n        parents.add(parent_id)\n        edges.setdefault(child_id, set()).add(parent_id)\n\n    def isAncestorOf(ancestor_id, descendant_id):\n        ancestors = edges.get(descendant_id, set()).copy()\n        pending = ancestors.copy()\n        while pending and ancestor_id not in ancestors:\n            commit_id = pending.pop()\n            parents = edges.get(commit_id, set())\n            pending.update(parents - ancestors)\n            ancestors.update(parents)\n        return ancestor_id in ancestors\n\n    candidates = listed_commits.copy()\n    heads = set()\n    tails = set()\n\n    for candidate_id in listed_commits:\n        for other_id in candidates:\n            if other_id != candidate_id and isAncestorOf(candidate_id, other_id):\n                break\n        else:\n            heads.add(candidate_id)\n\n        for other_id in candidates:\n            if other_id != candidate_id and isAncestorOf(other_id, candidate_id):\n                break\n        else:\n            tails.add(candidate_id)\n\n    if len(heads) != 1 or len(tails) != 1:\n        raise page.utils.DisplayMessage(\"Filtered view not possible since it includes a merge commit.\")\n\n    head = gitutils.Commit.fromId(db, review.repository, heads.pop())\n    tail = gitutils.Commit.fromId(db, review.repository, tails.pop())\n\n    # if tail.parents greater than 1, it means it's a merge commit\n    if len(tail.parents) > 1:\n        raise page.utils.DisplayMessage(\"Filtered view not possible since it includes a merge commit.\")\n\n    if len(tail.parents) == 0:\n        tail = None\n    else:\n        tail = gitutils.Commit.fromSHA1(db, review.repository, tail.parents[0])\n\n    commits = getCommitList(db, review.repository, tail, head)\n\n    if not commits:\n        raise page.utils.DisplayMessage(\"Filtered view not possible since it includes a merge commit.\")\n\n    tail_to_return = tail.sha1 if tail is not None else None\n    return tail_to_return, head.sha1, commits, listed_commits\n\ndef getCommitList(db, repository,\n                  from_commit=None, to_commit=None,\n                  first_commit=None, last_commit=None):\n    # This function should be called with either from_commit/to_commit *or*\n    # first_commit/last_commit.  The other two should be None.  When called\n    # for a range that starts with a root commit, from_commit will be None\n    # (and to_commit not.)\n    assert (from_commit is None) or (to_commit is not None)\n    assert (first_commit is None) == (last_commit is None)\n    assert (to_commit is None) != (last_commit is None)\n\n    if first_commit is not None:\n        sha1s = repository.revlist(\n            [last_commit], [first_commit], \"--ancestry-path\")\n        commits = [first_commit]\n        commits.extend(gitutils.Commit.fromSHA1(db, repository, sha1)\n                       for sha1 in sha1s)\n        return commits\n\n    commits = set()\n\n    class NotPossible(Exception): pass\n\n    def process(iter_commit):\n        while iter_commit != from_commit and iter_commit not in commits:\n            commits.add(iter_commit)\n\n            if len(iter_commit.parents) > 1:\n                try:\n                    mergebase = repository.mergebase(iter_commit)\n                    # Include the parents of the merge commit (and their\n                    # ancestors) if 'from_commit' is an ancestor of the merge\n                    # base (but isn't the merge base.)\n                    include_parents = (from_commit != mergebase and\n                                       from_commit.isAncestorOf(mergebase))\n                except gitutils.GitCommandError:\n                    raise NotPossible\n\n                if include_parents:\n                    map(process, [gitutils.Commit.fromSHA1(db, repository, sha1) for sha1 in iter_commit.parents])\n                    return\n                else:\n                    raise NotPossible\n            else:\n                if from_commit is None and len(iter_commit.parents) == 0:\n                    return\n\n                iter_commit = gitutils.Commit.fromSHA1(db, repository, iter_commit.parents[0])\n\n    if from_commit == to_commit:\n        return [to_commit]\n\n    try:\n        process(to_commit)\n        return list(commits)\n    except NotPossible:\n        return []\n\ndef getApproximativeCommitList(db, repository, from_commit, to_commit, paths):\n    try:\n        ancestor = repository.getCommonAncestor([from_commit, to_commit])\n    except gitutils.GitCommandError:\n        return [], []\n\n    return ([gitutils.Commit.fromSHA1(db, repository, sha1)\n             for sha1 in repository.revlist([to_commit], [ancestor])],\n            [gitutils.Commit.fromSHA1(db, repository, sha1).getId(db)\n             for sha1 in repository.revlist([to_commit], [ancestor], paths=paths)])\n\ndef renderShowCommit(req, db, user):\n    profiler = profiling.Profiler()\n\n    files_arg = req.getParameter(\"file\", None)\n    if files_arg is None:\n        file_ids = None\n    else:\n        file_ids = set()\n        for file_arg in files_arg.split(\",\"):\n            try:\n                file_id = int(file_arg)\n            except ValueError:\n                try:\n                    file_id = dbutils.find_file(db, file_arg.strip(), insert=False)\n                except dbutils.InvalidPath:\n                    file_id = None\n            if file_id is not None:\n                file_ids.add(file_id)\n\n    review_id = req.getParameter(\"review\", None, filter=int)\n    review_filter = req.getParameter(\"filter\", None)\n    context = req.getParameter(\"context\", None, int)\n    style = req.getParameter(\"style\", \"horizontal\", str)\n    rescan = req.getParameter(\"rescan\", \"no\", str) == \"yes\"\n    reanalyze = req.getParameter(\"reanalyze\", None)\n    wrap = req.getParameter(\"wrap\", \"yes\", str) == \"yes\"\n    conflicts = req.getParameter(\"conflicts\", \"no\") == \"yes\"\n    moves = req.getParameter(\"moves\", \"no\") == \"yes\"\n    full = req.getParameter(\"full\", \"no\") == \"yes\"\n\n    default_tabify = \"yes\" if user.getPreference(db, \"commit.diff.visualTabs\") else \"no\"\n    tabify = req.getParameter(\"tabify\", default_tabify) == \"yes\"\n\n    if user.getPreference(db, \"commit.diff.compactMode\"): default_compact = \"yes\"\n    else: default_compact = \"no\"\n\n    compact = req.getParameter(\"compact\", default_compact) == \"yes\"\n\n    if moves:\n        move_source_file_ids = req.getParameter(\"sourcefiles\", None)\n        move_target_file_ids = req.getParameter(\"targetfiles\", None)\n\n        if move_source_file_ids:\n            move_source_file_ids = set(map(int, move_source_file_ids.split(\",\")))\n        if move_target_file_ids:\n            move_target_file_ids = set(map(int, move_target_file_ids.split(\",\")))\n\n    all_commits = None\n    listed_commits = None\n    first_sha1 = None\n    last_sha1 = None\n\n    repository = None\n\n    document = htmlutils.Document(req)\n    document.setBase(None)\n\n    if review_id is None:\n        review = None\n    else:\n        review = dbutils.Review.fromId(db, review_id)\n        if not review: raise page.utils.DisplayMessage(\"Invalid review ID: %d\" % review_id)\n        branch = review.branch\n        repository = review.repository\n\n    title = \"\"\n\n    if review:\n        title += \"[r/%d] \" % review.id\n\n        if review_filter == \"pending\":\n            title += \"Pending: \"\n        elif review_filter == \"reviewable\":\n            title += \"Reviewable: \"\n        elif review_filter == \"relevant\":\n            title += \"Relevant: \"\n        elif review_filter == \"everything\":\n            title += \"Everything: \"\n\n    if not repository:\n        parameter = req.getParameter(\"repository\", None)\n        if parameter:\n            repository = gitutils.Repository.fromParameter(db, parameter)\n            if not repository:\n                raise page.utils.DisplayMessage(\"'%s' is not a valid repository!\" % repository.name, review=review)\n\n    cursor = db.cursor()\n\n    def expand_sha1(sha1):\n        if review and re.match(\"^[0-9a-f]+$\", sha1):\n            cursor.execute(\"\"\"SELECT sha1\n                                FROM commits\n                                JOIN changesets ON (changesets.child=commits.id)\n                                JOIN reviewchangesets ON (reviewchangesets.changeset=changesets.id)\n                               WHERE reviewchangesets.review=%s\n                                 AND commits.sha1 LIKE %s\"\"\",\n                           (review.id, sha1 + \"%\"))\n            try: return cursor.fetchone()[0]\n            except: pass\n\n        if len(sha1) == 40: return sha1\n        else: return repository.revparse(sha1)\n\n    sha1 = req.getParameter(\"sha1\", None, filter=expand_sha1)\n\n    if sha1 is None:\n        from_sha1 = req.getParameter(\"from\", None, filter=expand_sha1)\n        to_sha1 = req.getParameter(\"to\", None, filter=expand_sha1)\n\n        if (from_sha1 is None) != (to_sha1 is None):\n            raise page.utils.DisplayMessage(\"invalid parameters; one of 'from'/'to' specified but not both\")\n\n        if from_sha1 is None:\n            first_sha1 = req.getParameter(\"first\", None, filter=expand_sha1)\n            last_sha1 = req.getParameter(\"last\", None, filter=expand_sha1)\n\n            if (first_sha1 is None) != (last_sha1 is None):\n                raise page.utils.DisplayMessage(\"invalid parameters; one of 'first'/'last' specified but not both\")\n\n            if first_sha1 is None:\n                if review_id and review_filter:\n                    try:\n                        from_sha1, to_sha1, all_commits, listed_commits = commitRangeFromReview(db, user, review, review_filter, file_ids)\n                    except NoChangesFound:\n                        if review_filter == \"pending\":\n                            raise page.utils.DisplayMessage(\"Your work here is done!\", None, review)\n                        else:\n                            assert review_filter != \"files\"\n                            raise page.utils.DisplayMessage(\"No %s changes found.\" % review_filter, None, review)\n                    if from_sha1 == to_sha1:\n                        sha1 = to_sha1\n                        to_sha1 = None\n                else:\n                    raise page.utils.DisplayMessage(\"invalid parameters; need 'sha1', 'from'/'to' or 'first'/'last'\")\n    else:\n        from_sha1 = None\n        to_sha1 = None\n\n    if context is None: context = user.getPreference(db, \"commit.diff.contextLines\")\n\n    one_sha1 = filter(None, (sha1, from_sha1, to_sha1, first_sha1, last_sha1))[0]\n\n    if repository:\n        if not repository.iscommit(one_sha1):\n            raise page.utils.DisplayMessage(\"'%s' is not a valid commit in the repository '%s'!\" % (one_sha1, repository.name), review=review)\n    else:\n        repository = user.getDefaultRepository(db)\n        if repository and not repository.iscommit(one_sha1):\n            repository = None\n        if not repository:\n            repository = gitutils.Repository.fromSHA1(db, one_sha1)\n\n    if first_sha1 is not None:\n        try:\n            first_commit = gitutils.Commit.fromSHA1(db, repository, first_sha1)\n        except gitutils.GitReferenceError as error:\n            raise page.utils.DisplayMessage(\"Invalid SHA-1\", \"%s is not a commit in %s\" % (error.sha1, repository.path))\n\n        if len(first_commit.parents) > 1:\n            raise page.utils.DisplayMessage(\"Invalid parameters; 'first' can not be a merge commit.\", review=review)\n\n        from_sha1 = first_commit.parents[0] if first_commit.parents else None\n        to_sha1 = last_sha1\n\n    try:\n        commit = gitutils.Commit.fromSHA1(db, repository, sha1) if sha1 else None\n        from_commit = gitutils.Commit.fromSHA1(db, repository, from_sha1) if from_sha1 else None\n        to_commit = gitutils.Commit.fromSHA1(db, repository, to_sha1) if to_sha1 else None\n    except gitutils.GitReferenceError as error:\n        raise page.utils.DisplayMessage(\"Invalid SHA-1\", \"%s is not a commit in %s\" % (error.sha1, repository.path))\n\n    if commit:\n        title += \"%s (%s)\" % (commit.niceSummary(), commit.describe(db))\n    elif from_commit:\n        title += \"%s..%s\" % (from_commit.describe(db), to_commit.describe(db))\n    else:\n        title += \"..%s\" % to_commit.describe(db)\n\n    document.setTitle(title)\n\n    if review_filter == \"pending\":\n        document.setLink(\"next\", \"javascript:submitChanges();\")\n\n    commits = None\n    rebases = None\n\n    profiler.check(\"prologue\")\n\n    if to_commit:\n        changesets = changeset_utils.createChangeset(db, user, repository, from_commit=from_commit, to_commit=to_commit, conflicts=conflicts, rescan=rescan, reanalyze=reanalyze, filtered_file_ids=file_ids)\n        assert len(changesets) == 1\n\n        if not conflicts:\n            if review and (review_filter in (\"everything\", \"reviewable\", \"relevant\")\n                           or (review_filter == \"files\" and all_commits)):\n                # We're displaying the full changes in the review (possibly\n                # filtered by file) => include rebase information when rendering\n                # the \"Squashed History\" log.\n\n                cursor.execute(\"\"\"SELECT id, old_head, new_head, new_upstream, equivalent_merge, replayed_rebase, uid, branch\n                                    FROM reviewrebases\n                                   WHERE review=%s AND new_head IS NOT NULL\"\"\",\n                               (review.id,))\n\n                rebases = [(rebase_id,\n                            gitutils.Commit.fromId(db, repository, old_head),\n                            gitutils.Commit.fromId(db, repository, new_head),\n                            dbutils.User.fromId(db, user_id),\n                            gitutils.Commit.fromId(db, repository, new_upstream) if new_upstream is not None else None,\n                            gitutils.Commit.fromId(db, repository, equivalent_merge) if equivalent_merge is not None else None,\n                            gitutils.Commit.fromId(db, repository, replayed_rebase) if replayed_rebase is not None else None,\n                            branch_name)\n                           for rebase_id, old_head, new_head, new_upstream, equivalent_merge, replayed_rebase, user_id, branch_name in cursor]\n\n            if all_commits:\n                commits = all_commits\n            else:\n                if first_sha1:\n                    commits = getCommitList(\n                        db, repository, first_commit=first_commit, last_commit=to_commit)\n                else:\n                    commits = getCommitList(\n                        db, repository, from_commit=from_commit, to_commit=to_commit)\n\n                if not commits and not review:\n                    paths = [changed_file.path for changed_file in changesets[0].files]\n                    commits, listed_commits = getApproximativeCommitList(db, repository, from_commit, to_commit, paths)\n            if commits:\n                changesets[0].setCommits(commits)\n    else:\n        if len(commit.parents) > 1:\n            if review:\n                cursor.execute(\"SELECT COUNT(changeset) FROM reviewchangesets JOIN changesets ON (changeset=id) WHERE review=%s AND child=%s\", (review.id, commit.getId(db)))\n                if cursor.fetchone()[0] > len(commit.parents):\n                    full = True\n        else:\n            full = False\n\n        if full:\n            changesets = changeset_utils.createFullMergeChangeset(db, user, repository, commit, review=review)\n            commits = [commit]\n        else:\n            changesets = changeset_utils.createChangeset(db, user, repository, commit=commit, rescan=rescan, reanalyze=reanalyze, conflicts=conflicts, filtered_file_ids=file_ids, review=review)\n            commits = [commit]\n\n    profiler.check(\"create changeset\")\n\n    if review and commits:\n        all_files = set()\n        pending_files = set()\n        reviewable_files = set()\n\n        cursor.execute(\"\"\"SELECT reviewfiles.file, reviewfiles.state, reviewuserfiles.uid IS NOT NULL\n                            FROM commits\n                            JOIN changesets ON (changesets.child=commits.id)\n                            JOIN reviewfiles ON (reviewfiles.changeset=changesets.id)\n                 LEFT OUTER JOIN reviewuserfiles ON (reviewuserfiles.file=reviewfiles.id AND reviewuserfiles.uid=%s)\n                           WHERE commits.sha1=ANY (%s)\n                             AND reviewfiles.review=%s\"\"\",\n                       (user.id, [commit.sha1 for commit in commits], review.id))\n\n        for file_id, current_state, is_reviewer in cursor:\n            all_files.add(file_id)\n            if is_reviewer:\n                if current_state == 'pending':\n                    pending_files.add(file_id)\n                reviewable_files.add(file_id)\n\n        profiler.check(\"reviewfiles query\")\n\n        for changeset in changesets:\n            all_files_local = all_files.copy()\n\n            for file in changeset.files:\n                if file.id in all_files_local:\n                    all_files_local.remove(file.id)\n\n            for file_id in all_files_local:\n                if not file_ids or file_id in file_ids:\n                    changeset.files.append(diff.File(file_id, dbutils.describe_file(db, file_id), None, None, repository))\n\n            if review_filter == \"pending\":\n                def isPending(file): return file.id in pending_files\n                changeset.files = filter(isPending, changeset.files)\n\n            elif review_filter == \"reviewable\":\n                def isReviewable(file): return file.id in reviewable_files\n                changeset.files = filter(isReviewable, changeset.files)\n\n            elif review_filter == \"relevant\":\n                filters = review_filters.Filters()\n                filters.setFiles(db, review=review)\n                filters.load(db, review=review, user=user)\n\n                def isRelevant(file):\n                    if file.id in reviewable_files: return True\n                    elif filters.isRelevant(user, file): return True\n                    else: return False\n\n                changeset.files = filter(isRelevant, changeset.files)\n\n            elif review_filter == \"files\":\n                def isFiltered(file): return file.id in file_ids\n                changeset.files = filter(isFiltered, changeset.files)\n\n        profiler.check(\"review filtering\")\n\n    if moves:\n        if len(changesets) != 1:\n            raise page.utils.DisplayMessage(\"Can't detect moves in a merge commit!\", review=review)\n\n        move_changeset = changeset_detectmoves.detectMoves(db, changesets[0], move_source_file_ids, move_target_file_ids)\n\n        if not move_changeset:\n            raise page.utils.DisplayMessage(\"No moved code found!\", review=review)\n\n        changesets = [move_changeset]\n\n        profiler.check(\"moves detection\")\n\n    html = document.html()\n    head = html.head()\n    body = html.body()\n\n    if review:\n        def generateButtons(target):\n            review_utils.renderDraftItems(db, user, review, target)\n            buttons = target.div(\"buttons\")\n            if user.getPreference(db, \"debug.extensions.customProcessCommits\"):\n                buttons.button(onclick='customProcessCommits();').text(\"Process Commits\")\n            buttons.span(\"buttonscope buttonscope-global\")\n        page.utils.generateHeader(body, db, user, generateButtons, extra_links=[(\"r/%d\" % review.id, \"Back to Review\")])\n    else:\n        def generateButtons(target):\n            buttons = target.div(\"buttons\")\n            if not user.isAnonymous() and (commit or commits):\n                buttons.button(onclick='createReview();').text('Create Review')\n            buttons.span(\"buttonscope buttonscope-global\")\n        page.utils.generateHeader(body, db, user, generateButtons)\n\n    log_html.addResources(document)\n    changeset_html.addResources(db, user, repository, review, compact, tabify, document)\n\n    document.addInternalScript(user.getJS(db))\n    document.addInternalScript(repository.getJS())\n    document.addInternalScript(\"var keyboardShortcuts = %s;\" % (user.getPreference(db, \"ui.keyboardShortcuts\") and \"true\" or \"false\"))\n\n    for stop in render(db, body, user, repository, review, changesets, commits, listed_commits, context_lines=context, conflicts=conflicts, moves=moves, compact=compact, wrap=wrap, tabify=tabify, profiler=profiler, rebases=rebases):\n        yield document.render(stop=stop, pretty=not compact)\n\n    profiler.check(\"rendering\")\n    profiler.output(db, user, document)\n\n    db.commit()\n\n    yield document.render(pretty=not compact)\n"
  },
  {
    "path": "src/page/showfile.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport os\nimport urllib\n\nimport dbutils\nimport gitutils\nimport page.utils\nimport htmlutils\nimport textutils\nimport diff\nimport reviewing.utils as review_utils\nimport reviewing.comment as review_comment\n\nfrom syntaxhighlight.request import requestHighlights\n\ndef renderShowFile(req, db, user):\n    cursor = db.cursor()\n\n    sha1 = req.getParameter(\"sha1\")\n    path = req.getParameter(\"path\")\n    line = req.getParameter(\"line\", None)\n    review_id = req.getParameter(\"review\", None, filter=int)\n\n    default_tabify = \"yes\" if user.getPreference(db, \"commit.diff.visualTabs\") else \"no\"\n    tabify = req.getParameter(\"tabify\", default_tabify) == \"yes\"\n\n    if line is None:\n        first, last = None, None\n    else:\n        if \"-\" in line:\n            first, last = map(int, line.split(\"-\"))\n        else:\n            first = last = int(line)\n\n        context = req.getParameter(\"context\", user.getPreference(db, \"commit.diff.contextLines\"), int)\n\n        first_with_context = max(1, first - context)\n        last_with_context = last + context\n\n    if user.getPreference(db, \"commit.diff.compactMode\"): default_compact = \"yes\"\n    else: default_compact = \"no\"\n\n    compact = req.getParameter(\"compact\", default_compact) == \"yes\"\n\n    if len(path) == 0 or path[-1:] == \"/\":\n        raise page.utils.DisplayMessage(\n            title=\"Invalid path parameter\",\n            body=\"<p>The path must be non-empty and must not end with a <code>/</code>.</p>\",\n            html=True)\n    if path[0] == '/':\n        full_path = path\n        if path != \"/\": path = path[1:]\n    else:\n        full_path = \"/\" + path\n        if not path: path = \"/\"\n\n    if review_id is None:\n        review = None\n        repository_arg = req.getParameter(\"repository\", \"\")\n        if repository_arg:\n            repository = gitutils.Repository.fromParameter(db, repository_arg)\n        else:\n            repository = gitutils.Repository.fromSHA1(db, sha1)\n    else:\n        review = dbutils.Review.fromId(db, review_id)\n        repository = review.repository\n\n    document = htmlutils.Document(req)\n\n    html = document.html()\n    head = html.head()\n    body = html.body()\n\n    if review:\n        page.utils.generateHeader(body, db, user, lambda target: review_utils.renderDraftItems(db, user, review, target), extra_links=[(\"r/%d\" % review.id, \"Back to Review\")])\n    else:\n        page.utils.generateHeader(body, db, user)\n\n    document.addExternalStylesheet(\"resource/showfile.css\")\n    document.addInternalStylesheet(htmlutils.stripStylesheet(user.getResource(db, \"syntax.css\")[1], compact))\n\n    commit = gitutils.Commit.fromSHA1(db, repository, sha1)\n    file_sha1 = commit.getFileSHA1(full_path)\n    file_id = dbutils.find_file(db, path=path)\n\n    if file_sha1 is None:\n        raise page.utils.DisplayMessage(\n            title=\"File does not exist\",\n            body=(\"<p>There is no file named <code>%s</code> in the commit \"\n                  \"<a href='/showcommit?repository=%s&amp;sha1=%s'>\"\n                  \"<code>%s</code></a>.</p>\"\n                  % (htmlutils.htmlify(textutils.escape(full_path)),\n                     htmlutils.htmlify(repository.name),\n                     htmlutils.htmlify(sha1), htmlutils.htmlify(sha1[:8]))),\n            html=True)\n\n    file = diff.File(file_id, path, None, file_sha1, repository)\n\n    # A new file ID might have been added to the database, so need to commit.\n    db.commit()\n\n    if file.canHighlight():\n        requestHighlights(repository, { file.new_sha1: (file.path, file.getLanguage()) }, \"legacy\")\n\n    file.loadNewLines(True, request_highlight=True)\n\n    if review:\n        document.addInternalScript(user.getJS())\n        document.addInternalScript(review.getJS())\n        document.addInternalScript(\"var changeset = { parent: { id: %(id)d, sha1: %(sha1)r }, child: { id: %(id)d, sha1: %(sha1)r } };\" % { 'id': commit.getId(db), 'sha1': commit.sha1 })\n        document.addInternalScript(\"var files = { %(id)d: { new_sha1: %(sha1)r }, %(sha1)r: { id: %(id)d, side: 'n' } };\" % { 'id': file_id, 'sha1': file_sha1 })\n        document.addExternalStylesheet(\"resource/review.css\")\n        document.addExternalScript(\"resource/review.js\")\n\n        cursor.execute(\"\"\"SELECT DISTINCT commentchains.id\n                            FROM commentchains\n                            JOIN commentchainlines ON (commentchainlines.chain=commentchains.id)\n                           WHERE commentchains.review=%s\n                             AND commentchains.file=%s\n                             AND commentchainlines.sha1=%s\n                             AND ((commentchains.state!='draft' OR commentchains.uid=%s)\n                              AND commentchains.state!='empty')\n                        GROUP BY commentchains.id\"\"\",\n                       (review.id, file_id, file_sha1, user.id))\n\n        comment_chain_script = \"\"\n\n        for (chain_id,) in cursor.fetchall():\n            chain = review_comment.CommentChain.fromId(db, chain_id, user, review=review)\n            chain.loadComments(db, user)\n\n            comment_chain_script += \"commentChains.push(%s);\\n\" % chain.getJSConstructor(file_sha1)\n\n        if comment_chain_script:\n            document.addInternalScript(comment_chain_script)\n\n    document.addExternalStylesheet(\"resource/comment.css\")\n    document.addExternalScript(\"resource/comment.js\")\n    document.addExternalScript(\"resource/showfile.js\")\n\n    if tabify:\n        document.addExternalStylesheet(\"resource/tabify.css\")\n        document.addExternalScript(\"resource/tabify.js\")\n        tabwidth = file.getTabWidth()\n        indenttabsmode = file.getIndentTabsMode()\n\n    if user.getPreference(db, \"commit.diff.highlightIllegalWhitespace\"):\n        document.addInternalStylesheet(user.getResource(db, \"whitespace.css\")[1], compact)\n\n    if first is not None:\n        document.addInternalScript(\"var firstSelectedLine = %d, lastSelectedLine = %d;\" % (first, last))\n\n    target = body.div(\"main\")\n\n    if tabify:\n        target.script(type=\"text/javascript\").text(\"calculateTabWidth();\")\n\n    table = target.table('file show expanded paleyellow', align='center', cellspacing=0)\n\n    columns = table.colgroup()\n    columns.col('edge')\n    columns.col('linenr')\n    columns.col('line')\n    columns.col('middle')\n    columns.col('middle')\n    columns.col('line')\n    columns.col('linenr')\n    columns.col('edge')\n\n    thead = table.thead()\n    cell = thead.tr().td('h1', colspan=8)\n    h1 = cell.h1()\n\n    def make_url(url_path, path):\n        params = { \"sha1\": sha1,\n                   \"path\": path }\n        if review is None:\n            params[\"repository\"] = str(repository.id)\n        else:\n            params[\"review\"] = str(review.id)\n        return \"%s?%s\" % (url_path, urllib.urlencode(params))\n\n    h1.a(\"root\", href=make_url(\"showtree\", \"/\")).text(\"root\")\n    h1.span().text('/')\n\n    components = path.split(\"/\")\n    for index, component in enumerate(components[:-1]):\n        h1.a(href=make_url(\"showtree\", \"/\".join(components[:index + 1]))).text(component, escape=True)\n        h1.span().text('/')\n\n    if first is not None:\n        h1.a(href=make_url(\"showfile\", \"/\".join(components))).text(components[-1], escape=True)\n    else:\n        h1.text(components[-1], escape=True)\n\n    h1.span(\"right\").a(href=(\"/download/%s?repository=%s&sha1=%s\"\n                             % (urllib.quote(path), repository.name, file_sha1)),\n                       download=urllib.quote(path)).text(\"[download]\")\n    h1.span(\"right\").a(href=(\"/download/%s?repository=%s&sha1=%s\"\n                             % (urllib.quote(path), repository.name, file_sha1))).text(\"[view]\")\n\n    table.tbody('spacer top').tr('spacer top').td(colspan=8).text()\n\n    tbody = table.tbody(\"lines\")\n\n    yield document.render(stop=tbody, pretty=not compact)\n\n    for linenr, line in enumerate(file.newLines(True)):\n        linenr = linenr + 1\n        highlight_class = \"\"\n\n        if first is not None:\n            if not (first_with_context <= linenr <= last_with_context): continue\n            if linenr == first:\n                highlight_class += \" first-selected\"\n            if linenr == last:\n                highlight_class += \" last-selected\"\n\n        if tabify:\n            line = htmlutils.tabify(line, tabwidth, indenttabsmode)\n\n        line = line.replace(\"\\r\", \"<i class='cr'></i>\")\n\n        row = tbody.tr(\"line context single\", id=\"f%do%dn%d\" % (file.id, linenr, linenr))\n        row.td(\"edge\").text()\n        row.td(\"linenr old\").text(linenr)\n        row.td(\"line single whole%s\" % highlight_class, id=\"f%dn%d\" % (file.id, linenr), colspan=4).innerHTML(line)\n        row.td(\"linenr new\").text(linenr)\n        row.td(\"edge\").text()\n\n        if linenr % 500:\n            yield document.render(stop=tbody, pretty=not compact)\n\n    table.tbody('spacer bottom').tr('spacer bottom').td(colspan=8).text()\n\n    yield document.render(pretty=not compact)\n"
  },
  {
    "path": "src/page/showfilters.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2015 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport dbutils\nimport gitutils\nimport request\nimport reviewing.filters\nimport page.utils\n\ndef renderShowFilters(req, db, user):\n    path = req.getParameter(\"path\", \"/\")\n    repo_name = req.getParameter(\n        \"repository\", user.getPreference(db, \"defaultRepository\"))\n    repository = gitutils.Repository.fromParameter(db, repo_name)\n\n    show_path = path\n\n    if path.endswith(\"/\") or repository.getHead(db).isDirectory(path):\n        path = path.rstrip(\"/\") + \"/dummy.txt\"\n\n    file_id = dbutils.find_file(db, path=path)\n\n    filters = reviewing.filters.Filters()\n    filters.setFiles(db, [file_id])\n    filters.load(db, repository=repository, recursive=True)\n\n    reviewers = []\n    watchers = []\n\n    for user_id, (filter_type, _delegate) in filters.listUsers(file_id).items():\n        if filter_type == 'reviewer': reviewers.append(user_id)\n        else: watchers.append(user_id)\n\n    result = \"Path: %s\\n\" % show_path\n\n    reviewers_found = False\n    watchers_found = False\n\n    for reviewer_id in sorted(reviewers):\n        if not reviewers_found:\n            result += \"\\nReviewers:\\n\"\n            reviewers_found = True\n\n        reviewer = dbutils.User.fromId(db, reviewer_id)\n        result += \"  %s <%s>\\n\" % (reviewer.fullname, reviewer.email)\n\n    for watcher_id in sorted(watchers):\n        if not watchers_found:\n            result += \"\\nWatchers:\\n\"\n            watchers_found = True\n\n        watcher = dbutils.User.fromId(db, watcher_id)\n        result += \"  %s <%s>\\n\" % (watcher.fullname, watcher.email)\n\n    if not reviewers_found and not watchers_found:\n        result += \"\\nNo matching filters found.\\n\"\n\n    return page.utils.ResponseBody(result, content_type=\"text/plain\")\n"
  },
  {
    "path": "src/page/showreview.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport datetime\nimport calendar\nimport traceback\n\nimport dbutils\nimport gitutils\nimport htmlutils\nimport page.utils\nimport log.html\nimport reviewing.utils as review_utils\nimport reviewing.html as review_html\nimport reviewing.comment as review_comment\nimport configuration\nimport diff\nimport profiling\nimport linkify\nimport textutils\n\ntry:\n    from customization.paths import getModuleFromFile\nexcept:\n    def getModuleFromFile(repository, filename):\n        try:\n            base, rest = filename.split(\"/\", 1)\n            return base + \"/\"\n        except:\n            return None\n\nclass SummaryColumn(log.html.SummaryColumn):\n    def __init__(self, review):\n        log.html.SummaryColumn.__init__(self)\n        self.__review = review\n        self.__cache = {}\n\n    def fillCache(self, db, review):\n        cursor = db.cursor()\n        cursor.execute(\"\"\"SELECT DISTINCT assignee, child\n                            FROM fullreviewuserfiles\n                            JOIN changesets ON (changesets.id=changeset)\n                           WHERE review=%s\n                             AND state='pending'\"\"\",\n                       (review.id,))\n        for user_id, commit_id in cursor:\n            self.__cache.setdefault(commit_id, set()).add(user_id)\n\n    def render(self, db, commit, target, overrides={}):\n        user_ids = self.__cache.get(commit.getId(db))\n        if user_ids:\n            users = [\"%s:%s\" % (user.fullname, user.status) for user in dbutils.User.fromIds(db, [user_id for user_id in user_ids])]\n            target.setAttribute(\"critic-reviewers\", \",\".join(sorted(users)))\n        log.html.SummaryColumn.render(self, db, commit, target, overrides=overrides)\n\nclass ApprovalColumn:\n    APPROVED = 1\n    TOTAL = 2\n\n    def __init__(self, user, review, type, cache):\n        self.__user = user\n        self.__review = review\n        self.__type = type\n        self.__cache = cache\n\n    @staticmethod\n    def fillCache(db, user, review, cache, profiler):\n        cursor = db.cursor()\n\n        profiler.check(\"fillCache\")\n\n        cursor.execute(\"\"\"SELECT child, state, COUNT(*), SUM(deleted), SUM(inserted)\n                            FROM changesets\n                            JOIN reviewfiles ON (changeset=changesets.id)\n                           WHERE review=%s\n                        GROUP BY child, state\"\"\",\n                       (review.id,))\n\n        for commit_id, state, nfiles, deleted, inserted in cursor:\n            data = cache.get(commit_id)\n            if not data: data = cache[commit_id] = [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]\n            if state == 'reviewed':\n                data[3] += nfiles\n                data[4] += deleted\n                data[5] += inserted\n            data[0] += nfiles\n            data[1] += deleted\n            data[2] += inserted\n\n        profiler.check(\"fillCache: total\")\n\n        cursor.execute(\"\"\"SELECT child, COALESCE(reviewfilechanges.to_state, reviewfiles.state) AS effective_state, COUNT(*), SUM(deleted), SUM(inserted)\n                            FROM changesets\n                            JOIN reviewfiles ON (changeset=changesets.id)\n                            JOIN reviewuserfiles ON (reviewuserfiles.file=reviewfiles.id)\n                 LEFT OUTER JOIN reviewfilechanges ON (reviewfilechanges.file=reviewfiles.id\n                                                   AND reviewfilechanges.uid=reviewuserfiles.uid\n                                                   AND reviewfilechanges.state='draft')\n                           WHERE review=%s\n                             AND reviewuserfiles.uid=%s\n                        GROUP BY child, effective_state\"\"\",\n                       (review.id, user.id))\n\n        for commit_id, state, nfiles, deleted, inserted in cursor:\n            data = cache.get(commit_id)\n            if state == 'reviewed':\n                data[9] += nfiles\n                data[10] += deleted\n                data[11] += inserted\n            data[6] += nfiles\n            data[7] += deleted\n            data[8] += inserted\n\n        profiler.check(\"fillCache: user\")\n\n    def __calculate(self, db, commit):\n        return self.__cache.get(commit.id, [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0])\n\n    def className(self, db, commit):\n        if commit:\n            (total_nfiles, total_deleted, total_inserted,\n             approved_nfiles, approved_deleted, approved_inserted,\n             user_total_nfiles, user_total_deleted, user_total_inserted,\n             user_approved_nfiles, user_approved_deleted, user_approved_inserted) = self.__calculate(db, commit)\n\n            if user_approved_nfiles == user_total_nfiles:\n                category = \"\"\n            else:\n                category = \" user\"\n        else:\n            category = \"\"\n\n        if self.__type == ApprovalColumn.APPROVED:\n            return \"approval\" + category\n        else:\n            return \"total\" + category\n\n    def heading(self, target):\n        if self.__type == ApprovalColumn.APPROVED:\n            target.text(\"Pending\")\n        else:\n            target.text(\"Total\")\n\n    def render(self, db, commit, target, overrides={}):\n        (total_nfiles, total_deleted, total_inserted,\n         approved_nfiles, approved_deleted, approved_inserted,\n         user_total_nfiles, user_total_deleted, user_total_inserted,\n         user_approved_nfiles, user_approved_deleted, user_approved_inserted) = self.__calculate(db, commit)\n\n        if self.__type == ApprovalColumn.APPROVED:\n            if user_approved_nfiles == user_total_nfiles:\n                if approved_nfiles == total_nfiles:\n                    target.text()\n                elif approved_deleted == total_deleted and approved_inserted == total_inserted:\n                    target.span().text(\"?? %\")\n                else:\n                    target.span().text(\"%d %%\" % int(100.0 * ((total_deleted + total_inserted) - (approved_deleted + approved_inserted)) / (total_deleted + total_inserted)))\n            elif user_approved_deleted == user_total_deleted and user_approved_inserted == user_total_inserted:\n                target.span().text(\"?? %\")\n            else:\n                target.span().text(\"%d %%\" % int(100.0 * ((user_total_deleted + user_total_inserted) - (user_approved_deleted + user_approved_inserted)) / (user_total_deleted + user_total_inserted)))\n        else:\n            if user_approved_deleted == user_total_deleted and user_approved_inserted == user_total_inserted:\n                target.span().text(\"-%d/+%d\" % (total_deleted, total_inserted))\n            else:\n                target.span().text(\"-%d/+%d\" % (user_total_deleted, user_total_inserted))\n\ndef notModified(req, db, user, review):\n    value = req.getRequestHeader(\"If-None-Match\")\n    return review.getETag(db, user) == value\n\ndef renderShowReview(req, db, user):\n    profiler = profiling.Profiler()\n\n    cursor = db.cursor()\n\n    if user.getPreference(db, \"commit.diff.compactMode\"): default_compact = \"yes\"\n    else: default_compact = \"no\"\n\n    compact = req.getParameter(\"compact\", default_compact) == \"yes\"\n    highlight = req.getParameter(\"highlight\", None)\n\n    review_id = req.getParameter(\"id\", filter=int)\n    review = dbutils.Review.fromId(db, review_id, profiler=profiler)\n\n    profiler.check(\"create review\")\n\n    if not review:\n        raise page.utils.DisplayMessage(\"Invalid Review ID\", \"%d is not a valid review ID.\" % review_id)\n\n    if review.getETag(db, user) == req.getRequestHeader(\"If-None-Match\"):\n        raise page.utils.NotModified\n\n    profiler.check(\"ETag\")\n\n    repository = review.repository\n\n    prefetch_commits = {}\n\n    cursor.execute(\"\"\"SELECT DISTINCT sha1, child\n                        FROM changesets\n                        JOIN reviewchangesets ON (reviewchangesets.changeset=changesets.id)\n                        JOIN commits ON (commits.id=changesets.child)\n                       WHERE review=%s\"\"\",\n                   (review.id,))\n\n    prefetch_commits.update(dict(cursor))\n\n    profiler.check(\"commits (query)\")\n\n    cursor.execute(\"\"\"SELECT old_head, old_head_commit.sha1,\n                             new_head, new_head_commit.sha1,\n                             new_upstream, new_upstream_commit.sha1,\n                             equivalent_merge, equivalent_merge_commit.sha1,\n                             replayed_rebase, replayed_rebase_commit.sha1\n                        FROM reviewrebases\n             LEFT OUTER JOIN commits AS old_head_commit ON (old_head_commit.id=old_head)\n             LEFT OUTER JOIN commits AS new_head_commit ON (new_head_commit.id=new_head)\n             LEFT OUTER JOIN commits AS new_upstream_commit ON (new_upstream_commit.id=new_upstream)\n             LEFT OUTER JOIN commits AS equivalent_merge_commit ON (equivalent_merge_commit.id=equivalent_merge)\n             LEFT OUTER JOIN commits AS replayed_rebase_commit ON (replayed_rebase_commit.id=replayed_rebase)\n                       WHERE review=%s\"\"\",\n                   (review.id,))\n\n    rebases = cursor.fetchall()\n\n    if rebases:\n        has_finished_rebases = False\n\n        for (old_head_id, old_head_sha1,\n             new_head_id, new_head_sha1,\n             new_upstream_id, new_upstream_sha1,\n             equivalent_merge_id, equivalent_merge_sha1,\n             replayed_rebase_id, replayed_rebase_sha1) in rebases:\n            if old_head_id:\n                prefetch_commits[old_head_sha1] = old_head_id\n            if new_head_id:\n                prefetch_commits[new_head_sha1] = new_head_id\n                has_finished_rebases = True\n            if new_upstream_id:\n                prefetch_commits[new_upstream_sha1] = new_upstream_id\n            if equivalent_merge_id:\n                prefetch_commits[equivalent_merge_sha1] = equivalent_merge_id\n            if replayed_rebase_id:\n                prefetch_commits[replayed_rebase_sha1] = replayed_rebase_id\n\n        profiler.check(\"auxiliary commits (query)\")\n\n        if has_finished_rebases:\n            cursor.execute(\"\"\"SELECT commits.sha1, commits.id\n                                FROM commits\n                                JOIN reachable ON (reachable.commit=commits.id)\n                               WHERE branch=%s\"\"\",\n                           (review.branch.id,))\n\n            prefetch_commits.update(dict(cursor))\n\n            profiler.check(\"actual commits (query)\")\n\n    prefetch_commits = gitutils.FetchCommits(repository, prefetch_commits)\n\n    document = htmlutils.Document(req)\n\n    html = document.html()\n    head = html.head()\n    body = html.body(onunload=\"void(0);\")\n\n    def flush(target=None):\n        return document.render(stop=target, pretty=not compact)\n\n    def renderHeaderItems(target):\n        has_draft_items = review_utils.renderDraftItems(db, user, review, target)\n\n        target = target.div(\"buttons\")\n\n        if not has_draft_items:\n            if review.state == \"open\":\n                if review.accepted(db):\n                    target.button(id=\"closeReview\", onclick=\"closeReview();\").text(\"Close Review\")\n                else:\n                    if user in review.owners or user.getPreference(db, \"review.pingAnyReview\"):\n                        target.button(id=\"pingReview\", onclick=\"pingReview();\").text(\"Ping Review\")\n                    if user in review.owners or user.getPreference(db, \"review.dropAnyReview\"):\n                        target.button(id=\"dropReview\", onclick=\"dropReview();\").text(\"Drop Review\")\n\n                if user in review.owners and not review.description:\n                    target.button(id=\"writeDescription\", onclick=\"editDescription();\").text(\"Write Description\")\n            else:\n                target.button(id=\"reopenReview\", onclick=\"reopenReview();\").text(\"Reopen Review\")\n\n        target.span(\"buttonscope buttonscope-global\")\n\n    profiler.check(\"prologue\")\n\n    page.utils.generateHeader(body, db, user, renderHeaderItems, profiler=profiler)\n\n    cursor.execute(\"SELECT 1 FROM fullreviewuserfiles WHERE review=%s AND state='pending' AND assignee=%s\", (review.id, user.id))\n    hasPendingChanges = bool(cursor.fetchone())\n\n    if hasPendingChanges:\n        head.setLink(\"next\", \"showcommit?review=%d&filter=pending\" % review.id)\n\n    profiler.check(\"header\")\n\n    document.addExternalStylesheet(\"resource/showreview.css\")\n    document.addExternalStylesheet(\"resource/review.css\")\n    document.addExternalStylesheet(\"resource/comment.css\")\n    document.addExternalScript(\"resource/showreview.js\")\n    document.addExternalScript(\"resource/review.js\")\n    document.addExternalScript(\"resource/comment.js\")\n    document.addExternalScript(\"resource/reviewfilters.js\")\n    document.addExternalScript(\"resource/autocomplete.js\")\n    document.addInternalScript(user.getJS())\n    document.addInternalScript(\"var isReviewFrontpage = true;\")\n    document.addInternalScript(\"var owners = [ %s ];\" % \", \".join(owner.getJSConstructor() for owner in review.owners))\n    document.addInternalScript(\"var updateCheckInterval = %d;\" % user.getPreference(db, \"review.updateCheckInterval\"));\n\n    log.html.addResources(document)\n\n    document.addInternalScript(review.getJS())\n\n    target = body.div(\"main\")\n\n    basic = target.table('paleyellow basic', align='center')\n    basic.col(width='10%')\n    basic.col(width='60%')\n    basic.col(width='30%')\n    h1 = basic.tr().td('h1', colspan=3).h1()\n    h1.text(\"r/%d: \" % review.id)\n    h1.span(id=\"summary\").text(\"%s\" % review.summary, linkify=linkify.Context(db=db, review=review))\n    h1.a(\"edit\", href=\"javascript:editSummary();\").text(\"[edit]\")\n\n    def linkToCommit(commit):\n        cursor.execute(\"SELECT 1 FROM commits JOIN changesets ON (child=commits.id) JOIN reviewchangesets ON (changeset=changesets.id) WHERE sha1=%s AND review=%s\", (commit.sha1, review.id))\n        if cursor.fetchone():\n            return \"%s/%s?review=%d\" % (review.repository.name, commit.sha1, review.id)\n        return \"%s/%s\" % (review.repository.name, commit.sha1)\n\n    def row(heading, value, help, right=None, linkify=False, cellId=None):\n        main_row = basic.tr('line')\n        main_row.td('heading').text(\"%s:\" % heading)\n        if right is False: colspan = 2\n        else: colspan = None\n        if callable(value): value(main_row.td('value', id=cellId, colspan=colspan).preformatted())\n        else: main_row.td('value', id=cellId, colspan=colspan).preformatted().text(value, linkify=linkify, repository=review.repository)\n        if right is False: pass\n        elif callable(right): right(main_row.td('right', valign='bottom'))\n        else: main_row.td('right').text()\n        if help: basic.tr('help').td('help', colspan=3).text(help)\n\n    def renderBranchName(target):\n        classes = \"branch inset\"\n        if review.branch.archived:\n            classes += \" archived\"\n        target.code(classes).text(review.branch.name, linkify=linkify.Context())\n\n        if repository.name != user.getPreference(db, \"defaultRepository\"):\n            target.text(\" in \")\n            target.code(\"repository inset\").text(repository.getURL(db, user))\n\n        buttons = target.div(\"buttons\")\n\n        cursor.execute(\"\"\"SELECT id, remote, remote_name, disabled, previous\n                            FROM trackedbranches\n                           WHERE repository=%s\n                             AND local_name=%s\"\"\",\n                       (repository.id, review.branch.name))\n\n        row = cursor.fetchone()\n        if row:\n            trackedbranch_id, remote, remote_name, disabled, previous = row\n\n            target.p(\"tracking disabled\" if disabled else \"tracking\").text(\"tracking\")\n\n            target.code(\"branch inset\").text(remote_name, linkify=linkify.Context(remote=remote))\n            target.text(\" in \")\n            target.code(\"repository inset\").text(remote, linkify=linkify.Context())\n\n            if previous:\n                target.span(\"lastupdate\").script(type=\"text/javascript\").text(\"document.write('(last fetched: ' + shortDate(new Date(%d)) + ')');\" % (calendar.timegm(previous.utctimetuple()) * 1000))\n\n            if user in review.owners or user.hasRole(db, \"administrator\"):\n                if review.state == \"open\":\n                    if disabled:\n                        button = buttons.button(\"enabletracking\",\n                                                onclick=(\"enableTracking(%d, %s, %s);\"\n                                                         % (trackedbranch_id,\n                                                            htmlutils.jsify(remote),\n                                                            htmlutils.jsify(remote_name))))\n                        button.text(\"Enable Tracking\")\n                    else:\n                        buttons.button(\"disabletracking\", onclick=\"triggerUpdate(%d);\" % trackedbranch_id).text(\"Update Now\")\n                        buttons.button(\"disabletracking\", onclick=\"disableTracking(%d);\" % trackedbranch_id).text(\"Disable Tracking\")\n\n                    buttons.button(\"rebasereview\", onclick=\"location.assign('/rebasetrackingreview?review=%d');\" % review.id).text(\"Rebase Review\")\n\n        if review.state != \"open\" and review.branch.archived:\n            buttons.button(\"resurrect\").text(\"Resurrect Branch\")\n\n    def renderPeople(target, list):\n        for index, person in enumerate(list):\n            if index != 0: target.text(\", \")\n            span = target.span(\"user %s\" % person.status)\n            span.span(\"name\").text(person.fullname)\n            if person.status == 'absent':\n                span.span(\"status\").text(\" (%s)\" % person.getAbsence(db))\n            elif person.status == 'retired':\n                span.span(\"status\").text(\" (retired)\")\n\n    def renderOwners(target):\n        renderPeople(target, review.owners)\n\n    def renderReviewers(target):\n        if review.reviewers:\n            renderPeople(target, review.reviewers)\n        else:\n            target.i().text(\"No reviewers.\")\n\n        cursor.execute(\"\"\"SELECT reviewfilters.id, reviewfilters.uid, reviewfilters.path\n                            FROM reviewfilters\n                            JOIN users ON (reviewfilters.uid=users.id)\n                           WHERE reviewfilters.review=%s\n                             AND reviewfilters.type='reviewer'\n                             AND users.status!='retired'\"\"\",\n                       (review.id,))\n\n        rows = cursor.fetchall()\n        reviewer_filters_hidden = []\n\n        if rows:\n            table = target.table(\"reviewfilters reviewers\")\n\n            row = table.thead().tr(\"h1\")\n            row.th(\"h1\", colspan=4).text(\"Custom filters:\")\n\n            filter_data = {}\n            reviewfilters = {}\n\n            for filter_id, user_id, path in rows:\n                filter_user = dbutils.User.fromId(db, user_id)\n                path = path or '/'\n                reviewfilters.setdefault(filter_user.fullname, []).append(path)\n                filter_data[(filter_user.fullname, path)] = (filter_id, filter_user)\n\n            count = 0\n            tbody = table.tbody()\n\n            for fullname in sorted(reviewfilters.keys()):\n                original_paths = sorted(reviewfilters[fullname])\n                trimmed_paths = diff.File.eliminateCommonPrefixes(original_paths[:])\n\n                first = True\n\n                for original_path, trimmed_path in zip(original_paths, trimmed_paths):\n                    row = tbody.tr(\"filter\")\n\n                    if first:\n                        row.td(\"username\", rowspan=len(original_paths)).text(fullname)\n                        row.td(\"reviews\", rowspan=len(original_paths)).text(\"reviews\")\n                        first = False\n\n                    row.td(\"path\").span().innerHTML(trimmed_path)\n\n                    filter_id, filter_user = filter_data[(fullname, original_path)]\n\n                    href = \"javascript:removeReviewFilter(%d, %s, 'reviewer', %s, %s);\" % (filter_id, filter_user.getJSConstructor(), htmlutils.jsify(original_path), \"true\" if filter_user != user else \"false\")\n                    row.td(\"remove\").a(href=href).text(\"[remove]\")\n\n                    count += 1\n\n            tfoot = table.tfoot()\n            tfoot.tr().td(colspan=4).text(\"%d line%s hidden\" % (count, \"s\" if count > 1 else \"\"))\n\n            if count > 10:\n                tbody.setAttribute(\"class\", \"hidden\")\n                reviewer_filters_hidden.append(True)\n            else:\n                tfoot.setAttribute(\"class\", \"hidden\")\n                reviewer_filters_hidden.append(False)\n\n        buttons = target.div(\"buttons\")\n\n        if reviewer_filters_hidden:\n            buttons.button(\"showfilters\", onclick=\"toggleReviewFilters('reviewers', $(this));\").text(\"%s Custom Filters\" % (\"Show\" if reviewer_filters_hidden[0] else \"Hide\"))\n\n        if not review.applyfilters:\n            buttons.button(\"applyfilters\", onclick=\"applyFilters('global');\").text(\"Apply Global Filters\")\n\n        if review.applyfilters and review.repository.parent and not review.applyparentfilters:\n            buttons.button(\"applyparentfilters\", onclick=\"applyFilters('upstream');\").text(\"Apply Upstream Filters\")\n\n        buttons.button(\"addreviewer\", onclick=\"addReviewer();\").text(\"Add Reviewer\")\n        buttons.button(\"manage\", onclick=\"location.href='managereviewers?review=%d';\" % review.id).text(\"Manage Assignments\")\n\n    def renderWatchers(target):\n        if review.watchers:\n            renderPeople(target, review.watchers)\n        else:\n            target.i().text(\"No watchers.\")\n\n        cursor.execute(\"\"\"SELECT reviewfilters.id, reviewfilters.uid, reviewfilters.path\n                            FROM reviewfilters\n                            JOIN users ON (reviewfilters.uid=users.id)\n                           WHERE reviewfilters.review=%s\n                             AND reviewfilters.type='watcher'\n                             AND users.status!='retired'\"\"\",\n                       (review.id,))\n\n        rows = cursor.fetchall()\n        watcher_filters_hidden = []\n\n        if rows:\n            table = target.table(\"reviewfilters watchers\")\n\n            row = table.thead().tr(\"h1\")\n            row.th(\"h1\", colspan=4).text(\"Custom filters:\")\n\n            filter_data = {}\n            reviewfilters = {}\n\n            for filter_id, user_id, path in rows:\n                filter_user = dbutils.User.fromId(db, user_id)\n                path = path or '/'\n                reviewfilters.setdefault(filter_user.fullname, []).append(path)\n                filter_data[(filter_user.fullname, path)] = (filter_id, filter_user)\n\n            count = 0\n            tbody = table.tbody()\n\n            for fullname in sorted(reviewfilters.keys()):\n                original_paths = sorted(reviewfilters[fullname])\n                trimmed_paths = diff.File.eliminateCommonPrefixes(original_paths[:])\n\n                first = True\n\n                for original_path, trimmed_path in zip(original_paths, trimmed_paths):\n                    row = tbody.tr(\"filter\")\n\n                    if first:\n                        row.td(\"username\", rowspan=len(original_paths)).text(fullname)\n                        row.td(\"reviews\", rowspan=len(original_paths)).text(\"watches\")\n                        first = False\n\n                    row.td(\"path\").span().innerHTML(trimmed_path)\n\n                    filter_id, filter_user = filter_data[(fullname, original_path)]\n\n                    href = \"javascript:removeReviewFilter(%d, %s, 'watcher', %s, %s);\" % (filter_id, filter_user.getJSConstructor(), htmlutils.jsify(original_path), \"true\" if filter_user != user else \"false\")\n                    row.td(\"remove\").a(href=href).text(\"[remove]\")\n\n                    count += 1\n\n            tfoot = table.tfoot()\n            tfoot.tr().td(colspan=4).text(\"%d line%s hidden\" % (count, \"s\" if count > 1 else \"\"))\n\n            if count > 10:\n                tbody.setAttribute(\"class\", \"hidden\")\n                watcher_filters_hidden.append(True)\n            else:\n                tfoot.setAttribute(\"class\", \"hidden\")\n                watcher_filters_hidden.append(False)\n\n        buttons = target.div(\"buttons\")\n\n        if watcher_filters_hidden:\n            buttons.button(\"showfilters\", onclick=\"toggleReviewFilters('watchers', $(this));\").text(\"%s Custom Filters\" % (\"Show\" if watcher_filters_hidden[0] else \"Hide\"))\n\n        buttons.button(\"addwatcher\", onclick=\"addWatcher();\").text(\"Add Watcher\")\n\n        if user not in review.reviewers and user not in review.owners:\n            if user not in review.watchers:\n                buttons.button(\"watch\", onclick=\"watchReview();\").text(\"Watch Review\")\n            elif review.watchers[user] == \"manual\":\n                buttons.button(\"watch\", onclick=\"unwatchReview();\").text(\"Stop Watching Review\")\n\n    def renderEditOwners(target):\n        target.button(\"description\", onclick=\"editOwners();\").text(\"Edit Owners\")\n\n    def renderEditDescription(target):\n        target.button(\"description\", onclick=\"editDescription();\").text(\"Edit Description\")\n\n    def renderRecipientList(target):\n        cursor.execute(\"\"\"SELECT uid, fullname, include\n                            FROM reviewrecipientfilters\n                 LEFT OUTER JOIN users ON (uid=id)\n                           WHERE review=%s\"\"\",\n                       (review.id,))\n\n        default_include = True\n        included = dict((owner.fullname, owner.id) for owner in review.owners)\n        excluded = {}\n\n        for user_id, fullname, include in cursor:\n            if user_id is None: default_include = include\n            elif include: included[fullname] = user_id\n            elif user_id not in review.owners: excluded[fullname] = user_id\n\n        mode = None\n        users = None\n\n        buttons = []\n        opt_in_button = False\n        opt_out_button = False\n\n        if default_include:\n            if excluded:\n                mode = \"Everyone except \"\n                users = excluded\n                opt_out_button = user.fullname not in excluded\n                opt_in_button = not opt_out_button\n            else:\n                mode = \"Everyone.\"\n                opt_out_button = True\n        else:\n            if included:\n                mode = \"No-one except \"\n                users = included\n                opt_in_button = user.fullname not in included\n                opt_out_button = not opt_in_button\n            else:\n                mode = \"No-one at all.\"\n                opt_in_button = True\n\n        if user not in review.owners and (user in review.reviewers or user in review.watchers):\n            if opt_in_button:\n                buttons.append((\"Include me, please!\", \"includeRecipient(%d);\" % user.id))\n            if opt_out_button:\n                buttons.append((\"Exclude me, please!\", \"excludeRecipient(%d);\" % user.id))\n\n        target.span(\"mode\").text(mode)\n\n        if users:\n            container = target.span(\"users\")\n\n            first = True\n            for fullname in sorted(users.keys()):\n                if first: first = False\n                else: container.text(\", \")\n\n                container.span(\"user\", critic_user_id=users[fullname]).text(fullname)\n\n            container.text(\".\")\n\n        if buttons:\n            container = target.div(\"buttons\")\n\n            for label, onclick in buttons:\n                container.button(onclick=onclick).text(label)\n\n    row(\"Branch\", renderBranchName, \"The branch containing the commits to review.\", right=False)\n    row(\"Owner%s\" % (\"s\" if len(review.owners) > 1 else \"\"), renderOwners, \"The users who created and/or owns the review.\", right=renderEditOwners)\n    if review.description:\n        row(\"Description\", review.description, \"A longer description of the changes to be reviewed.\", linkify=linkToCommit, cellId=\"description\", right=renderEditDescription)\n    row(\"Reviewers\", renderReviewers, \"Users responsible for reviewing the changes in this review.\", right=False)\n    row(\"Watchers\", renderWatchers, \"Additional users who receive e-mails about updates to this review.\", right=False)\n    row(\"Recipient List\", renderRecipientList, \"Users (among the reviewers and watchers) who will receive any e-mails about the review.\", right=False)\n\n    profiler.check(\"basic\")\n\n    review_state = review.getReviewState(db)\n\n    profiler.check(\"review state\")\n\n    progress = target.table('paleyellow progress', align='center')\n    progress_header = progress.tr().td('h1', colspan=3).h1()\n    progress_header.text(\"Review Progress\")\n    progress_header_right = progress_header.span(\"right\")\n    progress_header_right.text(\"Display log: \")\n    progress_header_right.a(href=\"showreviewlog?review=%d&granularity=module\" % review.id).text(\"[per module]\")\n    progress_header_right.text()\n    progress_header_right.a(href=\"showreviewlog?review=%d&granularity=file\" % review.id).text(\"[per file]\")\n    progress_h1 = progress.tr().td('percent', colspan=3).h1()\n\n    title_data = { 'id': 'r/%d' % review.id,\n                   'summary': review.summary,\n                   'progress': str(review_state) }\n\n    if review.state == \"closed\":\n        progress_h1.img(src=htmlutils.getStaticResourceURI(\"seal-of-approval-left.png\"),\n                        style=\"position: absolute; margin-left: -80px; margin-top: -100px\")\n        progress_h1.text(\"Finished!\")\n\n        for branch in review.repository.getSignificantBranches(db):\n            if review.branch.getHead(db).isAncestorOf(branch.head_sha1):\n                remark = progress_h1.div().span(\"remark\")\n                remark.text(\"Merged to \")\n                remark.a(href=\"/log?repository=%s&branch=%s\" % (review.repository.name, branch.name)).text(branch.name)\n                remark.text(\".\")\n    elif review.state == \"dropped\":\n        progress_h1.text(\"Dropped...\")\n    elif review.state == \"open\" and review_state.accepted:\n        progress_h1.img(src=htmlutils.getStaticResourceURI(\"seal-of-approval-left.png\"),\n                        style=\"position: absolute; margin-left: -80px; margin-top: -100px\")\n        progress_h1.text(\"Accepted!\")\n        progress_h1.div().span(\"remark\").text(\"Hurry up and close it before anyone has a change of heart.\")\n    else:\n        progress_h1.text(review_state.getProgress())\n\n        if review_state.issues:\n            progress_h1.span(\"comments\").text(\" and \")\n            progress_h1.text(\"%d\" % review_state.issues)\n            progress_h1.span(\"comments\").text(\" issue%s\" % (review_state.issues > 1 and \"s\" or \"\"))\n\n        if review_state.getPercentReviewed() != 100.0:\n            cursor = db.cursor()\n            cursor.execute(\"\"\"SELECT 1\n                                FROM reviewfiles\n                     LEFT OUTER JOIN reviewuserfiles ON (reviewuserfiles.file=reviewfiles.id)\n                               WHERE reviewfiles.review=%s\n                                 AND reviewfiles.state='pending'\n                                 AND reviewuserfiles.uid IS NULL\"\"\",\n                           (review.id,))\n\n            if cursor.fetchone():\n                progress.tr().td('stuck', colspan=3).a(href=\"showreviewlog?review=%d&granularity=file&unassigned=yes\" % review.id).text(\"Not all changes have a reviewer assigned!\")\n\n            cursor.execute(\"\"\"SELECT uid, MIN(reviewuserfiles.time)\n                                FROM reviewfiles\n                                JOIN reviewuserfiles ON (reviewuserfiles.file=reviewfiles.id)\n                               WHERE reviewfiles.review=%s\n                                 AND reviewfiles.state='pending'\n                            GROUP BY reviewuserfiles.uid\"\"\",\n                           (review.id,))\n\n            now = datetime.datetime.now()\n\n            def seconds_since(timestamp):\n                if isinstance(timestamp, int):\n                    # We tell sqlite3 to convert TIMESTAMP values into datetime\n                    # objects, but apparently MIN(timestamp) as used in the\n                    # query above isn't typed as TIMESTAMP by SQLite, so doesn't\n                    # get converted automatically.\n                    timestamp = datetime.datetime.fromtimestamp(timestamp)\n                delta = now - timestamp\n                return delta.days * 60 * 60 * 24 + delta.seconds\n\n            pending_reviewers = [(dbutils.User.fromId(db, user_id), seconds_since(timestamp)) for (user_id, timestamp) in cursor.fetchall() if seconds_since(timestamp) > 60 * 60 * 8]\n\n            if pending_reviewers:\n                progress.tr().td('stragglers', colspan=3).text(\"Needs review from\")\n                for reviewer, seconds in pending_reviewers:\n                    if reviewer.status == 'retired': continue\n                    elif reviewer.status == 'absent': warning = \" absent\"\n                    elif not reviewer.getPreference(db, \"email.activated\"): warning = \" no-email\"\n                    else: warning = \"\"\n\n                    if seconds < 60 * 60 * 24:\n                        hours = seconds / (60 * 60)\n                        duration = \" (%d hour%s)\" % (hours, \"s\" if hours > 1 else \"\")\n                    elif seconds < 60 * 60 * 24 * 7:\n                        days = seconds / (60 * 60 * 24)\n                        duration = \" (%d day%s)\" % (days, \"s\" if days > 1 else \"\")\n                    elif seconds < 60 * 60 * 24 * 30:\n                        weeks = seconds / (60 * 60 * 24 * 7)\n                        duration = \" (%d week%s)\" % (weeks, \"s\" if weeks > 1 else \"\")\n                    else:\n                        duration = \" (wake up!)\"\n\n                    progress.tr().td('straggler' + warning, colspan=3).text(\"%s%s\" % (reviewer.fullname, duration))\n                if user in review.owners:\n                    progress.tr().td('pinging', colspan=3).span().text(\"Send a message to these users by pinging the review.\")\n\n    title_format = user.getPreference(db, 'ui.title.showReview')\n\n    try:\n        document.setTitle(title_format % title_data)\n    except Exception as exc:\n        document.setTitle(traceback.format_exception_only(type(exc), exc)[0].strip())\n\n    profiler.check(\"progress\")\n\n    check = profiler.start(\"ApprovalColumn.fillCache\")\n\n    approval_cache = {}\n\n    ApprovalColumn.fillCache(db, user, review, approval_cache, profiler)\n\n    check.stop()\n\n    summary_column = SummaryColumn(review)\n    summary_column.fillCache(db, review)\n\n    profiler.check(\"SummaryColumn.fillCache\")\n\n    columns = [(10, log.html.WhenColumn()),\n               (60, summary_column),\n               (16, log.html.AuthorColumn()),\n               (7, ApprovalColumn(user, review, ApprovalColumn.APPROVED, approval_cache)),\n               (7, ApprovalColumn(user, review, ApprovalColumn.TOTAL, approval_cache))]\n\n    def renderReviewPending(db, target):\n        if not user.isAnonymous():\n            target.text(\"Filter: \")\n\n            if hasPendingChanges:\n                target.a(href=\"showcommit?review=%d&filter=pending\" % review.id, title=\"All changes you need to review.\").text(\"[pending]\")\n                target.text()\n            if user in review.reviewers:\n                target.a(href=\"showcommit?review=%d&filter=reviewable\" % review.id, title=\"All changes you can review, including what you've already reviewed.\").text(\"[reviewable]\")\n                target.text()\n\n            target.a(href=\"showcommit?review=%d&filter=relevant\" % review.id, title=\"All changes that match your filters.\").text(\"[relevant]\")\n            target.text()\n\n        target.text(\"Manual: \")\n        target.a(href=\"filterchanges?review=%d\" % review.id, title=\"Manually select what files to display of the changes from all commits.\").text(\"[full]\")\n        target.text()\n        target.a(href=\"javascript:void(filterPartialChanges());\", title=\"Manually select what files to display of the changes in a selection of commits.\").text(\"[partial]\")\n\n    req.addResponseHeader(\"ETag\", review.getETag(db, user))\n\n    if user.getPreference(db, \"review.useMustRevalidate\"):\n        req.addResponseHeader(\"Cache-Control\", \"must-revalidate\")\n\n    yield flush(target)\n\n    try:\n        if prefetch_commits.error is None:\n            prefetch_commits.getCommits(db)\n\n        profiler.check(\"FetchCommits.getCommits()\")\n\n        cursor.execute(\"\"\"SELECT DISTINCT parent, child\n                            FROM changesets\n                            JOIN reviewchangesets ON (reviewchangesets.changeset=changesets.id)\n                            JOIN commits ON (commits.id=changesets.child)\n                           WHERE review=%s\n                             AND type!='conflicts'\"\"\",\n                       (review.id,))\n\n        commits = set()\n\n        for parent_id, child_id in cursor:\n            commits.add(gitutils.Commit.fromId(db, repository, child_id))\n\n        commits = list(commits)\n\n        cursor.execute(\"\"\"SELECT id, old_head, new_head, new_upstream, equivalent_merge, replayed_rebase, uid, branch\n                            FROM reviewrebases\n                           WHERE review=%s\"\"\",\n                       (review.id,))\n\n        all_rebases = [(rebase_id,\n                        gitutils.Commit.fromId(db, repository, old_head),\n                        gitutils.Commit.fromId(db, repository, new_head) if new_head else None,\n                        dbutils.User.fromId(db, user_id),\n                        gitutils.Commit.fromId(db, repository, new_upstream) if new_upstream is not None else None,\n                        gitutils.Commit.fromId(db, repository, equivalent_merge) if equivalent_merge is not None else None,\n                        gitutils.Commit.fromId(db, repository, replayed_rebase) if replayed_rebase is not None else None,\n                        branch_name)\n                       for rebase_id, old_head, new_head, new_upstream, equivalent_merge, replayed_rebase, user_id, branch_name in cursor]\n\n        bottom_right = None\n\n        finished_rebases = filter(lambda item: item[2] is not None, all_rebases)\n        current_rebases = filter(lambda item: item[2] is None, all_rebases)\n\n        if current_rebases:\n            assert len(current_rebases) == 1\n\n            def renderCancelRebase(db, target):\n                target.button(\"cancelrebase\").text(\"Cancel Rebase\")\n\n            if user == current_rebases[0][3]:\n                bottom_right = renderCancelRebase\n        else:\n            def renderPrepareRebase(db, target):\n                target.button(\"preparerebase\").text(\"Prepare Rebase\")\n\n            bottom_right = renderPrepareRebase\n\n        if finished_rebases:\n            cursor.execute(\"\"\"SELECT commit\n                                FROM reachable\n                               WHERE branch=%s\"\"\",\n                           (review.branch.id,))\n\n            actual_commits = [gitutils.Commit.fromId(db, repository, commit_id) for (commit_id,) in cursor]\n        else:\n            actual_commits = []\n\n        log.html.render(db, target, \"Commits (%d)\", commits=commits, columns=columns, title_right=renderReviewPending, rebases=finished_rebases, branch_name=review.branch.name, bottom_right=bottom_right, review=review, highlight=highlight, profiler=profiler, user=user, extra_commits=actual_commits)\n\n        yield flush(target)\n\n        profiler.check(\"log\")\n    except gitutils.GitReferenceError as error:\n        div = target.div(\"error\")\n        div.h1().text(\"Error!\")\n        div.text(\"The commit %s is missing from the repository.\" % error.sha1)\n    except gitutils.GitError as error:\n        div = target.div(\"error\")\n        div.h1().text(\"Error!\")\n        div.text(\"Failed to read commits from the repository: %s\" % error.message)\n\n    all_chains = review_comment.CommentChain.fromReview(db, review, user)\n\n    profiler.check(\"chains (load)\")\n\n    if all_chains:\n        issue_chains = filter(lambda chain: chain.type == \"issue\", all_chains)\n        draft_issues = filter(lambda chain: chain.state == \"draft\", issue_chains)\n        open_issues = filter(lambda chain: chain.state == \"open\", issue_chains)\n        addressed_issues = filter(lambda chain: chain.state == \"addressed\", issue_chains)\n        closed_issues = filter(lambda chain: chain.state == \"closed\", issue_chains)\n        note_chains = filter(lambda chain: chain.type == \"note\", all_chains)\n        draft_notes = filter(lambda chain: chain.state == \"draft\", note_chains)\n        open_notes = filter(lambda chain: chain.state != \"draft\" and chain.state != \"empty\", note_chains)\n    else:\n        open_issues = []\n        open_notes = []\n\n    chains = target.table(\"paleyellow comments\", align=\"center\", cellspacing=0)\n    h1 = chains.tr(\"h1\").td(\"h1\", colspan=3).h1().text(\"Comments\")\n    links = h1.span(\"links\")\n\n    if all_chains:\n        links.a(href=\"showcomments?review=%d&filter=all\" % review.id).text(\"[display all]\")\n\n        if not user.isAnonymous():\n            links.a(href=\"showcomments?review=%d&filter=all&blame=%s\" % (review.id, user.name)).text(\"[in my commits]\")\n\n            cursor.execute(\"\"\"SELECT count(commentstoread.comment) > 0\n                                FROM commentchains\n                                JOIN comments ON (comments.chain=commentchains.id)\n                                JOIN commentstoread ON (commentstoread.comment=comments.id)\n                               WHERE commentchains.review=%s\n                                 AND commentstoread.uid=%s\"\"\",\n                           [review.id, user.id])\n\n            if cursor.fetchone()[0]:\n                links.a(href=\"showcomments?review=%d&filter=toread\" % review.id).text(\"[display unread]\")\n\n        def renderChains(target, chains):\n            for chain in chains:\n                row = target.tr(\"comment %s %s\" % (chain.type, chain.state))\n                row.td(\"author\").text(chain.user.fullname)\n                row.td(\"title\").a(href=\"showcomment?chain=%d\" % chain.id).innerHTML(chain.leader())\n\n                ncomments = chain.countComments()\n                nunread = chain.countUnread()\n\n                cell = row.td(\"when\")\n                if ncomments <= 1:\n                    if nunread: cell.b().text(\"Unread\")\n                    else: cell.text(\"No replies\")\n                else:\n                    if nunread: cell.b().text(\"%d of %d unread\" % (nunread, ncomments))\n                    else: cell.text(\"%d repl%s\" % (ncomments - 1, \"ies\" if ncomments > 2 else \"y\"))\n\n        if draft_issues:\n            h2 = chains.tr(\"h2\", id=\"draft-issues\").td(\"h2\", colspan=3).h2().text(\"Draft Issues\")\n            h2.a(href=\"showcomments?review=%d&filter=draft-issues\" % review.id).text(\"[display all]\")\n            h2.a(href=\"showcomments?review=%d&filter=draft-issues&blame=%s\" % (review.id, user.name)).text(\"[in my commits]\")\n            renderChains(chains, draft_issues)\n\n        if open_issues:\n            h2 = chains.tr(\"h2\", id=\"open-issues\").td(\"h2\", colspan=3).h2().text(\"Open Issues\")\n            h2.a(href=\"showcomments?review=%d&filter=open-issues\" % review.id).text(\"[display all]\")\n            h2.a(href=\"showcomments?review=%d&filter=open-issues&blame=%s\" % (review.id, user.name)).text(\"[in my commits]\")\n            renderChains(chains, open_issues)\n\n        if addressed_issues:\n            h2 = chains.tr(\"h2\", id=\"addressed-issues\").td(\"h2\", colspan=3).h2().text(\"Addressed Issues\")\n            h2.a(href=\"showcomments?review=%d&filter=addressed-issues\" % review.id).text(\"[display all]\")\n            h2.a(href=\"showcomments?review=%d&filter=addressed-issues&blame=%s\" % (review.id, user.name)).text(\"[in my commits]\")\n            renderChains(chains, addressed_issues)\n\n        if closed_issues:\n            h2 = chains.tr(\"h2\", id=\"closed-issues\").td(\"h2\", colspan=3).h2().text(\"Resolved Issues\")\n            h2.a(href=\"showcomments?review=%d&filter=closed-issues\" % review.id).text(\"[display all]\")\n            h2.a(href=\"showcomments?review=%d&filter=closed-issues&blame=%s\" % (review.id, user.name)).text(\"[in my commits]\")\n            renderChains(chains, closed_issues)\n\n        if draft_notes:\n            h2 = chains.tr(\"h2\", id=\"draft-notes\").td(\"h2\", colspan=3).h2().text(\"Draft Notes\")\n            h2.a(href=\"showcomments?review=%d&filter=draft-notes\" % review.id).text(\"[display all]\")\n            h2.a(href=\"showcomments?review=%d&filter=draft-notes&blame=%s\" % (review.id, user.name)).text(\"[in my commits]\")\n            renderChains(chains, draft_notes)\n\n        if open_notes:\n            h2 = chains.tr(\"h2\", id=\"notes\").td(\"h2\", colspan=3).h2().text(\"Notes\")\n            h2.a(href=\"showcomments?review=%d&filter=open-notes\" % review.id).text(\"[display all]\")\n            h2.a(href=\"showcomments?review=%d&filter=open-notes&blame=%s\" % (review.id, user.name)).text(\"[in my commits]\")\n            renderChains(chains, open_notes)\n\n    buttons = chains.tr(\"buttons\").td(\"buttons\", colspan=3)\n    buttons.button(onclick=\"CommentChain.create('issue');\").text(\"Raise Issue\")\n    buttons.button(onclick=\"CommentChain.create('note');\").text(\"Write Note\")\n\n    profiler.check(\"chains (render)\")\n\n    yield flush(target)\n\n    cursor.execute(\"\"\"SELECT DISTINCT reviewfiles.file, theirs.uid\n                        FROM reviewfiles\n                        JOIN reviewuserfiles AS yours ON (yours.file=reviewfiles.id)\n                        JOIN reviewuserfiles AS theirs ON (theirs.file=yours.file AND theirs.uid!=yours.uid)\n                       WHERE reviewfiles.review=%s\n                         AND yours.uid=%s\"\"\",\n                   (review.id, user.id))\n    rows = cursor.fetchall()\n\n    profiler.check(\"shared assignments (query)\")\n\n    if rows:\n        reviewers = {}\n\n        for file_id, user_id in rows:\n            reviewers.setdefault(file_id, {})[user_id] = set()\n\n        shared = target.table('paleyellow shared', align='center', cellspacing=0)\n        row = shared.tr('h1')\n        shared_header = row.td('h1', colspan=2).h1()\n        shared_header.text(\"Shared Assignments\")\n        shared_buttons = row.td('buttons', colspan=2).span(style=\"display: none\")\n        shared_buttons.button(\"confirm\").text(\"Confirm\")\n        shared_buttons.button(\"cancel\").text(\"Cancel\")\n\n        granularity = \"module\"\n\n        def moduleFromFile(file_id):\n            filename = dbutils.describe_file(db, file_id)\n            return getModuleFromFile(repository, filename) or filename\n\n        def formatFiles(files):\n            paths = sorted([dbutils.describe_file(db, file_id) for file_id in files])\n            if granularity == \"file\":\n                return diff.File.eliminateCommonPrefixes(paths)\n            else:\n                modules = set()\n                files = []\n                for path in paths:\n                    module = getModuleFromFile(path)\n                    if module: modules.add(module)\n                    else: files.append(path)\n                return sorted(modules) + diff.File.eliminateCommonPrefixes(files)\n\n        files_per_team = review_utils.collectReviewTeams(reviewers)\n        teams_per_modules = {}\n\n        profiler.check(\"shared assignments (collect teams)\")\n\n        for team, files in files_per_team.items():\n            modules = set()\n            for file_id in files:\n                modules.add(moduleFromFile(file_id))\n            teams_per_modules.setdefault(frozenset(modules), set()).update(team)\n\n        for modules, team in teams_per_modules.items():\n            row = shared.tr(\"reviewers\")\n\n            cell = row.td(\"reviewers\")\n            members = sorted([dbutils.User.fromId(db, user_id).fullname for user_id in team])\n            for member in members: cell.text(member).br()\n            row.td(\"willreview\").innerHTML(\"<span class='also'>also</span>&nbsp;review&nbsp;changes&nbsp;in\")\n\n            cell = row.td(\"files\")\n            for path in diff.File.eliminateCommonPrefixes(sorted(modules)):\n                cell.span(\"file\").innerHTML(path).br()\n\n            paths = textutils.json_encode(list(modules))\n            user_ids = textutils.json_encode(list(team))\n\n            cell = row.td(\"buttons\")\n            cell.button(\"accept\", critic_paths=paths, critic_user_ids=user_ids).text(\"I will review this!\")\n            cell.button(\"deny\", critic_paths=paths, critic_user_ids=user_ids).text(\"They will review this!\")\n\n    yield flush(target)\n\n    profiler.check(\"shared assignments\")\n\n    cursor.execute(\"\"\"SELECT batches.id, batches.time,\n                             users.fullname,\n                             comments.comment\n                        FROM batches\n                        JOIN users ON (users.id=batches.uid)\n             LEFT OUTER JOIN commentchains ON (commentchains.id=batches.comment)\n             LEFT OUTER JOIN comments ON (comments.id=commentchains.first_comment)\n                       WHERE batches.review=%s\n                    ORDER BY batches.id DESC\"\"\",\n                   (review.id,))\n\n    rows = cursor.fetchall()\n\n    if rows:\n        numbers = {}\n\n        cursor.execute(\"\"\"SELECT batches.id, reviewfilechanges.to_state, SUM(deleted), SUM(inserted)\n                            FROM batches\n                            JOIN reviewfilechanges ON (reviewfilechanges.batch=batches.id)\n                            JOIN reviewfiles ON (reviewfiles.id=reviewfilechanges.file)\n                           WHERE batches.review=%s\n                        GROUP BY batches.id, reviewfilechanges.to_state\"\"\",\n                       (review.id,))\n\n        for batch_id, state, deleted, inserted in cursor:\n            per_batch = numbers.setdefault(batch_id, {})\n            per_batch[state] = (deleted, inserted)\n\n        cursor.execute(\"\"\"SELECT batches.id, commentchains.type, COUNT(commentchains.id)\n                            FROM batches\n                            JOIN commentchains ON (commentchains.batch=batches.id)\n                           WHERE batches.review=%s\n                        GROUP BY batches.id, commentchains.type\"\"\",\n                       (review.id,))\n\n        for batch_id, commentchain_type, count in cursor:\n            per_batch = numbers.setdefault(batch_id, {})\n            per_batch[commentchain_type] = count\n\n        cursor.execute(\"\"\"SELECT batches.id, commentchainchanges.to_state, COUNT(commentchainchanges.chain)\n                            FROM batches\n                            JOIN commentchainchanges ON (commentchainchanges.batch=batches.id)\n                           WHERE batches.review=%s\n                             AND commentchainchanges.to_state IS NOT NULL\n                        GROUP BY batches.id, commentchainchanges.to_state\"\"\",\n                       (review.id,))\n\n        for batch_id, commentchainchange_state, count in cursor:\n            per_batch = numbers.setdefault(batch_id, {})\n            per_batch[commentchainchange_state] = count\n\n        cursor.execute(\"\"\"SELECT batches.id, COUNT(commentchainchanges.chain)\n                            FROM batches\n                            JOIN commentchainchanges ON (commentchainchanges.batch=batches.id)\n                           WHERE batches.review=%s\n                             AND commentchainchanges.to_type IS NOT NULL\n                        GROUP BY batches.id\"\"\",\n                       (review.id,))\n\n        for batch_id, count in cursor:\n            per_batch = numbers.setdefault(batch_id, {})\n            per_batch[\"morph\"] = count\n\n        cursor.execute(\"\"\"SELECT batches.id, COUNT(comments.id)\n                            FROM batches\n                            JOIN commentchains ON (commentchains.batch!=batches.id)\n                            JOIN comments ON (comments.batch=batches.id\n                                          AND comments.chain=commentchains.id)\n                           WHERE batches.review=%s\n                        GROUP BY batches.id\"\"\",\n                       (review.id,))\n\n        for batch_id, count in cursor:\n            per_batch = numbers.setdefault(batch_id, {})\n            per_batch[\"reply\"] = count\n\n        batches = target.table(\"paleyellow batches\", align=\"center\", cellspacing=0)\n        batches.tr().td(\"h1\", colspan=3).h1().text(\"Work Log\")\n\n        def format_lines(deleted, inserted):\n            if deleted and inserted:\n                return \"-%d/+%d\" % (deleted, inserted)\n            elif deleted:\n                return \"-%d\" % deleted\n            else:\n                return \"+%d\" % inserted\n        def with_plural(count, one, many):\n            return (count, many if count > 1 else one)\n        def with_plural_s(count):\n            return with_plural(count, \"\", \"s\")\n\n        for batch_id, timestamp, user_fullname, comment in rows:\n            row = batches.tr(\"batch\")\n            row.td(\"author\").text(user_fullname)\n\n            title = row.td(\"title clickable\").a(\n                \"clickable-target\", href=\"showbatch?batch=%d\" % batch_id)\n            if comment:\n                title.innerHTML(textutils.summarize(comment, as_html=True))\n            else:\n                title.i().text(\"No comment\")\n\n            per_batch = numbers.get(batch_id)\n            if per_batch:\n                items = []\n                reviewed = per_batch.get(\"reviewed\")\n                if reviewed:\n                    items.append(\"reviewed %s lines\" % format_lines(*reviewed))\n                unreviewed = per_batch.get(\"pending\")\n                if unreviewed:\n                    items.append(\"unreviewed %s lines\" % format_lines(*unreviewed))\n                issues = per_batch.get(\"issue\")\n                if issues:\n                    items.append(\"raised %d issue%s\" % with_plural_s(issues))\n                notes = per_batch.get(\"note\")\n                if notes:\n                    items.append(\"wrote %d note%s\" % with_plural_s(notes))\n                resolved = per_batch.get(\"closed\")\n                if resolved:\n                    items.append(\"resolved %d issue%s\" % with_plural_s(resolved))\n                reopened = per_batch.get(\"open\")\n                if reopened:\n                    items.append(\"reopened %d issue%s\" % with_plural_s(reopened))\n                morphed = per_batch.get(\"morph\")\n                if morphed:\n                    items.append(\"morphed %d comment%s\" % with_plural_s(morphed))\n                replies = per_batch.get(\"reply\")\n                if replies:\n                    items.append(\"wrote %d %s\" % with_plural(replies, \"reply\", \"replies\"))\n                if items:\n                    if len(items) == 1:\n                        items = items[0]\n                    else:\n                        items = \"%s and %s\" % (\", \".join(items[:-1]), items[-1])\n                else:\n                    items = \"nothing\"\n                title.span(\"numbers\").text(items)\n\n            row.td(\"when\").text(user.formatTimestamp(db, timestamp))\n\n    profiler.check(\"batches\")\n    profiler.output(db, user, target)\n\n    yield flush()\n\n    try:\n        head = review.branch.getHead(db)\n    except gitutils.GitReferenceError:\n        # Commit missing from repository.\n        pass\n    else:\n        try:\n            head_according_to_git = repository.revparse(review.branch.name)\n        except gitutils.GitReferenceError:\n            # Branch missing from repository.\n            head_according_to_git = None\n\n        head_according_to_us = head.sha1\n\n        if head_according_to_git != head_according_to_us:\n            # The git repository disagrees with us.  Potentially harmful updates\n            # to the branch will be rejected by the git hook while this is the\n            # case, but this means that \"our\" head might not be referenced at\n            # all and thus that it might be GC:ed by the git repository at some\n            # point.  To avoid that, add a keepalive reference.\n            repository.keepalive(head_according_to_us)\n\n            yield \"\\n<!-- branch head mismatch: git=%s, us=%s (corrected) -->\" % (head_according_to_git[:8] if head_according_to_git else \"N/A\", head_according_to_us[:8])\n"
  },
  {
    "path": "src/page/showreviewlog.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport dbutils\nimport htmlutils\nimport page.utils\nimport log.html\nimport reviewing.utils as review_utils\nimport reviewing.html as review_html\nimport reviewing.comment as review_comment\nimport re\nimport diff\n\nre_module = re.compile(\"^((?:data|modules|platforms)/[^/]+)/.*$\")\n\ndef renderShowReviewLog(req, db, user):\n    review_id = req.getParameter(\"review\", filter=int)\n    granularity = req.getParameter(\"granularity\")\n    unassigned = req.getParameter(\"unassigned\", \"no\") == \"yes\"\n\n    cursor = db.cursor()\n\n    review = dbutils.Review.fromId(db, review_id)\n\n    document = htmlutils.Document(req)\n    html = document.html()\n    head = html.head()\n    body = html.body()\n\n    head.title(\"Review Log: %s\" % review.branch.name)\n\n    page.utils.generateHeader(body, db, user, lambda target: review_utils.renderDraftItems(db, user, review, target), extra_links=[(\"r/%d\" % review.id, \"Back to Review\")])\n\n    document.addExternalStylesheet(\"resource/showreviewlog.css\")\n    document.addExternalStylesheet(\"resource/review.css\")\n    document.addExternalScript(\"resource/review.js\")\n    document.addInternalScript(review.getJS())\n\n    target = body.div(\"main\")\n\n    reviewed_reviewers = review_utils.getReviewedReviewers(db, review)\n\n    def formatFiles(files):\n        paths = sorted([dbutils.describe_file(db, file_id) for file_id in files])\n        if granularity == \"file\":\n            return diff.File.eliminateCommonPrefixes(paths)\n        else:\n            modules = set()\n            files = []\n            for path in paths:\n                match = re_module.match(path)\n                if match: modules.add(match.group(1))\n                else: files.append(path)\n            return sorted(modules) + diff.File.eliminateCommonPrefixes(files)\n\n    if reviewed_reviewers and not unassigned:\n        reviewed = target.table(\"paleyellow\")\n        reviewed.col(width=\"30%\")\n        reviewed.col(width=\"10%\")\n        reviewed.col(width=\"60%\")\n        reviewed.tr().td('h1', colspan=3).h1().text(\"Reviewed Changes\")\n\n        teams = review_utils.collectReviewTeams(reviewed_reviewers)\n\n        for team in teams:\n            row = reviewed.tr(\"reviewers\")\n\n            cell = row.td(\"reviewers\")\n            users = sorted([dbutils.User.fromId(db, user_id).fullname for user_id in team])\n            for user in users: cell.text(user).br()\n            row.td(\"willreview\").innerHTML(\"reviewed\")\n\n            cell = row.td(\"files\")\n            for file in formatFiles(teams[team]):\n                cell.span(\"file\").innerHTML(file).br()\n\n    pending_reviewers = review_utils.getPendingReviewers(db, review)\n\n    if pending_reviewers:\n        pending = target.table(\"paleyellow\")\n        pending.col(width=\"30%\")\n        pending.col(width=\"10%\")\n        pending.col(width=\"60%\")\n        pending.tr().td('h1', colspan=3).h1().text(\"Pending Changes\")\n\n        teams = review_utils.collectReviewTeams(pending_reviewers)\n\n        if not unassigned:\n            for team in teams:\n                if team is not None:\n                    row = pending.tr(\"reviewers\")\n\n                    cell = row.td(\"reviewers\")\n                    users = sorted([dbutils.User.fromId(db, user_id).fullname for user_id in team])\n                    for user in users: cell.text(user).br()\n                    row.td(\"willreview\").innerHTML(\"should&nbsp;review\")\n\n                    cell = row.td(\"files\")\n                    for file in formatFiles(teams[team]):\n                        cell.span(\"file\").innerHTML(file).br()\n\n        if None in teams:\n            row = pending.tr(\"reviewers\")\n            row.td(\"no-one\", colspan=2).text(\"No reviewers found for changes in\")\n\n            cell = row.td(\"files\")\n            for file in formatFiles(teams[None]):\n                cell.span(\"file\").innerHTML(file).br()\n\n    return document\n"
  },
  {
    "path": "src/page/showtree.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport os\nimport stat\nimport urllib\n\nimport dbutils\nimport gitutils\nimport page.utils\nimport htmlutils\nimport textutils\n\ndef renderShowTree(req, db, user):\n    cursor = db.cursor()\n\n    sha1 = req.getParameter(\"sha1\")\n    path = req.getParameter(\"path\", \"/\")\n    review_id = req.getParameter(\"review\", None, filter=int)\n\n    if path[0] == '/':\n        full_path = path\n        if path != \"/\": path = path[1:]\n    else:\n        full_path = \"/\" + path\n        if not path: path = \"/\"\n\n    if review_id is None:\n        review = None\n        repository_arg = req.getParameter(\"repository\", \"\")\n        if repository_arg:\n            repository = gitutils.Repository.fromParameter(db, repository_arg)\n        else:\n            repository = gitutils.Repository.fromSHA1(db, sha1)\n    else:\n        review = dbutils.Review.fromId(db, review_id)\n        repository = review.repository\n\n    document = htmlutils.Document(req)\n\n    html = document.html()\n    head = html.head()\n    body = html.body()\n\n    extra_links = []\n\n    if review:\n        extra_links.append((\"r/%d\" % review.id, \"Back to Review\"))\n\n    page.utils.generateHeader(body, db, user, extra_links=extra_links)\n\n    document.addExternalStylesheet(\"resource/showtree.css\")\n\n    if review:\n        document.addInternalScript(review.getJS())\n\n    target = body.div(\"main\")\n\n    table = target.table(\"tree paleyellow\", align=\"center\", cellspacing=0)\n    table.col(width=\"10%\")\n    table.col(width=\"60%\")\n    table.col(width=\"20%\")\n\n    thead = table.thead()\n    h1 = thead.tr().td('h1', colspan=3).h1()\n\n    def make_url(url_path, path):\n        params = { \"sha1\": sha1,\n                   \"path\": path }\n        if review is None:\n            params[\"repository\"] = str(repository.id)\n        else:\n            params[\"review\"] = str(review.id)\n        return \"%s?%s\" % (url_path, urllib.urlencode(params))\n\n    if path == \"/\":\n        h1.text(\"/\")\n    else:\n        h1.a(\"root\", href=make_url(\"showtree\", \"/\")).text(\"root\")\n        h1.span().text('/')\n\n        components = path.split(\"/\")\n        for index, component in enumerate(components[:-1]):\n            h1.a(href=make_url(\"showtree\", \"/\".join(components[:index + 1]))).text(component, escape=True)\n            h1.span().text('/')\n\n        h1.text(components[-1], escape=True)\n\n    row = thead.tr()\n    row.td('mode').text(\"Mode\")\n    row.td('name').text(\"Name\")\n    row.td('size').text(\"Size\")\n\n    tree = gitutils.Tree.fromPath(gitutils.Commit.fromSHA1(db, repository, sha1), full_path)\n\n    if tree is None:\n        raise page.utils.DisplayMessage(\n            title=\"Directory does not exist\",\n            body=(\"<p>There is no directory named <code>%s</code> in the commit \"\n                  \"<a href='/showcommit?repository=%s&amp;sha1=%s'>\"\n                  \"<code>%s</code></a>.</p>\"\n                  % (htmlutils.htmlify(textutils.escape(full_path)),\n                     htmlutils.htmlify(repository.name),\n                     htmlutils.htmlify(sha1), htmlutils.htmlify(sha1[:8]))),\n            html=True)\n\n    def compareEntries(a, b):\n        if a.type != b.type:\n            if a.type == \"tree\": return -1\n            else: return 1\n        else:\n            return cmp(a.name, b.name)\n\n    tbody = table.tbody()\n\n    for entry in sorted(tree, cmp=compareEntries):\n        if entry.type in (\"blob\", \"tree\"):\n            if entry.type == \"blob\":\n                url_path = \"showfile\"\n            else:\n                url_path = \"showtree\"\n\n            url = make_url(url_path, os.path.join(path, entry.name))\n        else:\n            url = None\n\n        row = tbody.tr(entry.type)\n        row.td('mode').text(str(entry.mode))\n\n        if stat.S_ISLNK(entry.mode):\n            cell = row.td('link', colspan=2)\n            cell.span('name').text(entry.name, escape=True)\n            cell.text(' -> ')\n            cell.span('target').text(repository.fetch(entry.sha1).data)\n        elif entry.type == \"commit\":\n            row.td('name').text(\"%s (%s)\" % (entry.name, entry.sha1), escape=True)\n            row.td('size').text(entry.size)\n        else:\n            row.td('name').a(href=url).text(entry.name, escape=True)\n            row.td('size').text(entry.size)\n\n    return document\n"
  },
  {
    "path": "src/page/statistics.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport page.utils\nimport htmlutils\nimport dbutils\n\ndef renderStatistics(req, db, user):\n    document = htmlutils.Document(req)\n    document.setTitle(\"Statistics\")\n\n    html = document.html()\n    head = html.head()\n    body = html.body()\n\n    def flush(stop):\n        return document.render(stop=stop)\n\n    page.utils.generateHeader(body, db, user, current_page=\"statistics\")\n\n    document.addExternalStylesheet(\"resource/statistics.css\")\n\n    table = body.div(\"main\").table(\"paleyellow\", align=\"center\", cellspacing=0)\n\n    def commas(number):\n        as_string = str(number)\n        if number >= 1000000000:\n            as_string = as_string[:-9] + \",\" + as_string[-9:]\n        if number >= 1000000:\n            as_string = as_string[:-6] + \",\" + as_string[-6:]\n        if number >= 1000:\n            as_string = as_string[:-3] + \",\" + as_string[-3:]\n        return as_string\n\n    table.tr(\"h1\").td(\"h1\", colspan=4).h1().text(\"Most Lines Reviewed\")\n    table.tr(\"space\").td(colspan=4)\n\n    cursor = db.cursor()\n\n    cursor.execute(\"CREATE TEMPORARY TABLE reviewers (uid INTEGER, lines INTEGER)\")\n    cursor.execute(\"INSERT INTO reviewers (uid, lines) SELECT reviewer, SUM(deleted) + SUM(inserted) FROM reviewfiles WHERE state='reviewed' GROUP BY reviewer\")\n\n    cursor.execute(\"SELECT uid, lines FROM reviewers ORDER BY lines DESC LIMIT 10\")\n\n    self_included = False\n    for user_id, lines in cursor:\n        if user_id == user.id:\n            row = table.tr(\"line self\")\n            self_included = True\n        else:\n            row = table.tr(\"line\")\n\n        row.td(\"left\")\n        row.td(\"user\").text(dbutils.User.fromId(db, user_id).fullname)\n        row.td(\"value\").text(\"%s lines\" % commas(lines))\n        row.td(\"right\")\n\n    if not self_included:\n        cursor.execute(\"SELECT lines FROM reviewers WHERE uid=%s\", (user.id,))\n\n        data = cursor.fetchone()\n        if data and data[0]:\n            lines = data[0]\n\n            cursor.execute(\"SELECT COUNT(*) + 1 FROM reviewers WHERE lines > %s\", (lines,))\n\n            table.tr(\"space\").td(colspan=4)\n\n            row = table.tr(\"line self extra\")\n            row.td(\"left\")\n            row.td(\"user\").text(user.fullname)\n            row.td(\"value\").innerHTML(\"%s lines\" % commas(lines))\n            row.td(\"right\").text(\"(your position: %d)\" % cursor.fetchone()[0])\n\n    table.tr(\"space\").td(colspan=4)\n    table.tr(\"space\").td(colspan=4)\n\n    table.tr(\"h1\").td(\"h1\", colspan=4).h1().text(\"Most Lines in Owned Reviews\")\n    table.tr(\"space\").td(colspan=4)\n\n    cursor.execute(\"CREATE TEMPORARY TABLE owners (uid INTEGER, lines INTEGER)\")\n    cursor.execute(\"INSERT INTO owners (uid, lines) SELECT uid, SUM(deleted) + SUM(inserted) FROM reviewfiles JOIN reviewusers USING (review) JOIN reviews ON (reviewfiles.review=reviews.id) WHERE reviews.state IN ('open', 'closed') AND reviewusers.owner GROUP BY uid\")\n\n    cursor.execute(\"SELECT uid, lines FROM owners ORDER BY lines DESC LIMIT 10\")\n\n    self_included = False\n    for user_id, lines in cursor:\n        if user_id == user.id:\n            row = table.tr(\"line self\")\n            self_included = True\n        else:\n            row = table.tr(\"line\")\n\n        row.td(\"left\")\n        row.td(\"user\").text(dbutils.User.fromId(db, user_id).fullname)\n        row.td(\"value\").innerHTML(\"%s lines\" % commas(lines))\n        row.td(\"right\")\n\n    if not self_included:\n        cursor.execute(\"SELECT lines FROM owners WHERE uid=%s\", (user.id,))\n\n        data = cursor.fetchone()\n        if data and data[0]:\n            lines = data[0]\n\n            cursor.execute(\"SELECT COUNT(*) + 1 FROM owners WHERE lines > %s\", (lines,))\n\n            table.tr(\"space\").td(colspan=4)\n\n            row = table.tr(\"line self extra\")\n            row.td(\"left\")\n            row.td(\"user\").text(user.fullname)\n            row.td(\"value\").innerHTML(\"%s lines\" % commas(lines))\n            row.td(\"right\").text(\"(your position: %d)\" % cursor.fetchone()[0])\n\n    table.tr(\"space\").td(colspan=4)\n    table.tr(\"space\").td(colspan=4)\n\n    table.tr(\"h1\").td(\"h1\", colspan=4).h1().text(\"Most Issues Raised\")\n    table.tr(\"space\").td(colspan=4)\n\n    cursor.execute(\"\"\"SELECT uid, COUNT(type) AS issues\n                        FROM commentchains\n                       WHERE state IN ('open', 'addressed', 'closed')\n                         AND type='issue'\n                    GROUP BY uid\n                    ORDER BY issues DESC\n                       LIMIT 10\"\"\")\n\n    def calculateRatio(user_id, issues):\n        cursor.execute(\"\"\"SELECT lines FROM reviewers WHERE uid=%s\"\"\", (user_id,))\n\n        row = cursor.fetchone()\n        lines = row[0] if row else 0\n\n        return float(issues * 1000) / float(lines) if lines else 0\n\n    self_included = False\n    for user_id, issues in cursor.fetchall():\n        if user_id == user.id:\n            row = table.tr(\"line self\")\n            self_included = True\n        else:\n            row = table.tr(\"line\")\n\n        row.td(\"left\")\n        row.td(\"user\").text(dbutils.User.fromId(db, user_id).fullname)\n        row.td(\"value\").text(\"%s issues\" % commas(issues))\n\n        ratio = \"%.2f\" % calculateRatio(user_id, issues)\n\n        if ratio != \"0.00\": row.td(\"right\").text(\"(%s issues/kloc)\" % ratio)\n        else: row.td(\"right\")\n\n    if not self_included:\n        cursor.execute(\"\"\"SELECT COUNT(type)\n                            FROM commentchains\n                           WHERE state IN ('open', 'addressed', 'closed')\n                             AND type='issue'\n                             AND uid=%s\"\"\",\n                       (user.id,))\n\n        data = cursor.fetchone()\n        if data and data[0]:\n            issues = data[0]\n\n            cursor.execute(\"\"\"SELECT count(*) + 1\n                                FROM (SELECT uid, COUNT(type) AS issues\n                                        FROM commentchains\n                                       WHERE state IN ('open', 'addressed', 'closed')\n                                         AND type='issue'\n                                    GROUP BY uid\n                                    ORDER BY issues DESC) AS stats\n                               WHERE stats.issues > %s\"\"\",\n                           (issues,))\n\n            table.tr(\"space\").td(colspan=4)\n\n            row = table.tr(\"line self extra\")\n            row.td(\"left\")\n            row.td(\"user\").text(user.fullname)\n            row.td(\"value\").innerHTML(\"%s issues\" % commas(issues))\n\n            right = row.td(\"right\")\n            right.text(\"(your position: %d)\" % cursor.fetchone()[0])\n\n            ratio = \"%.2f\" % calculateRatio(user.id, issues)\n            if ratio != \"0.00\": right.text(\" (%s issues/kloc)\" % ratio)\n\n    table.tr(\"space\").td(colspan=4)\n    table.tr(\"space\").td(colspan=4)\n\n    table.tr(\"h1\").td(\"h1\", colspan=4).h1().text(\"Most Comments (and Replies) Written\")\n    table.tr(\"space\").td(colspan=4)\n\n    cursor.execute(\"\"\"SELECT uid, COUNT(state) AS comments\n                        FROM comments\n                       WHERE state='current'\n                    GROUP BY uid\n                    ORDER BY comments DESC\n                       LIMIT 10\"\"\")\n\n    self_included = False\n    for user_id, comments in cursor:\n        if user_id == user.id:\n            row = table.tr(\"line self\")\n            self_included = True\n        else:\n            row = table.tr(\"line\")\n\n        row.td(\"left\")\n        row.td(\"user\").text(dbutils.User.fromId(db, user_id).fullname)\n        row.td(\"value\").innerHTML(\"%s comments\" % commas(comments))\n        row.td(\"right\")\n\n    if not self_included:\n        cursor.execute(\"\"\"SELECT COUNT(state)\n                            FROM comments\n                           WHERE state='current'\n                             AND uid=%s\"\"\",\n                       (user.id,))\n\n        data = cursor.fetchone()\n        if data and data[0]:\n            issues = data[0]\n\n            cursor.execute(\"\"\"SELECT count(*) + 1\n                                FROM (SELECT uid, COUNT(state) AS comments\n                                        FROM comments\n                                       WHERE state='current'\n                                    GROUP BY uid\n                                    ORDER BY comments DESC) AS stats\n                               WHERE stats.comments > %s\"\"\",\n                           (issues,))\n\n            table.tr(\"space\").td(colspan=4)\n\n            row = table.tr(\"line self extra\")\n            row.td(\"left\")\n            row.td(\"user\").text(user.fullname)\n            row.td(\"value\").innerHTML(\"%s comments\" % commas(issues))\n            row.td(\"right\").text(\"(your position: %d)\" % cursor.fetchone()[0])\n\n    table.tr(\"space\").td(colspan=4)\n    table.tr(\"space\").td(colspan=4)\n\n    table.tr(\"h1\").td(\"h1\", colspan=4).h1().text(\"Most Characters Written\")\n    table.tr(\"space\").td(colspan=4)\n\n    cursor.execute(\"\"\"SELECT uid, SUM(CHARACTER_LENGTH(comment)) AS characters\n                        FROM comments\n                       WHERE state='current'\n                    GROUP BY uid\n                    ORDER BY characters DESC\n                       LIMIT 10\"\"\")\n\n    self_included = False\n    for user_id, characters in cursor:\n        if user_id == user.id:\n            row = table.tr(\"line self\")\n            self_included = True\n        else:\n            row = table.tr(\"line\")\n\n        row.td(\"left\")\n        row.td(\"user\").text(dbutils.User.fromId(db, user_id).fullname)\n        row.td(\"value\").innerHTML(\"%s characters\" % commas(characters))\n        row.td(\"right\")\n\n    if not self_included:\n        cursor.execute(\"\"\"SELECT SUM(CHARACTER_LENGTH(comment))\n                            FROM comments\n                           WHERE state='current'\n                             AND uid=%s\"\"\",\n                       (user.id,))\n\n        data = cursor.fetchone()\n        if data and data[0]:\n            characters = data[0]\n\n            cursor.execute(\"\"\"SELECT count(*) + 1\n                                FROM (SELECT uid, SUM(CHARACTER_LENGTH(comment)) AS characters\n                                        FROM comments\n                                       WHERE state='current'\n                                    GROUP BY uid\n                                    ORDER BY characters DESC) AS stats\n                               WHERE stats.characters > %s\"\"\",\n                           (characters,))\n\n            table.tr(\"space\").td(colspan=4)\n\n            row = table.tr(\"line self extra\")\n            row.td(\"left\")\n            row.td(\"user\").text(user.fullname)\n            row.td(\"value\").innerHTML(\"%s characters\" % commas(characters))\n            row.td(\"right\").text(\"(your position: %d)\" % cursor.fetchone()[0])\n\n    db.rollback()\n\n    return document\n"
  },
  {
    "path": "src/page/tutorial.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport page.utils\nimport dbutils\nimport htmlutils\nimport configuration\nimport textformatting\n\ndef renderFromFile(db, user, target, name):\n    lines = open(\"%s/tutorials/%s.txt\" % (configuration.paths.INSTALL_DIR, name)).read().splitlines()\n\n    table = target.table(\"paleyellow\", align=\"center\")\n    textformatting.renderFormatted(db, user, table, lines, toc=True)\n    table.tr(\"back\").td(\"back\").div().a(href=\"tutorial\").text(\"Back\")\n\ndef renderSections(db, user, target):\n    table = target.table(\"paleyellow\", align=\"center\")\n    table.tr(\"h1\").td(\"h1\").h1().text(\"Tutorials\")\n\n    def section(name, title, description):\n        table.tr(\"h2\").td(\"h2\").div().h2().text(title)\n        table.tr(\"text\").td(\"text\").div().text(description, cdata=True)\n        table.tr(\"goto\").td(\"goto\").div().a(href=\"tutorial?item=%s\" % name).text(\"Learn More\")\n\n    section(\"request\", \"Requesting a Review\", \"\"\"\\\nIntroduction to the different ways of requesting a review of changes in\nCritic.  You'll be able to request a review of your bug fix in 10 seconds,\nusing your favorite git client!  (Though it'll take you more than 10\nseconds to read all the text&#8230;)\"\"\")\n\n    section(\"review\", \"Reviewing Changes\", \"\"\"\\\nIntroduction to the process of reviewing changes in Critic.  Covers the\nbasic concepts, marking changes as reviewed and raising issues, and some\nother things.  Useful information both for reviewers and for those\nrequesting the reviews.\"\"\")\n\n    section(\"filters\", \"Filters\", \"\"\"\\\nInformation about the Filters mechanism.\"\"\")\n\n    section(\"archival\", \"Review Branch Archival\", \"\"\"\\\nInformation about the automatic review branch archival mechanism, which\ndeletes review branches some time after the review is finished.\"\"\")\n\n    section(\"viewer\", \"Repository Viewer\", \"\"\"\\\nSome information about Critic's repository viewers and its peculiarities\ncompared to \\\"normal\\\" git repository viewers such as gitk and cgit.\"\"\")\n\n    section(\"reconfigure\", \"Reconfiguring Critic\", \"\"\"\\\nInformation about the various per-user configuration options that Critic\nsupports.\"\"\")\n\n    section(\"rebase\", \"Rebasing Reviews\", \"\"\"\\\nDetails on what kind of rebase operations are supported on review\nbranches, how to convince Critic to accept non-fast-forward updates, and\nsome things you really should make sure not to do.\"\"\")\n\n    section(\"search\", \"Review Quick Search\", \"\"\"\\\nInformation about the review search facility and the search query syntax.\"\"\")\n\n    if configuration.extensions.ENABLED:\n        section(\"extensions\", \"Critic Extensions\", \"\"\"\\\nDescription of the Critic Extensions mechanism.\"\"\")\n\n        section(\"extensions-api\", \"Critic Extensions API\", \"\"\"\\\nDescription of the script API available to Critic Extensions.\"\"\")\n\n    if user.hasRole(db, \"administrator\"):\n        section(\"administration\", \"System Administration\", \"\"\"\\\nInformation about various Critic system administration tasks.\"\"\")\n\n        section(\"customization\", \"System Customization\", \"\"\"\\\nInformation about Critic system customization hooks.\"\"\")\n\ndef renderTutorial(req, db, user):\n    item = req.getParameter(\"item\", None)\n\n    document = htmlutils.Document(req)\n    document.setBase(None)\n    document.setTitle(\"Tutorials\")\n\n    html = document.html()\n    head = html.head()\n    body = html.body()\n\n    page.utils.generateHeader(body, db, user, current_page=None if item else \"tutorial\")\n\n    document.addExternalStylesheet(\"resource/tutorial.css\")\n    document.addExternalScript(\"resource/tutorial.js\")\n    document.addInternalStylesheet(\"div.main table td.text { %s }\" % user.getPreference(db, \"style.tutorialFont\"))\n\n    target = body.div(\"main\")\n\n    items = { \"request\": \"requesting\",\n              \"review\": \"reviewing\",\n              \"filters\": \"filters\",\n              \"archival\": \"archival\",\n              \"viewer\": \"repository\",\n              \"rebase\": \"rebasing\",\n              \"reconfigure\": \"reconfiguring\",\n              \"checkbranch\": \"checkbranch\",\n              \"administration\": \"administration\",\n              \"customization\": \"customization\",\n              \"search\": \"search\",\n              \"external-authentication\": \"external-authentication\",\n              \"extensions\": \"extensions\",\n              \"extensions-api\": \"extensions-api\" }\n\n    if item in items:\n        renderFromFile(db, user, target, items[item])\n    else:\n        renderSections(db, user, target)\n\n    return document\n"
  },
  {
    "path": "src/page/utils.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport re\n\nimport auth\nimport htmlutils\nimport configuration\n\nfrom request import (NoDefault, MovedTemporarily, DisplayMessage,\n                     InvalidParameterValue, decodeURIComponent, Request,\n                     NeedLogin, NotModified)\n\nfrom textutils import json_encode, json_decode\n\nLINK_RELS = { \"Home\": \"home\",\n              \"Dashboard\": \"contents\",\n              \"Branches\": \"index\",\n              \"Tutorial\": \"help\",\n              \"Back to Review\": \"up\" }\n\ndef YesOrNo(value):\n    if value == \"yes\": return True\n    elif value == \"no\": return False\n    else: raise DisplayMessage(\"invalid parameter value; expected 'yes' or 'no'\")\n\ndef generateEmpty(target):\n    pass\n\ndef generateHeader(target, db, user, generate_right=None, current_page=None, extra_links=[], profiler=None):\n    target.addExternalStylesheet(\"resource/third-party/jquery-ui.css\")\n    target.addExternalStylesheet(\"resource/third-party/chosen.css\")\n    target.addExternalStylesheet(\"resource/overrides.css\")\n    target.addExternalStylesheet(\"resource/basic.css\")\n    target.addInternalStylesheet(\".defaultfont, body { %s }\" % user.getPreference(db, \"style.defaultFont\"))\n    target.addInternalStylesheet(\".sourcefont { %s }\" % user.getPreference(db, \"style.sourceFont\"))\n    target.addExternalScript(\"resource/third-party/jquery.js\")\n    target.addExternalScript(\"resource/third-party/jquery-ui.js\")\n    target.addExternalScript(\"resource/third-party/jquery-ui-autocomplete-html.js\")\n    target.addExternalScript(\"resource/third-party/chosen.js\")\n    target.addExternalScript(\"resource/basic.js\")\n\n    target.noscript().h1(\"noscript\").blink().text(\"Please enable scripting support!\")\n\n    row = target.table(\"pageheader\", width='100%').tr()\n    left = row.td(\"left\", valign='bottom', align='left')\n    b = left.b()\n\n    opera_class = \"opera\"\n\n    if configuration.debug.IS_DEVELOPMENT:\n        opera_class += \" development\"\n\n    b.b(opera_class, onclick=\"location.href='/';\").text(\"Opera\")\n    b.b(\"critic\", onclick=\"location.href='/';\").text(\"Critic\")\n\n    links = []\n\n    if not user.isAnonymous():\n        links.append([\"home\", \"Home\", None, None])\n\n    links.append([\"dashboard\", \"Dashboard\", None, None])\n    links.append([\"branches\", \"Branches\", None, None])\n    links.append([\"search\", \"Search\", None, None])\n\n    if user.hasRole(db, \"administrator\"):\n        links.append([\"services\", \"Services\", None, None])\n    if user.hasRole(db, \"repositories\"):\n        links.append([\"repositories\", \"Repositories\", None, None])\n\n    if profiler:\n        profiler.check(\"generateHeader (basic)\")\n\n    if configuration.extensions.ENABLED:\n        from extensions.extension import Extension\n\n        updated = Extension.getUpdatedExtensions(db, user)\n\n        if updated:\n            link_title = \"\\n\".join([(\"%s by %s can be updated!\" % (extension_name, author_fullname)) for author_fullname, extension_name in updated])\n            links.append([\"manageextensions\", \"Extensions (%d)\" % len(updated), \"color: red\", link_title])\n        else:\n            links.append([\"manageextensions\", \"Extensions\", None, None])\n\n        if profiler:\n            profiler.check(\"generateHeader (updated extensions)\")\n\n    links.append([\"config\", \"Config\", None, None])\n    links.append([\"tutorial\", \"Tutorial\", None, None])\n\n    if user.isAnonymous():\n        count = 0\n    else:\n        cursor = db.cursor()\n        cursor.execute(\"\"\"SELECT COUNT(*)\n                            FROM newsitems\n                 LEFT OUTER JOIN newsread ON (item=id AND uid=%s)\n                           WHERE uid IS NULL\"\"\",\n                       (user.id,))\n        count = cursor.fetchone()[0]\n\n    if count:\n        links.append([\"news\", \"News (%d)\" % count, \"color: red\", \"There are %d unread news items!\" % count])\n    else:\n        links.append([\"news\", \"News\", None, None])\n\n    if profiler:\n        profiler.check(\"generateHeader (news)\")\n\n    req = target.getRequest()\n\n    if configuration.base.AUTHENTICATION_MODE != \"host\" \\\n           and configuration.base.SESSION_TYPE == \"cookie\":\n        if user.isAnonymous():\n            links.append([\"javascript:void(location.href='/login?target='+encodeURIComponent(location.href));\", \"Sign in\", None, None])\n        elif not req or (req.user == user.name and req.session_type == \"cookie\"):\n            links.append([\"javascript:signOut();\", \"Sign out\", None, None])\n\n    for url, label in extra_links:\n        links.append([url, label, None, None])\n\n    if req and configuration.extensions.ENABLED:\n        import extensions.role.inject\n\n        injected = {}\n\n        extensions.role.inject.execute(db, req, user, target, links, injected, profiler=profiler)\n\n        for url in injected.get(\"stylesheets\", []):\n            target.addExternalStylesheet(url, use_static=False, order=1)\n\n        for url in injected.get(\"scripts\", []):\n            target.addExternalScript(url, use_static=False, order=1)\n    else:\n        injected = None\n\n    ul = left.ul()\n\n    for index, (url, label, style, title) in enumerate(links):\n        if not re.match(\"[-.a-z]+:|/\", url):\n            url = \"/\" + url\n        ul.li().a(href=url, style=style, title=title).text(label)\n\n        rel = LINK_RELS.get(label)\n        if rel: target.setLink(rel, url)\n\n    right = row.td(\"right\", valign='bottom', align='right')\n    if generate_right:\n        generate_right(right)\n    else:\n        right.div(\"buttons\").span(\"buttonscope buttonscope-global\")\n\n    if profiler:\n        profiler.check(\"generateHeader (finish)\")\n\n    return injected\n\ndef renderShortcuts(target, page, **kwargs):\n    shortcuts = []\n\n    def addShortcut(keyCode, keyName, description):\n        shortcuts.append((keyCode, keyName, description))\n\n    if kwargs.get(\"review\"):\n        addShortcut(ord(\"u\"), \"u\", \"back to review\")\n\n    if page == \"showcommit\":\n        what = \"files\"\n\n        merge_parents = kwargs.get(\"merge_parents\")\n        if merge_parents > 1:\n            for index in range(min(9, merge_parents)):\n                order = (\"first\", \"second\", \"third\", \"fourth\", \"fifth\", \"seventh\", \"eight\", \"ninth\")[index]\n                addShortcut(ord('1') + index, \"%d\" % (index + 1), \" changes relative to %s parent\" % order)\n    elif page == \"showcomments\":\n        what = \"comments\"\n\n    if page == \"showcommit\" or page == \"showcomments\":\n        addShortcut(ord(\"e\"), \"e\", \"expand all %s\" % what)\n        addShortcut(ord(\"c\"), \"c\", \"collapse all %s\" % what)\n        addShortcut(ord(\"s\"), \"s\", \"show all %s\" % what)\n        addShortcut(ord(\"h\"), \"h\", \"hide all %s\" % what)\n\n        if page == \"showcommit\":\n            addShortcut(ord(\"m\"), \"m\", \"detect moved code\")\n\n            if kwargs.get(\"squashed_diff\"):\n                addShortcut(ord(\"b\"), \"b\", \"blame\")\n\n            addShortcut(32, \"SPACE\", \"scroll or show/expand next file\")\n\n    if page == \"showcomment\":\n        addShortcut(ord(\"m\"), \"m\", \"show more context\")\n        addShortcut(ord(\"l\"), \"l\", \"show less context\")\n\n    if page == \"filterchanges\":\n        addShortcut(ord(\"a\"), \"a\", \"select everything\")\n        addShortcut(ord(\"g\"), \"g\", \"go / display diff\")\n\n    container = target.div(\"pagefooter shortcuts\")\n\n    if shortcuts:\n        container.text(\"Shortcuts: \")\n\n        def renderShortcut(keyCode, ch, text, is_last=False):\n            a = container.a(\"shortcut\", href=\"javascript:void(handleKeyboardShortcut(%d));\" % keyCode)\n            a.b().text(\"(%s)\" % ch)\n            a.text(\" %s\" % text)\n            if not is_last:\n                container.text(\", \")\n\n        for index, (keyCode, keyName, description) in enumerate(shortcuts):\n            renderShortcut(keyCode, keyName, description, index == len(shortcuts) - 1)\n\ndef generateFooter(target, db, user, current_page=None):\n    renderShortcuts(target, current_page)\n\ndef displayMessage(db, req, user, title, review=None, message=None, page_title=None, is_html=False):\n    document = htmlutils.Document(req)\n\n    if page_title:\n        document.setTitle(page_title)\n\n    document.addExternalStylesheet(\"resource/message.css\")\n\n    html = document.html()\n    head = html.head()\n    body = html.body()\n\n    if review:\n        import reviewing.utils as review_utils\n\n        def generateRight(target):\n            review_utils.renderDraftItems(db, user, review, target)\n\n        back_to_review = (\"r/%d\" % review.id, \"Back to Review\")\n\n        document.addInternalScript(review.getJS())\n\n        generateHeader(body, db, user, generate_right=generateRight, extra_links=[back_to_review])\n    else:\n        generateHeader(body, db, user)\n\n    target = body.div(\"message paleyellow\")\n\n    if message:\n        target.h1(\"title\").text(title)\n\n        if callable(message): message(target)\n        elif is_html: target.innerHTML(message)\n        else: target.p().text(message)\n    else:\n        target.h1(\"center\").text(title)\n\n    return document\n\nclass PaleYellowTable:\n    def __init__(self, target, title, columns=[10, 60, 30]):\n        if not target.hasTitle():\n            target.setTitle(title)\n\n        self.table = target.div(\"main\").table(\"paleyellow\", align=\"center\").tbody()\n        self.columns = columns\n\n        colgroup = self.table.colgroup()\n        for column in columns: colgroup.col(width=\"%d%%\" % column)\n\n        h1 = self.table.tr().td(\"h1\", colspan=len(columns)).h1()\n        h1.text(title)\n        self.titleRight = h1.span(\"right\")\n\n        self.table.tr(\"spacer\").td(colspan=len(self.columns))\n\n    def addSection(self, title, extra=None):\n        h2 = self.table.tr().td(\"h2\", colspan=len(self.columns)).h2()\n        h2.text(title)\n        if extra is not None:\n            h2.span().text(extra)\n\n    def addItem(self, heading, value, description=None, buttons=None):\n        row = self.table.tr(\"item\")\n        row.td(\"name\").innerHTML(htmlutils.htmlify(heading).replace(\" \", \"&nbsp;\") + \":\")\n        cell = row.td(\"value\", colspan=2).preformatted()\n        if callable(value): value(cell)\n        else: cell.text(str(value))\n        if buttons:\n            div = cell.div(\"buttons\")\n            for label, onclick in buttons:\n                div.button(onclick=onclick).text(label)\n        if description is not None:\n            self.table.tr(\"help\").td(colspan=len(self.columns)).text(description)\n\n    def addCentered(self, content=None):\n        row = self.table.tr(\"centered\")\n        cell = row.td(colspan=len(self.columns))\n        if callable(content): content(cell)\n        elif content: cell.text(str(content))\n        return cell\n\n    def addSeparator(self):\n        self.table.tr(\"separator\").td(colspan=len(self.columns)).div()\n\ndef generateRepositorySelect(db, user, target, allow_selecting_none=False,\n                             placeholder_text=None, selected=None,\n                             access_type=\"read\", **attributes):\n    select = target.select(\"repository-select\", **attributes)\n\n    cursor = db.cursor()\n    cursor.execute(\"\"\"SELECT id, name, path\n                        FROM repositories\n                    ORDER BY name\"\"\")\n\n    rows = cursor.fetchall()\n\n    if not rows:\n        # Note: not honoring 'placeholder_text' here; callers typically don't\n        # take into account the possibility that there are no repositories.\n        select.setAttribute(\"data-placeholder\", \"No repositories\")\n        select.option(value=\"\", selected=\"selected\")\n        return\n\n    if selected is None:\n        default_repository = user.getDefaultRepository(db)\n        if default_repository:\n            selected = default_repository.name\n\n    if not selected or allow_selecting_none:\n        if placeholder_text is None:\n            placeholder_text = \"Select a repository\"\n        select.setAttribute(\"data-placeholder\", placeholder_text)\n        select.option(value=\"\", selected=\"selected\")\n\n    highlighted_ids = set()\n\n    cursor.execute(\"\"\"SELECT DISTINCT repository\n                        FROM filters\n                       WHERE uid=%s\"\"\",\n                   (user.id,))\n    highlighted_ids.update(repository_id for (repository_id,) in cursor)\n\n    cursor.execute(\"\"\"SELECT DISTINCT repository\n                        FROM branches\n                        JOIN reviews ON (reviews.branch=branches.id)\n                        JOIN reviewusers ON (reviewusers.review=reviews.id)\n                       WHERE reviewusers.uid=%s\n                         AND reviewusers.owner\"\"\",\n                   (user.id,))\n    highlighted_ids.update(repository_id for (repository_id,) in cursor)\n\n    if not highlighted_ids or len(highlighted_ids) == len(rows):\n        # Do not group options when there will be only one group.\n        highlighted = select\n        other = select\n    else:\n        highlighted = select.optgroup(label=\"Highlighted\")\n        other = select.optgroup(label=\"Other\")\n\n    html_format = (\"<span class=repository-name>%s</span>\"\n                   \"<span class=repository-path>%s</span>\")\n\n    for repository_id, name, path in rows:\n        try:\n            repository = auth.AccessControl.Repository(repository_id, path)\n            auth.AccessControl.accessRepository(db, access_type, repository)\n        except auth.AccessDenied:\n            # Skip repositories the user doesn't have access to.\n            continue\n\n        if repository_id in highlighted_ids:\n            optgroup = highlighted\n        else:\n            optgroup = other\n\n        if repository_id == selected or name == selected:\n            is_selected = \"selected\"\n        else:\n            is_selected = None\n\n        html = html_format % (name, path)\n\n        option = optgroup.option(\"repository flex\",\n                                 value=name, selected=is_selected,\n                                 data_text=name, data_html=html)\n        option.text(name)\n\ndef displayFormattedText(db, req, user, source):\n    document = htmlutils.Document(req)\n    document.setBase(None)\n    document.addExternalStylesheet(\"resource/tutorial.css\")\n    document.addInternalStylesheet(\"div.main table td.text { %s }\"\n                                   % user.getPreference(db, \"style.tutorialFont\"))\n\n    html = document.html()\n    head = html.head()\n    body = html.body()\n\n    generateHeader(body, db, user)\n\n    if isinstance(source, basestring):\n        lines = source.splitlines()\n    else:\n        lines = source\n\n    import textformatting\n\n    textformatting.renderFormatted(\n        db, user, body.div(\"main\").table(\"paleyellow\"), source, toc=True)\n\n    generateFooter(body, db, user)\n\n    return document\n\nclass DisplayFormattedText(Exception):\n    def __init__(self, source):\n        self.source = source\n\nclass ResponseBody(object):\n    def __init__(self, data, content_type=\"text/html\"):\n        self.data = data\n        self.content_type = content_type\n"
  },
  {
    "path": "src/page/verifyemail.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport page.utils\n\ndef renderVerifyEmail(req, db, user):\n    if user.isAnonymous():\n        raise page.utils.NeedLogin(req)\n    elif req.user != user.name:\n        raise page.utils.DisplayMessage(\"Invalid use!\")\n\n    email = req.getParameter(\"email\")\n    verification_token = req.getParameter(\"token\")\n\n    cursor = db.cursor()\n    cursor.execute(\"\"\"SELECT id\n                        FROM useremails\n                       WHERE uid=%s\n                         AND email=%s\n                         AND verification_token=%s\"\"\",\n                   (user.id, email, verification_token))\n\n    row = cursor.fetchone()\n\n    if not row:\n        raise page.utils.DisplayMessage(\"Invalid verification token!\")\n\n    email_id = row[0]\n\n    cursor.execute(\"\"\"UPDATE useremails\n                         SET verified=TRUE\n                       WHERE id=%s\"\"\",\n                   (email_id,))\n\n    db.commit()\n\n    raise page.utils.MovedTemporarily(\"/home?email_verified=%d\" % email_id)\n"
  },
  {
    "path": "src/profiling.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport time\nimport re\n\nclass Profiler:\n    class Check:\n        def __init__(self, profiler, title):\n            self.__profiler = profiler\n            self.__title = title\n            self.__begin = time.time()\n\n        def stop(self):\n            self.__profiler.add(self.__title, self.__begin, time.time())\n\n    def __init__(self):\n        self.__previous = time.time()\n        self.__checks = []\n        self.__table = {}\n\n    def add(self, title, begin, end):\n        if title not in self.__table:\n            self.__checks.append(title)\n            self.__table[title] = 0\n\n        self.__table[title] += end - begin\n        self.__previous = end\n\n    def start(self, title):\n        return Profiler.Check(self, title)\n\n    def check(self, title):\n        self.add(title, self.__previous, time.time())\n\n    def output(self, db=None, user=None, target=None):\n        log = \"\"\n        total = 0.0\n\n        title_width = max(map(len, self.__checks))\n        format = \"  %%-%ds : %%8.2f\\n\" % title_width\n\n        for title, duration in sorted(self.__table.items(), cmp=lambda a, b: cmp(a[1], b[1]), reverse=True):\n            log += format % (title, self.__table[title] * 1000)\n            total += self.__table[title]\n\n        log += \"\\n\" + format % (\"TOTAL\", total * 1000)\n\n        if db and user and target and user.getPreference(db, 'debug.profiling.pageGeneration'):\n            target.comment(\"\\n\\n\" + log + \"\\n\")\n\n        return log\n\ndef formatDBProfiling(db):\n    lines = [\"         | TIME (milliseconds)    | ROWS                   |\",\n             \"   Count | Accumulated |  Maximum | Accumulated |  Maximum | Query\",\n             \"  -------|-------------|----------|-------------|----------|-------\"]\n    items = sorted(db.profiling.items(), key=lambda item: item[1][1], reverse=True)\n\n    total_count = 0\n    total_accumulated_ms = 0.0\n    total_accumulated_rows = 0\n\n    for item, (count, accumulated_ms, maximum_ms, accumulated_rows, maximum_rows) in items:\n        total_count += count\n        total_accumulated_ms += accumulated_ms\n\n        if accumulated_rows is None:\n            lines.append(\"  %6d | %11.4f | %8.4f |             |          | %s\" %\n                         (count, accumulated_ms, maximum_ms, re.sub(r\"\\s+\", \" \", item)))\n        else:\n            total_accumulated_rows += accumulated_rows\n\n            lines.append(\"  %6d | %11.4f | %8.4f | %11d | %8d | %s\" %\n                         (count, accumulated_ms, maximum_ms, accumulated_rows, maximum_rows, re.sub(r\"\\s+\", \" \", item)))\n\n\n    lines.insert(3, (\"  %6d | %11.4f |          | %11d |          | TOTAL\" %\n                     (total_count, total_accumulated_ms, total_accumulated_rows)))\n\n    return \"\\n\".join(lines)\n"
  },
  {
    "path": "src/request.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport re\nimport urllib\nimport urlparse\nimport httplib\nimport wsgiref.util\n\nimport base\nimport auth\nimport configuration\nimport dbutils\n\n# Paths to which access should be allowed without authentication even if\n# anonymous users are not allowed in general.\nINSECURE_PATHS = set([\"login\", \"validatelogin\",\n                      \"createuser\", \"registeruser\"])\n\ndef decodeURIComponent(text):\n    \"\"\"\\\n    Replace %HH escape sequences and return the resulting string.\n    \"\"\"\n\n    return urllib.unquote_plus(text)\n\nclass NoDefault:\n    \"\"\"\\\n    Placeholder class to signal that a parameter has no default value.\n\n    An instance of this class is provided to Request.getParameter() as the\n    'default' argument to signal that it is an error if the parameter is\n    not present.\n    \"\"\"\n\n    pass\n\nclass HTTPResponse(Exception):\n    def __init__(self, status):\n        self.status = status\n        self.body = []\n        self.content_type = \"text/plain\"\n\n    def execute(self, db, req):\n        req.setStatus(self.status)\n        if self.body:\n            req.setContentType(self.content_type)\n        req.start()\n        return self.body\n\nclass NoContent(HTTPResponse):\n    def __init__(self):\n        super(NoContent, self).__init__(204)\n\nclass NotModified(HTTPResponse):\n    def __init__(self):\n        super(NotModified, self).__init__(304)\n\nclass Forbidden(HTTPResponse):\n    def __init__(self, message=\"Forbidden\"):\n        super(Forbidden, self).__init__(403)\n        self.body = [message]\n\nclass NotFound(HTTPResponse):\n    def __init__(self, message=\"Not found\"):\n        super(NotFound, self).__init__(404)\n        self.body = [message]\n\nclass Redirect(HTTPResponse):\n    def __init__(self, status, location, no_cache=False):\n        super(Redirect, self).__init__(status)\n        self.location = location\n        self.no_cache = no_cache\n\n    def execute(self, db, req):\n        from htmlutils import htmlify\n        if not req.allowRedirect(self.status):\n            self.status = 403\n            self.body = [\"Cowardly refusing to redirect %s request.\"\n                         % req.method]\n        else:\n            req.addResponseHeader(\"Location\", self.location)\n            self.body = [\"<p>Please go here: <a href=%s>%s</a>.\"\n                         % (htmlify(self.location, attributeValue=True),\n                            htmlify(self.location))]\n            self.content_type = \"text/html\"\n        return super(Redirect, self).execute(db, req)\n\nclass Found(Redirect):\n    def __init__(self, location):\n        super(Found, self).__init__(302, location)\n\nclass SeeOther(Redirect):\n    def __init__(self, location):\n        super(SeeOther, self).__init__(303, location)\n\nclass MovedTemporarily(Redirect):\n    def __init__(self, location, no_cache=False):\n        super(MovedTemporarily, self).__init__(307, location)\n        self.no_cache = no_cache\n\n    def execute(self, db, req):\n        if self.no_cache:\n            req.addResponseHeader(\"Cache-Control\", \"no-cache\")\n        return super(MovedTemporarily, self).execute(db, req)\n\nclass NeedLogin(MovedTemporarily):\n    def __init__(self, source, optional=False):\n        if isinstance(source, Request):\n            target = source.getTargetURL()\n        else:\n            target = str(source)\n        location = \"/login?target=%s\" % urllib.quote(target)\n        if optional:\n            location += \"&optional=yes\"\n        return super(NeedLogin, self).__init__(location, no_cache=True)\n\nclass RequestHTTPAuthentication(HTTPResponse):\n    def __init__(self):\n        super(RequestHTTPAuthentication, self).__init__(401)\n\n    def execute(self, db, req):\n        import page.utils\n\n        self.body = str(page.utils.displayMessage(\n            db, req, dbutils.User.makeAnonymous(),\n            title=\"Authentication required\",\n            message=(\"You must provide valid HTTP authentication to access \"\n                     \"this system.\")))\n        self.content_type = \"text/html\"\n\n        req.addResponseHeader(\"WWW-Authenticate\", \"Basic realm=\\\"Critic\\\"\")\n        return super(RequestHTTPAuthentication, self).execute(db, req)\n\nclass DisplayMessage(base.Error):\n    \"\"\"\\\n    Utility exception raised by pages to display a simply message.\n    \"\"\"\n\n    def __init__(self, title, body=None, review=None, html=False, status=200):\n        self.title = title\n        self.body = body\n        self.review = review\n        self.html = html\n        self.status = status\n\nclass InvalidParameterValue(DisplayMessage):\n    \"\"\"\\\n    Exception raised by pages when a query parameter has an invalid value.\n\n    This exception is automatically raised by Request.getParameter() if the\n    parameter's value can't be converted as requested.\n    \"\"\"\n\n    def __init__(self, name, value, expected):\n        DisplayMessage.__init__(self, \"Invalid URI Parameter Value!\", \"Got '%s=%s', expected %s.\" % (name, value, expected), status=400)\n\nclass MissingParameter(DisplayMessage):\n    \"\"\"\\\n    Exception raised by pages when a required query parameter is missing.\n\n    This exception is automatically raised by Request.getParameter() if the\n    parameter is required and missing.\n    \"\"\"\n\n    def __init__(self, name):\n        DisplayMessage.__init__(self, \"Missing URI Parameter!\", \"Expected '%s' parameter.\" % name, status=400)\n\nclass MissingWSGIRemoteUser(Exception):\n    \"\"\"\\\n    Exception raised if WSGI environ \"REMOTE_USER\" is missing.\n\n    This error happens when Critic is running in \"host\" authentication mode but no\n    REMOTE_USER variable was present in the WSGI environ dict provided by the\n    web server.\n    \"\"\"\n    pass\n\nclass Request:\n    \"\"\"\\\n    WSGI request wrapper class.\n\n    Pages and operations should typically only need to access request parameters\n    (via getParameter()) and headers (via getRequestHeader()), and set response\n    status (using setStatus()) if not \"200 OK\" and content-type (using\n    setContentType()) if not \"text/html\".  The start() method must be called\n    before any content is returned to the WSGI layer, but this is taken care of\n    by the main request handling function (critic.py::main).\n\n    In the case of POST requests, the request body is retrieved using the read()\n    method.\n\n    Properties:\n\n    user -- user name from HTTP authentication\n    method -- HTTP method (\"GET\" or \"POST\", typically)\n    path -- URI path component, without leading forward slash\n    original_path -- same as 'path', unless the path is a short-hand for another\n                     path, in which case 'path' is the resolved path\n    query -- URI query component\n    original_query == same as 'query', unless the path is a short-hand for\n                      another path, in which case 'query' is typically extended\n                      with parameters derived from the short-hand path\n\n    Primary methods:\n\n    getParameter(name, default, filter) -- get URI query parameter\n    getRequestHeader(name) -- get HTTP request header\n    getRequestHeaders(name) -- get all HTTP request headers\n    read() -- read HTTP request body\n    setStatus(code, message) -- set HTTP response status\n    setContentType(content_type) -- set Content-Type response header\n    addResponseHeader(name, value) -- add HTTP response header\n\n    Methods used by framework code:\n\n    start() -- call the WSGI layers start_response() callback\n    isStarted() -- check if start() has been called\n    getContentType() -- get response content type\n    \"\"\"\n\n    def __init__(self, db, environ, start_response):\n        \"\"\"\\\n        Construct request wrapper.\n\n        The environ and start_response arguments should be the arguments to the\n        WSGI application object.\n        \"\"\"\n\n        self.__db = db\n        self.__environ = environ\n        self.__start_response = start_response\n        self.__status = None\n        self.__content_type = None\n        self.__response_headers = []\n        self.__started = False\n\n        content_length = environ.get(\"CONTENT_LENGTH\")\n        self.__request_body_length = int(content_length) if content_length else 0\n        self.__request_body_read = 0\n\n        self.server_name = \\\n            self.getRequestHeader(\"X-Forwarded-Host\") \\\n            or environ.get(\"SERVER_NAME\") \\\n            or configuration.base.HOSTNAME\n\n        self.method = environ.get(\"REQUEST_METHOD\", \"\")\n        self.path = environ.get(\"PATH_INFO\", \"\").lstrip(\"/\")\n        self.original_path = self.path\n        self.query = environ.get(\"QUERY_STRING\", \"\")\n        self.parsed_query = urlparse.parse_qs(self.query, keep_blank_values=True)\n        self.original_query = self.query\n        self.cookies = {}\n\n        header = self.getRequestHeader(\"Cookie\")\n        if header:\n            for cookie in map(str.strip, header.split(\";\")):\n                name, _, value = cookie.partition(\"=\")\n                if name and value:\n                    self.cookies[name] = value\n\n        self.session_type = configuration.base.SESSION_TYPE\n\n    def updateQuery(self, items):\n        self.parsed_query.update(items)\n        self.query = urllib.urlencode(\n            sorted(self.parsed_query.items()), doseq=True)\n\n    @property\n    def user(self):\n        return self.__db.user\n\n    def getTargetURL(self):\n        target = \"/\" + self.path\n        if self.query:\n            target += \"?\" + self.query\n        return target\n\n    def getRequestURI(self):\n        return wsgiref.util.request_uri(self.__environ)\n\n    def getEnvironment(self):\n        return self.__environ\n\n    def getParameter(self, name, default=NoDefault, filter=lambda value: value):\n        \"\"\"\\\n        Get URI query parameter.\n\n        If the requested parameter was not present in the URI query component,\n        the supplied default value is returned instead, or, if the supplied\n        default value is the NoDefault class, a MissingParameter exception is\n        raised.\n\n        If a filter function is supplied, it is called with a single argument,\n        the string value of the URI parameter, and its return value is returned\n        from getParameter().  If the filter function raises an exception (other\n        than DisplayMessage or sub-classes thereof) an InvalidParameterValue\n        exception is raised.  Note: the filter function is not applied to\n        default values, meaning that the default value can be of a different\n        type than actual parameter values.\n        \"\"\"\n\n        value = self.parsed_query.get(name)\n\n        if value is None:\n            if default is NoDefault:\n                raise MissingParameter(name)\n            return default\n\n        def filter_value(value):\n            try:\n                return filter(value)\n            except (base.Error, auth.AccessDenied):\n                raise\n            except Exception:\n                if filter is int:\n                    expected = \"integer\"\n                else:\n                    expected = \"something else\"\n                raise InvalidParameterValue(name, value, expected)\n\n        value = [filter_value(element) for element in value]\n\n        if len(value) == 1:\n            return value[0]\n\n        return value\n\n    def getParameters(self):\n        return { name: value[0] if len(value) == 1 else value\n                 for name, value in self.parsed_query.items() }\n\n    def getRequestHeader(self, name, default=None):\n        \"\"\"\\\n        Get HTTP request header by name.\n\n        The name is case-insensitive.  If the request header was not present in\n        the request, the default value is returned (or None if no default value\n        is provided.)  If the request header was present, its value is returned\n        as a string.\n        \"\"\"\n\n        return self.__environ.get(\"HTTP_\" + name.upper().replace(\"-\", \"_\"), default)\n\n    def getRequestHeaders(self):\n        \"\"\"\\\n        Get a dictionary containing all HTTP request headers.\n\n        The header names are converted to all lower-case, and any underscores\n        ('_') in the header name is replaced with a dash ('-').  The reason for\n        this name transformation is that the header names are already\n        transformed in the WSGI layer from their original form to all\n        upper-case, with dashes replaced by underscores, so the original name is\n        not available.\n\n        The returned dictionary is a copy of the underlying storage, so the\n        caller can modify it without the modifications having any side-effects.\n        \"\"\"\n\n        headers = {}\n        for name, value in self.__environ.items():\n            if name.startswith(\"HTTP_\"):\n                headers[name[5:].lower().replace(\"_\", \"-\")] = value\n        return headers\n\n    def getReferrer(self):\n        try: return self.getRequestHeader(\"Referer\")\n        except: return \"N/A\"\n\n    def read(self, bufsize=None):\n        \"\"\"\\\n        Return the HTTP request body, or an empty string if there is none.\n        \"\"\"\n\n        if self.__request_body_length:\n            max_bufsize = self.__request_body_length - self.__request_body_read\n\n            if bufsize is None:\n                bufsize = max_bufsize\n            else:\n                bufsize = min(bufsize, max_bufsize)\n\n        if \"wsgi.input\" not in self.__environ or not bufsize:\n            return \"\"\n\n        data = self.__environ[\"wsgi.input\"].read(bufsize)\n        self.__request_body_read += len(data)\n        return data\n\n    def write(self, data):\n        \"\"\"\n        Write HTTP response body chunk.\n        \"\"\"\n\n        self.__write(data)\n\n    def setStatus(self, code, message=None):\n        \"\"\"\\\n        Set the HTTP status code, and optionally the status message.\n\n        If the message argument is None, a default status message for the\n        specified HTTP status code is used.  If the specified status code is not\n        one included in httplib.responses, an KeyError exception is raised.\n\n        If this method is not called, the HTTP status will be \"200 OK\".\n\n        This method must be called before the response is started.  (This really\n        only matters for incremental pages that returns the response body in\n        chunks; they can't call this method once they've yielded the first body\n        chunk.)\n        \"\"\"\n\n        assert not self.__started, \"Response already started!\"\n        if message is None: message = httplib.responses[code]\n        self.__status = \"%d %s\" % (code, message)\n\n    def hasContentType(self):\n        return self.__content_type is not None\n\n    def setContentType(self, content_type):\n        \"\"\"\\\n        Set the response content type (the \"Content-Type\" header).\n\n        If the specified content type doesn't have a \"charset=X\" addition, the\n        string \"; charset=utf-8\" is appended to the content type.\n\n        If this method is not called, the Content-Type header's value will be\n        \"text/html; charset=utf-8\".\n\n        This function must be used rather than addResponseHeader() to set the\n        Content-Type header, and must be called before the response is started.\n        \"\"\"\n\n        assert not self.__started, \"Response already started!\"\n        if content_type.startswith(\"text/\") and \"charset=\" not in content_type: content_type += \"; charset=utf-8\"\n        self.__content_type = content_type\n\n    def addResponseHeader(self, name, value):\n        \"\"\"\\\n        Add HTTP response header.\n\n        Append a response header to the list of response headers passed to the\n        WSGI start_response() callback when the response is started.\n\n        Note: This function does not replace existing headers or merge headers\n        with the same name; calling code has to handle such things.  No headers\n        (except Content-Type) are added automatically.\n\n        This function must not be used to add a Content-Type header, and must be\n        called before the response is started.\n        \"\"\"\n\n        assert not self.__started, \"Response already started!\"\n        assert name.lower() != \"content-type\", \"Use Request.setContentType() instead!\"\n        self.__response_headers.append((name, value))\n\n    def setCookie(self, name, value, secure=False):\n        if secure and configuration.base.ACCESS_SCHEME != \"http\":\n            modifier = \"Secure\"\n        else:\n            modifier = \"HttpOnly\"\n        self.addResponseHeader(\n            \"Set-Cookie\",\n            \"%s=%s; Max-Age=31536000; Path=/; %s\" % (name, value, modifier))\n\n    def deleteCookie(self, name):\n        if self.cookies.has_key(name):\n            self.addResponseHeader(\n                \"Set-Cookie\",\n                \"%s=invalid; Path=/; Expires=Thursday 01-Jan-1970 00:00:00 GMT\" % name)\n\n    def start(self):\n        \"\"\"\\\n        Start the response by calling the WSGI start_response() callback.\n\n        This function is called automatically by the main request handling\n        function (critic.py::main) and should typically not be called from any\n        other code.\n\n        This function can be called multiple times; repeated calls do nothing.\n        \"\"\"\n\n        if not self.__started:\n            if self.__status is None:\n                self.setStatus(200)\n            if self.__content_type is None:\n                self.setContentType(\"text/plain\")\n\n            headers = [(\"Content-Type\", self.__content_type)]\n            headers.extend(self.__response_headers)\n\n            self.__write = self.__start_response(self.__status, headers)\n            self.__started = True\n\n    def isStarted(self):\n        \"\"\"\\\n        Check if the response has been started.\n        \"\"\"\n\n        return self.__started\n\n    def getContentType(self):\n        \"\"\"\\\n        Return the currently set response content type.\n\n        The returned value includes the automatically added \"charset=utf-8\".  If\n        the response hasn't been started yet, and setContentType() hasn't been\n        called, None is returned.\n        \"\"\"\n\n        return self.__content_type\n\n    def ensureSecure(self):\n        if configuration.base.ACCESS_SCHEME != \"http\":\n            current_url = self.getRequestURI()\n            secure_url = re.sub(\"^http:\", \"https:\", current_url)\n\n            if current_url != secure_url:\n                raise MovedTemporarily(secure_url, True)\n\n    def requestHTTPAuthentication(self, realm=\"Critic\"):\n        self.setStatus(401)\n        self.addResponseHeader(\"WWW-Authenticate\", \"Basic realm=\\\"%s\\\"\" % realm)\n        self.start()\n\n    def allowRedirect(self, status):\n        \"\"\"Return true if it is safe to redirect this request\"\"\"\n        return self.method in (\"GET\", \"HEAD\") or status == 303\n"
  },
  {
    "path": "src/resources/.gitattributes",
    "content": "*.png   -text\n"
  },
  {
    "path": "src/resources/autocomplete.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\nfunction AutoCompleteUsers(users)\n{\n  function autocomplete(request, response)\n  {\n    var match = /^(.*?)([^\\s,]*)$/.exec(request.term);\n    var source = match[1];\n    var term = match[2].toLowerCase();\n\n    if (!term)\n    {\n      response([]);\n      return;\n    }\n\n    var matches = [];\n\n    for (var username in users)\n    {\n      var fullname = users[username];\n      if (username.substring(0, term.length).toLowerCase() == term ||\n          fullname.substring(0, term.length).toLowerCase() == term)\n        matches.push({ label: fullname + \" (\" + username + \")\", value: source + username });\n    }\n\n    matches.sort(function (a, b) { switch (true) { case a.label < b.label: return -1; case a.label > b.label: return 1; default: return 0; } });\n\n    response(matches);\n  }\n\n  return autocomplete;\n}\n\nfunction AutoCompletePath(paths)\n{\n  var pending_response;\n  var pending_operation;\n\n  function autocomplete(request, response)\n  {\n    function hasPrefix(full, prefix)\n    {\n      return full.substring(0, prefix.length) == prefix;\n    }\n    function repeat(what, count)\n    {\n      return Array(count + 1).join(what);\n    }\n\n    function callResponse(paths, prefiltered)\n    {\n      var pathnames = Object.keys(paths), previous, matches = [], additional = 0;\n\n      pathnames.sort();\n\n      for (var index = 0; index < pathnames.length; ++index)\n      {\n        var pathname = pathnames[index], shortened = pathname;\n\n        if (prefiltered || hasPrefix(pathname, request.term))\n        {\n          if (matches.length == 20)\n          {\n            if (prefiltered)\n            {\n              additional = pathnames.length - matches.length;\n              break;\n            }\n            else\n            {\n              ++additional;\n              continue;\n            }\n          }\n\n          if (previous)\n          {\n            var components = pathname.split(\"/\"), count = 0, prefix = \"\", checked_prefix;\n\n            while (count < components.length && hasPrefix(previous, checked_prefix = components.slice(0, count).join(\"/\")))\n            {\n              ++count;\n              prefix = checked_prefix;\n            }\n\n            if (prefix.length > 3)\n              shortened = repeat(\" \", prefix.length - 3) + \"...\" + pathname.substring(prefix.length);\n          }\n\n          var counts = paths[pathname];\n\n          if (\"deleted\" in counts && \"inserted\" in counts)\n          {\n            if (counts.files == 0)\n              counts = \"-\" + counts.deleted + \"/+\" + counts.inserted;\n            else\n              counts = \"(\" + counts.files + \" files) -\" + counts.deleted + \"/+\" + counts.inserted;\n          }\n          else if (\"files\" in counts)\n            counts = \"(\" + counts.files + \" files)\"\n          else\n            counts = \"\";\n\n          if (counts)\n            counts = \"<span style='float:right;font-size:smaller'>\" + counts + \"</span>\";\n\n          matches.push({ label: (\"<div class=sourcefont style='padding:0;margin:0;white-space:pre'>\" + htmlify(shortened) +\n                                 counts + \"</div>\"),\n                         value: pathname });\n\n          previous = pathname;\n        }\n        else if (matches.length)\n          break;\n      }\n\n      if (additional)\n        matches.push({ label: \"<i>\" + matches.length + \" more matching paths</i>\",\n                       value: request.term });\n\n      response(matches);\n\n      pending_response = null;\n      pending_operation = null;\n    }\n\n    if (pending_response)\n    {\n      pending_response([]);\n      pending_response = null;\n    }\n\n    if (pending_operation)\n    {\n      pending_operation.abort();\n      pending_operation = null;\n    }\n\n    pending_response = response;\n\n    if (typeof paths == \"function\")\n    {\n      pending_operation = paths(request.term, callResponse);\n      if (pending_operation)\n        pending_response = response;\n      else\n        response([]);\n    }\n    else\n      callResponse(paths);\n  }\n\n  return autocomplete;\n}\n\nfunction AutoCompleteRef(remote, prefix)\n{\n  var branches_remote = null;\n  var branches = null;\n  var branches_sha1 = null;\n  var branches_request = null;\n  var branches_response = null;\n\n  prefix = prefix || \"\";\n\n  function getCurrentRemote()\n  {\n    if (typeof remote == \"function\")\n      return remote();\n    else\n      return remote;\n  }\n\n  function autocomplete(request, response)\n  {\n    function callResponse()\n    {\n      function match(name)\n      {\n        return name.substring(0, branches_request.term.length) == branches_request.term;\n      }\n\n      var matches = branches.filter(match);\n\n      if (matches.length < 20)\n      {\n        matches.sort();\n\n        function formatMatch(name)\n        {\n          return { label: (\"<div class=sourcefont style='padding:0;margin:0;white-space:pre'>\" + htmlify(name) +\n                           \"<span style='float:right;font-size:smaller'>\" + branches_sha1[name].substring(0, 8) + \"</span></div>\"),\n                   value: name };\n        }\n\n        branches_response(matches.map(formatMatch));\n      }\n      else\n        branches_response([{ label: matches.length + \" matching branches\", value: branches_request.term }]);\n\n      branches_request = branches_response = null;\n    }\n\n    function handleResult(result)\n    {\n      branches = [];\n      branches_sha1 = {};\n\n      if (result)\n      {\n        for (var name in result.branches)\n        {\n          var use_name = name.substring(prefix.length);\n\n          branches.push(use_name);\n          branches_sha1[use_name] = result.branches[name];\n        }\n      }\n\n      callResponse();\n    }\n\n    if (branches_response)\n      branches_response([]);\n\n    branches_request = request;\n    branches_response = response;\n\n    var current_remote = getCurrentRemote();\n\n    if (branches_remote != current_remote)\n    {\n      branches_remote = current_remote;\n      branches = null;\n\n      var operation = new Operation({ action: \"fetch remote branches\",\n                                      url: \"fetchremotebranches\",\n                                      data: { remote: branches_remote,\n                                              pattern: \"refs/heads/*\" },\n                                      callback: handleResult });\n\n      operation.execute();\n    }\n    else if (branches)\n      return callResponse();\n  }\n\n  return autocomplete;\n}\n"
  },
  {
    "path": "src/resources/basic.css",
    "content": "/* -*- mode: css; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\nbody {\n    color: #222;\n}\n\nh1.noscript {\n    color: red;\n    text-align: center\n}\n\ndiv.main > table {\n    width: 95%\n}\n\n.pagefooter {\n    margin-top: 10px;\n    border-top: 2px solid #A0A092;\n    padding: 10px 1em 0 1em;\n}\n\ndiv.shortcuts {\n    text-align: right;\n}\n\ndiv.shortcuts a.shortcut {\n    text-decoration: none;\n    color: #222\n}\n\ntable.pageheader {\n    border-bottom: 2px solid #A0A092;\n}\n\ntable.pageheader td.left {\n    vertical-align: bottom\n}\n\ntable.pageheader td.left b {\n    font-family: sans-serif\n}\n\ntable.pageheader td.left b.opera {\n    font-size: 40px;\n    color: #d32226;\n    cursor: pointer\n}\n\ntable.pageheader td.left b.opera.development {\n    color: #222\n}\n\ntable.pageheader td.left b.critic {\n    font-size: 50px;\n    color: #666666;\n    cursor: pointer\n}\n\ntable.pageheader td.right {\n    padding-bottom: 10px;\n    text-align: right;\n    vertical-align: bottom\n}\n\ntable.pageheader td.right div {\n    display: inline-block\n}\n\ntable.pageheader td.right div span.buttonscope-global > :first-child {\n    margin-left: 0.5em\n}\n\n.paleyellow {\n    margin: 1rem auto;\n    padding: 1rem;\n    box-sizing: border-box;\n    background: #ffffe6;\n    border: 1px solid #cca;\n    border-bottom-color: #bb9;\n    border-right-color: #bb9;\n    border-radius: 4px;\n}\n\n.paleyellow > h1 {\n    margin: 0;\n    padding: .25rem .5rem .5rem;\n    border-bottom: 1px solid #cca;\n    font-size: 1.9rem;\n    font-weight: normal;\n}\n\n\ntable.paleyellow {\n    margin-top: 1.5em;\n    margin-bottom: 2em;\n    padding: 1em 1em 1.5em 1em;\n}\n\ntable.paleyellow > tbody > tr > td {\n    padding-left: 1em\n}\n\ntable.paleyellow > * > tr > td.h1 {\n    border-bottom: 1px solid #cca\n}\n\ntable.paleyellow > * > tr > td.h1 > h1 {\n    margin: 0 0 0.75rem;\n    font-size: 1.9rem;\n    font-weight: normal;\n}\n\ntable.paleyellow > * > tr > td.h1 > h1 > * {\n    text-shadow: none\n}\n\ntable.paleyellow > * > tr > td.h1 > h1 > span.right {\n    font-size: 50%;\n    float: right\n}\n\ntable.paleyellow > tbody > tr > td.h2 {\n    border-bottom: 1px solid #cca\n}\n\ntable.paleyellow > tbody > tr > td.h2 > h2 {\n    font-size: 1rem;\n    margin: 0.75rem 0;\n}\n\ntable.paleyellow > tbody > tr > td.h2 > h2 > span {\n    font-size: 50%;\n    margin-left: 1em\n}\n\ntable.paleyellow > tbody > tr > td.h2 > h2 > span.right {\n    float: right\n}\n\ntable.paleyellow > tbody > tr.item > td {\n    padding-top: 0.5em\n}\n\ntable.paleyellow > tbody > tr.item > td.name {\n    font-family: serif;\n    font-weight: bold;\n    text-align: right;\n}\n\ntable.paleyellow > tbody > tr.item > td.value {\n    font-family: monospace;\n    white-space: pre-wrap;\n    padding-left: 1em\n}\n\ntable.paleyellow > tbody > tr.item > td.value > div.buttons {\n    float: right\n}\n\ntable.paleyellow > tbody > tr.item > td.value > div.buttons button {\n    margin-left: 3px\n}\n\ntable.paleyellow > tbody > tr.help > td {\n    font-style: italic;\n    text-align: right;\n    border-bottom: 1px solid #cca\n}\n\ntable.paleyellow > tbody > tr.spacer > td {\n    padding-top: 0.3em;\n}\n\ntable.paleyellow > tbody > tr.separator > td {\n    padding: 1em 0 0.5em 0\n}\n\ntable.paleyellow > tbody > tr.separator > td > div {\n    border-bottom: 1px solid #cca\n}\n\ntable.paleyellow > tbody > tr.centered > td {\n    text-align: center\n}\n\ntable.paleyellow > tbody > tr.centered > td > button {\n    margin-top: 1em\n}\n\n@media (min-width: 1348px) {\n    .section,\n    div.main > table,\n    table.paleyellow {\n        width: 1280px\n    }\n}\n\n.callout {\n    padding: 0 0.75rem;\n    background: #fffff4;\n    border: 1px solid #ddb;\n    border-bottom-color: #cca;\n    border-right-color: #cca;\n    border-radius: 4px;\n}\n\ntable.callout {\n    margin: 0 auto;\n    padding-top: 0.75rem;\n    padding-bottom: 0.75rem;\n    border-spacing: 0;\n}\ntable.callout th {\n    border-bottom: 1px solid #cca;\n}\ntable.callout th,\ntable.callout td {\n    padding: 0.25em 0.5em;\n    text-align: left;\n}\n\n.inset {\n    padding: 0.25em 0.5em;\n    background: #fffff0;\n    border: 1px solid #ddb;\n}\n\n.ui-dialog .inset {\n    padding: 0.1rem 0.5rem;\n    background: #fbfaf9;\n    border: 1px solid #e0cfc2;\n}\n\n.ui-dialog { box-shadow: 0 5px 5px rgba(0, 0, 0, 0.5), 0 7px 25px rgba(0, 0, 0, 0.5) }\n\ntable.pageheader td.left > ul {\n  display: inline-block;\n  margin: 0 0 0 1em;\n  padding: 0;\n  font-weight: bold\n}\n\ntable.pageheader td.left > ul > li {\n  display: inline;\n  padding: 0\n}\n\ntable.pageheader td.left > ul > li > a {\n  text-decoration: none;\n  color: #222\n}\n\ntable.pageheader td.left > ul > li:before {\n  content: \" | \"\n}\n\ntable.pageheader td.left > ul > li:first-child:before {\n  content: none\n}\n\n.message-dialog pre {\n    margin-left: 1em\n}\n\n.error-dialog code, .message-dialog code {\n    display: inline-block;\n    border: 1px solid #cca;\n    background-color: #fff;\n    padding: 2px 4px\n}\n\n.repository-select {\n    background-color: white;\n    text-align: left;\n    white-space: nowrap\n}\n\n.repository-select .repository {\n    flex-flow: row wrap;\n}\n\n.repository-select .repository .repository-name {\n    font-weight: bold\n}\n\n.repository-select .repository .repository-path {\n    font-family: monospace;\n    padding-left: 0.5rem;\n    margin-left: auto;\n}\n\n.notifications {\n    position: fixed;\n    width: 400px;\n    margin: 0\n}\n\n.notification {\n    border-radius: 5px;\n    border: 2px solid #8a8;\n    margin: 10px auto;\n    padding: 5px 2em;\n    background-color: #dfd;\n}\n\ntable.searchresults {\n    width: 100%\n}\n\ntable.searchresults .id {\n    text-align: right;\n    padding-right: 0.5em\n}\n\ntable.searchresults .summary {\n    padding-left: 0.5em\n}\n\ntable.searchresults td.link {\n    text-align: right;\n    vertical-align: top;\n}\n\ntable.searchresults tr.review td {\n    font-family: monospace;\n    padding-top: 3px;\n    padding-bottom: 3px;\n    background-color: #fff\n}\n\ndiv.searchdialog .help {\n    float: right\n}\n\ndiv.searchdialog input {\n    width: 100%;\n    font-family: monospace;\n    margin-top: 0.5em\n}\n\n.flex {\n    display: -ms-flexbox !important;\n    display: -webkit-flex !important;\n    display: flex !important;\n}\n\n.ui-dialog > .flex {\n    -ms-flex-direction: column;\n    -webkit-flex-direction: column;\n    flex-direction: column;\n}\n\n.ui-dialog > .flex > * {\n    -ms-flex: none;\n    -webkit-flex: none;\n    flex: none;\n}\n.ui-dialog > .flex > .flexible {\n    -ms-flex: auto;\n    -webkit-flex: auto;\n    flex: auto;\n    min-height: 0;\n}\n\ninput:not([type]),\ninput[type=\"text\"],\ninput[type=\"password\"],\ninput[type=\"date\"],\ninput[type=\"email\"],\ninput[type=\"number\"],\ninput[type=\"search\"],\ninput[type=\"tel\"],\ninput[type=\"time\"],\ninput[type=\"url\"],\ntextarea {\n    margin: 0;\n    -moz-box-sizing: border-box;\n    box-sizing: border-box;\n    padding: .3rem;\n    background: #fff;\n    opacity: .95;\n    border: 1px solid #efefcf;\n    border-top-color: #e5e6b3;\n    border-left-color: #e5e6b3;\n    border-radius: 2px;\n    font-size: inherit;\n}\ninput:not([type]):focus,\ninput[type=\"text\"]:focus,\ninput[type=\"password\"]:focus,\ninput[type=\"date\"]:focus,\ninput[type=\"email\"]:focus,\ninput[type=\"number\"]:focus,\ninput[type=\"search\"]:focus,\ninput[type=\"tel\"]:focus,\ninput[type=\"time\"]:focus,\ninput[type=\"url\"]:focus,\ntextarea:focus {\n    outline: none;\n    opacity: 1;\n    border-color: #ddb;\n    border-top-color: #cca;\n    border-left-color: #cca;\n    box-shadow: 0 0 1px 1px #fff;\n}\n\nselect {\n    font-size: inherit;\n}\n\ninput:disabled,\nselect:disabled {\n    opacity: .7;\n}\n\nfieldset {\n    border: 0;\n    padding: 0;\n}\n\n.input-label {\n    font-size: .9rem;\n    font-weight: bold;\n}\n\n.checkbox-group label {\n    padding-right: .5rem;\n    white-space: nowrap;\n}\n.checkbox-group label:last-child {\n    padding-right: 0;\n}\n\n.clickable {\n    cursor: pointer\n}\n"
  },
  {
    "path": "src/resources/basic.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\n/* -*- Mode: js; js-indent-level: 2; indent-tabs-mode: nil -*- */\n\nfunction User(id, name, email, displayName, status, options)\n{\n  this.id = id;\n  this.name = name;\n  this.email = email;\n  this.displayName = displayName;\n  this.status = status;\n  this.options = options || {};\n}\n\nUser.prototype.toString = function () { return this.displayName + \" <\" + this.email + \">\"; };\n\nfunction Repository(id, name, path)\n{\n  this.id = id;\n  this.name = name;\n  this.path = path;\n}\n\nfunction Branch(id, name, base)\n{\n  this.id = id;\n  this.name = name;\n  this.base = base;\n}\n\nfunction reportError(what, specifics, title, callback)\n{\n  if (!title)\n    title = \"Communication Error!\";\n\n  var content = $(\"<div class=error-dialog title='\" + title + \"'><h1>Failed to \" + what + \".</h1><p>\" + specifics + \"</p></div>\");\n\n  content.dialog({ width: 800,\n                   height: 400,\n                   modal: true,\n                   buttons: { OK: function () { content.dialog(\"close\"); if (callback) callback(); }}});\n}\n\nfunction showMessage(title, heading, message, callback)\n{\n  var content = $(\"<div class=message-dialog title='\" + title + \"'><h1>\" + heading + \"</h1>\" + (message || \"\") + \"</div>\");\n\n  content.dialog({ width: 600, modal: true, buttons: { OK: function () { content.dialog(\"close\"); if (callback) callback(); }}});\n}\n\nfunction htmlify(text, attribute)\n{\n  text = String(text).replace(/&/g, \"&amp;\").replace(/</g, \"&lt;\");\n  if (attribute)\n    text = text.replace(/'/g, \"&apos;\").replace(/\"/g, \"&quot;\");\n  return text;\n}\n\nfunction Operation(data)\n{\n  this.action = data.action;\n  this.url = data.url;\n  this.data = data.data;\n  this.wait = data.wait;\n  this.cancelable = data.cancelable;\n  this.failure = data.failure || {};\n  this.callback = data.callback;\n  this.id = ++Operation.counter;\n\n  if (this.callback)\n    Operation.current[this.id] = true;\n}\n\nOperation.SUPPORTS_CALLBACK = true;\nOperation.current = {};\nOperation.counter = 0;\nOperation.idleCallbacks = [];\n\nOperation.isBusy = function ()\n  {\n    return Object.keys(Operation.current).length != 0;\n  };\n\nOperation.whenIdle = function (callback)\n  {\n    if (Operation.isBusy())\n      Operation.idleCallbacks.push(callback);\n    else\n      callback();\n  };\n\nOperation.finished = function (id)\n  {\n    delete Operation.current[id];\n  };\n\nOperation.checkIdle = function ()\n  {\n    if (!Operation.isBusy())\n    {\n      Operation.idleCallbacks.forEach(function (fn) { fn(); });\n      Operation.idleCallbacks = [];\n    }\n  };\n\nwindow.addEventListener(\"beforeunload\", function (ev)\n                        {\n                          if (Operation.isBusy())\n                          {\n                            ev.returnValue = \"There are pending requests to the server.  You probably want to let them finish before you leave the page.\";\n                            ev.preventDefault();\n                          }\n                        });\n\nOperation.prototype.execute = function ()\n  {\n    var self = this;\n    var result = null;\n    var wait = null;\n\n    function handleResult(result, callback)\n    {\n      callback = callback || function (result) { return result; };\n\n      if (result.status == \"failure\")\n      {\n        var handler = self.failure[result.code];\n\n        if (!handler || !handler(result))\n          showMessage(\"Oops...\", result.title, result.message, function () { callback(null); });\n\n        return null;\n      }\n      else if (result.status == \"error\")\n      {\n        if (result.error.indexOf(\"\\n\") != -1)\n          reportError(self.action, \"Server reply:<pre>\" + htmlify(result.error) + \"</pre>\", null, function () { callback(null); });\n        else\n          reportError(self.action, \"Server reply: <i>\" + htmlify(result.error) + \"</i>\", null, function () { callback(null); });\n\n        return null;\n      }\n      else\n        return callback(result);\n    }\n\n    function success(data)\n    {\n      self.ajax = null;\n      result = data;\n      if (data.__profiling__)\n        console.log(self.url + \"\\n\" +\n                    Array(self.url.length + 1).join(\"=\") + \"\\n\" +\n                    \"Total: \" + data.__time__.toPrecision(3) + \" seconds\\n\" +\n                    data.__profiling__);\n      if (wait)\n        wait.dialog(\"close\");\n      if (self.callback)\n      {\n        Operation.finished(self.id);\n        handleResult(result, self.callback);\n        Operation.checkIdle();\n      }\n    }\n\n    function error(xhr)\n    {\n      self.ajax = null;\n      if (wait)\n        wait.dialog(\"close\");\n      if (!self.aborted)\n      {\n        if (xhr.status == 404)\n          reportError(self.action,\n                      \"<p>The operation <code>\" + self.url + \"</code> is not supported by the server.<p>\" +\n                      \"<p>Simply reloading the page and then trying again might help.  \" +\n                      \"If that doesn't help, and you think an extension might be \" +\n                      \"involved, try reinstalling (or uninstalling) it.</p>\",\n                      null, self.callback);\n        else\n          reportError(self.action, \"Server reply:<pre>\" + (xhr.responseText ? htmlify(xhr.responseText) : \"N/A\") + \"</pre>\", null, self.callback);\n        if (self.callback)\n          Operation.finished(self.id);\n      }\n    }\n\n    if (this.wait)\n    {\n      wait = $(\"<div title='Please Wait' style='text-align: center; padding-top: 2em'>\" + this.wait + \"</div>\");\n      var data = { modal: true };\n      if (this.cancelable)\n        data.buttons = { \"Cancel\": function () { wait.dialog(\"close\"); self.ajax.abort(); }};\n      wait.dialog(data);\n    }\n\n    this.ajax = $.ajax({ async: !!this.callback,\n                         type: \"POST\",\n                         url: \"/\" + this.url,\n                         contentType: \"text/json\",\n                         data: JSON.stringify(this.data),\n                         dataType: \"json\",\n                         success: success,\n                         error: error });\n\n    if (!this.callback)\n    {\n      if (wait)\n        wait.dialog(\"close\");\n\n      if (result)\n        return handleResult(result);\n      else\n        return null;\n    }\n  };\n\nOperation.prototype.abort = function ()\n  {\n    this.aborted = true;\n    if (this.ajax)\n      this.ajax.abort();\n  };\n\n$(document).ready(function ()\n  {\n    $(\"button\").button();\n    $(\"a.button\").button();\n\n    /* Element (e.g. a table-cell) containing a link and with a 'click' handler\n       that clicks the link. */\n    $(\".clickable\").click(function (ev)\n      {\n        if (ev.button == 0 && !$(ev.target).closest(\"a, button, .clickable-target\").size())\n          /* The '.get(0)' means we call the browser's native click() instead of\n             jQuery's.  For some reason, the latter doesn't trigger the default\n             action of the click event on the link (i.e. navigation). */\n          $(ev.currentTarget).find(\".clickable-target\").get(0).click();\n      });\n  });\n\nvar keyboardShortcutHandlers = [];\n\nfunction handleKeyboardShortcut(key)\n{\n  for (var index = 0; index < keyboardShortcutHandlers.length; ++index)\n    if (keyboardShortcutHandlers[index](key))\n      return true;\n  return false;\n}\n\n$(document).ready(function ()\n  {\n    if (typeof keyboardShortcuts == \"undefined\" || keyboardShortcuts)\n      $(document).keypress(function (ev)\n        {\n          if (ev.ctrlKey || ev.shiftKey || ev.altKey || ev.metaKey)\n            return;\n\n          if (/^(?:input|textarea)$/i.test(ev.target.nodeName))\n            if (ev.which == 32 || /textarea/i.test(ev.target.nodeName) || !/^(?:checkbox|radio)$/i.test(ev.target.type))\n              return;\n\n          /* Handling non-printable keys. */\n          if (ev.which)\n          {\n            if (handleKeyboardShortcut(ev.which))\n              ev.preventDefault();\n          }\n        });\n  });\n\nif (!Object.create)\n{\n  Object.create =\n    function (proto, props)\n    {\n      var object = {};\n\n      try { object.__proto__ = proto; }\n      catch (e) {}\n\n      for (var name in props)\n      {\n        if (\"value\" in props[name])\n          object[name] = props[name].value;\n        else\n        {\n          if (props[name].get)\n            object.__defineGetter__(name, props[name].get);\n          if (props[name].set)\n            object.__defineSetter__(name, props[name].set);\n        }\n      }\n\n      return object;\n    };\n}\n\nvar hooks = Object.create(null, {\n  \"create-comment\": { value: [] },\n  \"display-comment\": { value: [] }\n});\n\nvar critic = {\n  Operation: Operation,\n\n  buttons: {\n    add: function (data)\n      {\n        if (!data.title || !(data.href || data.onclick) || !data.scope)\n          throw new TypeError(\"invalid data; should have 'title', 'scope' and 'href'/'onclick' properties\");\n\n        if (data.href)\n          html = \"<a href='\" + htmlify(data.href, true) + \"'>\" + htmlify(data.title) + \"</a>\";\n        else if (typeof data.onclick == \"function\")\n          html = \"<button>\" + htmlify(data.title) + \"</button>\";\n        else\n          html = \"<button onclick='\" + htmlify(data.onclick, true) + \"'>\" + htmlify(data.title) + \"</button>\";\n\n        var button = $(html);\n\n        if (typeof data.onclick == \"function\")\n          button.click(data.onclick);\n\n        button.button();\n\n        $(\"span.buttonscope-\" + data.scope).append(button);\n      },\n\n    remove: function (data)\n      {\n        if (!data.title || !data.scope)\n          throw new TypeError(\"invalid data; should have 'title' and 'scope' properties\");\n\n        $(\"span.buttonscope-\" + data.scope + \" button\").filter(function () { return $(this).text() == data.title }).detach();\n      }\n  },\n\n  hooks: {\n    add: function (name, callback)\n      {\n        if (!(name in hooks))\n          throw new TypeError(\"invalid hook; valid alternatives are: \" + Object.keys(hooks));\n\n        hooks[name].push(callback);\n      }\n  },\n\n  html: {\n    escape: htmlify\n  }\n};\n\nfunction signOut()\n{\n  var operation = new Operation({ action: \"sign out\",\n                                  url: \"endsession\",\n                                  data: {}});\n  var result = operation.execute();\n\n  if (result)\n  {\n    if (result.target_url)\n      location.href = result.target_url;\n    else\n      location.reload();\n  }\n}\n\nfunction repositionNotifications()\n{\n  $(\"body > div.notifications\").position({ my: \"center top\",\n                                           at: \"center top\",\n                                           of: window });\n}\n\nfunction showNotification(content, data)\n{\n  data = data || {};\n\n  var notifications = $(\"body > div.notifications\");\n\n  if (notifications.size() == 0)\n  {\n    notifications = $(\"<div class=notifications></div>\");\n    $(\"body\").append(notifications);\n    repositionNotifications();\n  }\n\n  var notification = $(\"<div class=notification></div>\");\n\n  function displayed()\n  {\n    setTimeout(hide, (data.duration || 3) * 1000);\n  }\n\n  function hide()\n  {\n    if (notification.next(\"div.notification\").size())\n      remove();\n    else\n    {\n      /* Using .animate({ opacity: 0 }) instead of .fadeOut() since the latter\n         \"helpfully\" sets display:none at the end of the animation.  We want to\n         also do .slideUp(), and that only works if the element is still there. */\n      notification.animate(\n        { opacity: 0 }, { duration: 600, complete: remove });\n    }\n  }\n\n  function remove()\n  {\n    if (data.callback)\n      data.callback();\n\n    notification.slideUp(400, finalize);\n  }\n\n  function finalize()\n  {\n    notification.remove();\n  }\n\n  if (data.className)\n    notification.addClass(data.className);\n\n  notification.append(content);\n  notification.fadeIn(400, displayed);\n  notification.click(hide);\n\n  notifications.append(notification);\n\n  return { hide: hide, remove: remove };\n}\n\nvar previous_query = \"\";\n\nif (typeof localStorage != \"undefined\")\n  previous_query = localStorage.getItem(\"previous_query\");\n\nfunction quickSearch(external_query, callback)\n{\n  function finish(result)\n  {\n    if (!result)\n    {\n      if (external_query === void 0)\n        setTimeout(quickSearch, 0);\n      return;\n    }\n\n    if (result.reviews.length == 0)\n    {\n      showMessage(\"Search results\", \"No reviews found!\");\n      return;\n    }\n\n    var html = (\"<table class=searchresults>\" +\n                \"<tr><th class=id>Review</th><th class=summary>Summary</th>\");\n\n    html += \"</tr>\";\n\n    result.reviews.forEach(function (review, index)\n      {\n        html += (\"<tr class=review critic-review-id=\" + review.id + \">\" +\n                 \"<td class=id>r/\" + review.id + \"</td>\" +\n                 \"<td class=summary><a href=/r/\" + review.id + \">\" + htmlify(review.summary) + \"</a></td>\" +\n                 \"</tr>\");\n      });\n\n    html += \"</table></div>\";\n\n    var content = $(html);\n\n    content.find(\"tr\").click(function (ev)\n      {\n        var target = $(ev.target);\n        if (!target.is(\"a\"))\n          target.parents(\"tr\").find(\"a\").get(0).click();\n      });\n\n    if (callback)\n      callback(content, result);\n    else\n    {\n      content.wrap(\"<div title='Search results'></div>\");\n      content.find(\"tr\").first().append(\n        \"<td class=link><a>Link to this search</a></td>\");\n      content.find(\"td.link a\").attr(\"href\", \"/search?\" + result.query_string);\n      content.find(\"td.summary\").attr(\"colspan\", \"2\");\n\n      content = content.parent();\n      content.dialog(\n        { width: 800,\n          buttons: { \"Close\": function () { content.dialog(\"close\"); }}});\n\n      if (content.closest(\".ui-dialog\").height() > innerHeight)\n        content.dialog(\"option\", \"height\", innerHeight - 10);\n    }\n  }\n\n  function search(query)\n  {\n    var operation = new Operation({ action: \"search\",\n                                    url: \"searchreview\",\n                                    data: { query: query },\n                                    wait: \"Searching...\",\n                                    callback: finish });\n\n    operation.execute();\n  }\n\n  if (external_query !== void 0)\n  {\n    search(external_query);\n    return;\n  }\n\n  function start()\n  {\n    previous_query = content.find(\"input\").val().trim();\n\n    if (typeof localStorage != \"undefined\")\n      localStorage.setItem(\"previous_query\", previous_query);\n\n    content.dialog(\"close\");\n\n    if (previous_query)\n      search(previous_query);\n  }\n\n  function cancel()\n  {\n    content.dialog(\"close\");\n  }\n\n  function handleKeypress(ev)\n  {\n    if (ev.keyCode == 13)\n      start();\n  }\n\n  var content = $(\"<div title='Review Quick Search' class=searchdialog>\" +\n                  \"<div><b>Search query:</b>\" +\n                  \"<span class=help><a href=/tutorial?item=search>Help</a></span></div>\" +\n                  \"<div><input></div>\" +\n                  \"</div>\");\n\n  content.find(\"input\")\n    .val(previous_query)\n    .keypress(handleKeypress);\n\n  content.dialog({ width: 800,\n                   buttons: { \"Search\": start, \"Cancel\": cancel }});\n\n  setTimeout(function () { content.find(\"input\").select(); content.find(\"input\").focus(); }, 0);\n}\n\nkeyboardShortcutHandlers.push(function (key)\n  {\n    if (key == \"f\".charCodeAt(0))\n    {\n      quickSearch();\n      return true;\n    }\n    if (key == \"u\".charCodeAt(0))\n    {\n      if (window.review)\n      {\n        if (window.isReviewFrontpage)\n          location.href = \"/dashboard\";\n        else\n          location.href = \"/r/\" + review.id;\n        return true;\n      }\n    }\n  });\n\n$(window).resize(repositionNotifications);\n"
  },
  {
    "path": "src/resources/branches.css",
    "content": "/* -*- mode: css; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\nbody { font-size: 12px; font-family: sans-serif }\n\ndiv.main table tr.title td.description { text-align: right; border-bottom: 1px solid #cca; margin-left: 0; margin-top: 0; margin-bottom: 0.5em; font-style: italic }\ndiv.main table tr.title td.repositories { text-align: right; border-bottom: 1px solid #cca }\n\ndiv.main table.branches tr.headings td { padding-top: 0.5em; font-weight: bold; text-decoration: underline }\ndiv.main table.branches tr.headings td.when { text-align: right; width: 10% }\ndiv.main table.branches tr.headings td.name { width: 40% }\ndiv.main table.branches tr.headings td.base { width: 40% }\ndiv.main table.branches tr.headings td.commits { text-align: right; width: 10% }\n\ndiv.main table.branches tr.nothing td.nothing { text-align: center; font-weight: bold; padding-top: 1em }\n\ndiv.main table.branches tr.branch:hover { background: #eec }\ndiv.main table.branches tr.branch td.when { text-align: right; font-weight: bold }\ndiv.main table.branches tr.branch td.name { font-family: monospace; font-size: 10pt }\ndiv.main table.branches tr.branch td.name span.review { padding-left: 1em; font-size: smaller }\ndiv.main table.branches tr.branch td.name span.check { padding-left: 1em; font-size: smaller }\ndiv.main table.branches tr.branch td.base { font-family: monospace; font-size: 10pt }\ndiv.main table.branches tr.branch td.commits { text-align: right; font-weight: bold }\n"
  },
  {
    "path": "src/resources/branches.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\n$(function ()\n  {\n    $(\"td.repositories select\").change(function (ev)\n      {\n        if (typeof repository == \"undefined\" || ev.target.value != repository.id)\n          location.href = \"/branches?repository=\" + encodeURIComponent(ev.target.value);\n      });\n\n    $(\".repository-select\").chosen({\n      inherit_select_classes: true,\n      generate_selected_value: function (item)\n        {\n          return { html: \"Repository: <b>\" + htmlify(item.text) + \"</b>\" };\n        },\n      collapsed_width: \"auto\",\n      expanded_width: \"600px\"\n    });\n  });\n"
  },
  {
    "path": "src/resources/changeset.css",
    "content": "/* -*- mode: css; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\ndiv.main {\n    margin-bottom: 20px;\n    padding-bottom: 10px;\n    border-bottom: 2px solid #A0A092;\n}\n\ntable.commit-files {\n    margin-left: 0.5em;\n    margin-bottom: 15px\n}\n\ntable.commit-files > thead > tr > th {\n    text-align: left;\n    padding-left: 10px;\n    padding-right: 10px\n}\n\ntable.commit-files > tbody > tr > td {\n    font-family: monospace;\n    white-space: pre;\n    padding-left: 10px;\n    padding-right: 10px\n}\n\ntable.commit-files > tbody > tr > td.approve {\n    text-align: right;\n    font-style: italic\n}\n\ntable.commit-files > tbody > tr > td.approve > span {\n    display: none\n}\n\ntable.commit-files > tbody > tr > td.approve > span.show {\n    display: inline\n}\n\ntable.commit-files > tbody > tr > th.parent {\n    text-align: center;\n    font-style: italic\n}\n\ntable.commit-files > tbody > tr > td > a {\n    color: #222;\n    text-decoration: none\n}\n\ntable.commit-files > tbody > tr > td.path:hover {\n    background-color: #ccc\n}\n\ntable.commit-files > tbody > tr > td.parent.hover {\n    background-color: #ccc\n}\n\ntable.file {\n    border: 1px solid #cca;\n    border-radius: 2px;\n    margin-bottom: 1em;\n    table-layout: fixed\n}\n\ntable.file col.edge {\n    width: 1em\n}\n\ntable.file col.linenr {\n    width: 3em\n}\n\ntable.file col.middle {\n    width: 1em\n}\n\ntable.file col.line {\n    width: 50%\n}\n\ntable.file {\n    display: none\n}\n\ntable.file.show {\n    display: table\n}\n\ntable.file > tbody, table.file > tfoot {\n    display: none\n}\n\ntable.file.expanded > tbody, table.file.expanded > tfoot {\n    display: table-row-group\n}\n\ntable.file > thead > tr > td,\ntable.file > tfoot > tr > td {\n    padding: 3px 2em;\n    width: 50%;\n    background-color: #ffffe6\n}\n\ntable.file > thead > tr > td.left,\ntable.file > tfoot > tr > td.left {\n    width: 50%;\n    text-align: left\n}\n\ntable.file > thead > tr > td.right,\ntable.file > tfoot > tr > td.right {\n    width: 50%;\n    text-align: right\n}\n\ntable.file > thead + tbody > tr:first-child > td,\ntable.file > tbody + tfoot > tr:first-child > td {\n    border-top: 1px solid #cca\n}\n\ntable.file thead td.left a.showtree,\ntable.file tfoot td.left a.showtree {\n    text-decoration: none;\n    color: #222\n}\ntable.file thead td.left a.showtree.root,\ntable.file tfoot td.left a.showtree.root {\n    font-size: 80%\n}\ntable.file thead td.left a.showtree:hover,\ntable.file tfoot td.left a.showtree:hover {\n    text-decoration: underline;\n    color: #222\n}\n\ntable.file > tbody.spacer > tr > td {\n    background-color: #eee;\n    text-align: center\n}\n\ntable.file > tbody.spacer > tr > td {\n    padding: 0\n}\n\ntable.file > tbody.spacer > tr + tr.spacer {\n    display: none\n}\n\ntable.file > tbody.spacer > tr.expand > td,\ntable.file > tbody.spacer > tr.context > td {\n    padding-top: 3px;\n    padding-bottom: 3px\n}\n\ntable.file > tbody.spacer > tr.expand > td > select {\n    background-color: #eee;\n    border: 0\n}\n\ntable.file > tbody.lines > tr > td.edge,\ntable.file > tbody.lines > tr > td.middle,\ntable.file > tbody.lines > tr > td.linenr {\n    background-color: #eee\n}\n\ntable.file > tbody.lines > tr > td.middle {\n    cursor: col-resize;\n    background-image: -webkit-radial-gradient(circle, #eee 4px, transparent 4px), -webkit-radial-gradient(circle, rgba(0, 0, 0, 0.3) 4px, transparent 4px);\n    background-image: -moz-radial-gradient(circle, #eee 4px, transparent 4px), -moz-radial-gradient(circle, rgba(0, 0, 0, 0.3) 4px, transparent 4px);\n    background-image: radial-gradient(circle, #eee 4px, transparent 4px), radial-gradient(circle, rgba(0, 0, 0, 0.3) 4px, transparent 4px);\n    background-position: 0 0, -1px -1px\n}\n\ntable.file > tbody.lines > tr > td.linenr.old {\n    text-align: right;\n    padding-right: 3px\n}\n\ntable.file > tbody.lines > tr > td.linenr.new {\n    padding-left: 3px\n}\n\ntable.file > tbody.lines > tr > td.line {\n    border-left: 1px solid #888;\n    border-right: 1px solid #888;\n    white-space: pre-wrap;\n    overflow: hidden;\n    word-wrap: break-word\n}\n\ntable.file.resized.old-narrower > tbody.lines > tr > td.line.old,\ntable.file.resized.new-narrower > tbody.lines > tr > td.line.new {\n    overflow-wrap: normal;\n    white-space: pre\n}\n\ntable.file > tbody.lines > tr > td.comment {\n    white-space: pre-wrap;\n    border-left: 1px solid #888;\n    border-right: 1px solid #888\n}\n\ntable.file > tbody.lines > tr:first-child > td.line {\n    border-top: 1px solid #888;\n    padding-top: 3px\n}\n\ntable.file > tbody.lines > tr:last-child > td.line {\n    border-bottom: 1px solid #888;\n    padding-bottom: 3px\n}\n\ntable.file > tbody.content > tr > td {\n    background-color: #eee\n}\n\ntable.file > tbody.deleted > tr > td {\n    background-color: #eee;\n    text-align: center\n}\n\ntable.file > tbody.deleted > tr > td > h2 {\n    margin-top: 0\n}\n\ntable.file > tbody.binary > tr > td {\n    background-color: #eee;\n    text-align: center;\n    vertical-align: middle\n}\n\ntable.file > tbody.binary > tr > td > h2 {\n    margin-top: 0\n}\n\ntable.file > tbody.binary > tr.download > td > a {\n    padding-left: 1em;\n    padding-right: 1em\n}\n\ntable.file > tbody.binary > tr.download > td > a > img {\n    vertical-align: middle\n}\n\ntable.commit-info { margin-top: 1em; width: auto !important }\ntable.commit-info tr.commit-info td { white-space: nowrap }\n\ntable.commit-info span.links,\ntable.commit-info span.branches,\ntable.commit-info span.tags {\n    margin-left: 1em\n}\n\ntable.commit-info span.link,\ntable.commit-info span.branch,\ntable.commit-info span.tag {\n    margin-right: 0.2em\n}\n\npre.commit-msg { padding: 0.5em 1em  }\n\ntr.modified b.t {\n    color: #cca\n}\n\ntr.replaced > td.old b.t,\ntr.deleted > td.old b.t {\n    color: #caa\n}\n\ntr.replaced > td.new b.t,\ntr.inserted > td.new b.t {\n    color: #aca\n}\n\ninput.approve {\n    background-color: #eec;\n    font-weight: bold\n}\n\nbody { font-size: 12px; font-family: sans-serif }\n\ndiv.parent {\n    display: none\n}\n\ndiv.parent.show {\n    display: block\n}\n\ndiv.parent > h1 {\n    padding-left: 1em;\n    font-size: 150%;\n    font-weight: bold\n}\n\ndiv.detectmoves select {\n    width: 100%;\n    padding: 3px\n}\n\ntable.commit-info tr.commit-msg > td {\n    padding-top: 1.5em;\n    padding-bottom: 1.5em\n}\n\ntable.commit-msg tr.line:hover {\n    background-color: #eee\n}\n\ntable.commit-msg tr.line td.edge {\n    padding: 0 1em 0 1em\n}\n\ntable.commit-msg tr.line td.line {\n    white-space: pre;\n    font-family: monospace;\n    padding: 0\n}\n\ntable.commit-msg tr.line.highlight td.line {\n    font-weight: bold\n}\n\n.blame-tooltip {\n    opacity: 1 !important;\n    max-width: none !important\n}\n\n.blame-tooltip pre {\n    margin: 5px;\n    padding: 5px;\n    background-color: white;\n    border: 1px solid #888\n}\n"
  },
  {
    "path": "src/resources/changeset.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\n/* -*- Mode: js; js-indent-level: 2; indent-tabs-mode: nil -*- */\n\nvar files = [];\nvar blocks = [];\n\nfunction makeLine(fileId, oldOffset, oldLine, newLine, newOffset)\n{\n  var row = document.createElement('tr');\n  row.className = 'line context';\n  row.id = 'f' + fileId + 'o' + oldOffset + 'n' + newOffset;\n\n  var edge1 = row.insertCell(-1);\n  edge1.className = 'edge';\n  var oldOffsetCell = row.insertCell(-1);\n  oldOffsetCell.textContent = oldOffset;\n  oldOffsetCell.className = 'linenr old';\n  oldOffsetCell.align = 'right';\n  var oldLineCell = row.insertCell(-1);\n  oldLineCell.innerHTML = oldLine ? oldLine : \"&nbsp;\";\n  oldLineCell.className = 'line old';\n  oldLineCell.id = 'f' + fileId + 'o' + oldOffset;\n  var middle = row.insertCell(-1);\n  middle.innerHTML = '&nbsp;';\n  middle.className = 'middle';\n  middle.colSpan = 2;\n  var newLineCell = row.insertCell(-1);\n  newLineCell.innerHTML = newLine ? newLine : \"&nbsp;\";\n  newLineCell.className = 'line new';\n  newLineCell.id = 'f' + fileId + 'n' + newOffset;\n  var newOffsetCell = row.insertCell(-1);\n  newOffsetCell.textContent = newOffset;\n  newOffsetCell.className = 'linenr new';\n  var edge2 = row.insertCell(-1);\n  edge2.className = 'edge';\n\n  if (typeof startCommentMarking != \"undefined\")\n  {\n    var lineCells = $(row).children(\"td.line\");\n    lineCells.mousedown(startCommentMarking);\n    lineCells.mouseover(continueCommentMarking);\n    lineCells.mouseup(endCommentMarking);\n  }\n\n  return row;\n}\n\nfunction previousTableSection(node)\n{\n  while (node.parentNode.nodeName.toLowerCase() != \"table\")\n    node = node.parentNode;\n  do\n    node = node.previousSibling;\n  while (node.nodeName.toLowerCase() != \"tbody\");\n  return node;\n}\n\nfunction nextTableSection(node)\n{\n  while (node.parentNode.nodeName.toLowerCase() != \"table\")\n    node = node.parentNode;\n  do\n    node = node.nextSibling;\n  while (node.nodeName.toLowerCase() != \"tbody\");\n  return node;\n}\n\nfunction hasClass(element, cls)\n{\n  return new RegExp(\"(^|\\\\s)\" + cls + \"($|\\\\s)\").test(element.className)\n}\nfunction addClass(element, cls)\n{\n  if (!hasClass(element, cls))\n    element.className += \" \" + cls;\n}\nfunction removeClass(element, cls)\n{\n  if (hasClass(element, cls))\n    element.className = element.className.replace(new RegExp(\"(^|\\\\s+)\" + cls + \"($|(?=\\\\s))\"), \"\");\n}\n\nvar extractedFiles = {};\n\nvar HIDE   = 1;\nvar SHOW   = 2;\nvar EXPAND = 3;\n\nvar CONTEXT    = 1;\nvar DELETED    = 2;\nvar MODIFIED   = 3;\nvar REPLACED   = 4;\nvar INSERTED   = 5;\nvar WHITESPACE = 6;\nvar CONFLICT   = 7;\n\nvar line_classes = [null, \"context\", \"deleted\", \"modified\", \"replaced\", \"inserted\", \"modified whitespace\", \"conflict\"];\n\nfunction recompact(id)\n{\n  var table = fileById(id), count = 0;\n\n  table.each(function (index, table)\n    {\n      if (!table.disableCompact)\n        for (index = 0; index < table.tBodies.length; ++index)\n        {\n          var tbody = table.tBodies.item(index);\n          if (tbody.firstChild.nodeType == Node.COMMENT_NODE)\n          {\n            var comment = tbody.firstChild;\n            while (comment.nextSibling)\n            {\n              tbody.removeChild(comment.nextSibling);\n              ++count;\n            }\n          }\n        }\n    });\n}\n\nfunction decompact(id)\n{\n  var table = fileById(id);\n\n  if (!table.children(\"colgroup\").size())\n    table.prepend(\"<colgroup><col class=edge><col class=linenr><col class=line><col class=middle><col class=middle><col class=line><col class=linenr><col class=edge></colgroup>\");\n\n  table.each(function (index, table)\n    {\n      var parent;\n\n      if (table.hasAttribute(\"critic-parent-index\"))\n        parent = \"p\" + table.getAttribute(\"critic-parent-index\");\n      else\n        parent = \"\";\n\n      if (table.disableCompact)\n        return;\n\n      function unpack(line)\n      {\n        return line.replace(/<([bi])([a-z]+)>/g, \"<$1 class=$2>\");\n      }\n\n      for (index = 0; index < table.tBodies.length; ++index)\n      {\n        var tbody = table.tBodies.item(index);\n        if (tbody.firstChild.nodeType == Node.COMMENT_NODE && !tbody.firstChild.nextSibling)\n        {\n          var data = JSON.parse(tbody.firstChild.nodeValue);\n\n          var file_id = data[0];\n          var sides = data[1];\n          var old_offset = data[2];\n          var new_offset = data[3];\n          var lines = data[4];\n          var html = \"\";\n\n          for (var line_index = 0; line_index < lines.length; ++line_index)\n          {\n            var line = lines[line_index];\n\n            var line_type = line[0];\n            var item_index = 1;\n\n            var line_old_offset = 0, line_new_offset = 0;\n\n            if (line_type != INSERTED)\n              line_old_offset = old_offset++;\n            if (line_type != DELETED && line_type != CONFLICT)\n              line_new_offset = new_offset++;\n\n            var line_id = parent + \"f\" + file_id + \"o\" + line_old_offset + \"n\" + line_new_offset;\n\n            html += \"<tr class='line \" + (sides != 2 ? \"single \" : \"\") +\n                    line_classes[line_type] + \"' id='\" + line_id + \"'>\" +\n                    \"<td class=edge>&nbsp;</td>\" +\n                    \"<td class='linenr old'>\";\n\n            if (sides == 2)\n            {\n              if (line_type != INSERTED)\n                html += line_old_offset + \"</td><td class='line old' id=\" + parent + \"f\" +\n                        file_id + \"o\" + line_old_offset + \">\" + unpack(line[item_index++]);\n              else\n                html += \"&nbsp;<td class='line old'>&nbsp;\";\n\n              html += \"</td><td class='middle' colspan=2>&nbsp;</td>\" +\n                      \"<td class='line new'\";\n\n              if (line_type != DELETED && line_type != CONFLICT)\n                html += \" id=\" + parent + \"f\" + file_id + \"n\" + line_new_offset + \">\" +\n                        unpack(line[item_index++]) + \"</td><td class='linenr new'>\" + line_new_offset;\n              else\n                html += \">&nbsp;</td><td class='linenr old'>&nbsp;\";\n            }\n            else\n            {\n              if (line_type == DELETED)\n                html += line_old_offset + \"</td><td class='line single old' id=\" + parent + \"f\" +\n                        file_id + \"o\" + line_old_offset + \" colspan=4>\" + unpack(line[item_index++]) +\n                        \"</td><td class='linenr new'>\" + line_old_offset;\n              else\n                html += line_new_offset + \"</td><td class='line single new' id=\" + parent + \"f\" +\n                        file_id + \"n\" + line_new_offset + \" colspan=4>\" + unpack(line[item_index++]) +\n                        \"</td><td class='linenr new'>\" + line_new_offset;\n            }\n\n            html += \"</td><td class=edge>&nbsp;</td></tr>\";\n          }\n\n          tbody = $(tbody);\n          tbody.append(html);\n\n          if (typeof review != \"undefined\")\n          {\n            tbody.find(\"td.line\").mousedown(startCommentMarking);\n            tbody.find(\"td.line\").mouseover(continueCommentMarking);\n            tbody.find(\"td.line\").mouseup(endCommentMarking);\n          }\n\n          updateBlame(parseInt(id));\n        }\n      }\n    });\n}\n\nfunction restoreFile(id)\n{\n  if (id in extractedFiles)\n  {\n    var table = extractedFiles[id][0];\n    var placeholder = extractedFiles[id][1];\n\n    delete extractedFiles[id];\n\n    placeholder.replaceWith(table);\n\n    if (typeof review != \"undefined\")\n    {\n      table.find(\"td.line\").mousedown(startCommentMarking);\n      table.find(\"td.line\").mouseover(continueCommentMarking);\n      table.find(\"td.line\").mouseup(endCommentMarking);\n    }\n  }\n}\n\nfunction restoreAllFiles()\n{\n  for (var id in extractedFiles)\n    restoreFile(id);\n}\n\nfunction toggleFile(table)\n{\n  table = $(table);\n\n  if (table.hasClass(\"expanded\"))\n    collapseFile(table.attr(\"critic-file-id\"));\n  else\n    expandFile(table.attr(\"critic-file-id\"));\n\n  if (typeof CommentMarkers != \"undefined\")\n    CommentMarkers.updateAll();\n}\n\nfunction fileById(id)\n{\n  if (typeof parentsCount != \"undefined\")\n  {\n    var selector = [];\n    for (index = 0; index < parentsCount; ++index)\n      selector.push(\"#p\" + index + \"f\" + id);\n    return $(selector.join(\", \"));\n  }\n  else\n    return $(\"#f\" + id);\n}\n\nfunction collapseFile(id, implicit)\n{\n  var table = fileById(id);\n\n  table.removeClass(\"expanded\");\n  recompact(id);\n\n  if (typeof CommentMarkers != \"undefined\")\n    CommentMarkers.updateAll();\n\n  if (!implicit)\n    saveState();\n}\n\nfunction expandFile(id, scroll)\n{\n  if (typeof parentsCount != \"undefined\")\n    if (selectedParent == null || !document.getElementById(\"p\" + selectedParent + \"f\" + id))\n      for (var index = 0; index < parentsCount; ++index)\n        if (document.getElementById(\"p\" + index + \"f\" + id))\n        {\n          selectParent(index);\n          break;\n        }\n\n  restoreFile(id);\n\n  var table = currentFile = fileById(id);\n\n  decompact(id);\n  table.addClass(\"show expanded\");\n\n  if (scroll)\n  {\n    if (table.offset().top + table.height() > pageYOffset + innerHeight || table.offset().top < scrollY)\n      scrollTo(pageXOffset, table.offset().top);\n  }\n\n  if (typeof CommentMarkers != \"undefined\")\n    CommentMarkers.updateAll();\n\n  saveState();\n}\n\nfunction hideFile(id)\n{\n  var table = fileById(id);\n\n  table.removeClass(\"show\");\n\n  recompact(id);\n}\n\nfunction showFile(id)\n{\n  if (typeof parentsCount != \"undefined\")\n    if (selectedParent == null || !document.getElementById(\"p\" + selectedParent + \"f\" + id))\n      for (var index = 0; index < parentsCount; ++index)\n        if (document.getElementById(\"p\" + index + \"f\" + id))\n        {\n          selectParent(index);\n          break;\n        }\n\n  restoreFile(id);\n\n  var table = fileById(id);\n\n  table.addClass(\"show expanded\");\n}\n\nfunction collapseAll(implicit)\n{\n  var changed = false;\n\n  $(\"table.file.expanded\").each(function (index, table)\n    {\n      changed = true;\n\n      table = $(table);\n      table.removeClass(\"expanded\");\n\n      var id = table.attr(\"critic-file-id\");\n\n      recompact(id);\n    });\n\n  if (typeof CommentMarkers != \"undefined\")\n    CommentMarkers.updateAll();\n\n  if (!implicit && changed)\n    saveState();\n}\n\nfunction expandAll()\n{\n  showAll(true);\n\n  var changed = false;\n\n  $(\"table.file\").each(function (index, table)\n    {\n      table = $(table);\n\n      var id = table.attr(\"critic-file-id\");\n\n      if (!table.hasClass(\"expanded\"))\n      {\n        decompact(id);\n        changed = true;\n        table.addClass(\"expanded\");\n      }\n    });\n\n  if (changed)\n    saveState();\n\n  if (typeof CommentMarkers != \"undefined\")\n    CommentMarkers.updateAll();\n}\n\nvar mode = \"hide\";\n\nfunction hideAll(implicit)\n{\n  if (/showcomments?$/.test(location.pathname))\n    return;\n\n  mode = \"hide\";\n  $(\"table.file\").each(function (index, table)\n    {\n      hideFile($(table).attr(\"critic-file-id\"));\n    });\n\n  if (typeof CommentMarkers != \"undefined\")\n    CommentMarkers.updateAll();\n\n  if (!implicit)\n    saveState();\n}\n\nfunction showAll(implicit)\n{\n  mode = \"show\";\n\n  restoreAllFiles();\n\n  var changed = false;\n\n  $(\"table.file\").each(function (index, table)\n    {\n      table = $(table);\n\n      var id = table.attr(\"critic-file-id\");\n\n      if (!table.hasClass(\"show\"))\n      {\n        changed = true;\n        if (table.hasClass(\"expanded\"))\n          decompact(id);\n        table.addClass(\"show\");\n      }\n    });\n\n  if (!implicit && changed)\n    saveState();\n\n  if (typeof CommentMarkers != \"undefined\")\n    CommentMarkers.updateAll();\n}\n\nvar isRestoringState = false;\nvar saveStateTimer = null;\nvar previousFilesView = {};\n\nfunction isFilesViewEqual(first, second)\n{\n  for (var id in first)\n    if (first[id] != second[id])\n      return false;\n\n  return true;\n}\n\nfunction queueSaveState(replace)\n{\n  if (saveStateTimer)\n    clearTimeout(saveStateTimer);\n\n  saveStateTimer = setTimeout(function () { saveState(replace); }, 1500);\n}\n\nfunction saveState(replace)\n{\n  if (!isRestoringState)\n  {\n    var filesView = {};\n\n    $(\"table.file\").each(function (index, table)\n      {\n        table = $(table);\n\n        var id = table.attr(\"critic-file-id\");\n\n        if (table.hasClass(\"show\"))\n          filesView[id] = table.hasClass(\"expanded\") ? EXPAND : SHOW;\n        else\n          filesView[id] = HIDE;\n      });\n\n    var state = { filesView: filesView, scrollLeft: pageXOffset, scrollTop: pageYOffset };\n\n    if (isFilesViewEqual(filesView, previousFilesView))\n      replace = true;\n    else\n      previousFilesView = filesView;\n\n    if (!replace)\n    {\n      if (typeof history.pushState == \"function\")\n        history.pushState(state, document.title, location.href);\n    }\n    else\n    {\n      if (typeof history.replaceState == \"function\")\n        history.replaceState(state, document.title, location.href);\n    }\n  }\n\n  clearTimeout(saveStateTimer);\n  saveStateTimer = null;\n}\n\nfunction restoreState(state)\n{\n  isRestoringState = true;\n\n  hideAll(true);\n\n  if (state)\n  {\n    for (var id in state.filesView)\n      switch (state.filesView[id])\n      {\n      case EXPAND:\n        expandFile(id);\n        break;\n      case SHOW:\n        showFile(id);\n        collapseFile(id);\n        break;\n      case HIDE:\n        hideFile(id);\n        break;\n      }\n\n    if (typeof state.scrollTop == \"number\")\n      window.scrollTo(state.scrollLeft, state.scrollTop);\n  }\n\n  isRestoringState = false;\n}\n\nvar selectedParent = null;\n\nfunction selectParent(index)\n{\n  for (var other = 0; other < parentsCount; ++other)\n    if (other != index)\n      $(\".parent\" + other).removeClass(\"show\");\n\n  $(\".parent\").removeClass(\"show\");\n  $(\"#p\" + index).addClass(\"show\");\n\n  $(\".parent\" + index).addClass(\"show\");\n\n  selectedParent = index;\n\n  if (typeof CommentMarkers != \"undefined\")\n    CommentMarkers.updateAll();\n}\n\ndocument.addEventListener(\"click\", function (ev)\n  {\n    var node = ev.target;\n    while (node)\n    {\n      if (node.nodeName.toLowerCase() == \"thead\" || node.nodeName.toLowerCase() == \"tfoot\" || hasClass(node, \"file-summary\"))\n      {\n        toggleFile($(node).parents(\"table\"));\n      }\n      else if (node.nodeName.toLowerCase() == \"a\")\n        return;\n\n      node = node.parentNode;\n    }\n  }, false);\n\nvar currentFile = null;\n\nkeyboardShortcutHandlers.push(function (key)\n  {\n    switch (key)\n    {\n    case 32:\n      if (!currentFile)\n        if (mode == \"hide\")\n          hideAll(true);\n        else\n          collapseAll(true);\n\n      if (pageYOffset + innerHeight >= (currentFile ? (currentFile.offset().top + currentFile.height()) : document.documentElement.scrollHeight))\n      {\n        var nextFile = currentFile ? currentFile.nextAll(\"table.file\").first() : $(\"table.file.first\");\n\n        if (currentFile && currentFile.length)\n        {\n          var id = currentFile.first().attr(\"critic-file-id\");\n\n          if (mode == \"hide\")\n          {\n            $(currentFile).removeClass(\"show\");\n\n            if (typeof CommentMarkers != \"undefined\")\n              CommentMarkers.updateAll();\n          }\n          else\n            collapseFile(id, true);\n\n          if (typeof markFile != \"undefined\")\n          {\n            var parent_index = currentFile.first().attr(\"critic-parent-index\");\n            if (parent_index)\n              parent_index = parseInt(parent_index);\n            else\n              parent_index = null;\n            markFile(\"reviewed\", parseInt(currentFile.first().attr(\"critic-file-id\")), parent_index);\n          }\n        }\n\n        if (nextFile.length)\n        {\n          expandFile(nextFile.first().attr(\"critic-file-id\"), true);\n          return true;\n        }\n        else\n          currentFile = null;\n      }\n      saveState();\n      return false;\n\n    case \"e\".charCodeAt(0):\n      expandAll();\n      return true;\n\n    case \"c\".charCodeAt(0):\n      collapseAll();\n      return true;\n\n    case \"s\".charCodeAt(0):\n      showAll();\n      return true;\n\n    case \"h\".charCodeAt(0):\n      hideAll();\n      return true;\n\n    case \"m\".charCodeAt(0):\n      detectMoves();\n      return true;\n\n    case \"b\".charCodeAt(0):\n      blame();\n      return true;\n\n    default:\n      if (typeof parentsCount != \"undefined\")\n        if (key >= \"1\".charCodeAt(0) && key <= \"0\".charCodeAt(0) + parentsCount)\n        {\n          selectParent(key - \"1\".charCodeAt(0));\n          return true;\n        }\n    }\n  });\n\nfunction setSpacerContext(spacer, context)\n{\n  var target = $(spacer).nextAll(\"tr.context\").find(\"td\");\n  if (!target.size())\n  {\n    var row = $(\"<tr class=context><td class=context colspan=8></td></tr>\");\n    $(spacer).after(row);\n    target = row.find(\"td\");\n  }\n  target.text(context);\n}\n\nfunction expand(select, file_id, path, sha1, where, oldOffset, newOffset, total)\n{\n  if (select.value == \"none\")\n    return;\n\n  var spacerCell = select.parentNode;\n  var spacerRow = spacerCell.parentNode;\n  var table = spacerRow.parentNode.parentNode;\n  var count = parseInt(select.value);\n  var deltaOffset = 0, deltaTotal = count, deltaFactor;\n\n  table.disableCompact = true;\n\n  if (where != 'top')\n    deltaOffset = count;\n  if (where == 'middle')\n    deltaFactor = 2;\n  else\n    deltaFactor = 1;\n  deltaTotal *= deltaFactor;\n\n  if (count == total)\n    spacerCell.innerHTML = \"&nbsp;\";\n  else\n  {\n    select.selectedIndex = 0;\n\n    var newTotal = total - deltaTotal;\n\n    select.onchange = function () { expand(this, file_id, path, sha1, where, oldOffset + deltaOffset, newOffset + deltaOffset, total - deltaTotal); };\n    select.options[0].textContent = (total - deltaTotal) + ' lines not shown';\n    select.lastChild.value = newTotal;\n\n    if (select.options.length == 5 && newTotal < 50 * deltaFactor)\n      select.options[3] = null;\n    if (select.options.length == 4 && newTotal < 25 * deltaFactor)\n      select.options[2] = null;\n    if (select.options.length == 3 && newTotal < 10 * deltaFactor)\n      select.options[1] = null;\n\n    select.blur();\n  }\n\n  var ranges = [];\n\n  /* Request lines above the spacer. */\n  if (where != \"top\")\n    ranges.push({ offset: newOffset, count: count, context: false });\n\n  /* Request lines below the spacer. */\n  if (where != \"bottom\" && (where == 'top' || count < total))\n    ranges.push({ offset: newOffset + total - count, count: count, context: true });\n\n  var data = { repository_id: repository.id,\n               path: path,\n               sha1: sha1,\n               ranges: ranges,\n               tabify: typeof tabified != \"undefined\" };\n\n  var operation = new Operation({ action: \"fetch lines\",\n                                  url: \"fetchlines\",\n                                  data: data });\n  var result = operation.execute();\n\n  /* Add lines below the spacer. */\n  if (where != 'bottom' && (where == 'top' || count < total))\n  {\n    var range = result.ranges.pop();\n    var lines = range.lines;\n    var tbody = nextTableSection(spacerRow);\n    var anchor = tbody.firstChild;\n\n    for (var index = 0; index < lines.length; ++index)\n      tbody.insertBefore(makeLine(file_id,\n                                  oldOffset + total - count + index,\n                                  lines[index],\n                                  lines[index],\n                                  newOffset + total - count + index),\n                         anchor);\n\n    setSpacerContext(spacerRow, range.context || \"\");\n\n    if (typeof CommentMarkers != \"undefined\")\n      CommentMarkers.updateAll();\n  }\n\n  if (where != \"top\")\n  {\n    var lines = result.ranges.pop().lines;\n    var tbody = previousTableSection(spacerRow);\n\n    for (var index = 0; index < lines.length; ++index)\n      tbody.appendChild(makeLine(file_id,\n                                 oldOffset + index,\n                                 lines[index],\n                                 lines[index],\n                                 newOffset + index));\n\n    if (count == total && where == 'middle')\n    {\n      var next = nextTableSection(spacerRow);\n      var spacerSection = spacerRow.parentNode;\n\n      spacerSection.parentNode.removeChild(spacerSection);\n      while (next.firstChild)\n        tbody.appendChild(next.firstChild);\n      next.parentNode.removeChild(next);\n    }\n  }\n\n  if (typeof CommentMarkers != \"undefined\")\n    CommentMarkers.updateAll();\n}\n\nfunction createReview()\n{\n  location.href = \"/createreview?repository=\" + repository.id + \"&commits=\" + changeset.commits.join(\",\");\n}\n\nfunction customProcessCommits()\n{\n  location.href = \"/processcommits?review=\" + review.id + \"&commits=\" + changeset.commits.join(\",\");\n}\n\nfunction fetchFile(fileset, file_id, side, replace_tbody)\n{\n  var data = { repository_id: repository.id,\n               path: fileset[file_id].path,\n               sha1: fileset[file_id][side + \"_sha1\"],\n               ranges: [{ offset: 1, count: -1, context: false }],\n               tabify: typeof tabified != \"undefined\" };\n\n  var operation = new Operation({ action: \"fetch lines\",\n                                  url: \"fetchlines\",\n                                  data: data });\n  var result = operation.execute();\n\n  var lines = result.ranges[0].lines;\n  var html = \"<tbody class=lines>\";\n  var deleted = side == \"old\";\n  var row_class = deleted ? \"deleted\" : \"inserted\";\n\n  for (var offset = 1; offset <= lines.length; ++offset)\n  {\n    var line = lines[offset - 1] || \"&nbsp;\";\n    var row_id = \"f\" + file_id + (deleted ? \"o\" + offset + \"n0\" : \"o0n\" + offset);\n    var cell_id = \"f\" + file_id + (deleted ? \"o\" + offset : \"n\" + offset);\n\n    html += \"<tr class='line single \" + row_class + \"' id=\" + row_id + \">\"\n          +   \"<td class=edge></td>\"\n          +   \"<td class='linenr old'>\" + offset + \"</td>\"\n          +   \"<td class='line single \" + side + \"' id=\" + cell_id + \" colspan=4>\" + line + \"</td>\"\n          +   \"<td class='linenr new'>\" + offset + \"</td>\"\n          +   \"<td class=edge></td>\"\n          + \"</tr>\";\n  }\n\n  html += \"</tbody>\";\n\n  var tbody = $(html);\n\n  if (typeof review != \"undefined\")\n  {\n    tbody.find(\"td.line\").mousedown(startCommentMarking);\n    tbody.find(\"td.line\").mouseover(continueCommentMarking);\n    tbody.find(\"td.line\").mouseup(endCommentMarking);\n  }\n\n  tbody.replaceAll($(replace_tbody));\n}\n\nfunction detectMoves()\n{\n  var content = $(\"<div title='Detect Moved Code' class='detectmoves'><p>Source file:<br><select class='source'><option value='any'></option></select></p><p>Target file:<br><select class='target'><option value='any'></option></select></p></div>\");\n\n  var selects = content.find(\"select\");\n  var source = selects.filter(\".source\");\n  var target = selects.filter(\".target\");\n  var fileids = {};\n  var paths = [];\n  var expanded_files = [];\n\n  for (var name in files)\n    if (/^\\d+$/.test(name))\n    {\n      var fileid = parseInt(name);\n      var path = files[fileid].path;\n\n      fileids[path] = fileid;\n      paths.push(path);\n\n      if ($(\"#f\" + fileid).is(\".expanded\"))\n        expanded_files.push(fileid);\n    }\n\n  paths.sort();\n\n  for (var index = 0; index < paths.length; ++index)\n  {\n    var path = paths[index];\n    var fileid = fileids[path];\n    var selected;\n\n    if (expanded_files.length == 1 && expanded_files[0] == fileid)\n      selected = \" selected\";\n    else\n      selected = \"\";\n\n    selects.append(\"<option value='\" + fileid + \"'\" + selected + \">\" + htmlify(path) + \"</option>\");\n  }\n\n  function finish()\n  {\n    var source_arg = source.val() == \"any\" ? \"\" : \"&sourcefiles=\" + source.val();\n    var target_arg = target.val() == \"any\" ? \"\" : \"&targetfiles=\" + target.val();\n\n    if (typeof review != \"undefined\")\n      location.href = \"/\" + changeset.parent.sha1 + \"..\" + changeset.child.sha1 + \"?review=\" + review.id + \"&moves=yes\" + source_arg + target_arg;\n    else\n      location.href = \"/\" + repository.name + \"/\" + changeset.parent.sha1 + \"..\" + changeset.child.sha1 + \"?moves=yes\" + source_arg + target_arg;\n  }\n\n  content.dialog({ width: 600,\n                   buttons: { Search: function () { finish(); content.dialog(\"close\"); },\n                              Cancel: function () { content.dialog(\"close\"); } } });\n\n  selects.chosen({ placeholder_text: \"Any\",\n                   allow_single_deselect: true });\n}\n\nvar BLAME = null;\n\nfunction fetchBlame()\n{\n  if (BLAME === null)\n  {\n    var files = [];\n\n    for (var file_id in blocks)\n    {\n      var raw_blocks = blocks[file_id];\n      var fine_blocks = new Array(raw_blocks.length);\n\n      for (var index = 0; index < raw_blocks.length; ++index)\n        fine_blocks[index] = { first: raw_blocks[index][0], last: raw_blocks[index][1] };\n\n      files.push({ id: ~~file_id, blocks: fine_blocks });\n    }\n\n    var operation = new Operation({ action: \"blame lines\",\n                                    url: \"blame\",\n                                    data: { repository_id: repository.id,\n                                            changeset_id: changeset.id,\n                                            files: files }\n                                  });\n    var result = operation.execute();\n\n    if (result)\n    {\n      BLAME = result;\n      BLAME.color_index = 0;\n\n      for (var index = 0; index < BLAME.commits.length; ++index)\n      {\n        var commit = BLAME.commits[index];\n\n        if (commit.original)\n          BLAME.original = commit;\n        if (commit.current)\n          BLAME.current = commit;\n      }\n\n      BLAME.file_by_id = {};\n\n      for (var index = 0; index < BLAME.files.length; ++index)\n      {\n        var file = BLAME.files[index];\n        BLAME.file_by_id[file.id] = file;\n      }\n    }\n  }\n}\n\nfunction updateBlame(file_id)\n{\n  function getColor(index)\n  {\n    var compentvalues = [0xff, 0x80, 0xc0, 0x40, 0xe0, 0xa0, 0x60, 0x20];\n    var cv1 = compentvalues[parseInt(index / 6) % 8], cv2 = parseInt(cv1 / 2), pattern;\n\n    cv1 = cv1.toString(16);\n    if (cv1.length == 1)\n      cv1 = \"0\" + cv1;\n\n    cv2 = cv2.toString(16);\n    if (cv2.length == 1)\n      cv2 = \"0\" + cv2;\n\n    switch (index % 6)\n    {\n    case 0: pattern = \"hhllll\"; break;\n    case 1: pattern = \"llhhll\"; break;\n    case 2: pattern = \"llllhh\"; break;\n    case 3: pattern = \"hhhhll\"; break;\n    case 4: pattern = \"hhllhh\"; break;\n    case 5: pattern = \"llhhhh\"; break;\n    }\n\n    return pattern.replace(/hh/g, cv1).replace(/ll/g, cv2);\n  }\n\n  function generateTooltip()\n  {\n    return $(this).attr(\"critic-blame-tooltip\");\n  }\n\n  if (BLAME)\n  {\n    for (var file_index = 0; file_index < BLAME.files.length; ++file_index)\n    {\n      var file = BLAME.files[file_index];\n\n      if (!file_id || file.id === file_id)\n      {\n        for (var block_index = 0; block_index < file.blocks.length; ++block_index)\n        {\n          var lines = file.blocks[block_index].lines;\n\n          for (var line_index = 0; line_index < lines.length; ++line_index)\n          {\n            var line = lines[line_index];\n            var commit = BLAME.commits[line.commit];\n            var row = $(\"#f\" + file.id + \"n\" + line.offset).parent(), color_selector, tooltip_selector;\n\n            function addTooltip(element, commit)\n            {\n              element.addClass(\"with-blame-tooltip\");\n              element.attr(\"critic-blame-tooltip\",\n                           (\"<div><b><u>\" + htmlify(commit.author_name) + \" &lt;\" + htmlify(commit.author_email) + \"></u></b>\" +\n                            \"<pre>\" + htmlify(commit.message) + \"</pre></div>\"));\n            }\n\n            if (commit.original)\n              addTooltip(row.children(\"td.line\"), commit);\n            else\n            {\n              if (!commit.color)\n                commit.color = getColor(BLAME.color_index++);\n\n              if (row.children(\"td.line.single\").size())\n                row.children(\"td.linenr\").css(\"background-color\", \"#\" + commit.color);\n              else\n                row.children(\"td.middle, td.linenr.new\").css(\"background-color\", \"#\" + commit.color);\n\n              if (!row.hasClass(\"inserted\"))\n                addTooltip(row.children(\"td.line.old\"), BLAME.original);\n\n              addTooltip(row.children(\"td.line.new\"), commit);\n            }\n          }\n        }\n      }\n    }\n\n    /* This is a workaround for an issue where a tooltip isn't always removed\n       when the mouse pointer is moved to a different element, leading to\n       multiple tooltips on-top of each other. */\n    var current_tooltip = null;\n    function tooltipOpened(event, ui)\n    {\n      if (current_tooltip !== null)\n        $(current_tooltip.tooltip).remove();\n      current_tooltip = ui;\n    }\n    function tooltipClosed(event, ui)\n    {\n      current_tooltip = null;\n    }\n    $(document).mouseover(\n      function (ev)\n      {\n        if (current_tooltip &&\n            !$(ev.target).closest(\"td.with-blame-tooltip\").size() &&\n            !$(ev.target).is(\"td.with-blame-tooltip\"))\n          $(\"td.with-blame-tooltip\").tooltip(\"close\");\n      });\n    /* End of workaround. */\n\n    $(\"td.with-blame-tooltip\").tooltip({\n      content: generateTooltip,\n      items: \"td.with-blame-tooltip\",\n      tooltipClass: \"blame-tooltip\",\n      track: true,\n      hide: false,\n      open: tooltipOpened,\n      close: tooltipClosed\n    });\n  }\n}\n\nfunction blame()\n{\n  fetchBlame();\n  updateBlame();\n}\n\nfunction registerPathHandlers()\n{\n  $(\"table.commit-files td.path\").click(function (ev)\n    {\n      try\n      {\n        if (mode == \"hide\")\n          hideAll(true);\n        else\n          collapseAll(true);\n\n        file_id = ev.currentTarget.parentNode.getAttribute(\"critic-file-id\");\n\n        expandFile(file_id, true);\n      }\n      catch (e)\n      {\n        console.log(e.message + \"\\n\" + e.stacktrace);\n      }\n\n      ev.preventDefault();\n      ev.target.blur();\n    });\n}\n\n$(document).ready(function ()\n  {\n    var match = /#f(\\d+)([on])(\\d+)/.exec(location.hash);\n    if (match)\n    {\n      expandFile(parseInt(match[1]));\n      location.hash = location.hash;\n    }\n\n    $(\"table.commit-files td.parent\").mouseover(function (ev)\n      {\n        var target = $(ev.currentTarget);\n\n        target.addClass(\"hover\");\n\n        if (target.prev(\"td.parent\").first().attr(\"critic-parent-index\") == target.attr(\"critic-parent-index\"))\n          target.prev(\"td.parent\").first().addClass(\"hover\");\n        if (target.next(\"td.parent\").first().attr(\"critic-parent-index\") == target.attr(\"critic-parent-index\"))\n          target.next(\"td.parent\").first().addClass(\"hover\");\n      });\n\n    $(\"table.commit-files td.parent\").mouseout(function (ev)\n      {\n        var target = $(ev.currentTarget);\n\n        target.removeClass(\"hover\");\n\n        if (target.prev(\"td.parent\").first().attr(\"critic-parent-index\") == target.attr(\"critic-parent-index\"))\n          target.prev(\"td.parent\").first().removeClass(\"hover\");\n        if (target.next(\"td.parent\").first().attr(\"critic-parent-index\") == target.attr(\"critic-parent-index\"))\n          target.next(\"td.parent\").first().removeClass(\"hover\");\n      });\n\n    $(\"table.commit-files td.parent\").click(function (ev)\n      {\n        var target = $(ev.currentTarget);\n        var file_id = target.parentsUntil(\"table\").filter(\"tr\").attr(\"critic-file-id\");\n        var parent = target.attr(\"critic-parent-index\");\n\n        if (mode == \"hide\")\n          hideAll(true);\n        else\n          collapseAll(true);\n\n        selectParent(parent);\n        expandFile(file_id, true);\n\n        ev.preventDefault();\n      });\n  });\n\nfunction applyLengthLimit(lines)\n{\n  lines.each(\n    function (index, element)\n    {\n      var limit = element.getAttribute(\"critic-length-limit\");\n      if (limit)\n      {\n        var match = /(\\d+)-(\\d+)/.exec(limit);\n        var low_limit = parseInt(match[1]);\n        var high_limit = parseInt(match[2]);\n\n        if (element.textContent.length > low_limit)\n        {\n          var iterator = document.createNodeIterator(element, NodeFilter.SHOW_TEXT, null, false);\n          var texts = [], text, seen = 0;\n\n          while (text = iterator.nextNode())\n            texts.push(text);\n\n          for (var index = 0; index < texts.length; ++index)\n          {\n            var text = texts[index];\n            var html = \"\";\n            var offset = Math.min(Math.max(0, low_limit - seen), text.length), end = Math.min(text.length, Math.max(0, high_limit - seen));\n\n            if (offset > 0)\n            {\n              html += htmlify(text.data.substring(0, offset));\n              seen += offset;\n            }\n\n            for (; offset < end; ++offset)\n            {\n              var redness = Math.min(100, 100 * (seen - low_limit) / (high_limit - low_limit)).toFixed(1);\n              html += \"<span style='color: rgb(\" + redness + \"%, 0%, 0%)'>\" + htmlify(text.data.substring(offset, offset + 1)) + \"</span>\";\n              ++seen;\n            }\n\n            if (offset < text.length)\n              html += \"<span style='color: rgb(100%, 0%, 0%)'>\" + htmlify(text.data.substring(offset)) + \"</span>\"\n\n            $(\"<div>\" + html + \"</div>\").contents().replaceAll($(text));\n          }\n        }\n      }\n    });\n}\n\n(function() {\n  /*Handle resizing of the left and right diff views\n    by dragging divider between them. */\n\n  var currentTable = null; ///< cached reference to the table whose panes are being resized\n  var currentCols = null; ///< cached reference to the col elements that are being resized\n  var tableCoord = { left: 0, width: 0 }\n  var HALF_DIVIDER_WIDTH = 15; ///< half of the width of the divider between diff views (somewhat arbitrary)\n\n  document.addEventListener('mousedown', handleMouseDown);\n  document.addEventListener('dblclick', handleDblClick);\n\n  function handleMouseDown(e)\n  {\n    if (e.button != 0)\n      return;\n\n    var mid_cell = $(e.target);\n    if (!mid_cell.is('td.middle'))\n      return;\n\n    var table = mid_cell.parents('table');\n    if (!table.length)\n      return;\n\n    currentTable = table;\n    currentCols = table.find('colgroup col.line');\n    if (currentCols.length != 2)\n      return;\n\n    /* Store clicked element's offset relative to the table. It can change\n       during wrapping and we want to restore previous position the element\n       had on screen. */\n    var offset_before = mid_cell.offset().top;\n    table.addClass(\"resized\");\n    window.scrollBy(0, mid_cell.offset().top - offset_before);\n\n    /* Calculate offsets from the sibling cells of the clicked one.\n       WebKit is unable to get dimensions from the col elements. */\n    var panes = mid_cell.parent().find(\"td.line\");\n    tableCoord.left = $(panes[0]).offset().left;\n    tableCoord.width = $(panes[0]).width() + $(panes[1]).width();\n\n    document.addEventListener('mousemove', handleMouseMove);\n    document.addEventListener('mouseup', handleMouseUp);\n    e.preventDefault();\n  }\n\n  function handleMouseUp(e)\n  {\n    currentTable = currentCols = null;\n    document.removeEventListener('mouseup', handleMouseUp);\n    document.removeEventListener('mousemove', handleMouseMove);\n  }\n\n  function handleMouseMove(e)\n  {\n    if (currentCols)\n    {\n      var leftDiffPaneWidth = e.pageX - tableCoord.left - HALF_DIVIDER_WIDTH;\n      leftDiffPaneWidth = Math.min(tableCoord.width, Math.max(0, leftDiffPaneWidth));\n      var rightDiffPaneWidth = (tableCoord.width - leftDiffPaneWidth);\n      $(currentCols[0]).css('width', leftDiffPaneWidth + 'px');\n      $(currentCols[1]).css('width', rightDiffPaneWidth + 'px');\n      if (typeof CommentMarkers != \"undefined\")\n        CommentMarkers.updateAll();\n\n      if (leftDiffPaneWidth < rightDiffPaneWidth)\n        currentTable.removeClass(\"new-narrower\").addClass(\"old-narrower\");\n      else if (leftDiffPaneWidth > rightDiffPaneWidth)\n        currentTable.removeClass(\"old-narrower\").addClass(\"new-narrower\");\n      else\n        currentTable.removeClass(\"old-narrower new-narrower\");\n    }\n  }\n\n  function handleDblClick(e)\n  {\n    var mid_cell = $(e.target);\n    if (!mid_cell.is('td.middle'))\n      return;\n\n    var table = mid_cell.parents('table');\n    var cols = table.find('colgroup col.line');\n    if (cols.length == 2)\n    {\n      /* Center diff division (reset to default). */\n      table.removeClass(\"resized old-narrower new-narrower\");\n      cols.removeAttr('style');\n      if (typeof CommentMarkers != \"undefined\")\n        CommentMarkers.updateAll();\n    }\n  }\n})();\n\nwindow.addEventListener(\"popstate\", function (ev)\n  {\n    if (ev.state)\n      restoreState(ev.state);\n  }, false);\n\nif (typeof history.replaceState == \"function\")\n{\n  document.addEventListener(\"DOMContentLoaded\", function (ev)\n    {\n      saveState(true);\n    });\n  window.addEventListener(\"scroll\", function (ev)\n    {\n      queueSaveState(true);\n    });\n}\n\n"
  },
  {
    "path": "src/resources/checkbranch.css",
    "content": "/* -*- mode: css; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\nbody { font-size: 12px; font-family: sans-serif }\n\ndiv.main table.branchstatus,\ndiv.main table.data {\n    table-layout: fixed\n}\n\ndiv.main table td.h1 p {\n    padding-left: 1em\n}\n\ndiv.main table td.h1 p span.command {\n    font-family: monospace;\n    background: #eec;\n    padding: 3px 6px\n}\n\ndiv.main table.branchstatus thead tr.headings th {\n    padding-top: 0.5em;\n    text-decoration: underline\n}\ndiv.main table.branchstatus thead tr.headings th.sha1 {\n    text-align: center\n}\ndiv.main table.branchstatus thead tr.headings th.user,\ndiv.main table.branchstatus thead tr.headings th.summary {\n    padding-left: 1em;\n    text-align: left\n}\ndiv.main table.branchstatus thead tr.headings th.review {\n    padding-left: 0;\n    padding-right: 0;\n    text-align: right\n}\n\ndiv.main table.branchstatus tbody tr.commit td.sha1 {\n    padding-left: 0\n}\n\ndiv.main table.branchstatus tbody tr.commit td.sha1 div {\n    padding-top: 3px;\n    padding-bottom: 3px\n}\n\ndiv.main table.branchstatus tbody.reviewed tr.commit td.sha1 div {\n    background-color: lime\n}\ndiv.main table.branchstatus tbody.pending tr.commit td.sha1 div {\n    background-color: red\n}\ndiv.main table.branchstatus tbody.note tr.commit td.sha1 div {\n    background-color: yellow\n}\ndiv.main table.branchstatus tbody.unknown tr.commit td.sha1 div {\n    background-color: red\n}\n\ndiv.main table.branchstatus tbody.unknown tr.commit.own td.summary a {\n    color: red;\n    font-weight: bold\n}\n\ndiv.main table.branchstatus tbody tr.commit td {\n    padding-top: 3px;\n    padding-bottom: 3px\n}\ndiv.main table.branchstatus tbody tr.commit td.sha1 {\n    padding-top: 0;\n    padding-bottom: 0\n}\n\ndiv.main table.branchstatus tbody tr.empty td {\n    padding-top: 3px;\n    padding-bottom: 3px\n}\n\ndiv.main table.branchstatus tbody tr.commit td.sha1 {\n    font-family: monospace;\n    text-align: center;\n    vertical-align: top\n}\ndiv.main table.branchstatus tbody tr.commit td.user {\n    font-family: monospace;\n    font-weight: bold;\n    vertical-align: top\n}\ndiv.main table.branchstatus tbody tr.commit td.summary {\n    font-family: monospace;\n}\ndiv.main table.branchstatus tbody tr.commit td.review {\n    vertical-align: top;\n    text-align: right\n}\ndiv.main table.branchstatus tbody tr td.edit {\n    vertical-align: top;\n    text-align: right\n}\n\ndiv.main table.branchstatus tbody tr.note td.note {\n    font-family: serif;\n    text-align: right\n}\ndiv.main table.branchstatus tbody tr.note td.note span.user {\n    font-weight: bold\n}\n\ndiv.main table.branchstatus tbody:hover tr.commit,\ndiv.main table.branchstatus tbody:hover tr.note {\n    background-color: #eec\n}\n\ndiv.main table.branchstatus tbody tr.commit:hover {\n    background-color: #998;\n}\n\ndiv.main table.branchstatus tbody tr td.edit a.edit {\n    color: #eec;\n    text-decoration: none\n}\ndiv.main table.branchstatus tbody:hover tr td.edit a.edit {\n    color: #cca\n}\ndiv.main table.branchstatus tbody tr:hover td.edit a.edit {\n    color: #222\n}\ndiv.main table.branchstatus tbody.unknown tr.commit.own td.edit a.edit {\n    color: red;\n    font-weight: bold\n}\ndiv.main table.branchstatus tbody tr td.edit a.edit:hover {\n    text-decoration: underline\n}\n\ndiv.main table td.value input {\n    font-family: monospace;\n    width: 40em\n}\n\ndiv.main table td.value input[name=\"fetch\"] {\n    width: auto\n}\n\ndiv.main table td.value select {\n    width: 40em\n}\n\ndiv.comment > p {\n    margin: 5px 0\n}\n\ndiv.comment > p > span.review-id {\n    font-family: monospace\n}\n\ndiv.comment > p > select {\n    width: 100%\n}\n\ndiv.comment > p > a {\n    float: right\n}\n\ndiv.comment > div.warning {\n    font-weight: bold;\n    color: #f00\n}\n\ndiv.comment > div.header {\n    padding-top: 10px\n}\n\ndiv.comment > div.header > span.author {\n    font-weight: bold\n}\n\ndiv.comment > div.text {\n    font-family: monospace;\n    font-size: 11px;\n    background-color: #fff;\n    border: 1px solid #bbb;\n    margin-top: 3px;\n    padding: 5px;\n    white-space: pre-wrap\n}\n\ndiv.comment > div.text > textarea {\n    font-family: monospace;\n    font-size: 11px;\n    background-color: #fff;\n    border: none;\n    width: 100%;\n    resize: none\n}\n\ndiv.comment-dialog > div.resolution,\ndiv.comments > div.resolution {\n    font-weight: bold;\n    font-size: 130%;\n    text-align: center;\n    padding-top: 10px\n}\n\ndiv.legend {\n    margin-top: 10px;\n    border-top: 2px solid #A0A092;\n    text-align: right;\n    padding-top: 10px;\n    padding-right: 1em;\n    font-weight: bold\n}\n\ndiv.legend span {\n    padding: 3px 2em;\n    font-family: monospace\n}\n\ndiv.legend span.red {\n    background-color: red\n}\ndiv.legend span.yellow {\n    background-color: yellow\n}\ndiv.legend span.green {\n    background-color: lime\n}\n"
  },
  {
    "path": "src/resources/checkbranch.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\nfunction deleteNote(sha1, parentDialog)\n{\n  var content = $(\"<div class='comment' title='Delete Note'?>Are you sure?</div>\");\n\n  function finish()\n  {\n    $.ajax({ async: false,\n             url: \"/deletecheckbranchnote?repository=\" + repository.id + \"&branch=\" + branch + \"&upstream=\" + upstream + \"&sha1=\" + sha1,\n             dataType: \"text\",\n             success: function (data)\n               {\n                 if (data == \"ok\")\n                   finished = true;\n                 else\n                   reportError(\"delete note\", \"Server reply: <i>\" + data + \"</i>\");\n               },\n             error: function ()\n               {\n                 reportError(\"delete note\", \"Request failed.\");\n               }\n           });\n\n    if (finished)\n    {\n      content.dialog(\"close\");\n      location.reload();\n    }\n  }\n\n  content.dialog({ modal: true,\n                   buttons: { Delete: function () { content.dialog(\"close\"); if (finish()) { parentDialog.dialog(\"close\"); } },\n                              Cancel: function () { content.dialog(\"close\"); }}});\n}\n\nfunction editCommit(sha1, commit_id, has_note, old_review_id)\n{\n  var row = $(\"tr.commit#\" + sha1);\n  var text = row.parent(\"tbody.note\").find(\"span.text\").text();\n  var suggestions = \"\";\n\n  if (old_review_id == void 0)\n  {\n    var operation = new Operation({ action: \"suggest reviews\",\n                                    url: \"suggestreviews\",\n                                    data: { repository_id: repository.id,\n                                            sha1: sha1 }});\n    var result = operation.execute();\n\n    if (result)\n    {\n      suggestions = \"<p><b>Suggested reviews:</b><br><select><option>(nothing selected)</option>\";\n      for (var id in result.reviews)\n        suggestions += \"<option value=\" + id + \">[r/\" + id + \"] \" + result.reviews[id] + \"</option>\";\n      suggestions += \"</select></p>\";\n    }\n    else\n      return;\n  }\n\n  function rebase(review_id)\n  {\n    function proceed()\n    {\n      var operation = new Operation({ action: \"rebase review\",\n                                      url: \"rebasereview\",\n                                      data: { review_id: review_id,\n                                              sha1: sha1,\n                                              branch: branch },\n                                      wait: \"Rebasing review...\" });\n\n      if (operation.execute())\n      {\n        confirm.dialog(\"close\");\n        location.reload();\n      }\n    }\n\n    var confirm = $(\"<div class='comment' title='Confirm Review Rebase'>The review <a href='/r/\" + review_id + \"'>r/\" + review_id + \"</a> can be rebased to contain this single commit.  If the commit is a squash of all changes in the review, this is the appropriate thing to do.</div>\");\n\n    confirm.dialog({ width: 400, modal: true, buttons: { \"Rebase Review\": function () { proceed(); }, \"Don't Rebase\": function () { confirm.dialog(\"close\"); content.dialog(\"close\"); location.reload(); } } });\n  }\n\n  function finish()\n  {\n    var new_review_id = content.find(\"input[type=text]\").val();\n    var new_text = content.find(\"textarea\").val();\n\n    if (new_review_id)\n      if (!/^[1-9][0-9]*$/.test(new_review_id))\n      {\n        alert(\"Invalid review ID; must be a positive integer!\");\n        return;\n      }\n      else\n        new_review_id = parseInt(new_review_id);\n\n    if (!new_review_id && /^\\s*$/.test(new_text))\n    {\n      alert(\"You must enter either a review ID or a comment (or both.)\");\n      return;\n    }\n\n    var finished = false;\n\n    $.ajax({ async: false,\n             type: \"POST\",\n             url: \"/addcheckbranchnote?repository=\" + repository.id + \"&branch=\" + branch + \"&upstream=\" + upstream + \"&sha1=\" + sha1 + (new_review_id ? \"&review=\" + new_review_id : \"\"),\n             contentType: \"text/plain\",\n             data: new_text,\n             dataType: \"text\",\n             success: function (data)\n               {\n                 if (data == \"rebase\")\n                   rebase(new_review_id);\n                 else if (data == \"ok\")\n                   finished = true;\n                 else\n                   reportError(\"rebase review\", \"Server reply: <i>\" + data + \"</i>\");\n               },\n             error: function ()\n               {\n                 reportError(\"Request failed.\");\n               }\n           });\n\n    if (finished)\n    {\n      content.dialog(\"close\");\n      location.reload();\n    }\n  }\n\n  if (old_review_id === void 0)\n    old_review_id = \"\";\n\n  var content = $(\"<div class='comment flex' title='Edit Commit Meta-Data'>\" +\n                    \"<p><b>Review ID:</b> <span class='review-id'>r/<input type='text' value='\" + old_review_id + \"'></span>\" +\n                    suggestions +\n                    \"<p><b>Comment:</b></p>\" +\n                    \"<textarea class='text flexible' rows=5>\" + htmlify(text) + \"</textarea>\" +\n                  \"</div>\");\n\n  content.find(\"select\").change(function () { content.find(\"input[type=text]\").val(content.find(\"select\").val()); });\n  content.find(\"a\").button();\n\n  var buttons = {};\n\n  if (has_note)\n    buttons[\"Delete\"] = function () { deleteNote(sha1, content); }\n\n  buttons[\"Save\"] = function () { finish(); };\n  buttons[\"Cancel\"] = function () { content.dialog(\"close\"); };\n\n  content.dialog({ width: 600, modal: true, buttons: buttons });\n}\n\n$(document).ready(function ()\n  {\n    $(\"button.check\").click(function (ev)\n      {\n        var repository = $(\"select[name='repository']\").val();\n        var commit = $(\"input[name='commit']\").val();\n        var fetch = $(\"input[name='fetch']:checked\").size();\n        var upstream = $(\"input[name='upstream']\").val();\n\n        location.href = \"/checkbranch?repository=\" + encodeURIComponent(repository) +\n                                    \"&commit=\" + encodeURIComponent(commit) +\n                                    \"&fetch=\" + (fetch ? \"yes\" : \"no\") +\n                                    \"&upstream=\" + encodeURIComponent(upstream);\n      });\n\n    $(\"a.button\").button();\n  });\n"
  },
  {
    "path": "src/resources/comment.css",
    "content": "/* -*- mode: css; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\ndiv.comment.draft > div.header > span.time:after {\n    content: \" (draft)\";\n    font-weight: bold;\n    color: #f00\n}\n\nspan.draft {\n    font-weight: bold;\n    color: #f00\n}\n\ndiv.comment > p {\n    margin: 0\n}\n\ndiv.comment > p.state {\n    margin: 0;\n    margin-bottom: 5px;\n    padding-bottom: 5px;\n    border-bottom: 1px solid black;\n    font-weight: bold\n}\n\ndiv.comment > div.warning {\n    font-weight: bold;\n    color: #f00\n}\n\ndiv.comment > div.header {\n    padding-top: 10px\n}\n\ndiv.comment > div.header > span.author {\n    font-weight: bold\n}\n\ndiv.comment > .text {\n    font-family: monospace;\n    font-size: 11px;\n    background-color: #fff;\n    border: 1px solid #bbb;\n    margin-top: 3px;\n    padding: 5px;\n    white-space: pre-wrap\n}\n\ndiv.comment > textarea {\n    width: 100%;\n    resize: none;\n    display: block;\n    outline: 0\n}\n\ndiv.comment-dialog > div.resolution,\ndiv.comments > div.resolution {\n    font-weight: bold;\n    font-size: 130%;\n    text-align: center;\n    padding-top: 10px\n}\n\n.comment-tooltip {\n    max-width: 40em\n}\n\n.comment-tooltip div.tooltip > div.header {\n    font-weight: bold\n}\n\n.comment-tooltip div.tooltip > div.text {\n    padding: 5px;\n    background-color: white;\n    border: 1px solid #bbb;\n    margin-top: 3px;\n    white-space: pre-wrap\n}\n\ndiv.marker {\n    position: absolute;\n    border: 2px solid;\n    width: 7px\n}\n\ndiv.marker.issue {\n    background-color: #f88;\n    border-color: #b66;\n    z-index: 2\n}\n\ndiv.marker.issue.addressed, div.marker.issue.closed {\n    background-color: #8f8;\n    border-color: #6b6;\n}\n\ndiv.marker.note {\n    background-color: #ff8;\n    border-color: #bb6;\n    z-index: 1\n}\n\ndiv.marker.new {\n    background-color: #88f;\n    border-color: #66b;\n    z-index: 3\n}\n\ndiv.marker.right {\n    border-top-right-radius: 10px;\n    border-bottom-right-radius: 10px;\n}\n\ndiv.marker.left {\n    border-top-left-radius: 10px;\n    border-bottom-left-radius: 10px;\n}\n\ndiv.comments {\n    max-width: 80em;\n}\n\ndiv.comments.left {\n    margin-left: 0\n}\n\ndiv.comments.center {\n    margin-left: auto;\n    margin-right: auto\n}\n\ndiv.comments.right {\n    margin-left: auto;\n    margin-right: 0\n}\n\ndiv.comments > div.buttons {\n    text-align: right;\n    margin-top: 3px;\n    font-size: 12px\n}\n\ndiv.comment-chain {\n    margin-top: 1em\n}\n\ndiv.comment-chain > table.file span.comment-chain-title {\n    font-weight: bold\n}\n\ndiv.comment-chain tr.content table.commit-info {\n    background-color: #fff;\n    margin-left: auto;\n    margin-right: auto;\n    border: 1px solid #bbb\n}\n\nbutton.hidden {\n    display: none\n}\n"
  },
  {
    "path": "src/resources/comment.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\n/* -*- Mode: js; js-indent-level: 2; indent-tabs-mode: nil -*- */\n\nvar commentChainsPerFile = {};\nvar commentChainById = {};\nvar commentChains = [];\n\nfunction Comment(id, author, time, state, text)\n{\n  this.id = id;\n  this.author = author;\n  this.time = time;\n  this.state = state;\n  this.text = text;\n}\n\nComment.prototype.getLeader = function ()\n  {\n    var leader = htmlify(this.text.substr(0, 80));\n    var linebreak = leader.indexOf(\"\\n\");\n\n    if (linebreak != -1)\n      leader = leader.substr(0, linebreak);\n\n    var period = leader.indexOf(\". \");\n\n    if (period != -1)\n      leader = leader.substr(0, period + 1);\n\n    if (this.text.length > 80)\n      leader += \"&#8230;\";\n\n    return leader;\n  };\n\nfunction CommentLines(file, sha1, firstLine, lastLine)\n{\n  this.file = file;\n  this.sha1 = sha1;\n  this.firstLine = firstLine;\n  this.lastLine = lastLine;\n}\n\nCommentLines.prototype.getFirstLine = function (chain)\n  {\n    for (var linenr = this.firstLine; linenr <= this.lastLine; ++linenr)\n    {\n      var base_id, line;\n\n      if (this.file !== null)\n      {\n        var file = files[this.sha1];\n        base_id = \"f\" + this.file + file.side + linenr;\n\n        if (typeof file.parent == \"number\")\n          base_id = \"p\" + file.parent + base_id;\n        else if (window.selectedParent != null)\n          base_id = \"p\" + selectedParent + base_id;\n      }\n      else\n        base_id = \"msg\" + this.firstLine;\n\n      line = document.getElementById(\"c\" + chain.id + base_id);\n      if (line)\n        return line;\n\n      line = document.getElementById(base_id);\n      if (line)\n        return line;\n    }\n\n    //console.log(\"first line missing: f\" + this.file + files[this.sha1].side + this.firstLine);\n    return null;\n  };\n\nCommentLines.prototype.getLastLine = function (chain)\n  {\n    for (var linenr = this.lastLine; linenr >= this.firstLine; --linenr)\n    {\n      var base_id, line;\n\n      if (this.file !== null)\n      {\n        var file = files[this.sha1];\n        base_id = \"f\" + this.file + file.side + linenr;\n\n        if (typeof file.parent == \"number\")\n          base_id = \"p\" + file.parent + base_id;\n        else if (window.selectedParent != null)\n          base_id = \"p\" + selectedParent + base_id;\n      }\n      else\n        base_id = \"msg\" + this.lastLine;\n\n      line = document.getElementById(\"c\" + chain.id + base_id);\n      if (line)\n        return line;\n\n      line = document.getElementById(base_id);\n      if (line)\n        return line;\n    }\n\n    //console.log(\"last line missing: f\" + this.file + files[this.sha1].side + this.lastLine);\n    return null;\n  };\n\nfunction CommentChain(id, user, type, type_is_draft, state, closed_by, addressed_by, comments, lines, markers)\n{\n  this.id = id;\n  this.user = user;\n  this.type = type;\n  this.type_is_draft = type_is_draft;\n  this.state = state;\n  this.closed_by = closed_by;\n  this.addressed_by = addressed_by;\n  this.comments = comments;\n  this.lines = lines;\n  this.markers = markers || null;\n}\n\nCommentChain.extraButtons = {};\n\nCommentChain.create = function (type_or_markers)\n  {\n    var chain_type = null;\n    var markers = null;\n    var paused = false;\n\n    if (typeof type_or_markers == \"string\")\n      chain_type = type_or_markers;\n    else\n      markers = type_or_markers;\n\n    var message = \"\";\n\n    function abort()\n    {\n      markers.remove();\n      currentMarkers = null;\n    }\n\n    var markersLocation;\n    var useChangeset;\n    var useFiles;\n\n    if (markers)\n    {\n      var m1 = /(?:p(\\d+))?f(\\d+)([on])(\\d+)/.exec(markers.firstLine.id);\n      if (m1)\n      {\n        var side = m1[3];\n        var parent;\n\n        if (m1[1] !== undefined)\n        {\n          parent = parseInt(m1[1]);\n          useChangeset = changeset[parent];\n          useFiles = files[parent];\n        }\n        else\n        {\n          useChangeset = changeset;\n          useFiles = files;\n        }\n\n        var file = parseInt(m1[2]);\n        var sha1 = side == 'o' ? useFiles[file].old_sha1 : useFiles[file].new_sha1;\n        var firstLine = parseInt(m1[4]);\n        var m2 = /(?:p\\d+)?f\\d+[on](\\d+)/.exec(markers.lastLine.id);\n        var lastLine = parseInt(m2[1]);\n\n        if (side == 'o' && markers.linesModified())\n        {\n          message = \"<p>\"\n                  +   \"<b>Warning:</b> An issue raised against the old version of \"\n                  +   \"modified lines will never be marked as addressed, and \"\n                  +   \"will thus need to be resolved manually.\"\n                  + \"</p>\";\n        }\n        else\n        {\n          var data = { review_id: review.id,\n                       origin: side == 'o' ? \"old\" : \"new\",\n                       parent_id: useChangeset.parent.id,\n                       child_id: useChangeset.child.id,\n                       file_id: file,\n                       offset: firstLine,\n                       count: lastLine + 1 - firstLine };\n\n          var operation = new Operation({ action: \"validate commented lines\",\n                                          url: \"validatecommentchain\",\n                                          data: data });\n          var result = operation.execute();\n\n          if (result.verdict == \"modified\")\n          {\n            var content = $(\"<div title='Warning!'>\"\n                          +   \"<p>\"\n                          +     \"One or more of the lines you are commenting are modified by a \"\n                          +       \"<a href='/\" + result.parent_sha1 + \"..\" + result.child_sha1 + \"?review=\" + review.id + \"#f\" + file + \"o\" + result.offset + \"'>later commit</a> \"\n                          +     \"in this review.\"\n                          +   \"</p>\"\n                          +   \"<p>\"\n                          +     \"An issue raised against already modified lines \"\n                          +     \"will never be marked as addressed, and will thus \"\n                          +     \"need to be resolved manually.\"\n                          +   \"</p>\"\n                          + \"</div>\");\n\n            content.dialog({ modal: true, width: 400,\n                             buttons: { \"Comment Anyway\": function () { content.dialog(\"close\"); start(); },\n                                        \"Cancel\": function () { content.dialog(\"close\"); abort(); }}\n                           });\n\n            paused = true;\n          }\n          else if (result.verdict == \"transferred\")\n          {\n            message = \"<p>\"\n                    +   \"<b>Note:</b> This file is modified by \"\n                    +     (result.count > 1 ? result.count + \" later commits \" : \"a later commit \")\n                    +   \"in this review, without affecting the commented lines.  \"\n                    +   \"This comment will appear against each version of the file.\"\n                    + \"</p>\";\n          }\n          else if (result.verdict == \"invalid\")\n          {\n            var content = $(\"<div title='Error!'>\"\n                          +   \"<p>\"\n                          +     \"<b>It is not possible to comment these lines.</b>\"\n                          +   \"</p>\"\n                          +   \"<p>\"\n                          +     \"This is probably because this/these commits are not part of the review.\"\n                          +   \"</p>\"\n                          + \"</div>\");\n\n            content.dialog({ modal: true,\n                             buttons: { \"OK\": function () { content.dialog(\"close\"); }}\n                           });\n\n            abort();\n            return;\n          }\n        }\n\n        markersLocation = \"file\";\n      }\n      else\n      {\n        var m1 = /msg(\\d+)/.exec(markers.firstLine.id);\n        firstLine = parseInt(m1[1]);\n        var m2 = /msg(\\d+)/.exec(markers.lastLine.id);\n        lastLine = parseInt(m2[1]);\n\n        if (\"child\" in changeset)\n          useChangeset = changeset;\n        else\n          useChangeset = changeset[0];\n\n        markersLocation = \"commit\";\n      }\n    }\n\n    var content;\n\n    function finish(chain_type)\n    {\n      var text = content.find(\"textarea\").val();\n      var data = { review_id: review.id,\n                   chain_type: chain_type,\n                   text: text };\n\n      if (markers)\n      {\n        if (markersLocation == \"file\")\n        {\n          data.file_context = { origin: side == 'o' ? \"old\" : \"new\",\n                                file_id: file,\n                                child_id: useChangeset.child.id,\n                                offset: firstLine,\n                                count: lastLine + 1 - firstLine };\n\n          if (useChangeset.parent)\n            data.file_context.parent_id = useChangeset.parent.id;\n        }\n        else\n          data.commit_context = { commit_id: useChangeset.child.id,\n                                  offset: firstLine,\n                                  count: lastLine + 1 - firstLine };\n      }\n\n      var operation = new Operation({ action: \"create comment\",\n                                      url: \"createcommentchain\",\n                                      data: data });\n      var result = operation.execute();\n\n      if (result.status == \"ok\")\n      {\n        var comment = new Comment(result.comment_id, user, \"now\", \"draft\", text);\n\n        if (markers)\n        {\n          var lines = new CommentLines(file, sha1, firstLine, lastLine);\n          var chain = new CommentChain(result.chain_id, user, chain_type, false, \"draft\", null, null, [comment], lines, markers);\n\n          markers.commentChain = chain;\n          commentChains.push(chain);\n\n          if (!(file in commentChainsPerFile))\n            commentChainsPerFile[file] = [];\n\n          commentChainsPerFile[file].push(markers.commentChain);\n\n          markers.setType(chain_type);\n        }\n        else\n        {\n          var chain = new CommentChain(result.chain_id, user, chain_type, false, \"draft\", null, null, [comment], null, null);\n\n          var html = \"<tr class='comment draft \" + chain_type + \"'><td class='author'>\" + htmlify(user.displayName) + \"</td><td class='title'><a href='/showcomment?chain=\" + chain.id + \"'>\" + chain.comments[0].getLeader() + \"</a></td><td class='when'>now</td></tr>\";\n          if (chain_type == \"issue\")\n          {\n            target = $(\"tr#draft-issues\");\n            if (target.length == 0)\n              $(\"table.comments tr.h1\").after(\"<tr id='draft-issues'><td class='h2' colspan='3'><h2>Draft Issues<a href='/showcomments?review=\" + review.id + \"&amp;filter=draft-issues'>[display all]</h2></td></tr>\" + html);\n            else\n              target.after(html);\n          }\n          else\n          {\n            target = $(\"tr#draft-notes\");\n            if (target.length == 0)\n            {\n              target = $(\"tr#notes\");\n              if (target.length == 0)\n                target = $(\"tr.buttons\");\n              target.before(\"<tr id='draft-notes'><td class='h2' colspan='3'><h2>Draft Notes<a href='/showcomments?review=\" + review.id + \"&amp;filter=draft-notes'>[display all]</h2></td></tr>\" + html);\n            }\n            else\n              target.after(html);\n          }\n\n          /* Force \"compatible history navigation\" from now on. */\n          unload = function () {}\n        }\n\n        updateDraftStatus(result.draft_status);\n\n        return true;\n      }\n\n      return success;\n    }\n\n    function start()\n    {\n      content = $(\"<div class='comment flex' title='Create Comment'>\" +\n                  message +\n                  \"<textarea class='text flexible' rows=8></textarea></div>\");\n\n      var buttons;\n\n      if (chain_type != null)\n        buttons = { Save: function () { if (finish(chain_type)) { content.dialog(\"close\"); } } };\n      else\n      {\n        buttons = { \"Add issue\": function () { if (finish(\"issue\")) { markers = null; content.dialog(\"close\"); } },\n                    \"Add note\": function () { if (finish(\"note\")) { markers = null; content.dialog(\"close\"); } } };\n\n        function wrapDialogFunction(fn)\n        {\n          return function () { if (fn()) content.dialog(\"close\"); };\n        }\n\n        for (var title in CommentChain.extraButtons)\n          buttons[title] = wrapDialogFunction(CommentChain.extraButtons[title]);\n      }\n\n      var data = {};\n      if (markers)\n      {\n        data.context = markersLocation;\n\n        if (markersLocation == \"file\")\n        {\n          data.changeset = useChangeset.id;\n          data.path = useFiles[file].path;\n          data.sha1 = useFiles[file][side == \"o\" ? \"old_sha1\" : \"new_sha1\"];\n          data.lineIndex = firstLine - 1;\n        }\n        else\n        {\n          data.sha1 = useChangeset.child.sha1;\n          data.lineIndex = firstLine;\n        }\n\n        data.lineCount = lastLine - firstLine + 1;\n      }\n      else\n        data.context = \"general\";\n\n      var hook_results = hooks[\"create-comment\"].map(function (callback) { try { return callback(data); } catch (e) { return []; } });\n\n      for (var index1 = 0; index1 < hook_results.length; ++index1)\n      {\n        var hook_result = hook_results[index1];\n\n        if (hook_result)\n          for (var index2 = 0; index2 < hook_result.length; ++index2)\n          {\n            (function (hooked)\n             {\n               if (hooked.href)\n                 buttons[hooked.title] = function () { content.dialog(\"close\"); location.href = hooked.href; };\n               else\n                 buttons[hooked.title] = function () { if (hooked.callback(content.find(\"textarea\").val())) content.dialog(\"close\"); };\n             })(hook_result[index2]);\n          }\n      }\n\n      buttons[\"Cancel\"] = function () { content.dialog(\"close\");  };\n\n      function close()\n      {\n        if (markers && chain_type == null)\n          markers.remove();\n\n        currentMarkers = null;\n      }\n\n      content.dialog({ width: 600,\n                       buttons: buttons,\n                       closeOnEscape: false,\n                       close: close });\n    }\n\n    if (!paused)\n      start();\n  };\n\nCommentChain.prototype.getFirstLine = function ()\n  {\n    return this.lines.getFirstLine(this);\n  };\n\nCommentChain.prototype.getLastLine = function ()\n  {\n    return this.lines.getLastLine(this);\n  };\n\nCommentChain.prototype.removeDraftStatus = function ()\n  {\n    if (this.state == \"draft\")\n    {\n      this.state = \"open\";\n\n      var comment = this.comments[this.comments.length - 1];\n      if (comment.state == \"draft\")\n        comment.state = \"current\";\n    }\n  };\n\nCommentChain.currentDialog = null;\nCommentChain.reopening = null;\n\nCommentChain.prototype.display = function ()\n  {\n    if (CommentChain.currentDialog)\n    {\n      CommentChain.currentDialog.dialog(\"close\");\n      CommentChain.currentDialog = null;\n    }\n\n    var self = this;\n    var html = \"<div class='comment-dialog' title='\" + (this.type == \"issue\" ? \"Issue raised\" : \"Note\") + \" by \" + htmlify(this.comments[0].author.displayName) + \"'>\";\n\n    for (var index = 0; index < this.comments.length; ++index)\n    {\n      var comment = this.comments[index];\n      html += \"<div class='comment\" + (comment.state == \"draft\" ? \" draft\" : \"\") + \"'><div class='header'><span class='author'>\" + htmlify(comment.author.displayName) + \"</span> posted <span class='time'>\" + comment.time + \"</span></div><div class='text'>\" + htmlify(comment.text) + \"</div></div>\";\n    }\n\n    if (this.state != \"draft\" && this.state != \"open\")\n    {\n      var text;\n\n      switch (this.state)\n      {\n      case \"addressed\":\n        text = \"Addressed by <a href='/showcommit?review=\" + review.id + \"&amp;sha1=\" + this.addressed_by + \"'>\" + this.addressed_by.substr(0, 8) + \"</a>\";\n        break;\n\n      case \"closed\":\n        text = \"Resolved by \" + htmlify(this.closed_by.displayName);\n        break;\n      }\n\n      html += \"<div class='resolution'>\" + text + \"</div>\";\n    }\n\n    html += \"</div>\";\n\n    var content = $(html);\n    var buttons = {};\n\n    if (this.state == \"draft\" || comment.state == \"draft\")\n    {\n      buttons[\"Edit\"] = function () { self.editComment(comment, content); };\n      buttons[\"Delete\"] = function () { self.deleteComment(comment, content); };\n    }\n    else\n      buttons[\"Reply\"] = function () { self.reply(content); };\n\n    if (this.state == \"closed\" || this.addressed_by)\n      buttons[\"Reopen issue\"] = function () { content.dialog(\"close\"); self.reopen(); };\n\n    var back = this.type_is_draft ? \"back \" : \"\";\n\n    if (this.type == \"issue\")\n    {\n      if (this.state == \"open\" && !this.type_is_draft)\n        buttons[\"Resolve issue\"] = function () { self.resolve(content); };\n\n      if (back || user.options.ui.convertIssueToNote)\n        buttons[\"Convert \" + back + \"to note\"] = function () { self.morph(content); };\n    }\n    else\n      buttons[\"Convert \" + back + \"to issue\"] = function () { self.morph(content); };\n\n    var data = {};\n    if (this.markers)\n    {\n      var m1 = /(?:p(\\d+))?f(\\d+)([on])(\\d+)/.exec(this.markers.firstLine.id);\n      var m2 = /(?:p\\d+)?f\\d+[on](\\d+)/.exec(this.markers.lastLine.id);\n      if (m1 && m2)\n      {\n        var side = m1[3];\n\n        if (m1[1] !== undefined)\n        {\n          var parent = parseInt(m1[1]);\n          useChangeset = changeset[parent];\n          useFiles = files[parent];\n        }\n        else\n        {\n          useChangeset = changeset;\n          useFiles = files;\n        }\n\n        var file = parseInt(m1[2]);\n\n        data.context = \"file\";\n        data.changeset = useChangeset.id;\n        data.path = useFiles[file].path;\n        data.sha1 = useFiles[file][side == \"o\" ? \"old_sha1\" : \"new_sha1\"];\n        data.lineIndex = parseInt(m1[4]) - 1;\n        data.lineCount = parseInt(m2[1]) - data.lineIndex;\n      }\n      else\n      {\n        var m1 = /msg(\\d+)/.exec(this.markers.firstLine.id);\n        var m2 = /msg(\\d+)/.exec(this.markers.lastLine.id);\n\n        data.context = \"commit\";\n        data.sha1 = (\"child\" in changeset) ? changeset.child.sha1 : changeset[0].child.sha1;\n        data.lineIndex = parseInt(m1[1]);\n        data.lineCount = parseInt(m2[1]) - data.lineIndex + 1;\n      }\n    }\n    else\n      data.context = \"general\";\n\n    var hook_results = hooks[\"display-comment\"].map(function (callback) { try { return callback(data); } catch (e) { return []; } });\n\n    for (var index1 = 0; index1 < hook_results.length; ++index1)\n    {\n      var hook_result = hook_results[index1];\n\n      if (hook_result)\n        for (var index2 = 0; index2 < hook_result.length; ++index2)\n        {\n          (function (hooked)\n           {\n             if (hooked.href)\n               buttons[hooked.title] = function () { content.dialog(\"close\"); location.href = hooked.href; };\n             else\n               buttons[hooked.title] = function () { if (hooked.callback(content.find(\"textarea\").val())) content.dialog(\"close\"); };\n           })(hook_result[index2]);\n        }\n    }\n\n    buttons[\"Close\"] = function () { content.dialog(\"close\"); };\n\n    content.dialog({ width: 600, buttons: buttons, close: function () { CommentChain.currentDialog = null; }});\n\n    if (content.closest(\".ui-dialog\").height() > innerHeight)\n      content.dialog(\"option\", \"height\", innerHeight - 10);\n\n    CommentChain.currentDialog = content;\n  };\n\nCommentChain.prototype.reply = function (parentDialog)\n  {\n    var self = this;\n    var content = $(\"<div class='comment flex' title='Write Reply'>\" +\n                    \"<textarea class='text flexible' rows=8></textarea></div>\");\n\n    function finish()\n    {\n      var text = content.find(\"textarea\").val();\n      var data = { chain_id: self.id,\n                   text: text };\n      var success = false;\n\n      var operation = new Operation({ action: \"add reply\",\n                                      url: \"createcomment\",\n                                      data: data });\n      var result = operation.execute();\n\n      if (result)\n      {\n        var comment = new Comment(result.comment_id, user, \"now\", \"draft\", text);\n        self.comments.push(comment);\n\n        var container = $(\"div.comment-chain#c\" + self.id);\n        if (container.length)\n        {\n          var buttons = container.find(\"div.comments\").find(\"div.buttons\");\n          var resolution = container.find(\"div.comments\").find(\"div.resolution\");\n          var target = resolution.size() ? resolution : buttons;\n\n          target.before(\"<div class='comment draft' id='c\" + self.id + \"c\" + comment.id + \"'>\" +\n                          \"<div class='header'><span class='author'>\" + htmlify(user.displayName) + \"</span> posted <span class='time'>now</span></div>\" +\n                          \"<div class='text' id='c\" + comment.id + \"text'>\" + htmlify(text) + \"</div>\" +\n                        \"</div>\");\n          buttons.children(\"button.reply\").addClass(\"hidden\").before(\"<button class='edit'>Edit</button><button class='delete'>Delete</button>\");\n          buttons.children(\"button.edit\").button().click(function () { self.editComment(comment, null); });\n          buttons.children(\"button.delete\").button().click(function () { self.deleteComment(comment, null); });\n\n          CommentMarkers.updateAll();\n        }\n\n        updateDraftStatus(result.draft_status);\n        return true;\n      }\n      else\n        return false;\n    }\n\n    content.dialog({ width: 600,\n                     buttons: { Save: function () { if (finish()) { content.dialog(\"close\"); if (parentDialog) parentDialog.dialog(\"close\"); } },\n                                Cancel: function () { content.dialog(\"close\"); }},\n                     closeOnEscape: false,\n                     modal: true });\n  };\n\nCommentChain.prototype.reopen = function (from_showcomment, from_onload)\n  {\n    var self = this;\n    var content;\n\n    function cancel()\n    {\n      content.dialog(\"close\");\n      CommentChain.reopening = null;\n    }\n\n    function finish(markers)\n    {\n      var operation;\n\n      if (markers)\n      {\n        var m1 = /(?:p(\\d+))?f(\\d+)[on](\\d+)/.exec(markers.firstLine.id);\n\n        var useFiles;\n        if (m1[1] !== undefined)\n          useFiles = files[parseInt(m1[1])];\n        else\n          useFiles = files;\n\n        var file = parseInt(m1[2]);\n        var sha1 = useFiles[file].new_sha1;\n        var firstLine = parseInt(m1[3]);\n        var m2 = /(?:p\\d+)?f\\d+[on](\\d+)/.exec(markers.lastLine.id);\n        var lastLine = parseInt(m2[1]);\n\n        operation = new Operation({ action: \"reopen issue\",\n                                    url: \"reopenaddressedcommentchain\",\n                                    data: { chain_id: self.id,\n                                            commit_id: changeset.child.id,\n                                            sha1: sha1,\n                                            offset: firstLine,\n                                            count: lastLine + 1 - firstLine }});\n      }\n      else\n        operation = new Operation({ action: \"reopen issue\",\n                                    url: \"reopenresolvedcommentchain\",\n                                    data: { chain_id: self.id }});\n\n      var result = operation.execute();\n\n      if (result)\n      {\n        if (result.new_state == \"open\")\n        {\n          self.state = \"open\";\n\n          if (markers)\n          {\n            self.markers.setType(self.type, self.state);\n\n            self.lines.sha1 = sha1;\n            self.lines.firstLine = firstLine;\n            self.lines.lastLine = lastLine;\n\n            self.markers.updatePosition();\n          }\n        }\n        else\n        {\n          showMessage(\"Reopen Issue\",\n                      \"Issue still addressed!\",\n                      \"The issue was successfully transferred to the selected lines, \" +\n                      \"but those lines were in turn modified by a later commit in the \" +\n                      \"review, so the issue is still marked as addressed.\");\n        }\n\n        updateDraftStatus(result.draft_status);\n\n        var container = $(\"div.comment-chain#c\" + self.id);\n        if (container.length)\n        {\n          container.find(\"div.resolution\").remove();\n          CommentMarkers.updateAll();\n        }\n      }\n\n      markers.remove();\n      cancel();\n    }\n\n    if (this.addressed_by)\n    {\n      if (from_showcomment || changeset.child.sha1 != this.addressed_by)\n      {\n        content = $(\"<div title='Reopen Issue'>Addressed issues can only be reopened from a regular diff of the commit that addressed the issue.  Would you like to go there?</div>\");\n\n        function goThere()\n        {\n          content.dialog(\"close\");\n          location.href = \"/showcommit?review=\" + review.id + \"&sha1=\" + self.addressed_by + \"&reopen=\" + self.id;\n        }\n\n        function stayHere()\n        {\n          content.dialog(\"close\");\n        }\n\n        content.dialog({ width: 600,\n                         buttons: { \"Yes, go there\": goThere, \"No, stay here\": stayHere },\n                         resizable: false });\n      }\n      else\n      {\n        this.finish = finish;\n\n        content = $(\"<div title='Reopen Issue'>Please select the lines in the new version of the file where the comment should be transferred to.</div>\");\n\n        content.dialog({ width: 800,\n                         position: \"top\",\n                         buttons: { Cancel: cancel },\n                         resizable: false });\n\n        CommentChain.reopening = this;\n      }\n    }\n    else if (this.state == \"closed\")\n      finish(null);\n  };\n\nCommentChain.prototype.resolve = function (dialog)\n  {\n    var self = this;\n\n    function finish()\n    {\n      var operation = new Operation({ action: \"resolve issue\",\n                                      url: \"resolvecommentchain\",\n                                      data: { chain_id: self.id }});\n      var result = operation.execute();\n\n      if (result)\n      {\n        self.state = 'closed';\n        self.closed_by = user;\n\n        if (self.markers)\n          self.markers.setType(self.type, self.state);\n\n        var container = $(\"#c\" + self.id);\n        if (container.length)\n        {\n          container.find(\"div.buttons\").before(\"<div class='resolution'>Resolved by \" + htmlify(user.displayName) + \"</div>\");\n          container.find(\"button.resolve\").remove();\n\n          CommentMarkers.updateAll();\n        }\n\n        updateDraftStatus(result.draft_status);\n\n        if (dialog)\n          dialog.dialog(\"close\");\n      }\n    }\n\n    if (user.options.ui.resolveIssueWarning && user.id != this.user.id)\n    {\n      var content = $(\"<div title='Please Confirm'><p><b>You did not raise this issue.</b>  Are you sure you mean to resolve it explicitly?</p><p>If you fixed the code, you should push a commit with the fixes, which often closes the issue automatically.  And even if it does not, you may want to let the reviewer who raised the issue resolve it after reviewing your fix.</p></div>\");\n      content.dialog({ width: 400, modal: true, buttons: { \"Resolve issue\": function () { content.dialog(\"close\"); finish(); }, \"Do nothing\": function () { content.dialog(\"close\"); }}});\n    }\n    else\n      finish();\n  };\n\nCommentChain.prototype.morph = function (dialog, button)\n  {\n    var self = this;\n    var new_type = this.type == 'issue' ? 'note' : 'issue';\n\n    var operation = new Operation({ action: \"change comment type\",\n                                    url: \"morphcommentchain\",\n                                    data: { chain_id: this.id,\n                                            new_type: new_type }});\n    var result = operation.execute();\n\n    if (result)\n    {\n      self.type = new_type;\n\n      if (self.markers)\n        self.markers.setType(self.type, self.state);\n\n      var title = $(\"#c\" + self.id + \" .comment-chain-title\");\n\n      if (new_type == 'note')\n        title.text(title.text().replace(\"Issue raised by\", \"Note by\"));\n      else\n        title.text(title.text().replace(\"Note by\", \"Issue raised by\"));\n\n      self.type_is_draft = !self.type_is_draft;\n\n      var back = self.type_is_draft ? \"back \" : \"\";\n\n      if (button)\n        if (new_type == 'note')\n          $(button).button(\"option\", \"label\", \"Convert \" + back + \"to issue\");\n        else if (back || user.options.ui.convertIssueToNote)\n          $(button).button(\"option\", \"label\", \"Convert \" + back + \"to note\");\n\n      updateDraftStatus(result.draft_status);\n\n      if (dialog)\n        dialog.dialog(\"close\");\n    }\n  };\n\nCommentChain.prototype.editComment = function (comment, parentDialog)\n  {\n    var self = this;\n    var content = $(\"<div class='comment flex' title='Edit Comment'>\" +\n                    \"<textarea class='text flexible' rows=8></textarea></div>\");\n    var textarea = content.find(\"textarea\");\n\n    textarea.val(comment.text);\n\n    function finish()\n    {\n      var new_text = textarea.val();\n\n      var operation = new Operation({ action: \"update comment\",\n                                      url: \"updatecomment\",\n                                      data: { comment_id: comment.id,\n                                              new_text: new_text }});\n      var result = operation.execute();\n\n      if (result)\n      {\n        comment.text = new_text;\n\n        $(\"#c\" + comment.id + \"text\").text(new_text);\n\n        updateDraftStatus(result.draft_status);\n        return true;\n      }\n      else\n        return false;\n    }\n\n    content.dialog({ width: 600,\n                     buttons: { Save: function () { if (finish()) { content.dialog(\"close\"); if (parentDialog) parentDialog.dialog(\"close\"); } },\n                                Cancel: function () { content.dialog(\"close\"); }},\n                     closeOnEscape: false,\n                     modal: true });\n  };\n\nCommentChain.prototype.deleteComment = function (comment, parentDialog)\n  {\n    var self = this;\n    var content = $(\"<div class='dialog' title='Delete Comment'?>Are you sure?</div>\");\n\n    function finish()\n    {\n      var operation = new Operation({ action: \"delete comment\",\n                                      url: \"deletecomment\",\n                                      data: { comment_id: comment.id }});\n      var result = operation.execute();\n\n      if (result)\n      {\n        self.comments.pop();\n\n        if (self.comments.length == 0)\n          if (self.markers)\n          {\n            commentChains.splice(commentChains.indexOf(self), 1);\n            commentChainsPerFile[self.lines.file].splice(commentChainsPerFile[self.lines.file].indexOf(self), 1);\n            self.markers.remove();\n          }\n          else\n          {\n            $(\"#c\" + self.id).remove();\n            if ($(\"table.file\").length == 0)\n              location.href = \"/showreview?id=\" + review.id;\n          }\n        else\n        {\n          $(\"#c\" + self.id + \"c\" + comment.id).remove();\n\n          var buttons = $(\"#c\" + self.id + \" div.buttons\");\n          buttons.children(\"button.edit, button.delete\").remove();\n          buttons.children(\"button.reply\").removeClass(\"hidden\");\n        }\n\n        updateDraftStatus(result.draft_status);\n        return true;\n      }\n      else\n        return false;\n    }\n\n    content.dialog({ modal: true,\n                     buttons: { Delete: function () { content.dialog(\"close\"); if (finish() && parentDialog) { parentDialog.dialog(\"close\"); } },\n                                Cancel: function () { content.dialog(\"close\"); }}});\n  };\n\nCommentChain.prototype.toolTip = function ()\n  {\n    var html = \"<div class='tooltip'>\";\n    html += \"<div class='header'>\";\n    html += (this.type == \"issue\" ? \"Issue raised\" : \"Note\");\n    html += \" by \";\n    html += htmlify(this.comments[0].author.displayName);\n    if (this.closed_by) {\n      html += \" (closed by \" + this.closed_by + \")\";\n    } else if (this.addressed_by) {\n      html += \" (addressed by \" + this.addressed_by.substring(0, 7) + \")\";\n    }\n    html += \"</div>\";\n    html += \"<div class='text sourcefont'>\" + htmlify(this.comments[0].text) + \"</div>\";\n    html += \"</div>\";\n    return html;\n  };\n\nCommentChain.removeAll = function ()\n  {\n    if (typeof commentChains != \"undefined\")\n    {\n      for (var index = 0; index < commentChains.length; ++index)\n        commentChains[index].markers.remove();\n\n      commentChains = [];\n    }\n  };\n\nfunction CommentMarkers(commentChain)\n{\n  var self = this;\n\n  allMarkers.push(this);\n\n  if (this.commentChain = commentChain)\n  {\n    this.firstLine = commentChain.getFirstLine();\n    this.lastLine = commentChain.getLastLine();\n  }\n  else\n    this.firstLine = this.lastLine = null;\n\n  this.bothMarkers = $(\"<div class='marker left'></div><div class='marker right'></div>\");\n  this.leftMarker = this.bothMarkers.first();\n  this.rightMarker = this.bothMarkers.last();\n\n  if (commentChain)\n    this.setType(commentChain.type, commentChain.state);\n  else\n    this.setType(\"new\");\n\n  this.bothMarkers.tooltip({\n    content: function () { if (self.commentChain) return self.commentChain.toolTip() },\n    items: \"div.marker\",\n    tooltipClass: \"comment-tooltip\",\n    track: true,\n    hide: false\n  });\n\n  this.leftMarker.click(function () { if (self.commentChain) self.commentChain.display(); });\n  this.rightMarker.click(function () { if (self.commentChain) self.commentChain.display(); });\n\n  $(document.body).append(this.bothMarkers);\n\n  this.updatePosition();\n}\n\nCommentMarkers.prototype.setLines = function (firstLine, lastLine)\n{\n  this.firstLine = firstLine;\n  this.lastLine = lastLine;\n  this.updatePosition();\n}\n\nCommentMarkers.prototype.setType = function (type, state)\n{\n  this.bothMarkers.removeClass(\"issue note new open addressed closed\");\n  this.bothMarkers.addClass(type);\n\n  if (type == \"issue\" && typeof state != \"undefined\")\n    this.bothMarkers.addClass(state);\n}\n\nCommentMarkers.prototype.updatePosition = function ()\n  {\n    if (this.commentChain)\n    {\n      this.firstLine = this.commentChain.getFirstLine();\n      this.lastLine = this.commentChain.getLastLine();\n    }\n\n    if (this.firstLine)\n    {\n      var firstLine = $(this.firstLine);\n      var lastLine = $(this.lastLine);\n\n      if (firstLine.parents(\"table.file\").is(\".show.expanded\") ||\n          firstLine.parents(\"table.commit-msg\").size())\n      {\n        this.leftMarker.css(\"display\", \"block\");\n        this.rightMarker.css(\"display\", \"block\");\n\n        var top = firstLine.offset().top - 2;\n        var bottom = lastLine.offset().top + lastLine.height();\n\n        if (firstLine.hasClass(\"whole\"))\n        {\n          var linenr = firstLine.prevAll(\"td.linenr.old\");\n          this.leftMarker.offset({ top: top, left: linenr.offset().left - this.leftMarker.width() - 4 });\n        }\n        else if (firstLine.hasClass(\"old\") || firstLine.hasClass(\"single\") && !firstLine.hasClass(\"commit-msg\"))\n        {\n          var edge = firstLine.prevAll(\"td.edge\");\n          this.leftMarker.offset({ top: top, left: edge.offset().left + edge.width() - this.leftMarker.width() - 4 });\n        }\n        else\n          this.leftMarker.offset({ top: top, left: firstLine.offset().left - this.leftMarker.width() - 6 });\n\n        if (firstLine.hasClass(\"new\") || firstLine.hasClass(\"single\"))\n          this.rightMarker.offset({ top: top, left: firstLine.nextAll(\"td.edge\").offset().left });\n        else\n          this.rightMarker.offset({ top: top, left: firstLine.nextAll(\"td.middle\").offset().left + 2 });\n\n        this.leftMarker.height(bottom - top - 1);\n        this.rightMarker.height(bottom - top - 1);\n\n        return;\n      }\n    }\n\n    this.leftMarker.css(\"display\", \"none\");\n    this.rightMarker.css(\"display\", \"none\");\n  };\n\nCommentMarkers.prototype.remove = function ()\n  {\n    this.leftMarker.remove();\n    this.rightMarker.remove();\n\n    allMarkers.splice(allMarkers.indexOf(this), 1);\n  };\n\nCommentMarkers.prototype.linesModified = function ()\n  {\n    var iter = $(this.firstLine).closest(\"tr\");\n    var stop = $(this.lastLine).closest(\"tr\");\n\n    do\n    {\n      if (!iter.hasClass(\"context\"))\n        return true;\n\n      if (iter.is(stop))\n        break;\n\n      iter = iter.next(\"tr\");\n    }\n    while (iter.size());\n\n    return false;\n  };\n\nCommentMarkers.updateAll = function ()\n{\n  try\n  {\n    for (var index = 0; index < allMarkers.length; ++index)\n      allMarkers[index].updatePosition();\n  }\n  catch (e)\n  {\n  }\n}\n\nvar activeMarkers = null, anchorLine = null, currentMarkers = null, allMarkers = [];\n\nfunction startCommentMarking(ev)\n{\n  if (ev.ctrlKey || ev.shiftKey || ev.altKey || ev.metaKey || /showcomments?$/.test(location.pathname) || ev.button != 0)\n    return;\n\n  if (ev.currentTarget.id && !activeMarkers && !currentMarkers)\n  {\n    if (CommentChain.reopening && CommentChain.reopening.lines.file != $(ev.currentTarget).parents(\"table.file\").first().attr(\"critic-file-id\"))\n    {\n      showMessage(\"Not supported\", \"Not supported\", \"Reopening an issue against lines in a different file is not supported.\");\n      return;\n    }\n\n    anchorLine = ev.currentTarget;\n    activeMarkers = new CommentMarkers;\n    activeMarkers.setLines(anchorLine, anchorLine);\n\n    ev.preventDefault();\n  }\n}\n\nfunction continueCommentMarking(ev)\n{\n  if (activeMarkers && ev.currentTarget.id)\n    if (ev.currentTarget.parentNode.parentNode == activeMarkers.firstLine.parentNode.parentNode && ev.currentTarget.cellIndex == anchorLine.cellIndex)\n    {\n      var firstLine, lastLine;\n\n      if (ev.currentTarget.parentNode.sectionRowIndex < anchorLine.parentNode.sectionRowIndex)\n      {\n        firstLine = ev.currentTarget;\n        lastLine = anchorLine;\n      }\n      else\n      {\n        firstLine = anchorLine;\n        lastLine = ev.currentTarget;\n      }\n\n      activeMarkers.setLines(firstLine, lastLine);\n    }\n}\n\n/* This function is overridden on some pages. */\nfunction handleMarkedLines(markers)\n{\n  CommentChain.create(markers);\n}\n\nfunction endCommentMarking(ev)\n{\n  if (activeMarkers)\n  {\n    if (CommentChain.reopening)\n      CommentChain.reopening.finish(activeMarkers);\n    else\n    {\n      currentMarkers = activeMarkers;\n      handleMarkedLines(activeMarkers);\n    }\n\n    activeMarkers = null;\n    ev.preventDefault();\n  }\n}\n\nfunction markChainsAsRead(chain_ids)\n{\n  var operation = new Operation({ action: \"mark comments as read\",\n                                  url: \"markchainsasread\",\n                                  data: { chain_ids: chain_ids },\n                                  callback: function () {} });\n\n  operation.execute();\n}\n\n$(document).ready(function ()\n  {\n    if (typeof commentChains != \"undefined\")\n      $.each(commentChains, function (index, commentChain)\n        {\n          try\n          {\n            if (commentChain.lines.file !== null)\n            {\n              if (!(commentChain.lines.file in commentChainsPerFile))\n                commentChainsPerFile[commentChain.lines.file] = [];\n              commentChainsPerFile[commentChain.lines.file].push(commentChain);\n            }\n\n            commentChain.markers = new CommentMarkers(commentChain);\n          }\n          catch (e)\n          {\n            //console.log(e);\n          }\n        });\n\n    if (typeof review != \"undefined\")\n      $(\"td.line\")\n        .mousedown(startCommentMarking)\n        .mouseover(continueCommentMarking)\n        .mouseup(endCommentMarking);\n\n    CommentMarkers.updateAll();\n  });\n\n$(window).load(function ()\n  {\n    CommentMarkers.updateAll();\n\n    var match = /(?:\\?|&)reopen=(\\d+)(?:&|$)/.exec(location.search);\n    if (match)\n    {\n      for (var index in commentChains)\n        if (commentChains[index].id == match[1])\n        {\n          var chain = commentChains[index];\n          var file_id = chain.lines.file;\n\n          expandFile(file_id);\n\n          var first_line = $(chain.markers.firstLine);\n          var last_line = $(chain.markers.lastLine);\n\n          scrollTo(0, first_line.offset().top - innerHeight / 2 + (last_line.offset().top + last_line.height() - first_line.offset().top) / 2);\n\n          setTimeout(function () { chain.reopen(false, true); }, 10);\n        }\n    }\n  });\n\nonresize = function ()\n  {\n    CommentMarkers.updateAll();\n  };\n"
  },
  {
    "path": "src/resources/config.css",
    "content": "/* -*- mode: css; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\ndiv.main table.preferences {\n    table-layout: fixed\n}\n\ndiv.main table.preferences tr.line td {\n    padding-top: 1em\n}\n\ndiv.main table.preferences td.heading {\n    font-family: serif;\n    vertical-align: top\n}\n\ndiv.main table.preferences tr.customized td.heading {\n    font-weight: bold;\n}\n\ndiv.main table.preferences td.value {\n    font-family: monospace;\n    white-space: pre-wrap;\n}\n\ndiv.main table.preferences td.value div.text {\n    white-space: pre\n}\n\ndiv.main table.preferences td.value input.setting[type=text] {\n    font-family: monospace\n}\n\ndiv.main table.preferences td.value select.setting {\n    min-width: 30%\n}\n\ndiv.main table.preferences tr td.value span.reset {\n    display: none\n}\n\ndiv.main table.preferences tr.customized:hover td.value span.reset {\n    display: inline;\n    float: right\n}\n\ndiv.main table.preferences td.value span.also-configurable-per {\n    float: right;\n    padding-left: 1em;\n    font-family: sans-serif;\n    font-style: italic;\n    color: #444\n}\n\ndiv.main table.preferences td.help {\n    font-style: italic;\n    text-align: right;\n    border-bottom: 1px solid #cca\n}\n\ndiv.main table.preferences td.extension {\n    padding-top: 1.5em;\n    border-bottom: 1px solid #cca\n}\n\ndiv.main table.preferences td.extension h2 span.name {\n    font-size: 150%;\n    padding-right: 0.5em\n}\n\ndiv.main table.preferences td.extension h2 span.author {\n    font-size: 120%;\n    padding-left: 0.5em\n}\n\ndiv.installed_sha1 {\n    text-align: center;\n    color: gray\n}\ndiv.installed_sha1 a {\n    text-decoration: none;\n    color: gray\n}\ndiv.installed_sha1 a:hover {\n    text-decoration: underline;\n}\n\n.notification.saved pre {\n    margin: 5px 0 3px 1em\n}\n\n.url-type {\n    flex-flow: row wrap;\n}\n\n.url-type .label {\n    font-weight: bold\n}\n\n.url-type .prefix {\n    font-family: monospace;\n    padding-left: 0.5rem;\n    margin-left: auto;\n}\n"
  },
  {
    "path": "src/resources/config.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\nvar timer_id = null;\nvar notifications = {};\nvar saves_in_progress = 0;\n\nfunction scheduleSaveSettings()\n{\n  if (user.id === null)\n    /* Don't (try to) save if user is anonymous. */\n    return;\n\n  if (timer_id !== null)\n    clearTimeout(timer_id);\n\n  timer_id = setTimeout(saveSettings, 500);\n}\n\nfunction saveSettings(reset_item)\n{\n  if (user.id === null)\n    /* Don't (try to) save if user is anonymous. */\n    return;\n\n  timer_id = null;\n\n  var data = { settings: [] };\n  var per_url = {};\n\n  function processElement(index, element)\n  {\n    var url = element.getAttribute(\"critic-url\"), value;\n    if (url)\n    {\n      var modified = false;\n\n      if (element instanceof HTMLInputElement)\n        if (element.type == \"checkbox\")\n        {\n          value = element.checked;\n          modified = value !== element.hasAttribute(\"checked\");\n          if (modified)\n            if (value)\n              element.setAttribute(\"checked\", \"checked\");\n            else\n              element.removeAttribute(\"checked\");\n        }\n        else\n        {\n          value = element.value;\n          modified = value !== element.getAttribute(\"value\");\n          if (modified)\n            element.setAttribute(\"value\", value);\n        }\n      else\n      {\n        value = element.value;\n        modified = value !== element.getAttribute(\"critic-value\");\n        if (modified)\n          element.setAttribute(\"critic-value\", value);\n      }\n\n      if (!modified)\n        return;\n\n      var items = per_url[url];\n      if (!items)\n      {\n        items = per_url[url] = {};\n        Object.defineProperty(items, \"CRITIC-EXTENSION\", { value: element.getAttribute(\"critic-extension\") });\n      }\n      items[element.getAttribute(\"name\")] = value;\n\n      if (value != JSON.parse(element.getAttribute(\"critic-default\")))\n        $(element).parents(\"tr.line\").addClass(\"customized\");\n      else\n        $(element).parents(\"tr.line\").removeClass(\"customized\");\n    }\n    else\n    {\n      if (element.type == \"checkbox\")\n        value = element.checked;\n      else if (element.type == \"number\")\n        value = parseInt(element.value);\n      else\n        value = String($(element).val());\n\n      if (value != JSON.parse(element.getAttribute(\"critic-current\")))\n      {\n        element.setAttribute(\"critic-current\", JSON.stringify(value));\n        data.settings.push({ item: element.name,\n                             value: value });\n        $(element).parents(\"tr.line\").addClass(\"customized\");\n      }\n    }\n  }\n\n  if (defaults)\n    data.defaults = true;\n\n  if (filter_id !== null)\n    data.filter_id = filter_id;\n  else if (repository_id !== null)\n    data.repository_id = repository_id;\n\n  if (reset_item)\n  {\n    var element = document.getElementsByName(reset_item)[0];\n    var default_value = JSON.parse(element.getAttribute(\"critic-default\"));\n    var item = { item: reset_item };\n\n    if (defaults && repository_id === null)\n      /* Editing global defaults => reset the default value rather\n         than deleting the override. */\n      item.value = default_value;\n\n    data.settings.push(item);\n\n    if (element.type == \"checkbox\")\n      element.checked = default_value;\n    else\n    {\n      element.value = default_value;\n      if (element instanceof HTMLSelectElement)\n        $(element).trigger(\"chosen:updated\");\n    }\n\n    $(element).parents(\"tr.line\").removeClass(\"customized\");\n    element.setAttribute(\"critic-current\", JSON.stringify(default_value));\n  }\n  else\n  {\n    $(\"td.value > input.setting, td.value > select.setting\")\n      .each(processElement);\n  }\n\n  function builtInSaved(result)\n  {\n    --saves_in_progress;\n\n    function showSavedNotification(title, details)\n    {\n      var html = \"<b>\" + title + \"</b>\";\n\n      if (details)\n        html += \"<pre>\" + details + \"</pre>\";\n\n      if (notifications[html])\n        notifications[html].remove();\n\n      notifications[html] = showNotification(\n        html, { className: \"saved\",\n                callback: function () { notifications[html] = null; }});\n    }\n\n    if (result)\n    {\n      if (result.saved_settings.length)\n      {\n        var title = reset_item ? \"Reset to default:\" : \"Saved settings:\";\n        var details = \"\";\n\n        result.saved_settings.forEach(\n          function (item)\n          {\n            details += htmlify(item) + \"\\n\";\n          });\n\n        showSavedNotification(title, details);\n      }\n    }\n\n    for (var url in per_url)\n    {\n      var items = per_url[url];\n\n      $.ajax({ async: false,\n               url: url,\n               type: \"POST\",\n               contentType: \"text/json\",\n               data: JSON.stringify(items),\n               dataType: \"text\",\n               success: function ()\n               {\n                 --saves_in_progress;\n                 showSavedNotification(\"Saved settings for extension \" + htmlify(items[\"CRITIC-EXTENSION\"]) + \".\");\n               },\n               error: function (xhr)\n               {\n                 --saves_in_progress;\n                 reportError(\"save settings for extension \" + items[\"CRITIC-EXTENSION\"], \"Request failed: \" + xhr.responseText);\n               }\n             });\n\n      ++saves_in_progress;\n    }\n  }\n\n  if (data.settings.length)\n  {\n    var operation = new Operation({ action: \"save settings\",\n                                    url: \"savesettings\",\n                                    data: data,\n                                    callback: builtInSaved });\n\n    operation.execute();\n\n    ++saves_in_progress;\n  }\n  else\n    builtInSaved();\n}\n\n$(document).ready(function ()\n  {\n    $(\"input[name='review.createViaPush']\").click(function (ev)\n      {\n        if (ev.target.checked)\n          showMessage(\"Important Note!\", \"Important Note!\", \"<p>Please note that when creating a review by pushing a branch whose name starts with <code>r/</code>, only the first (head) commit on the branch will be added to the review, and you will not be able to add its ancestor commits later.</p><p><strong>This feature cannot be used to create a review of multiple commits!</strong></p>\");\n      });\n\n    $(\"td.h1 .repository-select\")\n      .change(function (ev)\n        {\n          var repository = ev.target.value;\n          var params = {};\n\n          if (repository)\n            params.repository = repository;\n\n          if (defaults)\n            params.defaults = \"yes\";\n\n          var url = \"/config\";\n\n          if (Object.keys(params).length)\n            url += \"?\" + Object.keys(params).map(function (name) { return name + \"=\" + encodeURIComponent(params[name]); }).join(\"&\");\n\n          location.assign(url);\n        })\n      .chosen({ inherit_select_classes: true,\n                allow_single_deselect: true,\n                collapsed_width: \"auto\",\n                expanded_width: \"600px\" });\n\n    $(\"td.value > select.setting\")\n      .chosen({ inherit_select_classes: true,\n                disable_search: true,\n                collapsed_width: \"auto\",\n                expanded_width: \"30em\" });\n\n    $(\"td.value > .repository-select\")\n      .chosen({ inherit_select_classes: true,\n                allow_single_deselect: true,\n                collapsed_width: \"auto\",\n                expanded_width: \"40em\" })\n      .addClass(\"setting\");\n\n    $(\".setting\").bind(\"change input\", scheduleSaveSettings);\n\n    window.addEventListener(\"beforeunload\", function (ev)\n      {\n        saveSettings();\n\n        if (saves_in_progress > 0)\n          /* Firefox/IE looks at ev.returnValue; Chrome/Safari looks at the\n             function's return value.  Opera either doesn't fire the event\n             at all or behaves as Chrome. */\n          return ev.returnValue = (\"Modified settings are being saved.  You may want \" +\n                                   \"to wait a few seconds before leaving the page.\");\n      });\n  });\n"
  },
  {
    "path": "src/resources/confirmmerge.css",
    "content": "/* -*- mode: css; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\ndiv.main td.heading {\n    font-family: serif;\n    font-weight: bold;\n    text-align: right;\n    vertical-align: top\n}\n\ndiv.main td.value {\n    font-family: monospace;\n    white-space: pre-wrap;\n    vertical-align: top\n}\n"
  },
  {
    "path": "src/resources/confirmmerge.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\noverrideShowSquashedDiff = function (from_sha1)\n{\n  location.href = \"/confirmmerge?id=\" + confirmation_id + \"&tail=\" + from_sha1;\n}\n\n$(document).ready(function ()\n  {\n    $(\"button\").button();\n\n    $(\"button.confirmAll\").click(function (ev)\n      {\n        var tail = \"\";\n        if (typeof tail_sha1 == \"string\")\n          tail = \"&tail=\" + tail_sha1;\n        location.href = \"/confirmmerge?id=\" + confirmation_id + \"&confirm=yes\" + tail;\n      });\n\n    $(\"button.confirmNone\").click(function (ev)\n      {\n        location.href = \"/confirmmerge?id=\" + confirmation_id + \"&confirm=yes&tail=\" + merge_sha1;\n      });\n\n    $(\"button.cancel\").click(function (ev)\n      {\n        location.href = \"/confirmmerge?id=\" + confirmation_id + \"&cancel=yes\";\n      });\n\n    if (confirmed)\n    {\n      var content = $(\"<div title='Merge Confirmed'><p>Please repeat the 'git push' command that failed and redirected you here.  It will now allow this merge commit, and the additional commits it contributes listed on this page, to be added to the review.</p></div>\");\n      content.dialog({ width: 600, height: 225, modal: true, buttons: { OK: function () { content.dialog(\"close\"); }}});\n    }\n  });\n"
  },
  {
    "path": "src/resources/createreview.css",
    "content": "/* -*- mode: css; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\nbody { font-size: 12px; font-family: sans-serif }\n\ninput.submit { font-size: 20px; background-color: lime }\n\ndiv.main table.basic tr.line td { padding-top: 0.5em }\ndiv.main table.basic td.heading { font-family: serif; font-weight: bold; text-align: right; width: 20% }\ndiv.main table.basic tr.line.description td.heading,\ndiv.main table.basic tr.line.recipients td.heading {\n    vertical-align: top;\n}\ndiv.main table.basic td.value { font-family: monospace; }\ndiv.main table.basic td.value input,\ndiv.main table.basic td.value textarea {\n    font: 10pt monospace;\n    width: 90%;\n}\n#branch_name {\n    /* subtract width of \"r/\" prefix */\n    width: calc(90% - 1.34em);\n}\ndiv.main table.basic td.value span.mode { font-style: italic }\ndiv.main table.basic td.value div.buttons { float: right }\ndiv.main table.basic td.status { text-align: right; width: 20%; font-weight: bold }\ndiv.main table.basic td.help { font-style: italic; border-bottom: 1px solid #cca; padding-left: 30% }\n\ndiv.main table.filters tr.applyfilters td.value { padding-top: 0.5em; text-align: right }\ndiv.main table.filters tr.applyfilters td.legend { padding-top: 0.5em; font-weight: bold }\n\ndiv.main table.filters tr.reviewers td { padding-top: 0.5em }\ndiv.main table.filters tr.reviewers td.reviewers { font-weight: bold; text-align: right; width: 30% }\ndiv.main table.filters tr.reviewers td.files { font-family: monospace; width: 70% }\ndiv.main table.filters tr.reviewers td.files span.file { white-space: pre }\ndiv.main table.filters tr.reviewers td.no-one { text-align: right; color: red; font-weight: bold }\n\ndiv.main table.filters tr.watchers td.spacer { padding-top: 0; border-bottom: 1px solid #cca }\ndiv.main table.filters tr.watchers td.heading { padding-top: 0.5em; text-align: right; font-weight: bold }\ndiv.main table.filters tr.watchers td.watchers { padding-top: 0.5em; }\n\ndiv.main table.filters tr.buttons td.spacer { padding-top: 0; border-bottom: 1px solid #cca }\ndiv.main table.filters tr.buttons td.buttons { padding-top: 0.5em; text-align: right }\n\n\ndiv.comment > div.text {\n    font-family: monospace;\n    font-size: 11px;\n    background-color: #fff;\n    border: 1px solid #bbb;\n    margin-top: 3px;\n    padding: 5px;\n    white-space: pre-wrap\n}\n\ndiv.comment > div.text > textarea {\n    font-family: monospace;\n    font-size: 11px;\n    background-color: #fff;\n    border: none;\n    width: 100%;\n    resize: none\n}\n\ndiv#recipients table {\n    width: 100%\n}\n\ndiv#recipients .key {\n    font-weight: bold\n}\n\ndiv#recipients select {\n    width: 100%\n}\n\ndiv#recipients input {\n    font-family: monospace;\n    width: 100%\n}\n\np.remotehost {\n    margin-top: 0;\n    margin-bottom: 3px\n}\np.remotepath {\n    margin-top: 3px;\n    margin-bottom: 0\n}\n\n\ncode.inset > a {\n    color: #222\n}\ncode.inset > a:hover {\n    color: blue;\n    text-decoration: underline\n}\n\nselect.repository-select, input.remote, input.workbranch, input.upstreamcommit {\n    width: 40em\n}\n\ninput.remote, input.workbranch, input.upstreamcommit {\n    font-family: monospace\n}\n"
  },
  {
    "path": "src/resources/createreview.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\n/* -*- Mode: js; js-indent-level: 2; indent-tabs-mode: nil -*- */\n\nvar reviewfilters = {};\nvar recipients_mode = \"opt-out\", recipients_included = {}, recipients_excluded = {};\n\nfunction splitReviewFilters(reviewfilters)\n{\n  var result = [];\n  for (var key in reviewfilters)\n  {\n    var data = JSON.parse(key);\n    data.type = reviewfilters[key];\n    result.push(data);\n  }\n  return result;\n}\n\nfunction submitReview()\n{\n  var branch_name = document.getElementById(\"branch_name\");\n  var summary = document.getElementById(\"summary\");\n  var description = document.getElementById(\"description\").value.trim();\n\n  if (invalid_branch_name && branch_name.value == invalid_branch_name)\n  {\n    alert(\"You need to edit the branch name, lazy!\");\n    branch_name.focus();\n    return;\n  }\n\n  if (branch_name.value.length <= 4)\n  {\n    alert(\"A branch name that short is not a good review identifier.  Please elaborate a little bit.\");\n    branch_name.focus();\n    return;\n  }\n\n  if (summary.value.length <= 8)\n  {\n    alert(\"A summary that short is not very meaningful.  Please elaborate a little bit.\");\n    summary.focus();\n    return;\n  }\n\n  var data = { repository_id: repository.id,\n               commit_ids: review_data.commit_ids,\n               branch: \"r/\" + branch_name.value,\n               summary: summary.value.trim(),\n               reviewfilters: splitReviewFilters(reviewfilters),\n               recipientfilters: { mode: recipients_mode,\n                                   included: Object.keys(recipients_included),\n                                   excluded: Object.keys(recipients_excluded) },\n               applyfilters: $(\"input.applyfilters:checked\").size() != 0,\n               applyparentfilters: $(\"input.applyparentfilters:checked\").size() != 0 };\n\n  if (description)\n    data.description = description;\n  if (typeof fromBranch == \"string\")\n    data.frombranch = fromBranch;\n  if (typeof trackedbranch == \"object\")\n    data.trackedbranch = trackedbranch;\n\n  var operation = new Operation({ action: \"create review\",\n                                  url: \"submitreview\",\n                                  data: data });\n  var result = operation.execute();\n\n  if (result)\n  {\n    if (result.extensions_output)\n      showMessage(\"Review Created\",\n                  \"Extension Output\",\n                  \"<pre>\" + htmlify(result.extensions_output) + \"</pre>\",\n                  function () { location.href = \"/r/\" + result.review_id; });\n    else\n      location.href = \"/r/\" + result.review_id;\n  }\n}\n\nfunction updateReviewersAndWatchers(new_reviewfilters)\n{\n  var success = false;\n\n  if (!new_reviewfilters)\n    new_reviewfilters = reviewfilters;\n\n  var data = { repository_id: repository.id,\n               commit_ids: review_data.commit_ids,\n               reviewfilters: splitReviewFilters(new_reviewfilters),\n               applyfilters: $(\"input.applyfilters:checked\").size() != 0,\n               applyparentfilters: $(\"input.applyparentfilters:checked\").size() != 0 };\n\n  var operation = new Operation({ action: \"update filters\",\n                                  url: \"reviewersandwatchers\",\n                                  data: data });\n  var result = operation.execute();\n\n  if (result)\n  {\n    $(\"table.filters\").replaceWith(result.html);\n    $(\"table.filters\").find(\"button\").button();\n\n    connectApplyFilters();\n\n    reviewfilters = new_reviewfilters;\n    return true;\n  }\n  else\n    return false;\n}\n\nfunction updateFilters(filter_type)\n{\n  function addFilters(names, path)\n  {\n    new_reviewfilters = {};\n\n    for (var key in reviewfilters)\n      new_reviewfilters[key] = reviewfilters[key];\n\n    names.forEach(\n      function (name)\n      {\n        var key = JSON.stringify({ username: name, path: path });\n        new_reviewfilters[key] = filter_type;\n      });\n\n    return updateReviewersAndWatchers(new_reviewfilters);\n  }\n\n  addReviewFiltersDialog({\n    filter_type: filter_type,\n    callback: addFilters,\n    reload_page: false\n  });\n}\n\nfunction addReviewer()\n{\n  updateFilters(\"reviewer\");\n}\n\nfunction addWatcher()\n{\n  updateFilters(\"watcher\");\n}\n\nfunction editRecipientList()\n{\n  var recipient_list_dialog =\n    $(\"<div id='recipients' title='Edit Recipient List'>\"\n    +   \"<p>The recipient list determines the list of users that receive \"\n    +      \"e-mails about various updates to the review.  The recipient \"\n    +      \"list is constructed from the list of users associated with the \"\n    +      \"review (reviewers and watchers) either in an opt-in or opt-out \"\n    +      \"fashion.  The default is opt-out, meaning all associated users \"\n    +      \"receive e-mails unless they specifically ask not to.  By \"\n    +      \"choosing opt-in mode, the review owner can restrict the list \"\n    +      \"of recipients.</p>\"\n    +   \"<p>Note: the review owner (you) is always included in the \"\n    +      \"recipient list.</p>\"\n    +   \"<table>\"\n    +     \"<tr><td class=key>Mode:</td><td class=value>\"\n    +       \"<select id='mode'>\"\n    +         \"<option value='opt-out'>Opt-out (all users not specified below receive e-mails)</option>\"\n    +         \"<option value='opt-in'>Opt-in (only users specified below receive e-mails)</option>\"\n    +       \"</select>\"\n    +     \"</td></tr>\"\n    +     \"<tr><td class=key>Users:</td><td class=value>\"\n    +       \"<input id='users'>\"\n    +     \"</td></tr>\"\n    +   \"</table>\"\n    + \"</div>\");\n\n  if (recipients_mode == \"opt-out\")\n    names = Object.keys(recipients_excluded);\n  else\n    names = Object.keys(recipients_included);\n\n  recipient_list_dialog.find(\"#mode\").val(recipients_mode);\n  recipient_list_dialog.find(\"#users\").val(names.join(\", \"));\n\n  function save()\n  {\n    recipients_mode = recipient_list_dialog.find(\"#mode\").val();\n    recipients_included = {};\n    recipients_excluded = {};\n\n    var users = recipient_list_dialog.find(\"#users\").val().split(/[\\s,]+/g);\n    for (var index = 0; index < users.length; ++index)\n    {\n      var name = users[index];\n      if (name)\n        if (recipients_mode == \"opt-in\")\n          recipients_included[name] = true;\n        else\n          recipients_excluded[name] = true;\n    }\n\n    var mode;\n\n    if (recipients_mode == \"opt-in\")\n      if (Object.keys(recipients_included).length != 0)\n      {\n        mode = \"No-one except \";\n        users = Object.keys(recipients_included);\n      }\n      else\n        mode = \"No-one at all\";\n    else\n      if (Object.keys(recipients_excluded).length != 0)\n      {\n        mode = \"Everyone except \";\n        users = Object.keys(recipients_excluded);\n      }\n      else\n        mode = \"Everyone\";\n\n    $(\"span.mode\").text(mode);\n\n    if (users)\n      $(\"span.users\").text(users.join(\", \"));\n\n    recipient_list_dialog.dialog(\"close\");\n  }\n\n  function cancel()\n  {\n    recipient_list_dialog.dialog(\"close\");\n  }\n\n  function handleKeypress(ev)\n  {\n    if (ev.keyCode == 13)\n      save();\n  }\n\n  recipient_list_dialog.find(\"#users\").keypress(handleKeypress);\n\n  recipient_list_dialog.dialog({ width: 620,\n                                 modal: true,\n                                 buttons: { Save: save, Cancel: cancel }});\n\n  function enableAutoCompletion(result)\n  {\n    recipient_list_dialog.find(\"#users\").autocomplete(\n      { source: AutoCompleteUsers(result.users) });\n  }\n\n  var operation = new Operation({ action: \"get auto-complete data\",\n                                  url: \"getautocompletedata\",\n                                  data: { values: [\"users\"] },\n                                  callback: enableAutoCompletion });\n\n  operation.execute();\n}\n\nfunction connectApplyFilters()\n{\n  $(\"tr.applyfilters\").click(function (ev)\n    {\n      if (ev.target.nodeName.toLowerCase() != \"input\")\n      {\n        var checkbox = $(ev.currentTarget).find(\"input\");\n        checkbox.get(0).checked = !checkbox.get(0).checked;\n        updateReviewersAndWatchers();\n      }\n    });\n\n  $(\"tr.applyfilters input\").click(function (ev)\n    {\n      updateReviewersAndWatchers();\n    });\n}\n\n$(document).ready(function ()\n  {\n    connectApplyFilters();\n\n    $(\".repository-select\")\n      .change(\n        function ()\n        {\n          var name = $(this).val();\n\n          if (default_remotes[name])\n            $(\"input.remote\").val(default_remotes[name]);\n\n          if (default_branches[name])\n            $(\"input.upstreamcommit\").val(default_branches[name] ? \"refs/heads/\" + default_branches[name] : \"\");\n        })\n      .chosen({ inherit_select_classes: true });\n\n    function getCurrentRemote()\n    {\n      var remote = $(\"input.remote\").val();\n      if (!remote)\n        return undefined;\n      return remote.trim();\n    }\n\n    var input_workbranch = $(\"input.workbranch\");\n\n    input_workbranch.autocomplete({ source: AutoCompleteRef(getCurrentRemote, \"refs/heads/\"), html: true });\n    input_workbranch.keypress(\n      function (ev)\n      {\n        if (ev.keyCode == 13)\n          $(\"button.fetchbranch\").click();\n      });\n\n    var input_upstreamcommit = $(\"input.upstreamcommit\");\n\n    input_upstreamcommit.autocomplete({ source: AutoCompleteRef(), html: true });\n\n    $(\"button.fetchbranch\").click(\n      function ()\n      {\n        var branch = $(\"input.workbranch\").val().trim();\n        var upstream = $(\"input.upstreamcommit\").val().trim();\n\n        if (!branch)\n        {\n          showMessage(\"Invalid input!\", \"Invalid input!\", \"Please provide a non-empty branch name.\");\n          return;\n        }\n\n        if (!upstream)\n        {\n          showMessage(\"Invalid input!\", \"Invalid input!\", \"Please provide a non-empty upstream commit reference.\");\n          return;\n        }\n\n        function finish(result)\n        {\n          if (result)\n            location.href = (\"/createreview\" +\n                             \"?repository=\" + encodeURIComponent($(\"select.repository-select\").val().trim()) +\n                             \"&commits=\" + encodeURIComponent(result.commit_ids) +\n                             \"&remote=\" + encodeURIComponent(getCurrentRemote()) +\n                             \"&branch=\" + encodeURIComponent(branch) +\n                             \"&upstream=\" + encodeURIComponent(upstream) +\n                             \"&reviewbranchname=\" + encodeURIComponent(branch));\n        }\n\n        var operation = new Operation({ action: \"fetch remote branch\",\n                                        url: \"fetchremotebranch\",\n                                        data: { repository_name: $(\"select.repository-select\").val().trim(),\n                                                remote: getCurrentRemote(),\n                                                branch: branch,\n                                                upstream: upstream },\n                                        wait: \"Fetching branch...\",\n                                        callback: finish });\n\n        operation.execute();\n      });\n\n    if (getCurrentRemote())\n      input_workbranch.focus();\n    else\n      $(\"input.remote\").focus();\n  });\n"
  },
  {
    "path": "src/resources/createuser.css",
    "content": "/* -*- mode: css; indent-tabs-mode: nil -*-\n\n Copyright 2014 the Critic contributors, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\ntable.createuser {\n    margin-top: 1em;\n    padding: 1em;\n    border: 1px solid #cca;\n    border-radius: 2px;\n    background-color: #fffff4\n}\n\ntable.createuser tr.header td {\n    text-align: left;\n    padding-left: 20%;\n    padding-bottom: 1rem;\n    font-family: sans-serif;\n    font-size: 120%;\n    text-decoration: underline\n}\n\ntable.createuser td.value a.external {\n    font-family: monospace;\n    background-color: white;\n    border: 1px solid #cca;\n    padding: 2px 6px\n}\n\ntable.createuser tr.button td {\n    padding-top: 1rem\n}\n\ntable.createuser tr.separator1 td {\n    border-bottom: 1px solid #cca;\n    padding-top: 1em\n}\n\ntable.createuser tr.separator2 td {\n    padding-top: 1em\n}\n\ntable.createuser tr.status td {\n    color: red;\n    font-weight: bold;\n    text-align: center;\n    padding: 0 0 1em 1em;\n}\n\ntable.createuser tr.status td > .message {\n    max-width: 40em;\n    text-align: left\n}\n\ntable.createuser tr.status.disabled {\n    display: none\n}\n\ntable.createuser td.key {\n    padding-left: 2em;\n    padding-right: 0.5em;\n    font-weight: bold;\n    text-align: right\n}\n\ntable.createuser td.value {\n    font-family: monospace;\n    padding-left: 0.5em;\n    padding-right: 2em;\n    text-align: left\n}\n\ntable.createuser td.value input {\n    font-family: monospace\n}\n\ntable.createuser tr.button td {\n    text-align: center\n}\n"
  },
  {
    "path": "src/resources/createuser.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2014 the Critic contributors, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\nfunction createUser() {\n  var data = {\n    username: $(\"#newusername\").val().trim(),\n    fullname: $(\"#fullname\").val().trim(),\n    email: $(\"#email\").val().trim()\n  };\n\n  if (external) {\n    data.external = external;\n  } else {\n    var password1 = $(\"#password1\").val();\n    var password2 = $(\"#password2\").val();\n\n    if (password1 != password2) {\n      showMessage(\"Invalid input\",\n                  \"Password mismatch!\",\n                  \"The password must be input twice.\");\n      return;\n    }\n\n    data[\"password\"] = password1;\n  }\n\n  var operation = new Operation({\n    action: \"create user\",\n    url: \"registeruser\",\n    data: data\n  });\n\n  var result = operation.execute();\n  if (result) {\n    if (result.message) {\n      $(\".status\").removeClass(\"disabled\").find(\".message\").html(result.message);\n      if (result.focus)\n        $(result.focus).select().focus();\n    } else if (typeof target != \"undefined\") {\n      location.replace(target);\n    } else {\n      location.replace(\"/\");\n    }\n  }\n}\n\n$(function () {\n  if ($(\".status .message\").text().trim())\n    $(\".status\").removeClass(\"disabled\");\n\n  $(\".create\").click(createUser);\n});\n"
  },
  {
    "path": "src/resources/dashboard.css",
    "content": "/* -*- mode: css; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\nbody { font-size: 12px; font-family: sans-serif }\n\ndiv.main table.reviews td.h2 a {\n    font-size: 50%;\n    margin-left: 1em\n}\n\ndiv.main table.reviews tr.review:hover {\n    background: #eec\n}\n\ndiv.main table.reviews tr.review td {\n    padding-top: 3px;\n    padding-bottom: 3px\n}\n\ndiv.main table.reviews tr.review td.name {\n    font-weight: bold\n}\n\ndiv.main table.reviews tr.review td.lines,\ndiv.main table.reviews tr.review td.comments {\n    text-align: right\n}\n\na.repository, a.branch {\n    color: #222;\n    text-decoration: none\n}\n\na.branch {\n    white-space: nowrap\n}\n"
  },
  {
    "path": "src/resources/dashboard.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\nfunction markChainsAsRead(review_ids)\n{\n  var operation = new Operation({ action: \"mark comments as read\",\n                                  url: \"markchainsasread\",\n                                  data: { review_ids: review_ids },\n                                  callback: function (result) { if (result) location.reload(); } });\n\n  operation.execute();\n}\n\n$(document).ready(function ()\n  {\n    $(\"h1[title], h2[title]\").tooltip({\n      items: \"h1[title], h2[title]\"\n    });\n\n    $(\"div.main\").sortable({\n        handle: \"td.h1\",\n        stop: function ()\n          {\n            if (typeof history.replaceState == \"function\")\n            {\n              var items = [];\n              $(\"div.main > table.reviews\").each(function (index, element)\n                {\n                  items.push(element.id);\n                });\n\n              var href = location.href.replace(/(?:([?&]show=)[^&#]+|$)/, function (all, group1) { return (group1 ? group1 : \"?show=\") + items; });\n\n              history.replaceState(null, document.title, href);\n            }\n          }\n      });\n  });\n"
  },
  {
    "path": "src/resources/diff.css",
    "content": "/* -*- mode: css; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\n/* The coloring of diffs (deleted => red, inserted => green, et.c.)\n\n   Basic line HTML:\n\n     <tr class=\"line [type]\">\n       ...\n       <td class=\"line old\">\n         [old code]\n       </td>\n       ...\n       <td class=\"line new\">\n         [new code]\n       </td>\n       ...\n     </tr>\n\n   The [type] is one of the following:\n\n     context    => a line that wasn't changed\n     whitespace => line with only white-space changes\n     replaced   => changed line where old and new version has little in common\n     modified   => changed line with inter-line diff\n     inserted   => line that only exists in new version ([old code] is empty)\n     deleted    => line that only exists in old version ([new code] is empty)\n\n*/\n\n/* White background color for context lines, the left side where the right side\n   is an inserted line and the right side where the left side is a deleted line;\n   and slightly darker white when hovered. */\ntr.line.context    > td.line,\ntr.line.inserted   > td.line.old,\ntr.line.deleted    > td.line.new {\n    background-color: #fff\n}\ntr.line.context:hover    > td.line {\n    background-color: #eee\n}\n\n/* Red background color on the left side of deleted or replaced lines; and\n   slightly darker red when hovered. */\ntr.line.deleted  > td.line.old,\ntr.line.replaced > td.line.old {\n    background-color: #fdd\n}\ntr.line.deleted:hover  > td.line.old,\ntr.line.replaced:hover > td.line.old {\n    background-color: #ecc\n}\n\n/* Green background color on the left side of inserted or replaced lines; and\n   slightly darker green when hovered. */\ntr.line.inserted > td.line.new,\ntr.line.replaced > td.line.new {\n    background-color: #dfd\n}\ntr.line.inserted:hover > td.line.new,\ntr.line.replaced:hover > td.line.new {\n    background-color: #cec\n}\n\n/* Yellowish background color on both sides of modified lines; and slightly\n   darker yellowish when hovered. */\ntr.line.modified > td.line {\n    background-color: #ffffe6\n}\ntr.line.modified:hover > td.line {\n    background-color: #eec\n}\n\n/* White background color for white-space change lines and slightly darker white\n   when hovered.  This overrides the 'modified' style which otherwise also\n   applies to white-space lines. */\ntr.line.whitespace > td.line {\n    background-color: #fff\n}\ntr.line.whitespace:hover > td.line {\n    background-color: #eee\n}\n\n/* Blue background color on left side of conflict markers; and slightly darker\n   blue when hovered. */\ntr.line.conflict > td.line.old {\n    background-color: #8af !important\n}\ntr.line.conflict:hover > td.line.old {\n    background-color: #79e !important\n}\n\n\n/* Inter-line diff highlight is done using i (italics) elements, with the\n   classes:\n\n   r   => replaced (both sides)\n   d   => deleted  (left side only)\n   i   => inserted (right side only)\n\n   Additionally, to represent changes in line ending on last line:\n\n   eol => marker element signaling missing line break\n*/\n\n/* We don't actually want italics: */\ntr.line i {\n    font-style: normal\n}\n\n/* Slightly darker color for replaced/edited portions of the line (and slightly\n   darker still when hovered): */\ntr.line          td.old i.r,\ntr.line.modified        i.d {\n    background-color: #fdd\n}\ntr.line:hover          td.old i.r,\ntr.line.modified:hover        i.d {\n    background-color: #ecc\n}\ntr.line          td.new i.r,\ntr.line.modified        i.i {\n    background-color: #dfd\n}\ntr.line:hover          td.new i.r,\ntr.line.modified:hover        i.i {\n    background-color: #cec\n}\n\n/* On line with white-space changes, render deleted white-space in red (the line\n   will otherwise be rendered as a context line.) */\ntr.line.whitespace i.d {\n    background-color: #fdd\n}\ntr.line.whitespace:hover i.d {\n    background-color: #ecc\n}\n\n/* On line with white-space changes, render inserted white-space in green (the\n   line will otherwise be rendered as a context line.) */\ntr.line.whitespace i.i {\n    background-color: #dfd\n}\ntr.line.whitespace:hover i.i {\n    background-color: #cec\n}\n\n/* This element will contain the text \"[missing linebreak]\" and be added last on\n   the last line of the file if it was changed and is missing a linebreak. */\ntr.line i.eol {\n    font-weight: bold;\n    float: right;\n    padding-right: 1em\n}\n"
  },
  {
    "path": "src/resources/editresource.css",
    "content": "/* -*- mode: css; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\ndiv.main table.paleyellow tr.select td.heading {\n    font-family: serif;\n    font-weight: bold;\n    text-align: right;\n    vertical-align: top\n}\n\ndiv.main table.paleyellow tr.select td.value {\n    vertical-align: top\n}\n\ndiv.main table.paleyellow tr.help td.help {\n    font-style: italic;\n    text-align: right;\n    border-bottom: 1px solid #cca\n}\n\ndiv.main table.paleyellow tr.value td.value {\n    padding-top: 1em;\n    padding-bottom: 1em\n}\n\ndiv.main table.paleyellow tr.value td.value textarea {\n    width: 100%\n}\n"
  },
  {
    "path": "src/resources/editresource.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\nfunction saveResource()\n{\n  var source = $(\"textarea\").first().val();\n\n  var operation = new Operation({ action: \"save resource\",\n                                  url: \"storeresource\",\n                                  data: { name: resource_name,\n                                          source: source },\n                                  wait: \"Saving changes...\" });\n\n  if (operation.execute())\n    original_source = source;\n}\n\nfunction resetResource()\n{\n  function proceed()\n  {\n    var operation = new Operation({ action: \"reset resource\",\n                                    url: \"resetresource\",\n                                    data: { name: resource_name }});\n\n    return operation.execute() != null;\n  }\n\n  var content = $(\"<div title='Confirm'><p><b>Are you sure you want to stop using your edited resource?</b></p><p>Note that you will be able to switch back to your current edited version later on, unless you save another edited version.</p></div>\");\n\n  content.dialog({ modal: true,\n                   width: 600,\n                   buttons: { \"Reset to built-in version\": function () { if (proceed()) { content.dialog(\"close\"); location.reload(); }},\n                              \"Keep edited version\": function () { content.dialog(\"close\"); }}});\n}\n\nfunction restoreResource()\n{\n  var operation = new Operation({ action: \"restore resource\",\n                                  url: \"restoreresource\",\n                                  data: { name: resource_name }});\n\n  if (operation.execute())\n    location.reload();\n}\n\nfunction switchResource(name)\n{\n  if (name && name != resource_name)\n  {\n    function switchNow()\n    {\n      location.replace(\"/editresource?name=\" + name);\n    }\n\n    $(\"select\").val(resource_name);\n\n    if (resource_name && $(\"textarea\").val() != original_source)\n    {\n      var content = $(\"<div title='Save First?'><p>You have edited this resource.  Do you want to save it before selecting another resource?</p></div>\");\n\n      content.dialog({ modal: true,\n                       width: 600,\n                       buttons: { \"Save and switch\": function () { if (saveResource()) { content.dialog(\"close\"); switchNow(); } },\n                                  \"Don't switch\": function () { content.dialog(\"close\"); },\n                                  \"Switch without saving\": function () { content.dialog(\"close\"); switchNow(); }}});\n    }\n    else\n      switchNow();\n  }\n  else\n    $(\"select\").val(resource_name);\n}\n\n$(document).ready(function ()\n  {\n    $(\"tr.select td.value select\").change(function (ev)\n      {\n        switchResource(ev.target.value);\n      });\n\n    $(\"button.save\").click(saveResource);\n    $(\"button.reset\").click(resetResource);\n    $(\"button.restore\").click(restoreResource);\n  });\n"
  },
  {
    "path": "src/resources/filterchanges.css",
    "content": "/* -*- mode: css; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\ndiv.main table.filter {\n    table-layout: fixed\n}\n\ndiv.main table.filter tr.header td.button {\n    vertical-align: top\n}\n\ndiv.main table.filter tr.footer td.spacer {\n    padding-top: 1em\n}\n\ndiv.main table.filter tr.footer td.button {\n    border-top: 1px solid #cca\n}\n\ndiv.main table.filter tr.footer td {\n    padding-top: 1em\n}\n\ndiv.main table.filter td.button {\n    text-align: right\n}\n\ndiv.main table.filter tr.headings td {\n    font-weight: bold\n}\n\ndiv.main table.filter tr.headings td {\n    padding-top: 1em\n}\n\ndiv.main table.filter tr td.select {\n    text-align: right;\n    padding-right: 1em\n}\n\ndiv.main table.filter tr td.path {\n    white-space: pre;\n    font-family: monospace\n}\n\ndiv.main table.filter tr td.right {\n    text-align: right\n}\n\ndiv.main table.filter tr td.help {\n    font-style: italic;\n    text-align: right;\n    border-bottom: 1px solid #cca\n}\n"
  },
  {
    "path": "src/resources/filterchanges.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\nfunction checkDirectory(line)\n{\n  var level = parseInt(line.getAttribute(\"critic-level\"));\n  var all_checked = true;\n  var dirline = line;\n\n  $(line).nextAll(\"tr\").each(function (index, line)\n    {\n      if (parseInt(line.getAttribute(\"critic-level\")) <= level)\n        return false;\n\n      if (line.className == \"file\")\n        $(line).find(\"input\").each(function () { if (!this.checked) all_checked = false; });\n\n      return all_checked;\n    });\n\n  $(line).find(\"input\").each(function () { this.checked = all_checked; });\n}\n\nfunction checkDirectories()\n{\n  $(\"tr.directory\").each(function (index, line)\n    {\n      checkDirectory(line);\n    });\n}\n\n$(document).ready(function ()\n  {\n    $(\"tr.directory\").click(function (ev)\n      {\n        var line = $(ev.currentTarget);\n        var level = parseInt(line.attr(\"critic-level\"));\n        var checkbox = line.find(\"input\");\n\n        var on;\n        if (ev.target.nodeName.toLowerCase() != \"input\")\n        {\n          checkbox.each(function () { on = this.checked = !this.checked; });\n          ev.preventDefault();\n        }\n        else\n          checkbox.each(function () { on = this.checked; });\n\n        line.nextAll(\"tr\").each(function (index, line)\n          {\n            if (parseInt(line.getAttribute(\"critic-level\")) <= level)\n              return false;\n\n            $(line).find(\"input\").each(function () { this.checked = on; });\n          });\n\n        line.prevAll(\"tr.directory\").each(function (index, line)\n          {\n            var line_level = parseInt(this.getAttribute(\"critic-level\"));\n            if (line_level < level)\n            {\n              if (!ev.currentTarget.checked)\n                $(line).find(\"input\").each(function () { this.checked = false; });\n              else\n                checkDirectory(line);\n\n              level = line_level;\n            }\n          });\n      });\n\n    $(\"tr.file\").click(function (ev)\n      {\n        var line = $(ev.currentTarget);\n        var level = parseInt(line.attr(\"critic-level\"));\n\n        if (ev.target.nodeName.toLowerCase() != \"input\")\n        {\n          $(ev.currentTarget).find(\"input\").each(function () { this.checked = !this.checked; });\n          ev.preventDefault();\n        }\n\n        line.prevAll(\"tr.directory\").each(function (index, line)\n          {\n            var line_level = parseInt(this.getAttribute(\"critic-level\"));\n            if (line_level < level)\n            {\n              if (!ev.currentTarget.checked)\n                $(line).find(\"input\").each(function () { this.checked = false; });\n              else\n                checkDirectory(line);\n\n              level = line_level;\n            }\n          });\n      });\n\n    $(\"button.display\").click(function ()\n      {\n        var files = [];\n\n        $(\"tr.file\").each(function (index, line)\n          {\n            var selected;\n\n            $(line).find(\"input\").each(function () { selected = this.checked; });\n\n            if (selected)\n              files.push(line.getAttribute(\"critic-file-id\"));\n          });\n\n        if (files.length != 0)\n          if (commitRange)\n            location.href = \"/showcommit?review=\" + review.id + \"&first=\" + commitRange.first + \"&last=\" + commitRange.last + \"&filter=files&file=\" + files.join(\",\");\n          else\n            location.href = \"/showcommit?review=\" + review.id + \"&filter=files&file=\" + files.join(\",\");\n        else\n          alert(\"No files selected!\");\n      });\n  });\n\nkeyboardShortcutHandlers.push(function (key)\n  {\n    if (key == \"g\".charCodeAt(0))\n    {\n      $(\"button.display\").click();\n      return true;\n    }\n    else if (key == \"a\".charCodeAt(0))\n    {\n      $(\"tr.directory[critic-level=-1]\").click();\n      return true;\n    }\n  });\n"
  },
  {
    "path": "src/resources/home.css",
    "content": "/* -*- mode: css; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\nbody { font-size: 12px; font-family: sans-serif }\n\ndiv.main table td.repositories { text-align: right; border-bottom: 1px solid #cca }\n\ndiv.main table.basic tr.line td { padding-top: 0.5em; vertical-align: baseline }\ndiv.main table.basic td.heading { font-family: serif; font-weight: bold; text-align: right; width: 20% }\ndiv.main table.basic td.value { font-family: monospace; }\ndiv.main table.basic td.value input.value { font-family: monospace; width: 32em }\ndiv.main table.basic td.value .buttons { float: right }\ndiv.main table.basic td.value .status { padding-left: 1em; font-weight: bold }\ndiv.main table.basic td.help { font-style: italic; text-align: right; border-bottom: 1px solid #cca }\n\ndiv.main table.basic tr.email td.value.multiple {\n    vertical-align: top\n}\n\ndiv.main table.basic td.value .addresses {\n    display: inline-block\n}\n\ndiv.main table.basic td.value .addresses label {\n    margin: 2px 0\n}\n\n.addresses .address input {\n    margin: 0 0.5rem 0 0\n}\n\n.addresses .address.selected .value {\n    font-weight: bold\n}\n\n.addresses .address:first-child {\n    margin-top: 0\n}\n\n.addresses .address:last-child {\n    margin-bottom: 0\n}\n\n.addresses .actions {\n    padding-left: 1rem;\n    margin-left: auto\n}\n\n.addresses .actions .action {\n    padding-left: 0.4rem;\n    text-decoration: none;\n    color: black\n}\n\n.addresses .actions a.action:hover {\n    text-decoration: underline\n}\n\n.addresses .actions .verified {\n    color: green\n}\n\n.addresses .actions .verified.now {\n    font-weight: bold\n}\n\n.addresses .actions .unverified {\n    color: red\n}\n\ndiv.main table.basic td.value a.external {\n    background-color: white;\n    border: 1px solid #cca;\n    padding: 2px 6px\n}\n\ntable.filters {\n    width: 90%;\n    margin-top: 1rem;\n}\n\ntable.filters tbody th { padding-top: 1em; }\ntable.filters tbody:first-child th { padding-top: 0; }\n\ntable.filters td.path,\ntable.filters td.delegates,\ntable.filters td.files {\n    font-family: monospace;\n}\ntable.filters td.files,\ntable.filters td.links {\n    text-align: right;\n}\n\ntable.filters td.path { width: 25%; }\ntable.filters td.delegates { width: 35%; }\ntable.filters td.title { width: 20%; }\ntable.filters td.data { width: 20%; font-family: monospace }\ntable.filters td.files { width: 10%; }\ntable.filters td.links { width: 25%; }\n\ntable.filters td.links a {\n    margin-left: 0.5em;\n    font-weight: bold;\n}\ntable.filters a {\n    color: #222;\n    text-decoration: none\n}\ntable.filters a:hover {\n    text-decoration: underline\n}\n\ndiv.password input {\n    width: 100%;\n}\n\ndiv.filterdialog select {\n    width: 100%\n}\n\ndiv.filterdialog input[type='text'] {\n    font-family: monospace;\n    width: 100%\n}\n\ndiv.filterdialog span.matchedfiles {\n    float: right;\n    font-style: italic\n}\n\ndiv.filterdialog span.matchedfiles.clickable:hover {\n    text-decoration: underline\n}\n\ndiv.reapplyresult th {\n    text-align: left\n}\n\ndiv.reapplyresult td.id {\n    text-align: right;\n    padding-right: 0.5em;\n    font-family: monospace\n}\n\ndiv.reapplyresult td.summary {\n    font-family: monospace;\n    background-color: #fffff4;\n    padding: 3px 1em 3px 1em;\n    border-left: 1px dotted #cca;\n    border-bottom: 1px dotted #cca;\n    border-right: 1px dotted #cca\n}\n\ndiv.reapplyresult tr.first td.summary {\n    border-top: 1px dotted #cca\n}\n\ndiv.matchedfiles select {\n    width: 99%\n}\n\ndiv.unverified-dialog .address {\n    font-family: monospace\n}\n\ndiv.add-email-dialog input {\n    width: 100%;\n    font-family: monospace;\n}\n"
  },
  {
    "path": "src/resources/home.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\nfunction hasClass(element, cls)\n{\n  return new RegExp(\"(^|\\\\s)\" + cls + \"($|\\\\s)\").test(element.className)\n}\nfunction addClass(element, cls)\n{\n  if (!hasClass(element, cls))\n    element.className += \" \" + cls;\n}\nfunction removeClass(element, cls)\n{\n  if (hasClass(element, cls))\n    element.className = element.className.replace(new RegExp(\"(^|\\\\s+)\" + cls + \"($|(?=\\\\s))\"), \"\");\n}\n\nfunction saveFullname()\n{\n  var input = $(\"#user_fullname\");\n  var status = $(\"#status_fullname\");\n\n  var value = input.val().trim();\n\n  if (value == user.displayName)\n    status.text(\"Value not changed\");\n  else if (!value)\n    status.text(\"Empty name not saved\");\n  else\n  {\n    var operation = new Operation({ action: \"save changes\",\n                                    url: \"setfullname\",\n                                    data: { user_id: user.id,\n                                            value: value }});\n\n    if (operation.execute())\n    {\n      status.text(\"Value saved\");\n      user.displayName = value;\n    }\n  }\n}\n\nfunction resetFullname()\n{\n  var input = $(\"#user_fullname\");\n  var status = $(\"#status_fullname\");\n\n  input.val(user.displayName);\n  status.text(\"\");\n}\n\nfunction saveGitEmails()\n{\n  var input = $(\"#user_gitemails\");\n  var status = $(\"#status_gitemails\");\n\n  var value = input.val().trim();\n\n  if (value == user.gitEmails)\n    status.text(\"Value not changed\");\n  else if (!value)\n    status.text(\"Empty name not saved\");\n  else\n  {\n    var operation = new Operation({ action: \"save changes\",\n                                    url: \"setgitemails\",\n                                    data: { subject_id: user.id,\n                                            value: value.split(/,\\s*|\\s+/g) }});\n\n    if (operation.execute())\n    {\n      status.text(\"Value saved\");\n      user.gitEmails = value;\n    }\n  }\n}\n\nfunction resetGitEmails()\n{\n  var input = $(\"#user_gitemails\");\n  var status = $(\"#status_gitemails\");\n\n  input.val(user.gitEmails);\n  status.text(\"\");\n}\n\nfunction setPassword()\n{\n  var dialog = $(\"<div class=password title='Set password'>\"\n               +   \"<p><b>New password:</b><br>\"\n               +     \"<input id=newpw1 type=password>\"\n               +   \"</p>\"\n               +   \"<p><b>New password, again:</b><br>\"\n               +     \"<input id=newpw2 type=password>\"\n               +   \"</p>\"\n               + \"</div>\");\n\n  function save()\n  {\n    var newpw1 = $(\"#newpw1\").val();\n    var newpw2 = $(\"#newpw2\").val();\n\n    if (newpw1 != newpw2)\n    {\n      showMessage(\"Invalid input\",\n                  \"New password mismatch!\",\n                  \"The new password must be input twice.\");\n      return;\n    }\n\n    var operation = new Operation({ action: \"set password\",\n                                    url: \"changepassword\",\n                                    data: { subject: user.id,\n                                            new_pw: newpw1 }});\n\n    if (operation.execute())\n    {\n      dialog.dialog(\"close\");\n      showMessage(\"Success\", \"Password set!\", null, function () { location.reload(); });\n    }\n  }\n\n  function cancel()\n  {\n    dialog.dialog(\"close\");\n  }\n\n  dialog.find(\"input\").keypress(\n    function (ev)\n    {\n      if (ev.keyCode == 13)\n      {\n        if ($(\"#newpw1\").is(\":focus\"))\n          $(\"#newpw2\").focus();\n        else if ($(\"#newpw2\").is(\":focus\"))\n          save();\n      }\n    });\n\n  dialog.dialog({ width: 400,\n                  modal: true,\n                  buttons: { \"Save\": save, \"Cancel\": cancel }});\n\n  $(\"newpw1\").focus();\n}\n\nfunction changePassword()\n{\n  var dialog = $(\"<div class=password title='Change password'>\"\n               +   \"<p><b>Current password:</b><br>\"\n               +     \"<input id=currentpw type=password>\"\n               +   \"</p>\"\n               +   \"<p><b>New password:</b><br>\"\n               +     \"<input id=newpw1 type=password>\"\n               +   \"</p>\"\n               +   \"<p><b>New password, again:</b><br>\"\n               +     \"<input id=newpw2 type=password>\"\n               +   \"</p>\"\n               + \"</div>\");\n\n  function save()\n  {\n    var currentpw = $(\"#currentpw\").val();\n    var newpw1 = $(\"#newpw1\").val();\n    var newpw2 = $(\"#newpw2\").val();\n\n    if (newpw1 != newpw2)\n    {\n      showMessage(\"Invalid input\",\n                  \"New password mismatch!\",\n                  \"The new password must be input twice.\");\n      return;\n    }\n\n    var operation = new Operation({ action: \"change password\",\n                                    url: \"changepassword\",\n                                    data: { subject: user.id,\n                                            current_pw: currentpw,\n                                            new_pw: newpw1 } });\n\n    if (operation.execute())\n    {\n      dialog.dialog(\"close\");\n      showMessage(\"Success\", \"Password changed!\");\n    }\n  }\n\n  function cancel()\n  {\n    dialog.dialog(\"close\");\n  }\n\n  dialog.find(\"input\").keypress(\n    function (ev)\n    {\n      if (ev.keyCode == 13)\n      {\n        if ($(\"#currentpw\").is(\":focus\"))\n          $(\"#newpw1\").focus();\n        else if ($(\"#newpw1\").is(\":focus\"))\n          $(\"#newpw2\").focus();\n        else if ($(\"#newpw2\").is(\":focus\"))\n          save();\n      }\n    });\n\n  dialog.dialog({ width: 400,\n                  modal: true,\n                  buttons: { \"Save\": save, \"Cancel\": cancel }});\n\n  $(\"#currentpw\").focus();\n}\n\nfunction ModificationChecker(current, input, status)\n{\n  var is_modified_last = false;\n\n  setInterval(\n    function ()\n    {\n      var is_modified_now = input.val() != current();\n\n      if (is_modified_last != is_modified_now)\n      {\n        status.text(is_modified_now ? \"Modified\" : \"\");\n        input.nextAll(\"button\").prop(\"disabled\", !is_modified_now);\n      }\n    }, 100);\n}\n\nfunction showUnverifiedAddressDialog(ev) {\n  ev.preventDefault();\n\n  var context = $(this).closest(\".address\");\n  var address = context.find(\".value\").text();\n  var content = $(\"<div class='unverified-dialog' title='Unverified email address'>\" +\n                  \"<p>The address <span class='address inset'>\" + htmlify(address) +\n                  \"</span> needs to be verified as valid and in your control. \" +\n                  \"A verification email has been sent to the address already \" +\n                  \"and should arrive shortly.</p>\" +\n                  \"<p>If you suspect it has been lost in transit, you can \" +\n                  \"request another one.</p>\" +\n                  \"</div>\");\n\n  function sendVerificationEmail() {\n    var operation = new Operation({\n      action: \"request verification email\",\n      url: \"requestverificationemail\",\n      data: {\n        email_id: context.data(\"email-id\")\n      }\n    });\n\n    if (operation.execute())\n      content.dialog(\"close\");\n  }\n\n  function close() {\n    content.dialog(\"close\");\n  }\n\n  content.dialog({\n    modal: true,\n    width: 600,\n    buttons: {\n      \"Send verification email\": sendVerificationEmail,\n      \"Close\": close\n    }\n  });\n}\n\nfunction showDeleteAddressDialog(ev) {\n  ev.preventDefault();\n\n  var context = $(this).closest(\".address\");\n  var is_current = context.is(\".selected\");\n\n  function deleteAddress(dialog) {\n    var operation = new Operation({\n      action: \"delete address\",\n      url: \"deleteemailaddress\",\n      data: {\n        email_id: context.data(\"email-id\")\n      }\n    });\n\n    if (operation.execute()) {\n      if (dialog)\n        dialog.dialog(\"close\");\n      location.reload();\n    }\n  }\n\n  if (is_current) {\n    if (context.closest(\".addresses\").children(\".address\").size() > 1) {\n      showMessage(\"Not allowed\", \"Will not delete current address\",\n                  \"This email address is your current address.  Please select \" +\n                  \"one of the other addresses as your current address before \" +\n                  \"deleting it.\");\n      return;\n    } else {\n      var content =\n        $(\"<div class='delete-current-dialog' title='Delete current address?'>\" +\n          \"<p>Deleting your current email address means Critic will \" +\n          \"stop sending emails to you.  Are you sure you want that?</p>\" +\n          \"</div>\");\n\n      content.dialog({\n        modal: true,\n        width: 600,\n        buttons: {\n          \"Delete address\": function () {\n            deleteAddress(content);\n          },\n          \"Do nothing\": function () {\n            content.dialog(\"close\");\n          }\n        }\n      });\n    }\n  } else {\n    deleteAddress();\n  }\n}\n\nfunction showSelectEmailAddressDialog(ev) {\n  var context = $(this).closest(\".address\");\n  var is_unverified = context.find(\".unverified\").size() != 0;\n\n  function selectAddress(dialog) {\n    var operation = new Operation({\n      action: \"select address\",\n      url: \"selectemailaddress\",\n      data: {\n        email_id: context.data(\"email-id\")\n      }\n    });\n\n    if (operation.execute()) {\n      context.closest(\".addresses\")\n        .find(\".address\").not(context).removeClass(\"selected\");\n      context.addClass(\"selected\");\n      context.find(\"input\").prop(\"checked\", true);\n\n      if (dialog)\n        dialog.dialog(\"close\");\n    }\n  }\n\n  if (is_unverified) {\n    var content =\n      $(\"<div class='select-unverified-dialog' title='Select unverified address?'>\" +\n        \"<p>Selecting an unverified email address means Critic will stop \" +\n        \"sending emails to you until the address has been verified.  Are you \" +\n        \"sure you want that?</p>\" +\n        \"</div>\");\n\n    content.dialog({\n      modal: true,\n      width: 600,\n      buttons: {\n        \"Select address\": function () {\n          selectAddress(content);\n        },\n        \"Do nothing\": function () {\n          $(\".address.selected input\").prop(\"checked\", true);\n          content.dialog(\"close\");\n        }\n      }\n    });\n\n    ev.preventDefault();\n  } else {\n    selectAddress();\n  }\n}\n\nfunction showAddEmailAddressDialog() {\n  var content =\n    $(\"<div class='add-email-dialog' title='Add primary address'>\" +\n      \"<p>Add a primary email address.  You can have several addresses registered, \" +\n      \"but emails will only be sent to the one that is selected.</p>\" +\n      \"</div>\");\n\n  if (verifyEmailAddresses) {\n    content.append(\"<p>Note that a verification email will be sent to the added \" +\n                   \"email address, containing a link that must be followed before \" +\n                   \"Critic will send any other emails to the address.</p>\");\n  }\n\n  content.append(\"<p><b>Email address:</b><br><input placeholder='user@domain'></p>\");\n\n  function isValidAddress() {\n    var address = content.find(\"input\").val().trim();\n    return /^[^@]+@[^.]+(?:\\.[^.]+)*$/.test(address);\n  }\n\n  function addAddress() {\n    if (!isValidAddress()) {\n      showMessage(\"Invalid email address\", \"Invalid email address\",\n                  \"That does not look like a valid email address. \" +\n                  \"Please try again.\");\n    } else {\n      var operation = new Operation({\n        action: \"add email address\",\n        url: \"addemailaddress\",\n        data: {\n          subject_id: user.id,\n          email: content.find(\"input\").val().trim()\n        }\n      });\n\n      if (operation.execute()) {\n        content.dialog(\"close\");\n        location.reload();\n      }\n    }\n  }\n\n  content.dialog({\n    modal: true,\n    width: 600,\n    buttons: {\n      \"Add address\": function () {\n        addAddress();\n      },\n      \"Do nothing\": function () {\n        content.dialog(\"close\");\n      }\n    }\n  });\n\n  content.find(\"input\").keypress(function (ev) {\n    if (ev.keyCode == 13 && isValidAddress())\n      addAddress();\n  });\n}\n\n$(function ()\n  {\n    var fullname_input = $(\"#user_fullname\");\n    var fullname_status = $(\"#status_fullname\");\n\n    if (fullname_input.size() && fullname_status.size())\n      new ModificationChecker(function () { return user.displayName; }, fullname_input, fullname_status);\n\n    $(\".unverified\").click(showUnverifiedAddressDialog);\n    $(\".delete\").click(showDeleteAddressDialog);\n    $(\".address input\").click(showSelectEmailAddressDialog);\n    $(\".addemail\").click(showAddEmailAddressDialog);\n\n    if (/^\\?email_verified=\\d+/.test(location.search)) {\n      if (typeof history.replaceState == \"function\") {\n        var new_url = \"/home\";\n        var match = /&(.+)$/.exec(location.search);\n        if (match)\n          new_url += \"?\" + match[1];\n        history.replaceState(null, document.title, new_url);\n      }\n    }\n\n    var gitemails_input = $(\"#user_gitemails\");\n    var gitemails_status = $(\"#status_gitemails\");\n\n    if (gitemails_input.size() && gitemails_status.size())\n      new ModificationChecker(function () { return user.gitEmails; }, gitemails_input, gitemails_status);\n  });\n\nfunction deleteFilterById(filter_id)\n{\n  var operation = new Operation({ action: \"delete filter\",\n                                  url: \"deletefilter\",\n                                  data: { filter_id: filter_id }});\n\n  return !!operation.execute();\n}\n\nfunction deleteExtensionHookFilterById(filter_id)\n{\n  var operation = new Operation({ action: \"delete filter\",\n                                  url: \"deleteextensionhookfilter\",\n                                  data: { subject_id: user.id,\n                                          filter_id: filter_id }});\n\n  return !!operation.execute();\n}\n\nfunction editFilter(repository_name, filter_id, filter_type, filter_path, filter_data)\n{\n  function getPaths(prefix, callback)\n  {\n    var repository_name = repository.val();\n\n    if (repository_name)\n    {\n      var operation = new Operation({ action: \"fetch path suggestions\",\n                                      url: \"getrepositorypaths\",\n                                      data: { prefix: prefix,\n                                              repository_name: repository_name },\n                                      callback: function (result)\n                                                {\n                                                  if (result)\n                                                    callback(result.paths, true);\n                                                }});\n      operation.execute();\n      return operation;\n    }\n    else\n      return null;\n  }\n\n  if (typeof no_repositories != \"undefined\")\n  {\n    /* There are no repositories. */\n\n    showMessage(\"Impossible!\", \"No repositories\",\n                (\"There are no repositories in this Critic system, and it is \" +\n                 \"consequently impossible to create filters.  You might want to \" +\n                 \"<a href=/newrepository>add a repository</a>.\"));\n    return;\n  }\n\n  var dialog = $(\"div.hidden > div.filterdialog\").clone();\n\n  dialog.addClass(\"active\");\n\n  if (filter_id)\n    dialog.attr(\"title\", \"Edit Filter\");\n  else\n    dialog.attr(\"title\", \"Add Filter\");\n\n  var repository = dialog.find(\"select[name='repository']\");\n  var type = dialog.find(\"select[name='type']\");\n  var path = dialog.find(\"input[name='path']\");\n  var matchedfiles = dialog.find(\"span.matchedfiles\");\n  var delegates = dialog.find(\"input[name='delegates']\");\n  var apply = dialog.find(\"input[name='apply']\");\n\n  var matchedfiles_repository = null;\n  var matchedfiles_path = null;\n  var matchedfiles_error = null;\n\n  matchedfiles.click(\n    function ()\n    {\n      if (matchedfiles_error)\n        showMessage(\"Error\", \"Invalid pattern!\", matchedfiles_error);\n      else if (matchedfiles_repository && matchedfiles_path)\n        showMatchedFiles(matchedfiles_repository, matchedfiles_path);\n    });\n\n  function updateMatchedFiles()\n  {\n    if (!repository.val())\n      return;\n\n    var repository_value = repository.val();\n    var path_value = path.val().trim();\n\n    function update(result)\n    {\n      if (result)\n      {\n        matchedfiles.text(\"Matches \" + result.count + \" file\" + (result.count == 1 ? \"\" : \"s\"));\n        if (result.count != 0)\n        {\n          matchedfiles.addClass(\"clickable\");\n          matchedfiles_repository = repository_value;\n          matchedfiles_path = path_value;\n        }\n        else\n        {\n          matchedfiles.removeClass(\"clickable\");\n          matchedfiles_repository = matchedfiles_path = null;\n        }\n        matchedfiles_error = null;\n      }\n    }\n\n    function invalid(result)\n    {\n      matchedfiles.text(\"Invalid pattern!\");\n      matchedfiles.addClass(\"clickable\");\n      matchedfiles_repository = matchedfiles_path = null;\n      matchedfiles_error = result.message;\n      return true;\n    }\n\n    if (path_value && path_value != \"/\")\n    {\n      var operation = new Operation({ action: \"count matched files\",\n                                      url: \"countmatchedpaths\",\n                                      data: { single: { repository_name: repository.val(),\n                                                        path: path_value }},\n                                      callback: update,\n                                      failure: { invalidpattern: invalid }});\n\n      operation.execute();\n    }\n    else\n    {\n      matchedfiles.text(\"Matches all files.\");\n      matchedfiles.removeClass(\"clickable\");\n      matchedfiles_repository = matchedfiles_path = null;\n      matchedfiles_error = null;\n    }\n  }\n\n  if (filter_id !== void 0)\n  {\n    repository.val(repository_name);\n    type.val(filter_type);\n    path.val(filter_path);\n\n    updateFilterType().val(filter_data);\n    updateMatchedFiles();\n  }\n  else\n  {\n    type.val(\"reviewer\");\n    path.val(\"\");\n    delegates.val(\"\");\n    delegates.prop(\"disabled\", false);\n  }\n\n  function updateFilterType()\n  {\n    switch (type.val())\n    {\n    case \"reviewer\":\n      dialog.find(\".regular\").show();\n      dialog.find(\".extensionhook\").hide();\n      delegates.prop(\"disabled\", false);\n      return delegates;\n\n    case \"watcher\":\n    case \"ignored\":\n      dialog.find(\".regular\").show();\n      dialog.find(\".extensionhook\").hide();\n      delegates.prop(\"disabled\", true);\n      return $();\n\n    case \"extensionhook\":\n      var key = selectedExtensionHookKey();\n      dialog.find(\".regular, .extensionhook\").hide();\n      dialog.find(\".\" + key).show();\n      return dialog.find(\".\" + key + \" input\");\n    }\n  }\n\n  function selectedExtensionHookKey()\n  {\n    var option = type.find(\":selected\");\n    return option.data(\"extension-id\") + \"_\" + option.data(\"filterhook-name\");\n  }\n\n  function saveExtensionHookFilter(path_value)\n  {\n    var option = type.find(\":selected\");\n\n    var data = { subject_id: user.id,\n                 extension_id: option.data(\"extension-id\"),\n                 repository_name: repository.val(),\n                 filterhook_name: option.data(\"filterhook-name\"),\n                 path: path_value };\n\n    var data_input = dialog.find(\".\" + selectedExtensionHookKey() + \" input\");\n    if (data_input.length)\n      data.data = data_input.val();\n\n    if (filter_id !== void 0)\n      data.replaced_filter_id = filter_id;\n\n    var operation = new Operation({ action: \"save filter\",\n                                    url: \"addextensionhookfilter\",\n                                    data: data });\n    var result = operation.execute();\n\n    if (result)\n    {\n      dialog.dialog(\"close\");\n      location.reload();\n    }\n  }\n\n  function saveFilter()\n  {\n    var type_value = type.val();\n    var path_value = path.val().trim();\n    var delegates_value;\n\n    if (type_value == \"reviewer\")\n      delegates_value = delegates.val().trim().split(/\\s*,\\s*|\\s+/g);\n    else\n      delegates_value = [];\n\n    if (!repository.val())\n    {\n      showMessage(\"Invalid input\", \"No repository selected!\", \"Please select a repository.\",\n                  function () { repository.focus(); });\n      return;\n    }\n\n    if (type_value == \"extensionhook\")\n    {\n      saveExtensionHookFilter(path_value);\n      return;\n    }\n\n    var data = { filter_type: type_value,\n                 path: path_value,\n                 delegates: delegates_value,\n                 repository_name: repository.val() };\n\n    if (filter_id !== void 0)\n      data.replaced_filter_id = filter_id;\n\n    var operation = new Operation({ action: \"save filter\",\n                                    url: \"addfilter\",\n                                    data: data });\n    var result = operation.execute();\n\n    if (result)\n    {\n      var do_apply = apply.is(\":checked\");\n      dialog.dialog(\"close\");\n      if (do_apply)\n        reapplyFilters(result.filter_id, true);\n      else\n        location.reload();\n    }\n  }\n\n  function deleteFilter()\n  {\n    if (deleteFilterById(filter_id))\n    {\n      dialog.dialog(\"close\");\n      location.reload();\n    }\n  }\n\n  function closeDialog()\n  {\n    dialog.dialog(\"close\");\n  }\n\n  var buttons = {};\n\n  buttons[\"Save\"] = saveFilter;\n\n  if (filter_id)\n  {\n    buttons[\"Delete\"] = deleteFilter;\n    buttons[\"Close\"] = closeDialog;\n  }\n  else\n    buttons[\"Cancel\"] = closeDialog;\n\n  dialog.dialog({ width: 600,\n                  modal: true,\n                  buttons: buttons });\n\n  dialog.find(\".repository-select\").chosen({ inherit_select_classes: true });\n  dialog.find(\"select[name='type']\").chosen({ disable_search: true });\n\n  if (!repository.val())\n    repository.focus();\n  else\n    path.focus();\n\n  function handleKeypress(ev)\n  {\n    if (ev.keyCode == 13)\n      saveFilter();\n  }\n\n  path.keypress(handleKeypress);\n  delegates.keypress(handleKeypress);\n\n  path.change(updateMatchedFiles);\n  type.change(updateFilterType);\n\n  path.autocomplete({ source: AutoCompletePath(getPaths),\n                      html: true });\n}\n\nfunction showMatchedFiles(repository_name, path)\n{\n  function finished(result)\n  {\n    if (result)\n    {\n      var options = [];\n\n      for (var index = 0; index < result.paths.length; ++index)\n      {\n        options.push(\"<option>\" + htmlify(result.paths[index]) + \"</option>\");\n      }\n\n      var dialog = $(\"<div class=matchedfiles><select multiple>\" + options.join(\"\") + \"</select></div>\");\n\n      dialog.attr(\"title\", options.length + \" file\" + (options.length != 1 ? \"s\" : \"\") + \" matched by \" + path);\n      dialog.find(\"select\").attr(\"size\", Math.min(20, options.length));\n      dialog.dialog({ width: 600, buttons: { \"Close\": function () { dialog.dialog(\"close\"); }}});\n    }\n  }\n\n  var operation = new Operation({ action: \"fetch matched paths\",\n                                  url: \"getmatchedpaths\",\n                                  data: { repository_name: repository_name,\n                                          path: path,\n                                          user_id: user.id },\n                                  wait: \"Fetching matched paths...\",\n                                  cancelable: true,\n                                  callback: finished });\n\n  operation.execute();\n}\n\nfunction reapplyFilters(filter_id, reload_when_finished)\n{\n  function finished(result)\n  {\n    if (result && filter_id === void 0)\n    {\n      var changes, first;\n\n      if (result.assigned_reviews.length == 0 &&\n          result.watched_reviews.length == 0)\n        changes = \"<tr><th colspan=2>No changes.</th></tr>\";\n      else\n      {\n        changes = \"\";\n\n        if (result.assigned_reviews.length > 0)\n        {\n          changes += \"<tr><th colspan=2>Reviews with new changes to review:</th></tr>\";\n          first = \" class=first\";\n\n          result.assigned_reviews.forEach(\n            function (review_id)\n            {\n              changes += \"<tr\" + first + \"><td class=id><a href=/r/\" + review_id + \">r/\" + review_id + \"</a></td><td class=summary>\" + htmlify(result.summaries[review_id]) + \"</td></tr>\";\n              first = \"\";\n            });\n        }\n\n        if (result.watched_reviews.length > 0)\n        {\n          changes += \"<tr><th colspan=2>New watched reviews:</th></tr>\";\n          first = \" class=first\";\n\n          result.watched_reviews.forEach(\n            function (review_id)\n            {\n              changes += \"<tr\" + first + \"><td><a href=/r/\" + review_id + \">r/\" + review_id + \"</a></td><td>\" + htmlify(result.summaries[review_id]) + \"</td></tr>\";\n              first = \"\";\n            });\n        }\n      }\n\n      var dialog = $(\"<div class=reapplyresult>\"\n                   +   \"<h1>Result:</h1>\"\n                   +   \"<table>\"\n                   +     changes\n                   +   \"</table>\"\n                   + \"</div>\");\n\n      dialog.dialog({ width: 800,\n                      buttons: { Close: function () { dialog.dialog(\"close\"); }},\n                      close: function () { if (reload_when_finished) location.reload(); }});\n    }\n    else if (reload_when_finished)\n      location.reload();\n  }\n\n  var operation = new Operation({ action: \"reapply filters\",\n                                  url: \"reapplyfilters\",\n                                  data: { filter_id: filter_id },\n                                  wait: \"Please wait...\",\n                                  callback: finished });\n\n  operation.execute();\n}\n\nfunction countMatchedFiles()\n{\n  if (typeof count_matched_files == \"undefined\" || count_matched_files.length == 0)\n    return;\n\n  var item = count_matched_files.shift();\n\n  function update(result)\n  {\n    if (result)\n    {\n      result.filters.forEach(\n        function (filter)\n        {\n          var link = $(\"#f\" + filter.id);\n          if (filter.count)\n            link.text(filter.count + \" file\" + (filter.count == 1 ? \"\" : \"s\"));\n          else\n            link.replaceWith(\"no files\")\n        });\n      countMatchedFiles();\n    }\n  }\n\n  var operation = new Operation({ action: \"count matched files\",\n                                  url: \"countmatchedpaths\",\n                                  data: { multiple: item,\n                                          user_id: user.id },\n                                  callback: update });\n\n  operation.execute();\n}\n\n$(function ()\n  {\n    countMatchedFiles();\n  });\n"
  },
  {
    "path": "src/resources/log.css",
    "content": "/* -*- mode: css; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\ntable.log { background: #ffffe6; table-layout: fixed }\n\ndiv.main > table.log,\ntbody.content table.log {\n    padding: 1em; margin-top: 1em; border: 1px solid #cca; border-radius: 4px\n}\n\ntbody.content table.log {\n    width: 70%\n}\n\ndiv.main > table.log > thead > tr.headings > td,\ntbody.content table.log > thead > tr.headings > td {\n    padding-top: 0.5em\n}\n\ntable.log table.log { width: 98% }\n\ntable.log tr.title td { padding-left: 1em }\ntable.log tr.title td { border-bottom: 1px solid #cca }\ntable.log tr.title td h1 { margin-top: 0; margin-bottom: 0.5em }\n\ntable.log tr.headings td { padding-left: 1em; font-weight: bold; text-decoration: underline }\ntable.log tr.headings td.when { text-align: right; padding-right: 1em }\ntable.log tr.headings td.type { text-decoration: none }\ntable.log tr.headings td.author { text-align: right }\n\ntable.log tr.commit:hover { background: #eec }\ntable.log tr.commit td { padding-left: 1em; padding-top: 3px; padding-bottom: 3px }\ntable.log tr.commit td.when { text-align: right; font-weight: bold; padding-right: 1em }\ntable.log tr.commit td.type { font-weight: bold }\ntable.log tr.commit td.summary {\n    font-family: monospace;\n    font-size: 10pt;\n    background-color: #fffff4;\n    border-left: 1px dotted #cca;\n    border-bottom: 1px dotted #cca;\n    border-right: 1px dotted #cca;\n    white-space: nowrap;\n    overflow: hidden\n}\ntable.log tr.commit:hover td.summary { background-color: #eed }\ntable.log tr.commit.highlight td.summary { background-color: #dfd }\ntable.log tr.commit td.summary.selected { background-color: #eed }\ntable.log tr.commit:first-child td.summary { border-top: 1px dotted #cca }\ntable.log tr.commit td.author { text-align: right; font-weight: bold }\ntable.log tr.commit.selected { background: #ddb }\ntable.log tr.commit td.summary .nocomment,\ntable.log tr.commit td.summary .rebase\n{\n    font-style: italic;\n    text-decoration: none\n}\n\ntable.log > thead > tr.basemerge > td,\ntable.log > thead > tr.rebase > td { padding-top: 0.5em; padding-bottom: 0.5em; font-weight: bold }\ntable.log > thead > tr.upstream > td { padding-top: 0.5em; font-weight: bold }\ntable.log > thead > tr.error > td { padding-top: 0.5em; padding-bottom: 0.5em; font-weight: bold }\n\ntable.log > tfoot > tr > td {\n    text-align: right; border-top: 1px solid #cca; padding-top: 0.5em\n}\n\ntable.log.collapsable.collapsed > thead.title { display: table-row-group }\ntable.log.collapsable.collapsed > thead.title > tr.headings { display: none }\ntable.log.collapsable.collapsed > thead,\ntable.log.collapsable.collapsed > tbody { display: none }\n\ntr.sublog > td { padding-bottom: 0.5em }\ntr.sublog > td > table.log { padding-top: 0.5em; padding-bottom: 0.5em }\ntr.sublog > td > table.log tr > td.when { border-left: 3px solid #cca; }\n\ndiv.main div.submit { margin-left: 30%; margin-right: 30%; background-color: #8f8; border: 3px solid #6b6; text-align: center; margin-top: 1em; margin-bottom: 1em; border-radius: 10px; color: #141 }\ndiv.main div.submit:hover { background-color: #7e7; border-color: #5a5; cursor: hand }\ndiv.main div.submit:active { background-color: #6d6; border-color: #494; cursor: hand }\n\ndiv.marker {\n    position: absolute;\n    border: 2px solid;\n    width: 7px;\n    background-color: #88f;\n    border-color: #66b\n}\n\ndiv.marker.right {\n    border-top-right-radius: 10px;\n    border-bottom-right-radius: 10px;\n}\n\ndiv.marker.left {\n    border-top-left-radius: 10px;\n    border-bottom-left-radius: 10px;\n}\n\nselect.base option.base {\n    font-weight: bold\n}\n\ntable.log.relevant tr.commit.merge { display: none }\ntable.log.relevant td h1 a {\n    font-size: 12px;\n    margin-left: 1em\n}\n\n.followup-tooltip {\n    max-width: 600px\n}\n\n.followup-tooltip .header {\n    vertical-align: top;\n    font-style: italic;\n    white-space: nowrap;\n    padding-right: 0.5em\n}\n\n.followup-tooltip .summary {\n    font-style: normal;\n    font-family: monospace\n}\n"
  },
  {
    "path": "src/resources/log.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\nvar anchorCommit = null, focusCommit = null, commitMarkers = null;\n\nfunction CommitMarkers(commits)\n{\n  this.bothMarkers = $(\"<div class='marker left'></div><div class='marker right'></div>\");\n\n  $(document.body).append(this.bothMarkers);\n\n  this.leftMarker = this.bothMarkers.first();\n  this.rightMarker = this.bothMarkers.last();\n\n  this.firstCommit = this.lastCommit = null;\n  this.setCommits(commits);\n}\n\nCommitMarkers.prototype.setCommits = function (commits)\n  {\n    if (this.firstCommit)\n      this.firstCommit.parent().removeClass(\"first\");\n    if (this.lastCommit)\n      this.lastCommit.parent().removeClass(\"last\");\n\n    $(\"td.summary.selected\").removeClass(\"selected\");\n\n    this.firstCommit = commits.first().children(\"td.summary\");\n    this.lastCommit = commits.last().children(\"td.summary\");\n\n    this.firstCommit.addClass(\"selected\");\n    this.lastCommit.addClass(\"selected\");\n\n    if (this.firstCommit.get(0) != this.lastCommit.get(0))\n    {\n      this.lastCommit.parent().addClass(\"last\");\n      this.firstCommit.parent().nextUntil(\"tr.commit.last\").children(\"td.summary\").addClass(\"selected\");\n      this.lastCommit.parent().removeClass(\"last\");\n    }\n\n    this.updatePosition();\n  };\n\nCommitMarkers.prototype.updatePosition = function ()\n  {\n    var firstOffset = this.firstCommit.offset();\n    var top = firstOffset.top - parseInt(this.firstCommit.css(\"padding-top\"));\n    var bottom = this.lastCommit.offset().top + this.lastCommit.height() + parseInt(this.lastCommit.css(\"padding-bottom\"));\n\n    this.leftMarker.offset({ top: top, left: firstOffset.left - parseInt(this.leftMarker.outerWidth()) - 1 });\n    this.rightMarker.offset({ top: top, left: firstOffset.left + this.firstCommit.outerWidth() + 1 });\n\n    this.bothMarkers.height(bottom - top + 2);\n  };\n\nCommitMarkers.prototype.remove = function ()\n  {\n    $(\"td.summary.selected\").removeClass(\"selected\");\n\n    this.leftMarker.remove();\n    this.rightMarker.remove();\n  };\n\nfunction rebase(name, base, newBaseBase, oldCount, newCount, baseOldCount, baseNewCount)\n{\n  function finish()\n  {\n    $.ajax({ async: false,\n             url: \"/rebasebranch?repository=\" + repository.id + \"&name=\" + encodeURIComponent(name) + \"&base=\" + encodeURIComponent(base),\n             dataType: \"text\",\n             success: function (data)\n               {\n                 if (data == \"ok\")\n                   location.replace(\"/log?repository=\" + repository.id + \"&branch=\" + encodeURIComponent(name));\n                 else\n                   reportError(\"update base branch\", \"Server reply: <i style='white-space: pre'>\" + htmlify(data) + \"</i>\");\n               },\n             error: function ()\n               {\n                 reportError(\"update base branch\", \"Request failed.\");\n               }\n           });\n  }\n\n  var content = $(\"<div title=Please Confirm'><p>You are about to update Critic's record of the branch <b>\" + htmlify(name) + \"</b>.  It used to contain \" + oldCount + \" commits and will now contain these \" + newCount + \" commits instead.</p>\" + (typeof baseOldCount == \"number\" ? \"<p>In addition, the branch <b>\" + htmlify(base) + \"</b> will be modified to have <b>\" + htmlify(newBaseBase) + \"</b> as its new base branch instead of <b>\" + htmlify(name) + \"</b>, and will contain \" + baseNewCount + \" commits instead of \" + baseOldCount + \" commits.</p>\" : \"\") + \"<p><b>Note:</b> The git repository will not be affected at all by this.</p><p>Are you sure you want to do this?</p></div>\");\n  content.dialog({ width: 400, modal: true, buttons: { \"Perform Rebase\": finish, \"Do Nothing\": function () { content.dialog(\"close\"); }}});\n}\n\nfunction showRelevantMerges(ev)\n{\n  $(ev.currentTarget).parents(\"table.log.relevant\").find(\"tr.commit.merge\").show();\n  var text = ev.currentTarget.firstChild;\n  text.nodeValue = text.nodeValue.replace(\"Show\", \"Hide\");\n  ev.currentTarget.onclick = hideRelevantMerges;\n}\n\nfunction hideRelevantMerges(ev)\n{\n  $(ev.currentTarget).parents(\"table.log.relevant\").find(\"tr.commit.merge\").hide();\n  var text = ev.currentTarget.firstChild;\n  text.nodeValue = text.nodeValue.replace(\"Hide\", \"Show\");\n  ev.currentTarget.onclick = showRelevantMerges;\n}\n\nvar overrideShowSquashedDiff = null;\n\nfunction resetSelection()\n{\n  if (anchorCommit && commitMarkers)\n  {\n    commitMarkers.remove();\n    commitMarkers = null;\n  }\n\n  if (typeof automaticAnchorCommit == \"string\")\n    anchorCommit = $(\"#\" + automaticAnchorCommit);\n  else\n    anchorCommit = null;\n\n  focusCommit = null;\n}\n\nfunction executeSelection(commit)\n{\n  if (anchorCommit && commitMarkers)\n  {\n    if (commit.size() && commit.get(0).parentNode == anchorCommit.get(0).parentNode)\n    {\n      var re_sha1 = /[0-9a-f]{8,40}(?=\\?|$)/;\n\n      var to_sha1 = re_sha1.exec($(\"td.summary.selected > a\").first().attr(\"href\"))[0]; //.parent(\"tr.commit\").attr(\"id\");\n      var from_sha1 = re_sha1.exec($(\"td.summary.selected > a\").last().attr(\"href\"))[0]; //.parent(\"tr.commit\").attr(\"id\");\n\n      if (overrideShowSquashedDiff)\n        overrideShowSquashedDiff(from_sha1, to_sha1);\n      else\n        location.href = \"/showcommit?first=\" + from_sha1 + \"&last=\" + to_sha1 + (typeof review != \"undefined\" && typeof review.id == \"number\" ? \"&review=\" + review.id : \"\");\n    }\n\n    resetSelection();\n    return true;\n  }\n\n  return false;\n}\n\n$(document).ready(function ()\n  {\n    $(\"tr.commit td.summary\").click(function (ev)\n      {\n        resetSelection();\n      });\n\n    $(\"tr.commit td.summary\").mousedown(function (ev)\n      {\n        if (ev.button != 0 || ev.ctrlKey || ev.shiftKey || ev.altKey || ev.metaKey)\n          return;\n\n        if (!executeSelection($(ev.target).parents(\"tr.commit\")))\n        {\n          anchorCommit = $(ev.currentTarget).parent(\"tr.commit\");\n          focusCommit = null;\n\n          ev.preventDefault();\n        }\n      });\n\n    $(\"tr.commit td.summary\").mouseover(function (ev)\n      {\n        if (anchorCommit)\n        {\n          var commit = $(ev.currentTarget).parent(\"tr.commit\");\n\n          if (commit.size() && commit.get(0).parentNode == anchorCommit.get(0).parentNode)\n            if (!commitMarkers)\n            {\n              if (commit.get(0) != anchorCommit.get(0))\n              {\n                focusCommit = commit;\n                commitMarkers = new CommitMarkers(anchorCommit.add(focusCommit));\n              }\n            }\n            else\n            {\n              if (commit.get(0) == anchorCommit.get(0))\n              {\n                commitMarkers.remove();\n                commitMarkers = null;\n              }\n              else\n              {\n                focusCommit = commit;\n                commitMarkers.setCommits(anchorCommit.add(focusCommit));\n              }\n            }\n\n          if (commitMarkers)\n            commitMarkers.updatePosition();\n        }\n\n        ev.stopPropagation();\n      });\n\n    $(document).mouseover(function (ev)\n      {\n        if (typeof automaticAnchorCommit == \"string\")\n          resetSelection();\n      });\n\n    $(document).mouseup(function (ev)\n      {\n        if (!executeSelection($(ev.target).parents(\"tr.commit\")))\n          resetSelection();\n      });\n\n    $(\"select.base\").change(function (ev)\n      {\n        if (ev.target.value != \"*\")\n          location.replace(\"/log?repository=\" + repository.id + \"&branch=\" + encodeURIComponent(branch.name) + \"&base=\" + encodeURIComponent(ev.target.value));\n      });\n\n    $(\"span.squash, span.fixup\").tooltip({\n      items: \"span.fixup, span.fixup\",\n      content: function ()\n        {\n          var element = $(this);\n\n          return $(\"<table class='tooltip'><tr><td class=header>\" +\n                   (element.hasClass(\"squash\") ? \"Squash into\" : \"Fixup of\") +\n                   \"</td><td class=summary>\" +\n                   htmlify(element.attr(\"critic-ref\")) +\n                   \"</td></tr></table>\");\n        },\n      track: true,\n      hide: false,\n      tooltipClass: \"followup-tooltip\"\n    });\n\n    resetSelection();\n\n    var highlight = $(\"tr.commit.highlight td\");\n    if (highlight.size())\n      window.scrollTo(pageXOffset, highlight.offset().top - (innerHeight - highlight.height()) / 2);\n\n    $(\"table.log.collapsable > thead.title\").click(function (ev)\n      {\n        $(ev.currentTarget.parentNode).toggleClass(\"collapsed\");\n      });\n  });\n"
  },
  {
    "path": "src/resources/login.css",
    "content": "/* -*- mode: css; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\ntable.login {\n    margin-top: 1em;\n}\n\ntable.login tr.separator1 td {\n    border-bottom: 1px solid #cca;\n    padding-top: 1em\n}\n\ntable.login tr.separator2 td {\n    padding-top: 1em\n}\n\ntable.login tr.status td {\n    color: red;\n    font-weight: bold;\n    text-align: center;\n    padding-bottom: 1em\n}\n\ntable.login tr.status.disabled {\n    display: none\n}\n\ntable.login td.key {\n    padding-left: 2em;\n    padding-right: 0.5em;\n    font-weight: bold;\n    text-align: right\n}\n\ntable.login td.value {\n    padding-left: 0.5em;\n    padding-right: 2em\n}\n\ntable.login tr.login td {\n    text-align: center;\n    padding-top: 1em\n}\n\ntable.login tr.external td div,\ntable.login tr.register td,\ntable.login tr.continue td {\n    text-align: right;\n    font-weight: bold\n}\n\ntable.login tr.continue a, table.login tr.continue a:visited {\n    color: #222\n}\n"
  },
  {
    "path": "src/resources/login.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\n$(function () {\n  var fields = $(\"input.field\");\n  var submit = $(\"input.login\");\n  var form = $(\"form\");\n\n  fields.each(function (index) {\n    $(this).keypress(function (ev) {\n      if (ev.keyCode == 13) {\n        if (index == fields.length - 1)\n          submit.click();\n        else\n          fields[index + 1].focus();\n      }\n    });\n  });\n\n  submit.button();\n\n  form.submit(function (ev) {\n    var data = {\n      fields: {}\n    };\n\n    fields.each(function () {\n      data.fields[this.name] = this.value;\n    });\n\n    var operation = new Operation({\n      action: \"login\",\n      url: \"validatelogin\",\n      data: data\n    });\n\n    var result = operation.execute();\n\n    if (!result || result.message) {\n      ev.preventDefault();\n\n      if (result) {\n        $(\"tr.status td\").text(result.message);\n        $(\"tr.status\").removeClass(\"disabled\");\n      }\n    }\n  });\n});\n"
  },
  {
    "path": "src/resources/manageextensions.css",
    "content": "/* -*- mode: css; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\ntr.item > td.value > span.name {\n    border-bottom: 1px solid #cca\n}\n\ntr.item > td.value > span.installed {\n    font-weight: bold;\n    color: #0e0\n}\n\ntr.item > td.value > span.details {\n    float: right\n}\n\ntr.item > td.value > div.description {\n    padding-top: 1em;\n    padding-bottom: 1em;\n    font-family: serif\n}\n\ntr.item > td.value > div.description.broken {\n    font-weight: bold;\n    color: red\n}\n\ntr.item > td.value > div.description.broken > a {\n    text-decoration: none\n}\n\ntr.item > td.value > div.description.broken > a:hover {\n    text-decoration: underline\n}\n\ntr.item > td.value > div.authors > b {\n    padding-right: 1em\n}\n\ntable.roles {\n    margin-left: 1em\n}\n\ntable.roles > tbody > tr > th {\n    padding-top: 0.5em;\n    border-bottom: 1px solid #cca;\n}\n\ntable.roles > tbody > tr > td.description {\n    font-family: serif\n}\n\ntable.roles > tbody > tr > td.pattern {\n    padding-right: 1em\n}\n\ntable.roles > tbody > tr > td > ul {\n    padding-left: 1.5em;\n    margin: 0;\n    font-family: serif\n}\n\ntable.roles span.inactive {\n    font-weight: bold;\n    color: red\n}\n"
  },
  {
    "path": "src/resources/manageextensions.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\nfunction installExtension(author_name, extension_name, version, universal)\n{\n  $(\"button\").prop(\"disabled\", true);\n\n  var data = { extension_name: extension_name,\n               version: version,\n               universal: Boolean(universal) };\n\n  if (author_name)\n    data.author_name = author_name;\n\n  var operation = new Operation({ action: \"install extension\",\n                                  url: \"installextension\",\n                                  data: data });\n  var result = operation.execute();\n\n  if (result)\n    showMessage(\"Extension installed!\",\n                extension_name + \" installed!\",\n                \"The extension was installed successfully.\",\n                function () { location.reload(); });\n}\n\nfunction uninstallExtension(author_name, extension_name, universal)\n{\n  $(\"button\").prop(\"disabled\", true);\n\n  var data = { extension_name: extension_name,\n               universal: Boolean(universal) };\n\n  if (author_name)\n    data.author_name = author_name;\n\n  var operation = new Operation({ action: \"uninstall extension\",\n                                  url: \"uninstallextension\",\n                                  data: data });\n  var result = operation.execute();\n\n  if (result)\n    location.reload();\n}\n\nfunction reinstallExtension(author_name, extension_name, version, universal)\n{\n  $(\"button\").prop(\"disabled\", true);\n\n  var data = { extension_name: extension_name,\n               version: version,\n               universal: Boolean(universal) };\n\n  if (author_name)\n    data.author_name = author_name;\n\n  var operation = new Operation({ action: \"reinstall extension\",\n                                  url: \"reinstallextension\",\n                                  data: data });\n  var result = operation.execute();\n\n  if (result)\n    location.reload();\n}\n\nfunction clearExtensionStorage(author_name, extension_name)\n{\n  function clear()\n  {\n    var data = { extension_name: extension_name };\n\n    if (author_name)\n      data.author_name = author_name;\n\n    var operation = new Operation({ action: \"clear extension storage\",\n                                    url: \"clearextensionstorage\",\n                                    data: data });\n\n    if (operation.execute()) {\n      close();\n      location.reload();\n    }\n  }\n\n  function close()\n  {\n    dialog.dialog(\"close\");\n  }\n\n  var dialog = $(\n    \"<div title='Please confirm'>\" +\n    \"<p>Clearing an extension's storage deletes whatever state the extension \" +\n    \"has stored about your use of it since you first installed it.  The \" +\n    \"state can not be restored!</p><p><b>Are you sure?</b></p>\" +\n    \"</div>\");\n\n  dialog.dialog({\n    width: 600,\n    modal: true,\n    buttons: {\n      \"Clear storage\": clear,\n      \"Do nothing\": close\n    }\n  });\n}\n\n$(function ()\n  {\n    $(\"a.button\").button();\n\n    $(\"select.details\").change(function (ev)\n      {\n        var select = $(ev.currentTarget);\n        var previous = JSON.stringify(selected_versions);\n        var value = select.val();\n        var author_name = select.attr(\"critic-author\");\n        var extension_name = select.attr(\"critic-extension\");\n        var key;\n\n        if (author_name)\n          key = author_name + \"/\" + extension_name;\n        else\n          key = extension_name;\n\n        if (!value)\n          delete selected_version[key];\n        else if (value == \"live\")\n          selected_versions[key] = null;\n        else\n          /* value = \"version/*\" */\n          selected_versions[key] = value.substring(8);\n\n        var next = JSON.stringify(selected_versions);\n\n        if (next != previous)\n        {\n          /* Restore state before we leave the page: */\n          selected_versions = JSON.parse(previous);\n\n          location.href = \"/manageextensions?select=\" + encodeURIComponent(next) + \"&focus=\" + encodeURIComponent(key);\n        }\n      });\n  });\n"
  },
  {
    "path": "src/resources/managereviewers.css",
    "content": "/* -*- mode: css; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\ndiv.main table.manage {\n    table-layout: fixed;\n}\n\ndiv.main table.manage tr.current td.select,\ndiv.main table.manage tr.reviewer td.select,\ndiv.main table.manage tr.headings td {\n    font-weight: bold\n}\n\ndiv.main table.manage tr.current td,\ndiv.main table.manage tr.reviewer td,\ndiv.main table.manage tr.headings td {\n    padding-top: 1em\n}\n\ndiv.main table.manage tr.reviewer span.message {\n    font-weight: normal;\n    font-style: italic;\n    padding-left: 2em\n}\n\ndiv.main table.manage tr.reviewer input.reviewer {\n    font-family: monospace\n}\n\ndiv.main table.manage tr td.select {\n    text-align: right;\n    padding-right: 1em\n}\n\ndiv.main table.manage tr.current td.select {\n    vertical-align: top\n}\n\ndiv.main table.manage tr.current td.value {\n    font-family: monospace\n}\n\ndiv.main table.manage tr td.path {\n    white-space: pre;\n    font-family: monospace\n}\n\ndiv.main table.manage tr td.right {\n    text-align: right\n}\n\ndiv.main table.manage tr td.help {\n    font-style: italic;\n    text-align: right;\n    border-bottom: 1px solid #cca\n}\n"
  },
  {
    "path": "src/resources/managereviewers.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\nvar currentReviewer = null;\n\nfunction selectReviewer(reset)\n{\n  var reviewer = $(\"input.reviewer\").val();\n\n  if (reviewer == currentReviewer)\n    return;\n\n  if (reviewer == \"\")\n    if (reset)\n      $(\"input.reviewer\").val(reviewer = currentReviewer);\n    else\n      return;\n\n  function handleNoSuchUser(result)\n  {\n    $(\"tr.reviewer span.message\").text(\"No such user.\");\n    return true;\n  }\n\n  $(\"tr.reviewer span.message\").text(\"\");\n\n  var operation = new Operation({ action: \"fetch assigned changes\",\n                                  url: \"getassignedchanges\",\n                                  data: { review_id: review.id,\n                                          user_name: reviewer },\n                                  failure: { nosuchuser: handleNoSuchUser }});\n  var result = operation.execute();\n\n  if (result)\n  {\n    var files = {};\n\n    for (var index = 0; index < result.files.length; ++index)\n      files[result.files[index]] = true;\n\n    currentReviewer = reviewer;\n\n    $(\"tr.file\").each(\n      function ()\n      {\n        $(this).find(\"input\").get(0).checked = $(this).attr(\"critic-file-id\") in files;\n      });\n\n    checkDirectories();\n\n    $(\"input.reviewer\").autocomplete(\"close\");\n  }\n}\n\nfunction checkDirectory(line)\n{\n  line = $(line);\n\n  var level = parseInt(line.attr(\"critic-level\"));\n  var all_checked = true;\n  var dirline = line;\n\n  line.nextAll(\"tr\").each(function (index, line)\n    {\n      if (parseInt(line.getAttribute(\"critic-level\")) <= level)\n        return false;\n\n      if (line.className == \"file\")\n        $(line).find(\"input\").each(function () { if (!this.checked) all_checked = false; });\n\n      return all_checked;\n    });\n\n  line.find(\"input\").each(function () { this.checked = all_checked; });\n}\n\nfunction checkDirectories()\n{\n  $(\"tr.directory\").each(function (index, line)\n    {\n      checkDirectory(line);\n    });\n}\n\n$(document).ready(function ()\n  {\n    $(\"tr.directory\").click(function (ev)\n      {\n        var line = $(ev.currentTarget);\n        var level = parseInt(line.attr(\"critic-level\"));\n        var checkbox = line.find(\"input\").get(0);\n\n        var on;\n        if (ev.target.nodeName.toLowerCase() != \"input\")\n        {\n          on = checkbox.checked = !checkbox.checked;\n          ev.preventDefault();\n        }\n        else\n          on = checkbox.checked;\n\n        line.nextAll(\"tr\").each(\n          function (index, line)\n          {\n            if (parseInt(line.getAttribute(\"critic-level\")) <= level)\n              return false;\n\n            $(line).find(\"input\").each(function () { this.checked = on; });\n          });\n\n        line.prevAll(\"tr.directory\").each(\n          function (index, line)\n          {\n            var line_level = parseInt(line.getAttribute(\"critic-level\"));\n            if (line_level < level)\n            {\n              if (!checkbox.checked)\n                $(line).find(\"input\").each(function () { this.checked = false; });\n              else\n                checkDirectory(line);\n\n              level = line_level;\n            }\n          });\n      });\n\n    $(\"tr.file\").click(function (ev)\n      {\n        var line = $(ev.currentTarget);\n        var level = parseInt(line.attr(\"critic-level\"));\n        var checkbox = line.find(\"input\").get(0);\n\n        var on;\n        if (ev.target.nodeName.toLowerCase() != \"input\")\n        {\n          on = checkbox.checked = !checkbox.checked;\n          ev.preventDefault();\n        }\n        else\n          on = checkbox.checked;\n\n        line.prevAll(\"tr.directory\").each(\n          function (index, line)\n          {\n            var line_level = parseInt(this.getAttribute(\"critic-level\"));\n            if (line_level < level)\n            {\n              if (!checkbox.checked)\n                $(line).find(\"input\").each(function () { this.checked = false; });\n              else\n                checkDirectory(line);\n\n              level = line_level;\n            }\n          });\n      });\n\n    $(\"input.reviewer\").autocomplete({ source: users });\n    $(\"input.reviewer\").keypress(function (ev)\n      {\n        if (ev.keyCode == 13)\n          selectReviewer(false);\n      });\n    $(\"input.reviewer\").blur(function (ev)\n      {\n        setTimeout(function () { selectReviewer(true); }, 100);\n      });\n\n    $(\"button.save\").click(function ()\n      {\n        var files = [], reviewer = $(\"input.reviewer\").val();\n\n        $(\"tr.file\").each(function ()\n          {\n            var row = $(this);\n            if (row.find(\"input\").get(0).checked)\n              files.push(parseInt(row.attr(\"critic-file-id\")));\n          });\n\n        var operation = new Operation({ action: \"assign changes\",\n                                        url: \"setassignedchanges\",\n                                        data: { review_id: review.id,\n                                                user_name: reviewer,\n                                                files: files }});\n\n        if (operation.execute())\n          $(\"tr.reviewer span.message\").text(\"Assignments saved.\");\n      });\n\n    $(\"input.reviewer\").val(user.name);\n\n    selectReviewer();\n\n    $(\"span.reviewer\").click(function (ev)\n      {\n        $(\"input.reviewer\").val($(this).attr(\"critic-username\"));\n        selectReviewer();\n      });\n  });\n"
  },
  {
    "path": "src/resources/message.css",
    "content": "/* -*- mode: css; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\n.message {\n    width: 50%;\n}\n\n.message h1.center {\n    text-align: center;\n    border-bottom: 0;\n}\n\n.message > p {\n    margin-left: 0.5rem;\n    margin-right: 0.5rem;\n    font-size: 1rem;\n}\n\n.message > .pre {\n    white-space: pre;\n    font-family: monospace;\n}\n"
  },
  {
    "path": "src/resources/newrepository.css",
    "content": "/* -*- mode: css; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\nbody { font-size: 12px; font-family: sans-serif }\n\ndiv.main table.paleyellow td.heading { font-family: serif; font-weight: bold; text-align: right; width: 20% }\ndiv.main table.paleyellow td.prefix { font-family: monospace; text-align: right; white-space: nowrap }\ndiv.main table.paleyellow td.value { font-family: monospace; padding-left: 5px }\ndiv.main table.paleyellow td.value input { font-family: monospace; width: 100% }\ndiv.main table.paleyellow td.suffix { font-family: monospace; padding-left: 5px }\ndiv.main table.paleyellow tr.help td { font-style: italic; text-align: right; border-bottom: 1px solid #cca }\n\ndiv.main table.paleyellow tr.buttons td { padding-top: 1em }\n"
  },
  {
    "path": "src/resources/newrepository.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\n$(document).ready(\n  function ()\n  {\n    function updateBranchDisabled()\n    {\n      if ($(\"input[name='remote']\").val().trim())\n        $(\"input[name='branch']\").prop(\"disabled\", false);\n      else\n        $(\"input[name='branch']\").prop(\"disabled\", true);\n    }\n\n    $(\"input[name='remote']\")\n      .keyup(updateBranchDisabled)\n      .change(updateBranchDisabled)\n      .bind(\"input\", updateBranchDisabled);\n\n    $(\"button.add\").click(\n      function (ev)\n      {\n        var name = $(\"input[name='name']\").val();\n        var path = $(\"input[name='path']\").val();\n        var remote = $(\"input[name='remote']\").val();\n        var branch = $(\"input[name='branch']\").val();\n\n        var data = { name: name,\n                     path: path };\n\n        if (remote.trim())\n          data.mirror = { remote_url: remote,\n                          remote_branch: branch,\n                          local_branch: branch };\n\n        var operation = new Operation({ action: \"add repository\",\n                                        url: \"addrepository\",\n                                        data: data });\n        if (operation.execute())\n          location.href = \"/repositories#\" + name;\n      });\n\n    function handleKeypress(ev)\n    {\n      if (ev.keyCode == 13)\n        $(\"button.add\").click();\n    }\n\n    $(\"input[name='name']\").keypress(handleKeypress);\n    $(\"input[name='path']\").keypress(handleKeypress);\n    $(\"input[name='remote']\").keypress(handleKeypress);\n    $(\"input[name='branch']\").keypress(handleKeypress);\n\n    $(\"input[name='name']\").focus();\n  });\n"
  },
  {
    "path": "src/resources/news.css",
    "content": "/* -*- mode: css; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\ntable tr.item:hover {\n    background-color: #eec;\n    cursor: pointer\n}\n\ntable tr.item td.date {\n    text-align: right;\n    width: 15%;\n    padding-right: 1em\n}\n\ntable tr.item td.title {\n    font-weight: bold\n}\n\ntable tr.item td.status {\n    text-align: right;\n    font-weight: bold;\n    color: red;\n    width: 15%\n}\n\ntable tr.nothing td.nothing {\n    text-align: center;\n    font-weight: bold;\n    padding-top: 1em\n}\n\ndiv.main table td.show {\n    text-align: center;\n    padding-left: 0;\n    padding-top: 1em\n}\n\ndiv.main table td.show div {\n    padding-top: 1em;\n    border-top: 1px solid #cca\n}\n"
  },
  {
    "path": "src/resources/news.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\nfunction addOrEditNewsItem(edit_item_id, edit_text)\n{\n  function finish()\n  {\n    var text = content.find(\"textarea\").val();\n    var operation;\n\n    if (edit_item_id)\n      operation = new Operation({ action: \"edit news item\",\n                                  url: \"editnewsitem\",\n                                  data: { item_id: edit_item_id,\n                                          text: text }});\n    else\n      operation = new Operation({ action: \"add news item\",\n                                  url: \"addnewsitem\",\n                                  data: { text: text }});\n\n    return !!operation.execute();\n  }\n\n  var verb = edit_item_id ? \"Edit\" : \"Create\"\n\n  var content = $(\"<div class='comment flex' title='\" + verb + \" News Item'>\" +\n                  \"<textarea class='text flexible' rows=8></textarea></div>\");\n\n  if (edit_text)\n    content.find(\"textarea\").val(edit_text);\n\n  var buttons = { Save: function () { if (finish()) { $(content).dialog(\"close\"); location.reload(); } },\n                  Cancel: function () { $(content).dialog(\"close\"); } };\n\n  content.dialog({ width: 600,\n                   buttons: buttons,\n                   closeOnEscape: false });\n}\n\n$(document).ready(function ()\n  {\n    $(\"button, a.show, a.back\").button();\n    $(\"button.addnewsitem\").click(function () { addOrEditNewsItem(false); });\n    $(\"button.editnewsitem\").click(function () { addOrEditNewsItem(news_item_id, news_text); });\n    $(\"tr.item\").click(function (ev)\n      {\n        var target = $(ev.currentTarget);\n        location.search = \"item=\" + target.attr(\"critic-item-id\");\n      });\n  });\n"
  },
  {
    "path": "src/resources/overrides.css",
    "content": ".ui-widget,\n.ui-widget input,\n.ui-widget select,\n.ui-widget textarea,\n.ui-widget button,\n.repository-select\n{\n    font-family: Verdana, Helvetica, Arial, sans-serif\n}\n\n.chosen-container .chosen-drop {\n    background-color: #f2ece0;\n}\n\n.chosen-container-single .chosen-single {\n    border-color: #cdc3b7;\n}\n\n.chosen-container .chosen-results li.group-result {\n    color: #444;\n    border-bottom: 1px solid #444;\n}\n\n.chosen-container-multi .chosen-choices li.search-field input[type=\"text\"] {\n    -webkit-box-sizing: content-box;\n    -moz-box-sizing: content-box;\n    box-sizing: content-box;\n}\n\n.chosen-container-single .chosen-single,\n.chosen-container-active.chosen-with-drop .chosen-single,\n.chosen-container-multi .chosen-choices li.search-choice {\n    background: #ede4d4 url(third-party/images/ui-bg_glass_70_ede4d4_1x400.png) 50% 50% repeat-x;\n}\n\n.chosen-container .chosen-drop,\n.chosen-container-active .chosen-single,\n.chosen-container-active.chosen-with-drop .chosen-single,\n.chosen-container-active .chosen-choices {\n    border-color: #f5ad66;\n}\n"
  },
  {
    "path": "src/resources/rebasetrackingreview.css",
    "content": "/* -*- mode: css; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\nselect, input {\n    width: 50%\n}\n\ninput {\n    font-family: monospace;\n}\n"
  },
  {
    "path": "src/resources/rebasetrackingreview.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\n\"use strict\";\n\nfunction editNewBranch(button)\n{\n  var dropdown = $(\"select#newbranch\");\n  var selected = dropdown.val();\n\n  dropdown.replaceWith(\"<input id=newbranch>\");\n\n  $(\"input#newbranch\").val(selected);\n\n  $(button).remove();\n}\n\nfunction fetchBranch()\n{\n  var newbranch = \"refs/heads/\" + $(\"#newbranch\").val().trim();\n  var upstream = $(\"#upstream\").val().trim();\n\n  if (!newbranch)\n  {\n    showMessage(\"Invalid input!\", \"Invalid input!\", \"Please provide a non-empty branch name.\");\n    return;\n  }\n\n  if (!upstream)\n  {\n    showMessage(\"Invalid input!\", \"Invalid input!\", \"Please provide a non-empty upstream.\");\n    return;\n  }\n\n  function finish(result)\n  {\n    if (result)\n      location.href = (\"/rebasetrackingreview\" +\n                       \"?review=\" + encodeURIComponent(review.id) +\n                       \"&newbranch=\" + encodeURIComponent(newbranch) +\n                       \"&upstream=\" + encodeURIComponent(upstream) +\n                       \"&newhead=\" + encodeURIComponent(result.head_sha1) +\n                       \"&newupstream=\" + encodeURIComponent(result.upstream_sha1));\n  }\n\n  var operation = new Operation({ action: \"fetch remote branch\",\n                                  url: \"fetchremotebranch\",\n                                  data: { repository_name: repository.name,\n                                          remote: trackedbranch.remote,\n                                          branch: newbranch,\n                                          upstream: upstream },\n                                  wait: \"Fetching branch...\",\n                                  callback: finish });\n\n  operation.execute();\n}\n\nfunction rebaseReview()\n{\n  function finish(result)\n  {\n    if (result)\n      location.replace(\"/r/\" + review.id);\n  }\n\n  var data = { review_id: review.id,\n               new_head_sha1: check.new_head_sha1,\n               new_trackedbranch: check.new_trackedbranch };\n\n  if (check.new_upstream_sha1)\n    data.new_upstream_sha1 = check.new_upstream_sha1;\n\n  var operation = new Operation({ action: \"rebase review\",\n                                  url: \"rebasereview\",\n                                  data: data,\n                                  wait: \"Rebasing review...\",\n                                  callback: finish });\n\n  operation.execute();\n}\n\n$(function ()\n  {\n    $(\"input#newbranch\").autocomplete({ source: AutoCompleteRef(trackedbranch.remote, \"refs/heads/\"), html: true });;\n    $(\"input#upstream\").autocomplete({ source: AutoCompleteRef(trackedbranch.remote), html: true });;\n\n    function updateConflictsStatus(result)\n    {\n      if (result)\n      {\n        var message;\n\n        if (result.has_conflicts && result.has_changes)\n          message = \"Has conflicts and other changes.\";\n        else if (result.has_conflicts)\n          message = \"Has conflicts.\";\n        else if (result.has_changes)\n          message = \"Has unexpected changes!\";\n\n        var status_conflicts = $(\"#status_conflicts\");\n\n        status_conflicts.text(message || \"Clean.\");\n\n        if (message)\n          status_conflicts.attr(\"href\", result.url);\n\n        $(\"button#rebasereview\").removeAttr(\"disabled\").button(\"refresh\");\n      }\n    }\n\n    function updateMergeStatus(result)\n    {\n      if (result)\n      {\n        var message;\n\n        if (result.has_conflicts)\n          message = \"Will need review.\";\n\n        var status_merge = $(\"#status_merge\");\n\n        status_merge.text(message || \"Clean.\");\n\n        if (message)\n          status_merge.attr(\"href\", \"/showcommit?repository=\" + repository.id + \"&sha1=\" + result.merge_sha1);\n\n        var conflicts_status = new Operation({ action: \"check conflicts status\",\n                                               url: \"checkconflictsstatus\",\n                                               data: { review_id: review.id,\n                                                       merge_sha1: result.merge_sha1 },\n                                               callback: updateConflictsStatus });\n\n        conflicts_status.execute();\n\n        $(\"#status_conflicts\").text(\"Checking...\");\n      }\n    }\n\n    function updateHistoryRewriteStatus(result)\n    {\n      if (result)\n      {\n        var status_historyrewrite = $(\"#status_historyrewrite\");\n\n        status_historyrewrite.text(result.valid ? \"Clean.\" : \"Not valid!\");\n\n        if (result.valid)\n          $(\"button#rebasereview\").removeAttr(\"disabled\").button(\"refresh\");\n      }\n    }\n\n    if (typeof check != \"undefined\")\n    {\n      if (check.rebase_type == \"history\")\n      {\n        var historyrewrite_status = new Operation({ action: \"check history rewrite status\",\n                                                    url: \"checkhistoryrewritestatus\",\n                                                    data: { review_id: review.id,\n                                                            new_head_sha1: check.new_head_sha1 },\n                                                    callback: updateHistoryRewriteStatus });\n\n        historyrewrite_status.execute();\n\n        $(\"#status_historyrewrite\").text(\"Checking...\");\n      }\n      else if (check.rebase_type == \"move:ff\")\n      {\n        var merge_status = new Operation({ action: \"check merge status\",\n                                           url: \"checkmergestatus\",\n                                           data: { review_id: review.id,\n                                                   new_head_sha1: check.new_head_sha1,\n                                                   new_upstream_sha1: check.new_upstream_sha1 },\n                                           callback: updateMergeStatus });\n\n        merge_status.execute();\n\n        $(\"#status_merge\").text(\"Checking...\");\n      }\n      else\n      {\n        var conflicts_status = new Operation({ action: \"check conflicts status\",\n                                               url: \"checkconflictsstatus\",\n                                               data: { review_id: review.id,\n                                                       new_head_sha1: check.new_head_sha1,\n                                                       new_upstream_sha1: check.new_upstream_sha1 },\n                                               callback: updateConflictsStatus });\n\n        conflicts_status.execute();\n\n        $(\"#status_conflicts\").text(\"Checking...\");\n      }\n    }\n  });\n"
  },
  {
    "path": "src/resources/repositories.css",
    "content": "/* -*- mode: css; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\ntable.repositories {\n    min-width: 60%;\n    margin-top: 1rem;\n    font-family: monospace;\n}\n\ntable.repositories tr.repository:hover {\n    background-color: #eed;\n    cursor: pointer;\n}\n\ntable.repositories .details {\n    display: none\n}\ntable.repositories .details.show {\n    display: table-row\n}\n\ntable.repositories .details .buttons {\n    text-align: right;\n    padding-top: 0.5em\n}\ntable.repositories .details > td {\n    padding: 0.25rem 0 0;\n}\ntable.repositories .upstream {\n    text-align: right;\n}\n\ntable.trackedbranches {\n    width: 100%\n}\n\ntable.trackedbranches .title {\n    text-align: center;\n}\ntable.trackedbranches th {\n    border-bottom: 1px solid #222;\n    font-family: serif\n}\n\ntable.trackedbranches tr.branch td {\n    padding-top: 3px;\n    padding-bottom: 3px\n}\n\ntable.trackedbranches tr.branch:hover {\n    background-color: #eed;\n    cursor: pointer\n}\n\ntable.trackedbranches .localname, table.trackedbranches .remote, table.trackedbranches .remotename {\n    padding-right: 1em\n}\n\ntable.trackedbranches .enabled {\n    text-align: right\n}\n\ntable.trackedbranches .users {\n    padding-left: 1em\n}\n\ntable.trackedbranches td.buttons {\n    text-align: right\n}\n\ndiv.trackedbranchlog .log {\n    max-height: 500px;\n    overflow: auto\n}\n\ndiv.trackedbranchlog .log > table {\n    width: 100%\n}\n\ndiv.trackedbranchlog .from,\ndiv.trackedbranchlog .to {\n    font-family: monospace\n}\n\ndiv.trackedbranchlog .range > a {\n    text-decoration: none\n}\n\ndiv.trackedbranchlog .output {\n    padding-bottom: 0.5em\n}\n\ndiv.trackedbranchlog .output pre {\n    font-family: monospace;\n    white-space: pre;\n    background-color: #fff;\n    margin: 0;\n    border: 1px solid #888;\n    padding: 3px;\n}\n\ndiv#addtrackedbranch table {\n    width: 100%\n}\n\ndiv#addtrackedbranch td {\n    white-space: nowrap\n}\n\ndiv#addtrackedbranch td.key {\n    font-weight: bold\n}\n\ndiv#addtrackedbranch td.value {\n    font-family: monospace\n}\n\ndiv#addtrackedbranch td.note {\n    font-family: serif;\n    font-style: italic;\n    white-space: normal;\n    padding-bottom: 0.5em\n}\n\ndiv#addtrackedbranch input {\n    font-family: monospace;\n    width: 100%\n}\n"
  },
  {
    "path": "src/resources/repositories.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\n/* -*- mode: js; indent-tabs-mode: nil -*- */\n\n$(function ()\n  {\n    $(\"a.button\").button();\n\n    $(\"tr.repository\").click(\n      function (ev)\n      {\n        $(ev.currentTarget).next(\"tr.details\").toggleClass(\"show\");\n        ev.preventDefault();\n      });\n\n    if (location.hash)\n      $(\"tr.details.\" + location.hash.substring(location.hash.indexOf(\"#\") + 1)).addClass(\"show\");\n\n    $(\"tr.branch\").click(\n      function (ev)\n      {\n        var branch_row = $(ev.currentTarget);\n\n        var branch_id = parseInt(branch_row.attr(\"critic-branch-id\"));\n        var user_ids = branch_row.attr(\"critic-user-ids\").split(\",\").map(Number);\n\n        var operation = new Operation({ action: \"fetch log\",\n                                        url: \"trackedbranchlog\",\n                                        data: { branch_id: branch_id } });\n        var result = operation.execute();\n\n        var html = \"<div class='trackedbranchlog' title='Update Log'>\";\n\n        if (result.items.length)\n        {\n          html += \"<div class='log'><table><tr><th colspan=2>Update log:</th></tr>\";\n\n          for (var index = 0; index < result.items.length; ++index)\n          {\n            var item = result.items[index];\n\n            html += \"<tr><td class='when'>\" + new Date(item.time * 1000) + \"</td>\" +\n                        \"<td class='range'>\";\n            if (item.from_sha1 !== null && item.to_sha1 !== null)\n            {\n              html += \"<a href='\" + result.repository.name + \"/\" + item.from_sha1 + \"..\" + item.to_sha1 + \"'>\" +\n                        item.from_sha1.substring(0, 8) + \"..\" + item.to_sha1.substring(0, 8) +\n                      \"</a>\";\n            }\n            else\n            {\n              html += \"N/A\";\n            }\n            html += \"</td></tr>\";\n\n            if (item.hook_output.trim())\n              html += \"<tr><td class='output' colspan=2><pre>\" + htmlify(item.hook_output) + \"</pre></td></tr>\";\n          }\n\n          html += \"</table></div>\";\n        }\n\n        html += \"<p>\";\n        html += \"<b>Last check:</b> \" + (result.previous === null ? \"Never\" : new Date(result.previous * 1000)) + \"<br>\";\n        html += \"<b>Next scheduled check:</b> \" + (result.next === null ? \"ASAP\" : new Date(result.next * 1000));\n        html += \"</p>\";\n\n        html += \"</div>\";\n\n        var dialog = $(html);\n\n        function triggerUpdate()\n        {\n          var operation = new Operation({ action: \"trigger update\",\n                                          url: \"triggertrackedbranchupdate\",\n                                          data: { branch_id: branch_id }});\n          var result = operation.execute();\n\n          if (result)\n            dialog.dialog(\"close\");\n        }\n\n        function disable()\n        {\n          function finish()\n          {\n            confirm.dialog(\"close\");\n\n            var operation = new Operation({ action: \"disable tracking\",\n                                            url: \"disabletrackedbranch\",\n                                            data: { branch_id: branch_id }});\n            var result = operation.execute();\n\n            if (result)\n            {\n              dialog.dialog(\"close\");\n              branch_row.find(\"td.enabled\").text(\"No\");\n            }\n          }\n\n          var confirm = $(\"<div title='Disable Branch Tracking'><p>Are you sure you want to disable the tracking of this branch?</p></div>\");\n\n          confirm.dialog({ buttons: { \"Disable the tracking\": finish,\n                                      \"Do nothing\": function () { confirm.dialog(\"close\"); }}});\n        }\n\n        function enable()\n        {\n          var operation = new Operation({ action: \"enable tracking\",\n                                          url: \"enabletrackedbranch\",\n                                          data: { branch_id: branch_id }});\n          var result = operation.execute();\n\n          if (result)\n          {\n            dialog.dialog(\"close\");\n            branch_row.find(\"td.enabled\").text(\"Yes\");\n          }\n        }\n\n        function deleteTrackedBranch()\n        {\n          function finish()\n          {\n            confirm.dialog(\"close\");\n\n            var operation = new Operation({ action: \"delete tracking\",\n                                            url: \"deletetrackedbranch\",\n                                            data: { branch_id: branch_id }});\n            var result = operation.execute();\n\n            if (result)\n            {\n              dialog.dialog(\"close\");\n              branch_row.remove();\n            }\n          }\n\n          var confirm = $(\"<div title='Delete Branch Tracking'><p>Are you sure you want to delete the tracking of this branch?</p></div>\");\n\n          confirm.dialog({ buttons: { \"Delete the tracking\": finish,\n                                      \"Do nothing\": function () { confirm.dialog(\"close\"); }}});\n        }\n\n        var buttons = {};\n\n        if (user_ids.indexOf(user.id) != -1 || user.administrator)\n        {\n          if (branch_row.find(\"td.enabled\").text() == \"Yes\")\n          {\n            buttons[\"Update now\"] = triggerUpdate;\n            buttons[\"Disable\"] = disable;\n          }\n          else if (branch_row.find(\"td.enabled\").text() == \"No\")\n            buttons[\"Enable\"] = enable;\n\n          buttons[\"Delete\"] = deleteTrackedBranch;\n        }\n\n        buttons[\"Close\"] = function () { dialog.dialog(\"close\"); };\n\n        dialog.dialog({ width: 600, buttons: buttons });\n\n        ev.preventDefault();\n      });\n  });\n\nfunction addTrackedBranch(repository_id)\n{\n  var repository = repositories[repository_id];\n\n  var dialog = $(\"<div title='Add Tracked Branch' id=addtrackedbranch>\" +\n                 \"<table>\" +\n                 \"<tr><td class=key>Source repository:</td>\" +\n                 \"<td class=value><input id=sourcelocation value='\" + htmlify(repository.defaultRemoteLocation) + \"'></td></tr>\" +\n                 \"<tr><td class=key></td><td class=note>Source repository location/URL.</td></tr>\" +\n                 \"<tr><td class=key>Source branch name:</td><td class=value><input id=sourcename></td></tr>\" +\n                 \"<tr><td class=key></td><td class=note>Name of the branch in the source repository, without the leading \\\"refs/heads/\\\".</td></tr>\" +\n                 \"<tr><td class=key>Target repository:</td><td class=value>\" + htmlify(repository.location) + \"</td></tr>\" +\n                 \"<tr><td class=key>Target branch name:</td><td class=value><input id=targetname></td></tr>\" +\n                 \"<tr><td class=key></td><td class=note>Name of the branch in Critic's repository, without the leading \\\"refs/heads/\\\".</td></tr>\" +\n                 \"<tr><td class=key>Users:</td><td class=value><input id=users value='\" + htmlify(user.name) + \"'></td></tr>\" +\n                 \"<tr><td class=key></td><td class=note>Space or comma separated list of users to send emails to if the tracking fails.</td></tr>\" +\n                 \"</div>\");\n\n  var sourcename = dialog.find(\"#sourcename\");\n  var targetname = dialog.find(\"#targetname\");\n\n  sourcename.change(\n    function ()\n    {\n      if (targetname.val() == \"\")\n        targetname.val(sourcename.val());\n    });\n\n  function finish()\n  {\n    var source_location = dialog.find(\"#sourcelocation\").val().trim();\n    var source_name = dialog.find(\"#sourcename\").val().trim();\n    var target_name = dialog.find(\"#targetname\").val().trim();\n    var users = dialog.find(\"#users\").val().split(/\\s*,\\s*|\\s+/g);\n    var errors = [];\n\n    if (source_location == \"\")\n      errors.push(\"Empty source repository.\");\n\n    if (source_name == \"\")\n      errors.push(\"Empty source branch name.\");\n\n    if (target_name == \"\")\n      errors.push(\"Empty target branch name.\");\n\n    if (errors.length == 0)\n    {\n      var operation = new Operation({ action: \"add tracked branch\",\n                                      url: \"addtrackedbranch\",\n                                      data: { repository_id: repository_id,\n                                              source_location: source_location,\n                                              source_name: source_name,\n                                              target_name: target_name,\n                                              users: users }});\n\n      var result = operation.execute();\n\n      if (result)\n      {\n        dialog.dialog(\"close\");\n        location.reload();\n      }\n    }\n    else\n      alert(errors.join(\"\\n\"));\n  }\n\n  function cancel()\n  {\n    dialog.dialog(\"close\");\n  }\n\n  var buttons = { \"Add Tracked Branch\": finish, \"Cancel\": cancel };\n\n  dialog.dialog({ width: 600, buttons: buttons });\n\n  if (repository.defaultRemoteLocation)\n    dialog.find(\"#sourcename\").focus();\n  else\n    dialog.find(\"#sourcelocation\").focus();\n}\n"
  },
  {
    "path": "src/resources/review.css",
    "content": "/* -*- mode: css; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\ndiv#draftStatus {\n    text-align: right;\n    font-weight: bold;\n    font-size: larger\n}\n\ndiv#draftStatus span.draft {\n    color: red\n}\n\ndiv#draftStatus span.buttons {\n    font-size: 12px\n}\n\ndiv.editowner input {\n    font-family: monospace;\n    margin-top: 0.5em;\n    width: 100%\n}\n"
  },
  {
    "path": "src/resources/review.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\n/* -*- Mode: js; js-indent-level: 2; indent-tabs-mode: nil -*- */\n\nfunction updateDraftStatus(data)\n{\n  if (typeof data == \"string\")\n  {\n    var match = /^ok:(?:.*,)?approved=(\\d+),disapproved=(\\d+),(?:approvedBinary=(\\d+),disapprovedBinary=(\\d+),)?comments=(\\d+),reopened=(\\d+),closed=(\\d+),morphed=(\\d+)$/.exec(data);\n\n    if (!match)\n      return;\n\n    data = { reviewedNormal: parseInt(match[1]),\n             unreviewedNormal: parseInt(match[2]),\n             reviewedBinary: parseInt(match[3]),\n             unreviewedBinary: parseInt(match[4]),\n             writtenComments: parseInt(match[5]),\n             reopenedIssues: parseInt(match[6]),\n             resolvedIssues: parseInt(match[7]),\n             morphedChains: parseInt(match[8]) }\n  }\n\n  var items = [];\n\n  function renderCount(count, what)\n  {\n    return count + \" \" + what + (count != 1 ? \"s\" : \"\");\n  }\n\n  if (data.reviewedNormal != '0')\n    items.push(\"<span class='approved'>reviewed \" + renderCount(data.reviewedNormal, \"line\") + \"</span>\");\n  if (data.unreviewedNormal != '0')\n    items.push(\"<span class='disapproved'>unreviewed \" + renderCount(data.unreviewedNormal, \"line\") + \"</span>\");\n  if (data.reviewedBinary != '0')\n    items.push(\"<span class='approved'>reviewed \" + renderCount(data.reviewedBinary, \"binary file\") + \"</span>\");\n  if (data.unreviewedBinary != '0')\n    items.push(\"<span class='disapproved'>unreviewed \" + renderCount(data.unreviewedBinary, \"binary file\") + \"</span>\")\n  if (data.writtenComments != '0')\n    items.push(\"<span class='comments'>wrote \" + renderCount(data.writtenComments, \"comment\") + \"</span>\");\n  if (data.reopenedIssues != '0')\n    items.push(\"<span class='reopened'>reopened \" + renderCount(data.reopenedIssues, \"issue\") + \"</span>\");\n  if (data.resolvedIssues != '0')\n    items.push(\"<span class='closed'>resolved \" + renderCount(data.resolvedIssues, \"issue\") + \"</span>\");\n  if (data.morphedChains != '0')\n    items.push(\"<span class='morphed'>morphed \" + renderCount(data.morphedChains, \"comment\") + \"</span>\");\n\n  var draftStatus = $(\"#draftStatus\");\n\n  if (items.length == 0)\n    draftStatus.empty().nextAll(\"div.buttons\").show();\n  else\n  {\n    draftStatus.html(\"<span class='draft'>Draft:</span>\"\n                    + \" \" + items.join(\", \") + \" \"\n                    + \"<span class='buttons'>\"\n                    +   \"<button onclick='previewChanges();'>Preview</button>\"\n                    +   \"<button onclick='submitChanges();'>Submit</button>\"\n                    +   \"<button onclick='cancelChanges();'>Abort</button>\"\n                    + \"</span>\");\n    draftStatus.find(\"button\").button();\n    draftStatus.nextAll(\"div.buttons\").hide();\n  }\n\n  if (typeof CommentMarkers != \"undefined\")\n    CommentMarkers.updateAll();\n}\n\nfunction markFile(status, file_id, parent_index)\n{\n  var id_prefix = parent_index !== null ? \"p\" + parent_index : \"\";\n  var checkbox = document.getElementById(id_prefix + \"a\" + file_id);\n\n  if (!checkbox)\n    return;\n\n  var changeset_ids;\n  var reviewableFiles;\n\n  if (parent_index !== null)\n  {\n    changeset_ids = [changeset[parent_index].id];\n    reviewableFiles = changeset[parent_index].reviewableFiles;\n  }\n  else\n  {\n    changeset_ids = changeset.ids;\n    reviewableFiles = changeset.reviewableFiles;\n  }\n\n  if (reviewableFiles[file_id] == status)\n    return;\n\n  var checked = status == \"reviewed\";\n\n  if (checkbox.checked != checked)\n    checkbox.checked = checked;\n\n  var data = { review_id: review.id,\n               reviewed: status == \"reviewed\",\n               changeset_ids: changeset_ids,\n               file_ids: [file_id] };\n\n  function finish(result)\n  {\n    if (result)\n    {\n      reviewableFiles[file_id] = status;\n      updateDraftStatus(result.draft_status);\n    }\n    else\n      checkbox.checked = !checked;\n\n    var all_checked;\n\n    if (checkbox.checked)\n    {\n      all_checked = true;\n      $(checkbox).parents(\"table.commit-files\").find(\"td.approve.file > input\").each(function (index, element) { if (!element.checked) all_checked = false; });\n    }\n    else\n      all_checked = false;\n\n    $(checkbox).parents(\"table.commit-files\").find(\"td.approve.everything > input\").each(function (index, element) { element.checked = all_checked; });\n  }\n\n  var callback;\n\n  if (user.options.ui.asynchronousReviewMarking)\n    callback = finish;\n  else\n    callback = null;\n\n  var operation = new Operation({ action: \"mark files as \" + status,\n                                  url: \"markfiles\",\n                                  data: data,\n                                  callback: callback });\n\n  if (callback)\n    operation.execute();\n  else\n    finish(operation.execute());\n}\n\nfunction markAllFiles(status)\n{\n  var changeset_ids = changeset.ids;\n  var file_ids = [];\n\n  var reviewableFiles = changeset.reviewableFiles;\n\n  for (var file_id in reviewableFiles)\n    if (/^\\d+$/.test(file_id) && reviewableFiles[file_id] != status)\n      file_ids.push(parseInt(file_id));\n\n  if (!file_ids.length)\n    return;\n\n  var data = { review_id: review.id,\n               reviewed: status == \"reviewed\",\n               changeset_ids: changeset_ids,\n               file_ids: file_ids };\n\n  var operation = new Operation({ action: \"mark files as \" + status,\n                                  url: \"markfiles\",\n                                  data: data });\n  var result = operation.execute();\n\n  if (result)\n  {\n    for (var index = 0; index < file_ids.length; ++index)\n      reviewableFiles[file_ids[index]] = status;\n    updateDraftStatus(result.draft_status);\n  }\n  else\n    checkbox.checked = status == \"pending\";\n}\n\nfunction previewChanges()\n{\n  location.href = \"/showbatch?review=\" + review.id;\n}\n\nfunction submitChanges()\n{\n  function start()\n  {\n    function finish(remark)\n    {\n      var success = false;\n\n      var data = { review_id: review.id };\n\n      if (!/^\\s*$/.test(remark))\n        data.remark = remark;\n\n      var operation = new Operation({ action: \"submit changes\",\n                                      url: \"submitchanges\",\n                                      data: data,\n                                      wait: \"Submitting changes...\" });\n      var result = operation.execute();\n\n      if (result)\n      {\n        if (result.profiling)\n          console.log(result.profiling);\n\n        return true;\n      }\n      else\n        return false;\n    }\n\n    var operation = new Operation({ action: \"determine review state change\",\n                                    url: \"reviewstatechange\",\n                                    data: { review_id: review.id }});\n    var result = operation.execute();\n    var state_change = \"\";\n\n    if (result)\n      if (result.current_state == \"open\" && result.new_state == \"accepted\")\n        state_change = \"<p class='state'>With these changes, the review will be ACCEPTED.</p>\";\n      else if (result.current_state == \"accepted\" && result.new_state == \"open\")\n        state_change = \"<p class='state'>With these changes, the review will NO LONGER be ACCEPTED.</p>\";\n\n    var content = $(\"<div class='comment flex' title='Submit Changes'>\" +\n                    state_change +\n                    \"<p class='message' style='margin: 0'>Additional note (optional):</p>\" +\n                    \"<textarea class='text flexible' rows=8></textarea></div>\");\n\n    var buttons;\n\n    buttons = {\n      Submit: function () { if (finish(content.find(\"textarea\").val())) { $(content).dialog(\"close\"); location.reload(); } },\n      Cancel: function () { $(content).dialog(\"close\"); }\n    };\n\n    content.dialog({ width: 600,\n                     buttons: buttons });\n  }\n\n  Operation.whenIdle(start);\n}\n\nfunction cancelChanges()\n{\n  function finish()\n  {\n    var data = { review_id: review.id,\n                 what: {} };\n\n    if (0 != ((data.what[\"approval\"] = approval.is(\":checked\")) +\n              (data.what[\"comments\"] = comments.is(\":checked\")) +\n              (data.what[\"metacomments\"] = metacomments.is(\":checked\"))))\n    {\n      var operation = new Operation({ action: \"abort changes\",\n                                      url: \"abortchanges\",\n                                      data: data });\n      var result = operation.execute();\n\n      if (result)\n      {\n        location.reload();\n\n        if (result.profiling)\n          console.log(result.profiling);\n      }\n      else\n        return false;\n    }\n\n    return true;\n  }\n\n  var content = $(\"<div title='Warning!'><p><b>Aborted changes will be lost permanently.</b></p><legend><input type=checkbox id=what_approval checked>Abort reviewed/unreviewed changes</legend><legend><input type=checkbox id=what_comments checked>Abort written comments</legend><legend><input type=checkbox id=what_metacomments checked>Abort reopened/morphed comments</legend></div>\");\n\n  var approval = content.find(\"input#what_approval\");\n  var comments = content.find(\"input#what_comments\");\n  var metacomments = content.find(\"input#what_metacomments\");\n\n  content.find(\"legend\").click(function (ev) { if (ev.target.nodeName.toLowerCase() != \"input\") $(ev.currentTarget).find(\"input\").click(); });\n\n  content.dialog({ width: 400, modal: true, buttons: { \"Abort Changes\": function () { if (finish()) content.dialog(\"close\"); }, \"Do Nothing\": function () { content.dialog(\"close\"); }}});\n}\n\nfunction closeReview()\n{\n  function proceed()\n  {\n    var operation = new Operation({ action: \"close review\",\n                                    url: \"closereview\",\n                                    data: { review_id: review.id }});\n\n    if (operation.execute())\n      location.reload();\n  }\n\n  var is_owner = false;\n\n  for (var index = 0; index < review.owners.length; ++index)\n  {\n    if (user.id == review.owners[index].id)\n    {\n      is_owner = true;\n      break;\n    }\n  }\n\n  if (!is_owner)\n  {\n    var content = $(\"<div title=Please Confirm'><p><b>You are not the owner of this review.</b>  Are you sure you mean to close it?</p></div>\");\n    content.dialog({ width: 400, modal: true, buttons: { \"Close Review\": function () { content.dialog(\"close\"); proceed(); }, \"Do Nothing\": function () { content.dialog(\"close\"); }}});\n  }\n  else\n    proceed();\n}\n\nfunction dropReview()\n{\n  function proceed()\n  {\n    var operation = new Operation({ action: \"drop review\",\n                                    url: \"dropreview\",\n                                    data: { review_id: review.id }});\n\n    if (operation.execute())\n      location.reload();\n  }\n\n  var content = $(\"<div title='Please Confirm'><p>Are you sure you mean to drop the review?</p></div>\");\n  content.dialog({ width: 400, modal: true, buttons: { \"Drop Review\": function () { content.dialog(\"close\"); proceed(); }, \"Do Nothing\": function () { content.dialog(\"close\"); }}});\n}\n\nfunction reopenReview()\n{\n  var operation = new Operation({ action: \"reopen review\",\n                                  url: \"reopenreview\",\n                                  data: { review_id: review.id }});\n\n  if (operation.execute())\n    location.reload();\n}\n\nfunction pingReview()\n{\n  var content = $(\"<div class='comment flex' title='Ping Review'>\" +\n                  \"<textarea class='text flexible' rows=8></textarea></div>\");\n\n  function finish(type)\n  {\n    var operation = new Operation({ action: \"ping review\",\n                                    url: \"pingreview\",\n                                    data: { review_id: review.id,\n                                            note: content.find(\"textarea\").val() }});\n\n    return operation.execute() != null;\n  }\n\n  var buttons = { \"Send Ping\": function () { if (finish()) { $(content).dialog(\"close\"); } },\n                  \"Cancel\": function () { $(content).dialog(\"close\"); } };\n\n  content.dialog({ width: 600,\n                   buttons: buttons,\n                   closeOnEscape: false });\n}\n\nfunction editSummary()\n{\n  function finish(type)\n  {\n    var operation = new Operation({ action: \"update review\",\n                                    url: \"updatereview\",\n                                    data: { review_id: review.id,\n                                            new_summary: content.find(\"input\").val() }});\n\n    return operation.execute() != null;\n  }\n\n  function checkFinished()\n  {\n    if (finish()) { $(content).dialog(\"close\"); location.reload(); }\n  }\n\n  function handleKeypress(ev)\n  {\n    if (ev.keyCode == 13)\n      checkFinished();\n  }\n\n  var content = $(\"<div class='comment' title='Edit Summary'><div class='text'>Enter new summary:<br><input></div></div>\");\n\n  content.find(\"input\").val($(\"#summary\").text());\n  content.find(\"input\").keypress(handleKeypress);\n\n  var buttons = {\n                  \"Save\": function () { checkFinished(); },\n                  Cancel: function () { $(content).dialog(\"close\"); }\n                };\n\n  content.dialog({ width: 600,\n                   buttons: buttons });\n}\n\nfunction editDescription()\n{\n  function finish(type)\n  {\n    var operation = new Operation({ action: \"update review\",\n                                    url: \"updatereview\",\n                                    data: { review_id: review.id,\n                                            new_description: content.find(\"textarea\").val() }});\n\n    return operation.execute() != null;\n  }\n\n  var self = this;\n  var content = $(\"<div class='comment flex' title='Edit Description'>\" +\n                  \"<textarea class='text flexible' rows=8></textarea></div>\");\n\n  content.find(\"textarea\").val($(\"#description\").text());\n\n  var buttons = {\n    Save: function () { if (finish()) { $(content).dialog(\"close\"); location.reload(); } },\n    Cancel: function () { $(content).dialog(\"close\"); }\n  };\n\n  content.dialog({ width: 600,\n                   buttons: buttons,\n                   closeOnEscape: false });\n}\n\nfunction editOwners()\n{\n  function finish(type)\n  {\n    var operation = new Operation({ action: \"update review\",\n                                    url: \"updatereview\",\n                                    data: { review_id: review.id,\n                                            new_owners: content.find(\"input\").val().split(/[,\\s]+/g) }});\n\n    return operation.execute() != null;\n  }\n\n  function checkFinished()\n  {\n    if (finish())\n    {\n      $(content).dialog(\"close\");\n      location.reload();\n    }\n  }\n\n  function handleKeypress(ev)\n  {\n    if (ev.keyCode == 13)\n      checkFinished();\n  }\n\n  var self = this;\n  var content = $(\"<div class='editowner' title='Change Review Owner'><p>Please enter the user name(s) of the new review owner(s):<br><input></p></div>\");\n\n  content.find(\"input\").val(owners.map(function (user) { return user.name; }).join(\", \"));\n  content.find(\"input\").keypress(handleKeypress);\n\n  var buttons = {\n    Save: function () { checkFinished(); },\n    Cancel: function () { $(content).dialog(\"close\"); }\n  };\n\n  content.dialog({ width: 400,\n                   modal: true,\n                   buttons: buttons });\n}\n\n$(document).ready(function ()\n  {\n    $(\"td.approve input\").change(function (ev)\n      {\n        var target = $(ev.currentTarget);\n        if (target.parents(\"td\").hasClass(\"everything\"))\n        {\n          markAllFiles(ev.currentTarget.checked ? \"reviewed\" : \"pending\");\n\n          target.parents(\"table\").find(\"td.approve.file input\").each(function (index, element)\n            {\n              element.checked = ev.currentTarget.checked;\n            });\n        }\n        else\n        {\n          var row = target.parents(\"tr\");\n          var file_id = parseInt(row.attr(\"critic-file-id\"));\n          var parent_index = target.attr(\"critic-parent-index\");\n\n          if (parent_index)\n            parent_index = parseInt(parent_index);\n          else\n            parent_index = null;\n\n          markFile(ev.currentTarget.checked ? \"reviewed\" : \"pending\", file_id, parent_index);\n        }\n      });\n\n    if (typeof updateCheckInterval != \"undefined\" && updateCheckInterval)\n    {\n      function processResult(result)\n      {\n        if (result.stale)\n        {\n          var content = $(\"<div title='Review Updated'>\" +\n                          \"<p>The review has been updated since you loaded \" +\n                          \"this page.  Would you like to reload?</p>\" +\n                          \"</div>\");\n\n          content.dialog({\n            modal: true,\n            buttons: { \"Reload\": function () { content.dialog(\"close\"); location.reload(); },\n                       \"Do Nothing\": function () { content.dialog(\"close\"); }}\n          });\n\n          return;\n        }\n        else\n        {\n          updateCheckInterval = result.interval;\n        }\n\n        if (updateCheckInterval)\n          setTimeout(checkSerial, updateCheckInterval * 1000);\n      }\n\n      function checkSerial()\n      {\n        var operation = new Operation({ action: \"check serial\",\n                                        url: \"checkserial\",\n                                        data: { \"review_id\": review.id,\n                                                \"serial\": review.serial },\n                                        callback: processResult });\n        operation.execute();\n      }\n\n      setTimeout(checkSerial, updateCheckInterval * 1000);\n    }\n  });\n"
  },
  {
    "path": "src/resources/reviewfilters.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2013 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\n\"use strict\";\n\nfunction addReviewFiltersDialog(options)\n{\n  var title, message;\n\n  if (options.filter_type == \"reviewer\")\n  {\n    title = \"Add Reviewer\";\n    message = \"<p>Make specified users reviewers of given path during this review.</p>\";\n  }\n  else\n  {\n    title = \"Add Watcher\";\n    message = \"<p>Make specified users watchers of given path during this review.  \"\n            +    \"If a user would normally be a reviewer of the path, he/she is \"\n            +    \"reduced to just a watcher.</p>\";\n  }\n\n  var content = $(\"<div class='comment' title='\" + title + \"'>\"\n                +   message\n                +   \"<p>\"\n                +     \"<b>User name(s):</b><br>\"\n                +     \"<input class='name sourcefont' style='width: 100%'><br>\"\n                +     \"<b>Directory:</b><br>\"\n                +     \"<input class='path sourcefont' style='width: 100%'\"\n                +           \" placeholder='Leave empty for \\\"everything\\\"'>\"\n                +   \"</p>\"\n                + \"</div>\");\n\n  function finish()\n  {\n    var names = content.find(\"input.name\").val().trim().split(/[, ]+/);\n    var path = content.find(\"input.path\").val().trim();\n\n    /* Filter out empty names. */\n    names = names.filter(function (name) { return name; });\n\n    if (!path)\n      path = \"/\";\n\n    if (names.length)\n      return options.callback(names, path);\n    else\n      return false;\n  }\n\n  function checkFinished()\n  {\n    if (finish())\n    {\n      $(content).dialog(\"close\");\n\n      if (options.reload_page)\n        location.reload();\n    }\n  }\n\n  function handleKeypress(ev)\n  {\n    if (ev.keyCode == 13)\n      checkFinished();\n  }\n\n  content.find(\"input\").keypress(handleKeypress);\n\n  var buttons = {\n    \"Add Filter\": function () { checkFinished(); },\n    \"Cancel\": function () { content.dialog(\"close\"); }\n  };\n\n  content.dialog({ width: 600, height: \"auto\",\n                   modal: true,\n                   buttons: buttons });\n\n  function enableAutoCompletion(result)\n  {\n    content.find(\"input.name\").autocomplete({\n      source: AutoCompleteUsers(result.users)\n    });\n    content.find(\"input.path\").autocomplete({\n      source: AutoCompletePath(result.paths),\n      html: true\n    });\n  }\n\n  var data = { values: [ \"users\", \"paths\" ] };\n\n  if (window.review)\n    /* Called from review front-page. */\n    data.review_id = review.id;\n  else\n    /* Called from \"Create Review\" page. */\n    data.changeset_ids = review_data.changeset_ids;\n\n  var operation = new Operation({ action: \"get auto-complete data\",\n                                  url: \"getautocompletedata\",\n                                  data: data,\n                                  callback: enableAutoCompletion });\n\n  operation.execute();\n}\n"
  },
  {
    "path": "src/resources/ruler.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2013 Rafał Chłodnicki, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\n$(document).ready(function ()\n  {\n    var spaces = (new Array(rulerColumn + 1)).join(\" \");\n    var space = $('<span class=\"sourcefont\" style=\"white-space: pre\">' + spaces + '</span>');\n    $(\"body\").append(space);\n    var space_width = space.width();\n    space.remove();\n\n    $(\"head\").append('<style>td.line { background-image: url(/static-resource/ruler.png);' +\n      'background-position: ' + space_width + 'px 0;' +\n      'background-repeat: repeat-y; }</style>');\n  });\n"
  },
  {
    "path": "src/resources/search.css",
    "content": "/* -*- mode: css; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\n.search,\n.callout.quicksearch {\n    -ms-flex: 1;\n    -webkit-flex: 1;\n    flex: 1;\n}\n.callout.quicksearch {\n    -ms-flex-item-align: start;\n    -webkit-align-self: flex-start;\n    align-self: flex-start;\n    max-width: 19rem;\n    margin: 1.5rem .5rem 0 1.5rem;\n}\n.search {\n    max-width: 50em;\n    margin: 1.5rem .5rem;\n}\n\n.search > * {\n    margin-bottom: 1.5rem;\n}\n.search > *:last-child {\n    margin-bottom: 0;\n}\n\n.input-options {\n    float: right;\n    font-size: .8em;\n}\n\n.search input[type=text],\n.search select {\n    margin-top: .25rem;\n    width: 100%;\n}\n\n.search-query {\n    width: 100%;\n}\n\n.search-freetext,\n.search-repository {\n    margin-right: 1.5rem;\n}\n\n.search-repository,\n.search-branch {\n    -ms-flex: 1;\n    -webkit-flex: 1;\n    flex: 1;\n}\n.search-freetext {\n    -ms-flex: 5;\n    -webkit-flex: 5;\n    flex: 5;\n}\n.search-state {\n    -ms-flex: 2;\n    -webkit-flex: 2;\n    flex: 2;\n}\n\n.search-buttons > * {\n    margin-right: 0.5rem\n}\n\n.search-result {\n    margin: 0 20%\n}\n\n.search-result h2 {\n    font-weight: normal;\n    font-size: 1.4rem;\n    margin: 1rem 0 0.5rem 0\n}\n\n.search-result tr.review td {\n    background-color: inherit !important\n}\n"
  },
  {
    "path": "src/resources/search.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\n\"use strict\";\n\n$(function () {\n  var searchForm = document.forms.search;\n\n  function doSearch(query) {\n    function handleResult(table, result) {\n      $(\".search-result\")\n        .css({ display: \"block\" })\n        .children(\".callout\").empty().append(table);\n      history.replaceState(null, null, \"/search?\" + result.query_string);\n    }\n\n    quickSearch(query, searchForm.query ? handleResult : null);\n  }\n\n  if (!searchForm.query) {\n    // Disable the freetext input if none of its checkboxes are checked\n    [ searchForm.freetextSummary, searchForm.freetextDescription ].forEach(\n      function (checkbox, idx, checkboxes) {\n        checkbox.addEventListener('click', function () {\n          var someChecked = checkboxes.some(\n            function (cbox) { return cbox.checked; }\n          );\n          if (searchForm.freetext.disabled === someChecked) {\n            searchForm.freetext.disabled = !someChecked;\n          }\n        });\n      });\n  }\n\n  searchForm.addEventListener('submit', function (evt) {\n    evt.preventDefault();\n    var form = this;\n    function phrases(value)\n    {\n      return value.match(/\"[^\"]+\"|'[^']+'|\\S+/g).map(\n        function (phrase)\n        {\n          var match = /^'([^']+)'|\"([^\"]+)\"$/.exec(phrase);\n          if (match)\n            return match[1] || match[2] || \"\";\n          else\n            return phrase;\n        });\n    }\n\n    function tokens(value)\n    {\n      return value.split(/[\\s,]+/g).map(\n        function (item)\n        {\n          return item.trim();\n        }).filter(\n          Boolean\n        );\n    }\n\n    function with_keyword(keyword)\n    {\n      return function (term) { return term ? keyword + \":'\" + term + \"'\" : \"\"; };\n    }\n\n    var query;\n\n    if (form.query) {\n      query = form.query.value.trim();\n    } else {\n      var terms = [];\n\n      var freetext = form.freetext.value.trim();\n      if (freetext) {\n        var textphrases = phrases(freetext);\n        if (form.freetextSummary.checked && form.freetextDescription.checked) {\n          terms.push.apply(terms, textphrases.map(with_keyword(\"text\")));\n        } else if (form.freetextSummary.checked) {\n          terms.push.apply(terms, textphrases.map(with_keyword(\"summary\")));\n        } else if (form.freetextDescription.checked) {\n          terms.push.apply(terms, textphrases.map(with_keyword(\"description\")));\n        }\n      }\n\n      var users = tokens(form.user.value.trim()), usersKey;\n      if (form.userOwner.checked && form.userReviewer.checked) {\n        usersKey = \"owner-or-reviewer\";\n      } else if (form.userOwner.checked) {\n        usersKey = \"owner\";\n      } else if (form.userReviewer.checked) {\n        usersKey = \"reviewer\";\n      } else {\n        usersKey = \"user\";\n      }\n      terms.push.apply(terms, users.map(with_keyword(usersKey)));\n\n      var repository = form.repository.value;\n      if (repository) {\n        terms.push(with_keyword(\"repository\")(repository));\n      }\n\n      var branch = form.branch.value.trim();\n      if (branch) {\n        terms.push(with_keyword(\"branch\")(branch));\n      }\n\n      var paths = tokens(form.path.value.trim());\n      terms.push.apply(terms, paths.map(with_keyword(\"path\")));\n\n      var state = form.state.value;\n      if (state) {\n        terms.push(with_keyword(\"state\")(state));\n      }\n\n      query = terms.join(\" \");\n    }\n\n    doSearch(query);\n  });\n\n  $(document.forms.search.user)\n    .autocomplete({ source: AutoCompleteUsers(users) });\n\n  $(\"select[name='state']\").chosen();\n\n  $(\".repository-select\")\n    .chosen({ inherit_select_classes: true,\n              allow_single_deselect: true,\n              collapsed_width: \"100%\",\n              expanded_width: \"40em\" });\n\n  if (searchForm.query && searchForm.query.value.trim()) {\n    doSearch(searchForm.query.value.trim());\n  }\n});\n"
  },
  {
    "path": "src/resources/services.css",
    "content": "/* -*- mode: css; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\ntable.services {\n    margin-top: 1rem;\n    font-family: monospace\n}\n\ntr.service:hover {\n    background-color: #eed;\n}\n\ntr.service > td.commands a {\n    visibility: hidden;\n    text-decoration: none\n}\ntr.service:hover > td.commands a {\n    visibility: visible\n}\n\ntable.services .pid,\ntable.services .rss,\ntable.services .cpu,\ntable.services .uptime {\n    text-align: right;\n}\n\ndiv.servicelog > pre {\n    border: 1px solid #222;\n    padding: 0.5em;\n    background: white;\n    overflow: auto\n}\n"
  },
  {
    "path": "src/resources/services.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\n\"use strict\";\n\nfunction restartService(service_name)\n{\n  var operation = new Operation({ action: \"restart service\",\n                                  url: \"restartservice\",\n                                  data: { service_name: service_name },\n                                  wait: \"Restarting service...\" });\n\n  if (operation.execute())\n    location.reload();\n}\n\nfunction getServiceLog(service_name)\n{\n  var content;\n\n  var operation = new Operation({ action: \"fetch service log\",\n                                  url: \"getservicelog\",\n                                  data: { service_name: service_name },\n                                  wait: \"Fetching service log...\" });\n  var result = operation.execute();\n\n  if (result)\n  {\n    content = $(\"<div class='servicelog flex' title='Service Log'>\" +\n                \"<pre class=flexible></pre></div>\");\n    content.find(\"pre\").text(result.lines.join(\"\\n\"));\n    content.dialog({ width: Math.min($(document).width() - 100, 1024),\n                     buttons: { Close: function () { content.dialog(\"close\"); }} });\n  }\n}\n"
  },
  {
    "path": "src/resources/showbatch.css",
    "content": "/* -*- mode: css; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\ntable.files {\n    margin: 0;\n}\ntable.files tbody {\n    font-family: monospace;\n}\n\ntable.files .path {\n    text-align: left;\n    padding-left: 0.25em;\n    padding-right: 1em;\n    white-space: pre;\n}\ntable.files .lines {\n    text-align: right;\n}\n\ndiv.main table.comments td.h2 {\n    padding-top: 2em;\n    padding-left: 2em;\n}\n\ndiv.main table.comments td.h3 {\n    padding-top: 1.5em;\n    padding-left: 1.5em;\n}\n\ndiv.main table.comments td.h3 h3 {\n    font-size: 0.9rem;\n    border-bottom: 1px solid #cca;\n    display: inline-block;\n    margin-top: 0;\n    margin-bottom: 0.5em\n}\n\ndiv.main table.comments td.h3 h3 a {\n    font-size: 60%;\n    margin-left: 1em\n}\n"
  },
  {
    "path": "src/resources/showbranch.css",
    "content": "/* -*- mode: css; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\ndiv.main { margin-bottom: 1em }\n"
  },
  {
    "path": "src/resources/showcomment.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\nkeyboardShortcutHandlers.push(function (key)\n  {\n    for (var chain_id = null in commentChainById)\n      break;\n\n    if (chain_id === null)\n      return false;\n\n    if (key == \"m\".charCodeAt(0))\n      contextLines = Math.ceil(contextLines * 1.5) || 1;\n    else if (key == \"l\".charCodeAt(0))\n      contextLines = Math.floor(contextLines / 1.5);\n    else\n      return false;\n\n    location.replace(\"/showcomment?chain=\" + chain_id + \"&context=\" + contextLines);\n    return true;\n  });\n"
  },
  {
    "path": "src/resources/showfile.css",
    "content": "/* -*- mode: css; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\ndiv.main table.file {\n    table-layout: fixed\n}\n\ndiv.main table.file td.h1 {\n    border-bottom: none\n}\n\ndiv.main table td.line {\n    padding-left: 0\n}\n\ndiv.main table td.h1 h1 > a {\n    text-decoration: none;\n    color: #222\n}\ndiv.main table td.h1 h1 > a.root {\n    font-size: 80%\n}\ndiv.main table td.h1 h1 > a:hover {\n    text-decoration: underline;\n    color: #222\n}\ndiv.main table td.h1 h1 span {\n    padding-left: 3px;\n    padding-right: 3px;\n}\n\ntable.file col.edge {\n    width: 1em\n}\n\ntable.file col.linenr {\n    width: 4em\n}\n\ntable.file col.middle {\n    width: 1em\n}\n\ntable.file col.line {\n    width: 50%\n}\n\ntable.file > tbody.spacer > tr > td {\n    background-color: #eee;\n    text-align: center;\n    border-left: 1px solid #888;\n    border-right: 1px solid #888\n}\n\ntable.file > tbody.spacer.bottom > tr > td {\n    border-bottom: 1px solid #888\n}\n\ntable.file > tbody.spacer.top > tr > td {\n    border-top: 1px solid #888\n}\n\ntable.file > tbody.spacer > tr > td {\n    padding: 0\n}\n\ntable.file > tbody.spacer > tr + tr.spacer {\n    display: none\n}\n\ntable.file > tbody.spacer > tr.expand > td,\ntable.file > tbody.spacer > tr.context > td {\n    padding-top: 3px;\n    padding-bottom: 3px\n}\n\ntable.file > tbody.spacer > tr.expand > td > select {\n    background-color: #eee;\n    border: 0\n}\n\ntable.file > tbody.lines > tr > td.edge,\ntable.file > tbody.lines > tr > td.middle,\ntable.file > tbody.lines > tr > td.linenr {\n    background-color: #eee\n}\n\ntable.file > tbody.lines > tr > td.edge:first-child {\n    border-left: 1px solid #888\n}\n\ntable.file > tbody.lines > tr > td.edge:last-child {\n    border-right: 1px solid #888\n}\n\ntable.file > tbody.lines > tr > td.linenr.old {\n    text-align: right;\n    padding-right: 3px\n}\n\ntable.file > tbody.lines > tr > td.linenr.new {\n    padding-left: 3px\n}\n\ntable.file > tbody.lines > tr > td.line {\n    background-color: white;\n    font-family: monospace;\n    border-left: 1px solid #888;\n    border-right: 1px solid #888;\n    white-space: pre-wrap;\n    overflow: hidden;\n    padding-left: 3px\n}\n\ntable.file > tbody.lines > tr > td.comment {\n    white-space: pre-wrap;\n    border-left: 1px solid #888;\n    border-right: 1px solid #888\n}\n\ntable.file > tbody.lines > tr:first-child > td.line {\n    border-top: 1px solid #888;\n    padding-top: 3px\n}\n\ntable.file > tbody.lines > tr:last-child > td.line {\n    border-bottom: 1px solid #888;\n    padding-bottom: 3px\n}\n\ntable.file > tbody.lines > tr.context:hover > td.line,\ntable.file > tbody.lines > tr.context.markold > td.line.old,\ntable.file > tbody.lines > tr.context.marknew > td.line.new\n{\n    background-color: #eee\n}\n\ntable.file > tbody.lines > tr > td.line.highlight {\n    background-color: #dfd\n}\ntable.file > tbody.lines > tr:hover > td.line.highlight {\n    background-color: #cec\n}\n\ntd.line.highlight b.t { color: #aca }\n\n\ntable.file > tbody.lines > tr.commented > td.line.commented {\n    background-color: #eef\n}\ntable.file > tbody.lines > tr.commented:hover > td.line.commented {\n    background-color: #dde\n}\n\ntable.file > tbody.lines > tr.commenthead > td.commenthead {\n    font-family: sans-serif;\n    background-color: #ccf;\n    font-weight: bold;\n    border-top: 1px solid #99b;\n    border-bottom: 1px solid #99b;\n    text-align: center\n}\n\ntable.file > tbody.lines > tr.commentchain > td.commentchain,\ntable.file > tbody.lines > tr.commentchain > td.comment {\n    font-family: sans-serif;\n    background-color: #ccf;\n    border-bottom: 1px solid #99b;\n    padding-left: 3px;\n    padding-right: 3px;\n}\n\ntable.file > tbody.lines > tr.commentchain > td.comment > div {\n    padding: 3px\n}\n\ntable.file > tbody.lines > tr.commentchain > td.comment > div.comment.draft > span.time:after {\n    content: \" (draft)\";\n    font-weight: bold;\n    color: #f00\n}\n\ntable.file > tbody.lines > tr.commentchain > td.comment > div.warning {\n    font-weight: bold;\n    color: #f00\n}\n\ntable.file > tbody.lines > tr.commentchain > td > div > span.type {\n    font-weight: bold\n}\n\ntable.file > tbody.lines > tr.commentchain > td > div > span.author {\n    font-weight: bold\n}\n\ntable.file > tbody.lines > tr.commentchain > td > div > span.draft {\n    font-weight: bold;\n    color: #f00\n}\n\ntable.file > tbody.lines > tr.commentchain > td.comment > div > div.text {\n    font-family: monospace;\n    background-color: #ddf;\n    border: 1px solid #99b;\n    margin-top: 3px;\n    padding: 5px;\n    white-space: pre-wrap\n}\n\ntable.file > tbody.lines > tr.commentchain > td.comment > div > div.text > textarea {\n    background-color: #ddf;\n    border: 0;\n    width: 100%\n}\n\ntable.file > tbody.lines > tr.commentchain > td.comment > div.buttons {\n    text-align: right;\n    padding-top: 3px\n}\n\ntable.file > tbody.content > tr > td {\n    background-color: #eee\n}\n\ntable.file > tbody.deleted > tr > td {\n    background-color: #eee;\n    text-align: center\n}\n\ntable.file > tbody.deleted > tr > td > h2 {\n    margin-top: 0\n}\n\ntable.file > tbody.binary > tr > td {\n    background-color: #eee;\n    text-align: center\n}\n\ntable.file > tbody.binary > tr > td > h2 {\n    margin-top: 0\n}\n\ntable.commit-info { margin-top: 1em; width: auto !important }\n\ntable.commit-info span.branches,\ntable.commit-info span.tags {\n    margin-left: 1em\n}\n\ntable.commit-info span.branch,\ntable.commit-info span.tag {\n    margin-right: 0.2em\n}\n\npre.commit-msg { padding: 0.5em 1em  }\n\nb.tab:before { content: \"\\2192\" }\nb.tab { color: #ccc }\n\ntr.modified b.tab {\n    color: #cca\n}\n\ntr.replaced > td.old b.tab,\ntr.deleted > td.old b.tab {\n    color: #caa\n}\n\ntr.replaced > td.new b.tab,\ntr.inserted > td.new b.tab {\n    color: #aca\n}\n\ninput.approve {\n    background-color: #eec;\n    font-weight: bold\n}\n\nbody { font-size: 12px; font-family: sans-serif }\n\ndiv.playground {\n    white-space: pre;\n    font-family: monospace\n}\n\ndiv.parent {\n    display: none\n}\n\ndiv.parent.show {\n    display: block\n}\n\ndiv.parent > h1 {\n    padding-left: 1em;\n    font-size: 150%;\n    font-weight: bold\n}\n\ndiv.detectmoves select {\n    width: 100%;\n    background-color: #fff;\n    padding: 3px\n}\n\ntable.commit-info tr.commit-msg > td {\n    padding-top: 1.5em;\n    padding-bottom: 1.5em\n}\n\ntable.commit-msg tr.line:hover {\n    background-color: #eee\n}\n\ntable.commit-msg tr.line td.edge {\n    padding: 0 1em 0 1em\n}\n\ntable.commit-msg tr.line td.line {\n    white-space: pre;\n    font-family: monospace;\n    padding: 0\n}\n\ntable.commit-msg tr.line.first td.line {\n    font-weight: bold\n}\n"
  },
  {
    "path": "src/resources/showfile.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\n/* -*- mode: text; indent-tabs-mode: nil -*- */\n\nfunction highlightLines(markers)\n{\n  var start = /f\\d+n(\\d+)/.exec(markers.firstLine.id)[1];\n  var end = /f\\d+n(\\d+)/.exec(markers.lastLine.id)[1];\n  var href = location.href;\n\n  var match = /^(.*)&line=\\d+(?:-\\d+)?(.*)$/.exec(href);\n\n  if (match)\n    href = match[1] + match[2];\n\n  if (start == end)\n    location.href = href + \"&line=\" + start;\n  else\n    location.href = href + \"&line=\" + start + \"-\" + end;\n\n  markers.remove();\n  currentMarkers = null;\n}\n\nvar defaultHandleMarkedLines = handleMarkedLines;\n\nhandleMarkedLines = function (markers)\n  {\n    if (typeof review == \"undefined\")\n      highlightLines(markers);\n    else\n    {\n      CommentChain.extraButtons = { \"Link to Lines\": function () { highlightLines(currentMarkers); return true; } };\n      defaultHandleMarkedLines(markers);\n    }\n  };\n\n$(document).ready(function ()\n  {\n    $(\"td.line\").mousedown(startCommentMarking);\n    $(\"td.line\").mouseover(continueCommentMarking);\n    $(\"td.line\").mouseup(endCommentMarking);\n\n    if (typeof firstSelectedLine == \"number\")\n    {\n      var markers = new CommentMarkers;\n      markers.setLines($(\"td.first-selected\"), $(\"td.last-selected\"));\n      markers.setType(\"issue\", \"open\");\n    }\n  });\n"
  },
  {
    "path": "src/resources/showreview.css",
    "content": "/* -*- mode: css; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\ndiv.main table.basic td.h1 h1 a.edit,\ndiv.main table.progress td.h1 h1 a {\n    font-size: 12px\n}\n\ndiv.main table.basic td.h1 h1 a.edit {\n    padding-left: 1em\n}\n\ndiv.main table.basic tr.line > td {\n    padding-top: 0.5em\n}\n\ndiv.main table.basic tr.line > td div.buttons {\n    float: right\n}\n\ndiv.main table.basic tr.line > td div.buttons button {\n    margin-left: 3px\n}\n\ndiv.main table.basic td.heading {\n    font-family: serif;\n    font-weight: bold;\n    text-align: right;\n    vertical-align: top\n}\n\ndiv.main table.basic td.value {\n    font-family: monospace;\n    white-space: pre-wrap;\n    vertical-align: top\n}\n\ndiv.main table.basic code.branch.archived {\n    text-decoration: line-through\n}\n\ndiv.main table.basic span.user.retired > span.name {\n    text-decoration: line-through\n}\ndiv.main table.basic span.user > span.status {\n    color: red\n}\n\ndiv.main table.basic table.reviewfilters {\n    white-space: normal;\n    margin-top: 1em\n}\n\ndiv.main table.basic table.reviewfilters tbody.hidden,\ndiv.main table.basic table.reviewfilters tfoot.hidden {\n    display: none\n}\n\ndiv.main table.basic table.reviewfilters tfoot tr td {\n    font-style: italic\n}\n\ndiv.main table.basic table.reviewfilters tr.h1 th.h1 {\n    font-family: serif;\n    font-weight: bold;\n    border-bottom: 1px solid #222\n}\n\ndiv.main table.basic table.reviewfilters td {\n    vertical-align: top\n}\n\ndiv.main table.basic table.reviewfilters td.username {\n    font-weight: bold;\n    text-align: right\n}\n\ndiv.main table.basic table.reviewfilters td.reviews {\n    font-family: sans-serif;\n    text-align: center;\n    padding-left: 0.5em;\n    padding-right: 0.5em\n}\n\ndiv.main table.basic table.reviewfilters td.path {\n    white-space: pre\n}\n\ndiv.main table.basic table.reviewfilters tr.filter td.remove {\n    text-align: right;\n    padding-left: 0.5em\n}\n\ndiv.main table.basic table.reviewfilters tr.filter td.remove a {\n    visibility: hidden\n}\n\ndiv.main table.basic table.reviewfilters tr.filter:hover td.remove a {\n    visibility: visible;\n    color: red;\n    text-decoration: none\n}\ndiv.main table.basic table.reviewfilters tr.filter:hover td.remove a:hover {\n    text-decoration: underline\n}\n\ndiv.main table.basic td.value div.text {\n    white-space: pre-wrap\n}\n\ndiv.main table.basic td.value input.value {\n    font-family: monospace;\n    width: 32em\n}\n\ndiv.main table.basic td.right {\n    text-align: right\n}\n\ndiv.main table.basic td.help {\n    font-style: italic;\n    text-align: right;\n    border-bottom: 1px solid #cca\n}\n\ndiv.main table.basic td.value span.mode {\n    font-style: italic\n}\n\ndiv.main table.progress {\n    text-align: center;\n}\n\ndiv.main table.progress .h1 h1 {\n    text-align: left;\n}\n\ndiv.main table.progress td.percent h1 {\n    font-size: 100px;\n    margin: 10px\n}\n\ndiv.main table.progress td.percent h1 span.comments {\n    font-size: 30%;\n    padding-left: 1em;\n    padding-right: 1em\n}\n\ndiv.main table.progress td.percent h1 div {\n    margin-top: -60px;\n}\n\ndiv.main table.progress td.percent h1 div span.remark {\n    font-size: 12px;\n    padding-left: 1em;\n    padding-right: 1em\n}\n\ndiv.main table.progress td.stuck {\n    padding-bottom: 10px\n}\n\ndiv.main table.progress td.stuck a {\n    font-weight: bold;\n    color: red;\n    text-decoration: none\n}\n\ndiv.main table.progress td.stragglers {\n    font-weight: bold;\n    text-decoration: underline;\n    padding-bottom: 3px\n}\n\ndiv.main table.progress td.straggler {\n    font-family: monospace\n}\n\ndiv.main table.progress td.straggler.absent:after {\n    content: \" (absent)\";\n    color: red\n}\n\ndiv.main table.progress td.pinging {\n    font-style: italic;\n    padding-top: 5px\n}\n\ndiv.main table.progress td.pinging span {\n    border-top: 1px solid #cca\n}\n\ndiv.main table.shared tr.h1 td.buttons {\n    text-align: right;\n    border-bottom: 1px solid #cca\n}\n\ndiv.main table.shared tr.reviewers td\n{\n    padding-top: 0.5em;\n    vertical-align: top\n}\n\ndiv.main table.shared tr.reviewers td.reviewers\n{\n    font-weight: bold;\n    text-align: right;\n    width: 30%\n}\n\ndiv.main table.shared tr.reviewers td.files\n{\n    font-family: monospace;\n    width: 40%\n}\n\ndiv.main table.shared tr.reviewers td.files span.file\n{\n    white-space: pre\n}\n\ndiv.main table.shared tr.reviewers td.files\n{\n    width: 30%\n}\n\ndiv.main table.log td.approval,\ndiv.main table.log td.total {\n    text-align: right;\n    white-space: pre\n}\n\ndiv.main table.log td.approval.user span,\ndiv.main table.log td.total.user span {\n    outline: 3px dotted #f66;\n    padding-left: 3px;\n    padding-right: 3px\n}\n\ndiv.main table.log td.approval span,\ndiv.main table.log td.total span {\n    font-family: monospace\n}\n\ndiv.main table.log td.approval.user span,\ndiv.main table.log td.total.user span {\n    font-weight: bold\n}\n\ndiv.main table.log td.approval.approved span {\n    background-color: #dfd\n}\n\ndiv.main table.log td.approval.pending span {\n    background-color: #fdd\n}\n\ndiv.main table.comments > tr > td,\ndiv.main table.batches > tr > td {\n    padding-left: 1em\n}\n\ndiv.main table.comments td.h1,\ndiv.main table.batches td.h1 {\n    border-bottom: 1px solid #cca\n}\n\ndiv.main table.comments td.h1 h1,\ndiv.main table.batches td.h1 h1 {\n    margin-top: 0;\n    margin-bottom: 0.5em\n}\n\ndiv.main table.comments td.h1 h1 a {\n    font-size: 50%;\n    margin-left: 1em\n}\n\ndiv.main table.comments td.h2 {\n    padding-top: 1em;\n    padding-left: 2em;\n    border-bottom: 1px solid #cca\n}\n\ndiv.main table.comments td.h2 h2 {\n    margin-top: 0;\n    margin-bottom: 0.5em\n}\n\ndiv.main table.comments td.h2 a {\n    font-size: 50%;\n    margin-left: 1em\n}\n\ndiv.main table.comments tr.comment td,\ndiv.main table.batches tr.batch td {\n    padding-top: 3px;\n    padding-bottom: 3px\n}\n\ndiv.main table.comments tr.comment:hover,\ndiv.main table.batches tr.batch:hover {\n    background: #eec\n}\n\ndiv.main table.comments td.author,\ndiv.main table.batches td.author {\n    font-weight: bold;\n    white-space: pre\n}\n\ndiv.main table.comments td.title,\ndiv.main table.batches td.title {\n    font-family: monospace;\n    white-space: nowrap;\n    text-overflow: ellipsis;\n    overflow: hidden;\n    width: 70%\n}\n\ndiv.main table.comments td.when,\ndiv.main table.batches td.when {\n    text-align: right;\n    white-space: pre;\n    padding-right: 1em\n}\n\ndiv.main table.comments td.buttons {\n    border-top: 1px solid #cca;\n    padding-top: 15px;\n    text-align: right\n}\n\ndiv.main table.batches td.title span.numbers {\n    color: black;\n    float: right;\n    font-family: sans-serif;\n    font-size: 90%\n}\n\ndiv.summary-tooltip > div.header {\n    font-weight: bold;\n    text-decoration: underline\n}\n\ndiv.summary-tooltip > div.reviewer {\n    font-family: monospace\n}\ndiv.summary-tooltip > div.reviewer > span.absent {\n    color: red\n}\n\ndiv.text input {\n    margin-top: 1em;\n    padding: 3px;\n    width: 95%\n}\n\ndt {\n    font-weight: bold\n}\n\ndd {\n    padding-top: 0.5em;\n    padding-bottom: 1em\n}\n\ndd .notsupported {\n    font-weight: bold;\n    color: red;\n    margin-bottom: 0.5em\n}\n\ndiv.specifyupstream input[name='sha1'] {\n    font-family: monospace;\n    width: 100%\n}\n\ndiv.specifyupstream select {\n    width: 100%\n}\n\ndiv.removefilter div {\n    padding-left: 1em\n}\n\ndiv.removefilter .user {\n    font-weight: bold\n}\n\ndiv.removefilter .path {\n    font-family: monospace\n}\n\ndiv.main > div.error {\n    width: 50%;\n    margin-left: auto;\n    margin-right: auto;\n    border: 10px solid red;\n    border-radius: 20px;\n    padding: 0.5em 2em 2em 2em;\n    font-weight: bold\n}\n\ndiv.main > div.error > h1 {\n    text-decoration: underline\n}\n\ncode.inset > a {\n    color: #222;\n    text-decoration: none\n}\ncode.inset > a:hover {\n    color: blue;\n    text-decoration: underline\n}\n\np.tracking {\n    font-style: italic;\n    padding-left: 2em\n}\np.tracking.disabled {\n    text-decoration: line-through;\n    color: red\n}\n\nspan.lastupdate {\n    font-style: italic;\n    padding-left: 1em\n}\n\ndiv.enabletracking input {\n    font-family: monospace;\n    margin-top: 0.5em;\n    width: 100%\n}\n\ndiv.operation-performed h1 {\n    text-align: center;\n    font-size: 120%\n}\n"
  },
  {
    "path": "src/resources/showreview.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\nfunction archiveBranch()\n{\n  function finished(result)\n  {\n    if (result)\n    {\n      var done = $(\"<div class=operation-performed title=Status>\" +\n                   \"<h1>Branch archived</h1>\" +\n                   \"</div>\");\n\n      done.dialog({\n        modal: true,\n        buttons: {\n          OK: function () {\n            done.dialog(\"close\");\n            location.reload();\n          }\n        }\n      });\n    }\n  }\n\n  var operation = new Operation({\n    action: \"archive branch\",\n    url: \"archivebranch\",\n    data: { review_id: review.id },\n    callback: finished\n  });\n\n  operation.execute();\n}\n\nfunction resurrectBranch()\n{\n  function finish(result)\n  {\n    if (result)\n    {\n      var scheduled_archival_note = \"\";\n\n      if (result.delay)\n      {\n        var delay = String(result.delay) + \" day\";\n        if (result.delay > 1)\n          delay += \"s\";\n        scheduled_archival_note = (\n          \"<p>Note that since the review is not open, its branch was \" +\n          \"scheduled to be archived again in \" + delay + \".</p>\");\n      }\n\n      var done = $(\"<div class=operation-performed title=Status>\" +\n                   \"<h1>Branch resurrected</h1>\" +\n                   scheduled_archival_note +\n                   \"</div>\");\n\n      done.dialog({\n        width: scheduled_archival_note ? 600 : void 0,\n        modal: true,\n        buttons: {\n          OK: function () {\n            done.dialog(\"close\");\n            location.reload();\n          }\n        }\n      });\n    }\n  }\n\n  var operation = new Operation({\n    action: \"resurrect branch\",\n    url: \"resurrectbranch\",\n    data: { review_id: review.id },\n    callback: finish\n  });\n\n  operation.execute();\n}\n\nfunction shortDate(date)\n{\n  function pad(number, width)\n  {\n    var result = String(number);\n    while (result.length < width)\n      result = \"0\" + result;\n    return result;\n  }\n\n  return (pad(date.getFullYear(), 4) +\n          \"-\" + pad(date.getMonth() + 1, 2) +\n          \"-\" + pad(date.getDate(), 2) +\n          \" \" + pad(date.getHours(), 2) +\n          \":\" + pad(date.getMinutes(), 2));\n}\n\nfunction triggerUpdate(branch_id)\n{\n  var operation = new Operation({ action: \"trigger update\",\n                                  url: \"triggertrackedbranchupdate\",\n                                  data: { branch_id: branch_id }});\n\n  if (operation.execute())\n  {\n    var done = $(\"<div title='Status' style='text-align: center; padding-top: 2em'>Branch update triggered.</div>\");\n    done.dialog({ modal: true, buttons: { OK: function () { done.dialog(\"close\"); }}});\n  }\n}\n\nfunction enableTracking(branch_id, remote, current_remote_name)\n{\n  function finish()\n  {\n    var operation = new Operation({ action: \"enable tracking\",\n                                    url: \"enabletrackedbranch\",\n                                    data: { branch_id: branch_id,\n                                            new_remote_name: remote_name.val() }});\n\n    return Boolean(operation.execute());\n  }\n\n  var self = this;\n  var content = $(\"<div class='enabletracking' title='Enable Tracking'><p><b>Remote branch name:</b><br><input></p></div>\");\n  var remote_name = content.find(\"input\");\n\n  remote_name\n    .val(current_remote_name)\n    .autocomplete({ source: AutoCompleteRef(remote, \"refs/heads/\"), html: true });\n\n  var buttons = {\n    \"Enable Tracking\": function () { if (finish()) { content.dialog(\"close\"); location.reload(); } },\n    \"Cancel\": function () { content.dialog(\"close\"); }\n  };\n\n  content.dialog({ width: 400,\n                   buttons: buttons });\n}\n\nfunction disableTracking(branch_id)\n{\n  var operation = new Operation({ action: \"disable tracking\",\n                                  url: \"disabletrackedbranch\",\n                                  data: { branch_id: branch_id }});\n\n  if (operation.execute())\n    location.reload();\n}\n\nfunction watchReview()\n{\n  var operation = new Operation({ \"action\": \"watch review\",\n                                  \"url\": \"watchreview\",\n                                  \"data\": { \"review_id\": review.id,\n                                            \"subject_name\": user.name }});\n\n  if (operation.execute())\n    location.reload();\n}\n\nfunction unwatchReview()\n{\n  var operation = new Operation({ \"action\": \"unwatch review\",\n                                  \"url\": \"unwatchreview\",\n                                  \"data\": { \"review_id\": review.id,\n                                            \"subject_name\": user.name }});\n\n  if (operation.execute())\n    location.reload();\n}\n\nfunction filterPartialChanges()\n{\n  var content = $(\"<div title='Filter Partial Changes'>Please select the desired range of commits below using click-and-drag.</div>\");\n\n  function cancel()\n  {\n    content.dialog(\"close\");\n    overrideShowSquashedDiff = null;\n  }\n\n  content.dialog({ width: 800,\n                   position: \"top\",\n                   buttons: { Cancel: cancel },\n                   resizable: false });\n\n  overrideShowSquashedDiff = function (from_sha1, to_sha1)\n    {\n      overrideShowSquashedDiff = null;\n      content.dialog(\"close\");\n\n      location.href = \"/filterchanges?review=\" + review.id + \"&first=\" + from_sha1 + \"&last=\" + to_sha1;\n    };\n}\n\nfunction updateFilters(filter_type)\n{\n  function addFilters(names, path)\n  {\n    var operation = new Operation({\n      action: \"add review filters\",\n      url: \"addreviewfilters\",\n      data: { review_id: review.id,\n              filters: [{ type: filter_type,\n                          user_names: names,\n                          paths: [path] }] }\n    });\n\n    return operation.execute() != null;\n  }\n\n  addReviewFiltersDialog({\n    filter_type: filter_type,\n    callback: addFilters,\n    reload_page: true\n  });\n}\n\nfunction addReviewer()\n{\n  updateFilters(\"reviewer\");\n}\n\nfunction addWatcher()\n{\n  updateFilters(\"watcher\");\n}\n\nfunction removeReviewFilter(filter_id, filter_user, filter_type, filter_path, confirm)\n{\n  function finish()\n  {\n    var operation = new Operation({ action: \"remove review filter\",\n                                    url: \"removereviewfilter\",\n                                    data: { filter_id: filter_id }});\n\n    if (operation.execute())\n    {\n      location.reload();\n      return true;\n    }\n    else\n      return false;\n  }\n\n  if (confirm)\n  {\n    var content = $(\"<div class='removefilter' title='Confirm'><p>Please confirm that you mean to remove the filter that makes</p><div class=user>\" + htmlify(filter_user) + \"</div><p>a \" + filter_type + \" of</p><div class=path>\" + filter_path + \"</div><p>An email will be sent the user about the change and its effect on assignments.</p></div>\");\n\n    content.dialog({ width: 400,\n                     buttons: { \"Remove the filter\": function () { if (finish()) content.dialog(\"close\"); },\n                                \"Do nothing\": function () { content.dialog(\"close\"); } },\n                     modal: true });\n  }\n  else\n    finish();\n}\n\nfunction applyFilters(what)\n{\n  var query_url, apply_url;\n\n  if (what == \"global\")\n  {\n    query_url = \"queryglobalfilters\";\n    apply_url = \"applyglobalfilters\";\n  }\n  else\n  {\n    query_url = \"queryparentfilters\";\n    apply_url = \"applyparentfilters\";\n  }\n\n  function openDialog(result)\n  {\n    if (!result)\n      return;\n\n    function proceed()\n    {\n      function finish(result)\n      {\n        if (result)\n        {\n          dialog.dialog(\"close\");\n          location.reload();\n        }\n      }\n\n      var operation = new Operation({ action: \"update review filters\",\n                                      url: apply_url,\n                                      data: { review_id: review.id },\n                                      wait: \"Applying filters ...\",\n                                      callback: finish });\n\n      operation.execute();\n    }\n\n    function cancel()\n    {\n      dialog.dialog(\"close\");\n    }\n\n    var html = \"<div title='Apply \" + what + \" filters'>\";\n    var buttons = {};\n\n    if (result.reviewers.length || result.watchers.length)\n    {\n      html += (\"<p>By applying \" + what + \" filters to this review, the \" +\n               \"following new reviewers and watchers would be added:</p>\");\n\n      if (result.reviewers.length)\n      {\n        html += \"<p>New reviewers:</p><ul>\";\n        result.reviewers.forEach(\n          function (user)\n          {\n            html += \"<li>\" + htmlify(user.displayName + \" <\" + user.email + \">\") + \"</li>\";\n          });\n        html += \"</ul>\";\n      }\n\n      if (result.watchers.length)\n      {\n        html += \"<p>New watchers:</p><ul>\";\n        result.watchers.forEach(\n          function (user)\n          {\n            html += \"<li>\" + htmlify(user.displayName + \" <\" + user.email + \">\") + \"</li>\";\n          });\n        html += \"</ul>\";\n      }\n\n      buttons[\"Apply Filters\"] = proceed;\n    }\n    else\n    {\n      html += (\"<p>Applying \" + what + \" filters to this review would \" +\n               \"not cause any immediate changes.  It may however affect \" +\n               \"what happens when adding additional changes to the review \" +\n               \"in the future.</p>\");\n    }\n\n    buttons[\"Cancel\"] = cancel;\n\n    html += \"</div>\";\n\n    var dialog = $(html);\n\n    dialog.dialog({ width: 400, modal: true, buttons: buttons });\n  }\n\n  var operation = new Operation({ action: \"query review filters\",\n                                  url: query_url,\n                                  data: { review_id: review.id },\n                                  wait: \"Listing new reviewers and watchers ...\",\n                                  callback: openDialog });\n\n  operation.execute();\n}\n\nfunction toggleReviewFilters(type, button)\n{\n  var table = $(\"table.reviewfilters.\" + type);\n  var tbody = table.find(\"tbody\");\n  var tfoot = table.find(\"tfoot\");\n\n  if (tbody.hasClass(\"hidden\"))\n  {\n    tbody.removeClass(\"hidden\");\n    tfoot.addClass(\"hidden\");\n    button.button(\"option\", \"label\", \"Hide Custom Filters\");\n  }\n  else\n  {\n    tbody.addClass(\"hidden\");\n    tfoot.removeClass(\"hidden\");\n    button.button(\"option\", \"label\", \"Show Custom Filters\");\n  }\n}\n\nfunction prepareRebase()\n{\n  var rebase_type_dialog;\n\n  function finish()\n  {\n    var inplace = rebase_type_dialog.find(\"input#inplace:checked\").size() != 0;\n\n    if (inplace)\n    {\n      var operation = new Operation({ action: \"prepare rebase\",\n                                      url: \"preparerebase\",\n                                      data: { review_id: review.id }});\n\n      if (operation.execute())\n      {\n        rebase_type_dialog.dialog(\"close\");\n\n        var finished =\n          $(\"<div title='Rebase Prepared!'>\"\n          +   \"<p>\"\n          +     \"You may now push the rebased branch, using \\\"git push -f\\\".  \"\n          +     \"Any attempt to push changes to this review by other users will \"\n          +     \"be rejected until you've completed the rebase, or aborted it.\"\n          +   \"</p>\"\n          +   \"<p>\"\n          +     \"<b>Note:</b> Remember that one commit on the rebased branch must \"\n          +     \"reference a tree that is identical to the one referenced by the \"\n          +     \"current head of the review branch.  If this is not the case, your \"\n          +     \"push will be rejected.\"\n          +   \"</p>\"\n          + \"</div>\");\n\n        finished.dialog({ width: 400,\n                          modal: true,\n                          close: function() { location.reload(); },\n                          buttons: { Close: function () { finished.dialog(\"close\"); }}\n                        });\n      }\n    }\n    else\n    {\n      rebase_type_dialog.dialog(\"close\");\n\n      var select_upstream_dialog =\n        $(\"<div class='specifyupstream' title='Specify New Upstream Commit'>\"\n        +   \"<p>\"\n        +     \"Unless you squashed the whole branch into a single commit, please specify \"\n        +     \"the new upstream commit onto which the review branch is rebased, either by \"\n        +     \"entering a SHA-1 sum or by selecting one of the suggested tags:\"\n        +   \"</p>\"\n        +   \"<p>\"\n        +     \"<label><input name='single' type='checkbox'>Branch squashed into a single commit.</label>\"\n        +   \"</p>\"\n        +   \"<p>\"\n        +     \"<b>SHA-1:</b><input name='sha1' size=40 spellcheck='false'>\"\n        +   \"</p>\"\n        +   \"<p>\"\n        +     \"<b>Tag:</b>\"\n        +     \"<select disabled>\"\n        +       \"<option value='none'>Fetching suggestions...</option>\"\n        +     \"</select>\"\n        +   \"</p>\"\n        + \"</div>\");\n\n      var select_upstream_dialog_closed = false;\n\n      function populateSuggestedUpstreams(result)\n      {\n        if (result)\n        {\n          var upstreams = result.upstreams.map(\n            function (tag)\n            {\n              return \"<option value='\" + htmlify(tag) + \"'>\" + htmlify(tag) + \"</option>\";\n            });\n\n          var select = select_upstream_dialog.find(\"select\").get(0);\n\n          if (upstreams.length != 0)\n          {\n            select.innerHTML = \"<option value='none'>Found \" + upstreams.length + \" likely upstreams:</option>\" + upstreams.join(\"\");\n            select.disabled = single.checked;\n          }\n          else\n            select.innerHTML = \"<option value='none'>(No likely upstreams found.)</option>\";\n        }\n      }\n\n      var fetch_upstreams = new Operation({ action: \"fetch suggested upstream commits\",\n                                            url: \"suggestupstreams\",\n                                            data: { review_id: review.id },\n                                            callback: populateSuggestedUpstreams });\n\n      fetch_upstreams.execute();\n\n      var single = select_upstream_dialog.find(\"input\").get(0);\n      var sha1 = select_upstream_dialog.find(\"input\").get(1);\n      var tag = select_upstream_dialog.find(\"select\").get(0);\n\n      single.onclick = function ()\n        {\n          sha1.disabled = single.checked;\n          tag.disabled = single.checked || tag.options.length == 1;\n        };\n\n      function finishMove()\n      {\n        var upstream;\n\n        if (single.checked)\n          upstream = \"0000000000000000000000000000000000000000\";\n        else if (tag.value != \"none\" && sha1.value != \"\")\n          alert(\"Ambiguous input! Please leave either SHA-1 or tag empty.\");\n        else if (tag.value == \"none\" && !/^[0-9a-f]{40}$/i.test(sha1.value))\n          alert(\"Invalid input! Please specify a full 40-character SHA-1 sum.\");\n        else if (sha1.value != \"\")\n          upstream = sha1.value;\n        else\n          upstream = tag.value;\n\n        if (typeof upstream == \"string\")\n        {\n          var operation = new Operation({ action: \"prepare rebase\",\n                                          url: \"preparerebase\",\n                                          data: { review_id: review.id,\n                                                  new_upstream: upstream }});\n\n          if (operation.execute())\n          {\n            select_upstream_dialog.dialog(\"close\");\n\n            var finished =\n              $(\"<div title='Rebase Prepared!'>\"\n              +   \"<p>\"\n              +     \"You may now push the rebased branch, using \\\"git push -f\\\".  \"\n              +     \"Any attempt to push changes to this review by other users will \"\n              +     \"be rejected until you've completed the rebase, or aborted it.\"\n              +   \"</p>\"\n              +   \"<p>\"\n              +     \"<b>Important:</b> Remember not to push any new changes to the \"\n              +     \"review with this push; such changes will be very difficult to \"\n              +     \"see or review.\"\n              +   \"</p>\"\n              + \"</div>\");\n\n            finished.dialog({ width: 400,\n                              modal: true,\n                              close: function() { location.reload(); },\n                              buttons: { Close: function () { finished.dialog(\"close\"); }}\n                            });\n          }\n        }\n      }\n\n      select_upstream_dialog.dialog({ width: 400,\n                                      modal: true,\n                                      buttons: { Continue: function () { finishMove(); },\n                                                 Cancel: function () { select_upstream_dialog.dialog(\"close\"); }},\n                                      close: function () { select_upstream_dialog_closed = true; }\n                                    });\n    }\n\n    return true;\n  }\n\n  function start(supports_move)\n  {\n    rebase_type_dialog =\n      $(\"<div title='Prepare Rebase'>\"\n      +   \"<p>Please select rebase type:</p>\"\n      +   \"<dl>\"\n      +     \"<dt><label><input id='inplace' type='radio' name='rebasetype' checked>History Rewrite / In-place</label></dt>\"\n      +     \"<dd>Rebase on-top of the same upstream commit that only changes the history on the branch.</dd>\"\n      +     \"<dt><label><input id='move' type='radio' name='rebasetype'\" + (supports_move ? \"\" : \" disabled\") + \">New Upstream / Move</label></dt>\"\n      +     \"<dd>\" + (supports_move ? \"\" : \"<div class='notsupported'>[Not supported for this review!]</div>\") + \"Rebase on-top of a different upstream commit.  Can also change the history on the branch in the process.</dd>\"\n      +   \"</dl>\"\n      + \"</div>\");\n\n    rebase_type_dialog.dialog({ width: 400,\n                                modal: true,\n                                buttons: { Continue: function () { finish(); },\n                                           Cancel: function () { rebase_type_dialog.dialog(\"close\"); }}\n                              });\n  }\n\n  var operation = new Operation({ action: \"check rebase possibility\",\n                                  url: \"checkrebase\",\n                                  data: { review_id: review.id }});\n\n  var result = operation.execute();\n\n  if (result)\n    start(result.available == \"both\");\n}\n\nfunction cancelRebase()\n{\n  var operation = new Operation({ action: \"cancel rebase\",\n                                  url: \"cancelrebase\",\n                                  data: { review_id: review.id }});\n\n  if (operation.execute())\n    location.reload();\n}\n\nfunction revertRebase(rebase_id)\n{\n  var confirm_dialog = $(\"<div title=Please Confirm'><p>Are you sure you want to revert the rebase?</p></div>\");\n\n  function finish()\n  {\n    var operation = new Operation({ action: \"revert rebase\",\n                                    url: \"revertrebase\",\n                                    data: { review_id: review.id,\n                                            rebase_id: rebase_id }});\n\n    if (operation.execute())\n    {\n      confirm_dialog.dialog(\"close\");\n      location.reload();\n    }\n  }\n\n  confirm_dialog.dialog({ width: 400,\n                          modal: true,\n                          buttons: { \"Revert Rebase\": function () { finish(); },\n                                     \"Do Nothing\": function () { confirm_dialog.dialog(\"close\"); }}\n                        });\n}\n\nfunction excludeRecipient(user_id)\n{\n  var operation = new Operation({ action: \"exclude recipient\",\n                                  url: \"addrecipientfilter\",\n                                  data: { review_id: review.id,\n                                          user_id: user_id,\n                                          include: false }});\n\n  if (operation.execute())\n    location.reload();\n}\n\nfunction includeRecipient(user_id)\n{\n  var operation = new Operation({ action: \"include recipient\",\n                                  url: \"addrecipientfilter\",\n                                  data: { review_id: review.id,\n                                          user_id: user_id,\n                                          include: true }});\n\n  if (operation.execute())\n    location.reload();\n}\n\n$(document).ready(function ()\n  {\n    $(\"button.archive\").click(archiveBranch);\n    $(\"button.resurrect\").click(resurrectBranch);\n\n    $(\"tr.commit td.summary\").each(function (index, element)\n      {\n        var users = $(element).attr(\"critic-reviewers\");\n        if (users)\n        {\n          users = users.split(\",\");\n\n          $(element).find(\"a.commit\").tooltip({\n            items: 'a.commit',\n            content: function ()\n              {\n                var html = \"<div class='summary-tooltip'><div class='header'>Needs review from</div>\";\n\n                for (var index = 0; index < users.length; ++index)\n                {\n                  var match = /([^:]+):(current|absent|retired)/.exec(users[index]);\n                  var fullname = match[1];\n                  var status = match[2];\n                  if (status != \"retired\")\n                  {\n                    html += \"<div class='reviewer'>\" + htmlify(fullname);\n                    if (status == \"absent\")\n                      html += \"<span class='absent'> (absent)</span>\";\n                    html += \"</div>\";\n                  }\n                }\n\n                return $(html + \"</div>\");\n              },\n            track: true,\n            hide: false\n          });\n        }\n      });\n\n    $(\"td.straggler.no-email\").each(function (index, element)\n      {\n        $(element).tooltip({\n          items: 'td.straggler.no-email',\n          content: function ()\n            {\n              return $(\"<div class='no-email-tooltip'><strong>This user has not enabled the <u>email.activated</u> preference!</strong></div>\");\n            },\n          track: true,\n          hide: false\n        });\n      });\n\n    $(\"a[title]\").tooltip({ fade: 250 });\n\n    var reviewfilters = [];\n\n    $(\"table.shared button.accept\").click(function (ev)\n      {\n        var target = $(ev.currentTarget);\n        var paths = JSON.parse(target.attr(\"critic-paths\"));\n        var user_ids = JSON.parse(target.attr(\"critic-user-ids\"));\n\n        reviewfilters.push({ type: \"watcher\",\n                             user_ids: user_ids,\n                             paths: paths });\n\n        $(\"table.shared td.buttons > span\").css(\"display\", \"inline\");\n\n        target.parents(\"td.buttons\").children(\"button\").css(\"visibility\", \"hidden\");\n        target.parents(\"tr.reviewers\").children(\"td.willreview\").css(\"text-decoration\", \"line-through\");\n      });\n\n    $(\"table.shared button.deny\").click(function (ev)\n      {\n        var target = $(ev.currentTarget);\n        var paths = JSON.parse(target.attr(\"critic-paths\"));\n\n        reviewfilters.push({ type: \"watcher\",\n                             user_ids: [user.id],\n                             paths: paths });\n\n        $(\"table.shared td.buttons > span\").css(\"display\", \"inline\");\n\n        target.parents(\"td.buttons\").children(\"button\").css(\"visibility\", \"hidden\");\n        target.parents(\"tr.reviewers\").find(\"td.willreview span.also\").css(\"text-decoration\", \"line-through\");\n      });\n\n    $(\"table.shared button.cancel\").click(function (ev)\n      {\n        location.reload();\n      });\n\n    $(\"table.shared button.confirm\").click(function (ev)\n      {\n        var operation = new Operation({ action: \"add review filters\",\n                                        url: \"addreviewfilters\",\n                                        data: { review_id: review.id,\n                                                filters: reviewfilters }});\n\n        if (operation.execute())\n        {\n          $(\"table.shared td.buttons > span\").css(\"display\", \"none\");\n          reviewfilters = [];\n          location.reload();\n        }\n      });\n\n    $(\"button.preparerebase\").click(prepareRebase);\n    $(\"button.cancelrebase\").click(cancelRebase);\n  });\n"
  },
  {
    "path": "src/resources/showreviewlog.css",
    "content": "/* -*- mode: css; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\nbody { font-size: 12px; font-family: sans-serif }\n\ndiv.main > table td { vertical-align: top }\n\ndiv.main > table tr.reviewers td { padding-top: 0.5em }\ndiv.main > table tr.reviewers td.reviewers { font-weight: bold; text-align: right }\ndiv.main > table tr.reviewers td.willreview { text-align: center }\ndiv.main > table tr.reviewers td.files { font-family: monospace; }\ndiv.main > table tr.reviewers td.files span.file { white-space: pre }\ndiv.main > table tr.reviewers td.no-one { text-align: right; color: red; font-weight: bold }\n"
  },
  {
    "path": "src/resources/showtree.css",
    "content": "/* -*- mode: css; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\ndiv.main table.tree {\n    table-layout: fixed\n}\n\ndiv.main table.tree td {\n    padding-left: 1em\n}\n\ndiv.main table td.h1 h1 a {\n    text-decoration: none;\n    color: #222\n}\ndiv.main table td.h1 h1 a.root {\n    font-size: 80%\n}\ndiv.main table td.h1 h1 a:hover {\n    text-decoration: underline;\n    color: #222\n}\ndiv.main table td.h1 h1 span {\n    padding-left: 3px;\n    padding-right: 3px;\n}\n\ndiv.main table.tree thead td.mode,\ndiv.main table.tree thead td.name,\ndiv.main table.tree thead td.size {\n    padding-top: 1em;\n    font-weight: bold;\n    text-decoration: underline\n}\n\ndiv.main table.tree tbody td {\n    padding-top: 0.5em\n}\n\ndiv.main table.tree td.mode {\n    font-family: monospace\n}\ndiv.main table.tree td.name {\n    font-family: monospace\n}\ndiv.main table.tree td.name a {\n    text-decoration: none\n}\ndiv.main table.tree td.size {\n    text-align: right;\n    font-family: monospace\n}\n\ndiv.main table.tree tr.tree td.name {\n    font-weight: bold\n}\n\ndiv.main table.tree td.link {\n    font-family: monospace;\n    color: gray\n}\n\ndiv.main table.tree td.link span {\n    color: #222\n}\n"
  },
  {
    "path": "src/resources/statistics.css",
    "content": "/* -*- mode: css; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\ntr.line td {\n    padding: 3px 3px 3px 1em\n}\n\ntr.line td.user {\n    text-align: right;\n    font-weight: bold;\n    white-space: pre\n}\n\ntr.line td.value {\n    text-align: right;\n    white-space: pre\n}\n\ntr.line.self td.user,\ntr.line.self td.value {\n    border-bottom: 1px solid #222;\n    border-top: 1px solid #222;\n    background-color: #eec\n}\n\ntr.space td {\n    padding-top: 1em\n}\n\ndiv.main table td.right {\n    width: 90%;\n    text-align: left\n}\n"
  },
  {
    "path": "src/resources/syntax.css",
    "content": "/* -*- mode: css; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\n/* C++ syntax highlighting is done using b (bold) elements, with the classes:\n\n   com => comment\n   pp  => pre-processing directive\n   kw  => keyword\n   op  => operator\n   id  => identifier\n   str => string literal\n   ch  => character literal\n   fp  => floating point literal\n   int => integer literal\n */\n\n/* We didn't really want bold: */\ntd.line b { font-weight: normal }\n\n/* Comments. */\ntd.line b.com {\n    font-style: italic;\n    color: darkblue\n}\n\n/* Preprocessing directives. */\ntd.line b.pp {\n    color: maroon\n}\n\n/* Keywords and operators. */\ntd.line b.kw,\ntd.line b.op {\n    font-weight: bold;\n    color: #222\n}\n\n/* Identifiers. */\ntd.line b.id {\n    color: #222\n}\n\n/* String and character literals. */\ntd.line b.str,\ntd.line b.ch {\n    color: blue\n}\n\n/* Numeric literals. */\ntd.line b.fp,\ntd.line b.int {\n    color: green\n}\n"
  },
  {
    "path": "src/resources/tabify.css",
    "content": "/* -*- mode: css; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\ndiv.playground { white-space: pre }\nb.t:before { content: \"\\2192\" }\nb.t { color: #ccc }\nb.t.ill { color: red; font-weight: bold }\n"
  },
  {
    "path": "src/resources/tabify.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\nvar tabified = true, tab_width_calculated = false, tabify_style_added = false;\n\nfunction calculateTabWidth()\n{\n  if (!tabify_style_added)\n  {\n    $(\"head\").append('<style>' +\n                     '  div.playground { white-space: pre }' +\n                     '  b.t:before { content: \"\\\\2192\" }' +\n                     '  b.t { color: #ccc }' +\n                     '  b.t.ill { color: red; font-weight: bold }' +\n                     '</style>');\n\n    tabify_style_added = true;\n  }\n\n  if (!tab_width_calculated)\n  {\n    document.write(\"<div class='playground sourcefont'><span id='playground-space'> </span></div>\");\n\n    var playground = $(\".playground\");\n    var space_width = $(\"#playground-space\").width();\n\n    if (space_width != 0)\n    {\n      var stylesheet = \"\";\n\n      for (var tabwidth = 2; tabwidth <= 8; ++tabwidth)\n      {\n        var tab_width_extra = (tabwidth - 1) * space_width; // NOTE: I don't know why \" + 1\" is necessary.\n        var tab_margin_before = (tab_width_extra / 2) << 0;\n        var tab_margin_after = tab_width_extra - tab_margin_before;\n\n        stylesheet += \"b.w\" + tabwidth + \" { padding-left: \" + tab_margin_before + \"px; padding-right: \" + tab_margin_after + \"px }\\n\";\n      }\n\n      $(\"head\").append(\"<style>\" + stylesheet + \"</style>\");\n\n      tab_width_calculated = true;\n    }\n\n    playground.remove();\n  }\n}\n"
  },
  {
    "path": "src/resources/third-party/chosen.css",
    "content": "/*!\nChosen, a Select Box Enhancer for jQuery and Prototype\nby Patrick Filler for Harvest, http://getharvest.com\n\nVersion 1.0.0\nFull source at https://github.com/harvesthq/chosen\nCopyright (c) 2011 Harvest http://getharvest.com\n\nMIT License, https://github.com/harvesthq/chosen/blob/master/LICENSE.md\nThis file is generated by `grunt build`, do not edit it by hand.\n*/\n\n/* @group Base */\n.chosen-container {\n  position: relative;\n  display: inline-block;\n  vertical-align: middle;\n  font-size: 13px;\n  zoom: 1;\n  *display: inline;\n  -webkit-user-select: none;\n  -moz-user-select: none;\n  user-select: none;\n}\n.chosen-container .chosen-drop {\n  position: absolute;\n  top: 100%;\n  left: -9999px;\n  z-index: 1010;\n  -webkit-box-sizing: border-box;\n  -moz-box-sizing: border-box;\n  box-sizing: border-box;\n  width: 100%;\n  border: 1px solid #aaa;\n  border-top: 0;\n  background: #fff;\n  box-shadow: 0 4px 5px rgba(0, 0, 0, 0.15);\n}\n.chosen-container.chosen-with-drop .chosen-drop {\n  left: 0;\n}\n.chosen-container a {\n  cursor: pointer;\n}\n\n/* @end */\n/* @group Single Chosen */\n.chosen-container-single .chosen-single {\n  position: relative;\n  display: block;\n  overflow: hidden;\n  padding: 0 0 0 8px;\n  height: 23px;\n  border: 1px solid #aaa;\n  border-radius: 5px;\n  background-color: #fff;\n  background: -webkit-gradient(linear, 50% 0%, 50% 100%, color-stop(20%, #ffffff), color-stop(50%, #f6f6f6), color-stop(52%, #eeeeee), color-stop(100%, #f4f4f4));\n  background: -webkit-linear-gradient(top, #ffffff 20%, #f6f6f6 50%, #eeeeee 52%, #f4f4f4 100%);\n  background: -moz-linear-gradient(top, #ffffff 20%, #f6f6f6 50%, #eeeeee 52%, #f4f4f4 100%);\n  background: -o-linear-gradient(top, #ffffff 20%, #f6f6f6 50%, #eeeeee 52%, #f4f4f4 100%);\n  background: linear-gradient(top, #ffffff 20%, #f6f6f6 50%, #eeeeee 52%, #f4f4f4 100%);\n  background-clip: padding-box;\n  box-shadow: 0 0 3px white inset, 0 1px 1px rgba(0, 0, 0, 0.1);\n  color: #444;\n  text-decoration: none;\n  white-space: nowrap;\n  line-height: 24px;\n}\n.chosen-container-single .chosen-default {\n  color: #999;\n}\n.chosen-container-single .chosen-single span {\n  display: block;\n  overflow: hidden;\n  margin-right: 26px;\n  text-overflow: ellipsis;\n  white-space: nowrap;\n}\n.chosen-container-single .chosen-single-with-deselect span {\n  margin-right: 50px;\n}\n.chosen-container-single .chosen-single abbr {\n  position: absolute;\n  top: 6px;\n  right: 26px;\n  display: block;\n  width: 12px;\n  height: 12px;\n  background: url('chosen-sprite.png') -42px 1px no-repeat;\n  font-size: 1px;\n}\n.chosen-container-single .chosen-single abbr:hover {\n  background-position: -42px -10px;\n}\n.chosen-container-single.chosen-disabled .chosen-single abbr:hover {\n  background-position: -42px -10px;\n}\n.chosen-container-single .chosen-single div {\n  position: absolute;\n  top: 0;\n  right: 0;\n  display: block;\n  width: 18px;\n  height: 100%;\n}\n.chosen-container-single .chosen-single div b {\n  display: block;\n  width: 100%;\n  height: 100%;\n  background: url('chosen-sprite.png') no-repeat 0px 2px;\n}\n.chosen-container-single .chosen-search {\n  position: relative;\n  z-index: 1010;\n  margin: 0;\n  padding: 3px 4px;\n  white-space: nowrap;\n}\n.chosen-container-single .chosen-search input[type=\"text\"] {\n  -webkit-box-sizing: border-box;\n  -moz-box-sizing: border-box;\n  box-sizing: border-box;\n  margin: 1px 0;\n  padding: 4px 20px 4px 5px;\n  width: 100%;\n  height: auto;\n  outline: 0;\n  border: 1px solid #aaa;\n  background: white url('chosen-sprite.png') no-repeat 100% -20px;\n  background: url('chosen-sprite.png') no-repeat 100% -20px, -webkit-gradient(linear, 50% 0%, 50% 100%, color-stop(1%, #eeeeee), color-stop(15%, #ffffff));\n  background: url('chosen-sprite.png') no-repeat 100% -20px, -webkit-linear-gradient(#eeeeee 1%, #ffffff 15%);\n  background: url('chosen-sprite.png') no-repeat 100% -20px, -moz-linear-gradient(#eeeeee 1%, #ffffff 15%);\n  background: url('chosen-sprite.png') no-repeat 100% -20px, -o-linear-gradient(#eeeeee 1%, #ffffff 15%);\n  background: url('chosen-sprite.png') no-repeat 100% -20px, linear-gradient(#eeeeee 1%, #ffffff 15%);\n  font-size: 1em;\n  font-family: sans-serif;\n  line-height: normal;\n  border-radius: 0;\n}\n.chosen-container-single .chosen-drop {\n  margin-top: -1px;\n  border-radius: 0 0 4px 4px;\n  background-clip: padding-box;\n}\n.chosen-container-single.chosen-container-single-nosearch .chosen-search {\n  position: absolute;\n  left: -9999px;\n}\n\n/* @end */\n/* @group Results */\n.chosen-container .chosen-results {\n  position: relative;\n  overflow-x: hidden;\n  overflow-y: auto;\n  margin: 0 4px 4px 0;\n  padding: 0 0 0 4px;\n  max-height: 240px;\n  -webkit-overflow-scrolling: touch;\n}\n.chosen-container .chosen-results li {\n  display: none;\n  margin: 0;\n  padding: 5px 6px;\n  list-style: none;\n  line-height: 15px;\n  -webkit-touch-callout: none;\n}\n.chosen-container .chosen-results li.active-result {\n  display: list-item;\n  cursor: pointer;\n}\n.chosen-container .chosen-results li.disabled-result {\n  display: list-item;\n  color: #ccc;\n  cursor: default;\n}\n.chosen-container .chosen-results li.highlighted {\n  background-color: #3875d7;\n  background-image: -webkit-gradient(linear, 50% 0%, 50% 100%, color-stop(20%, #3875d7), color-stop(90%, #2a62bc));\n  background-image: -webkit-linear-gradient(#3875d7 20%, #2a62bc 90%);\n  background-image: -moz-linear-gradient(#3875d7 20%, #2a62bc 90%);\n  background-image: -o-linear-gradient(#3875d7 20%, #2a62bc 90%);\n  background-image: linear-gradient(#3875d7 20%, #2a62bc 90%);\n  color: #fff;\n}\n.chosen-container .chosen-results li.no-results {\n  display: list-item;\n  background: #f4f4f4;\n}\n.chosen-container .chosen-results li.group-result {\n  display: list-item;\n  font-weight: bold;\n  cursor: default;\n}\n.chosen-container .chosen-results li.group-option {\n  padding-left: 15px;\n}\n.chosen-container .chosen-results li em {\n  font-style: normal;\n  text-decoration: underline;\n}\n\n/* @end */\n/* @group Multi Chosen */\n.chosen-container-multi .chosen-choices {\n  position: relative;\n  overflow: hidden;\n  -webkit-box-sizing: border-box;\n  -moz-box-sizing: border-box;\n  box-sizing: border-box;\n  margin: 0;\n  padding: 0;\n  width: 100%;\n  height: auto !important;\n  height: 1%;\n  border: 1px solid #aaa;\n  background-color: #fff;\n  background-image: -webkit-gradient(linear, 50% 0%, 50% 100%, color-stop(1%, #eeeeee), color-stop(15%, #ffffff));\n  background-image: -webkit-linear-gradient(#eeeeee 1%, #ffffff 15%);\n  background-image: -moz-linear-gradient(#eeeeee 1%, #ffffff 15%);\n  background-image: -o-linear-gradient(#eeeeee 1%, #ffffff 15%);\n  background-image: linear-gradient(#eeeeee 1%, #ffffff 15%);\n  cursor: text;\n}\n.chosen-container-multi .chosen-choices li {\n  float: left;\n  list-style: none;\n}\n.chosen-container-multi .chosen-choices li.search-field {\n  margin: 0;\n  padding: 0;\n  white-space: nowrap;\n}\n.chosen-container-multi .chosen-choices li.search-field input[type=\"text\"] {\n  margin: 1px 0;\n  padding: 5px;\n  height: 15px;\n  outline: 0;\n  border: 0 !important;\n  background: transparent !important;\n  box-shadow: none;\n  color: #666;\n  font-size: 100%;\n  font-family: sans-serif;\n  line-height: normal;\n  border-radius: 0;\n}\n.chosen-container-multi .chosen-choices li.search-field .default {\n  color: #999;\n}\n.chosen-container-multi .chosen-choices li.search-choice {\n  position: relative;\n  margin: 3px 0 3px 5px;\n  padding: 3px 20px 3px 5px;\n  border: 1px solid #aaa;\n  border-radius: 3px;\n  background-color: #e4e4e4;\n  background-image: -webkit-gradient(linear, 50% 0%, 50% 100%, color-stop(20%, #f4f4f4), color-stop(50%, #f0f0f0), color-stop(52%, #e8e8e8), color-stop(100%, #eeeeee));\n  background-image: -webkit-linear-gradient(#f4f4f4 20%, #f0f0f0 50%, #e8e8e8 52%, #eeeeee 100%);\n  background-image: -moz-linear-gradient(#f4f4f4 20%, #f0f0f0 50%, #e8e8e8 52%, #eeeeee 100%);\n  background-image: -o-linear-gradient(#f4f4f4 20%, #f0f0f0 50%, #e8e8e8 52%, #eeeeee 100%);\n  background-image: linear-gradient(#f4f4f4 20%, #f0f0f0 50%, #e8e8e8 52%, #eeeeee 100%);\n  background-clip: padding-box;\n  box-shadow: 0 0 2px white inset, 0 1px 0 rgba(0, 0, 0, 0.05);\n  color: #333;\n  line-height: 13px;\n  cursor: default;\n}\n.chosen-container-multi .chosen-choices li.search-choice .search-choice-close {\n  position: absolute;\n  top: 4px;\n  right: 3px;\n  display: block;\n  width: 12px;\n  height: 12px;\n  background: url('chosen-sprite.png') -42px 1px no-repeat;\n  font-size: 1px;\n}\n.chosen-container-multi .chosen-choices li.search-choice .search-choice-close:hover {\n  background-position: -42px -10px;\n}\n.chosen-container-multi .chosen-choices li.search-choice-disabled {\n  padding-right: 5px;\n  border: 1px solid #ccc;\n  background-color: #e4e4e4;\n  background-image: -webkit-gradient(linear, 50% 0%, 50% 100%, color-stop(20%, #f4f4f4), color-stop(50%, #f0f0f0), color-stop(52%, #e8e8e8), color-stop(100%, #eeeeee));\n  background-image: -webkit-linear-gradient(top, #f4f4f4 20%, #f0f0f0 50%, #e8e8e8 52%, #eeeeee 100%);\n  background-image: -moz-linear-gradient(top, #f4f4f4 20%, #f0f0f0 50%, #e8e8e8 52%, #eeeeee 100%);\n  background-image: -o-linear-gradient(top, #f4f4f4 20%, #f0f0f0 50%, #e8e8e8 52%, #eeeeee 100%);\n  background-image: linear-gradient(top, #f4f4f4 20%, #f0f0f0 50%, #e8e8e8 52%, #eeeeee 100%);\n  color: #666;\n}\n.chosen-container-multi .chosen-choices li.search-choice-focus {\n  background: #d4d4d4;\n}\n.chosen-container-multi .chosen-choices li.search-choice-focus .search-choice-close {\n  background-position: -42px -10px;\n}\n.chosen-container-multi .chosen-results {\n  margin: 0;\n  padding: 0;\n}\n.chosen-container-multi .chosen-drop .result-selected {\n  display: list-item;\n  color: #ccc;\n  cursor: default;\n}\n\n/* @end */\n/* @group Active  */\n.chosen-container-active .chosen-single {\n  border: 1px solid #5897fb;\n  box-shadow: 0 0 5px rgba(0, 0, 0, 0.3);\n}\n.chosen-container-active.chosen-with-drop .chosen-single {\n  border: 1px solid #aaa;\n  -moz-border-radius-bottomright: 0;\n  border-bottom-right-radius: 0;\n  -moz-border-radius-bottomleft: 0;\n  border-bottom-left-radius: 0;\n  background-image: -webkit-gradient(linear, 50% 0%, 50% 100%, color-stop(20%, #eeeeee), color-stop(80%, #ffffff));\n  background-image: -webkit-linear-gradient(#eeeeee 20%, #ffffff 80%);\n  background-image: -moz-linear-gradient(#eeeeee 20%, #ffffff 80%);\n  background-image: -o-linear-gradient(#eeeeee 20%, #ffffff 80%);\n  background-image: linear-gradient(#eeeeee 20%, #ffffff 80%);\n  box-shadow: 0 1px 0 #fff inset;\n}\n.chosen-container-active.chosen-with-drop .chosen-single div {\n  border-left: none;\n  background: transparent;\n}\n.chosen-container-active.chosen-with-drop .chosen-single div b {\n  background-position: -18px 2px;\n}\n.chosen-container-active .chosen-choices {\n  border: 1px solid #5897fb;\n  box-shadow: 0 0 5px rgba(0, 0, 0, 0.3);\n}\n.chosen-container-active .chosen-choices li.search-field input[type=\"text\"] {\n  color: #111 !important;\n}\n\n/* @end */\n/* @group Disabled Support */\n.chosen-disabled {\n  opacity: 0.5 !important;\n  cursor: default;\n}\n.chosen-disabled .chosen-single {\n  cursor: default;\n}\n.chosen-disabled .chosen-choices .search-choice .search-choice-close {\n  cursor: default;\n}\n\n/* @end */\n/* @group Right to Left */\n.chosen-rtl {\n  text-align: right;\n}\n.chosen-rtl .chosen-single {\n  overflow: visible;\n  padding: 0 8px 0 0;\n}\n.chosen-rtl .chosen-single span {\n  margin-right: 0;\n  margin-left: 26px;\n  direction: rtl;\n}\n.chosen-rtl .chosen-single-with-deselect span {\n  margin-left: 38px;\n}\n.chosen-rtl .chosen-single div {\n  right: auto;\n  left: 3px;\n}\n.chosen-rtl .chosen-single abbr {\n  right: auto;\n  left: 26px;\n}\n.chosen-rtl .chosen-choices li {\n  float: right;\n}\n.chosen-rtl .chosen-choices li.search-field input[type=\"text\"] {\n  direction: rtl;\n}\n.chosen-rtl .chosen-choices li.search-choice {\n  margin: 3px 5px 3px 0;\n  padding: 3px 5px 3px 19px;\n}\n.chosen-rtl .chosen-choices li.search-choice .search-choice-close {\n  right: auto;\n  left: 4px;\n}\n.chosen-rtl.chosen-container-single-nosearch .chosen-search,\n.chosen-rtl .chosen-drop {\n  left: 9999px;\n}\n.chosen-rtl.chosen-container-single .chosen-results {\n  margin: 0 0 4px 4px;\n  padding: 0 4px 0 0;\n}\n.chosen-rtl .chosen-results li.group-option {\n  padding-right: 15px;\n  padding-left: 0;\n}\n.chosen-rtl.chosen-container-active.chosen-with-drop .chosen-single div {\n  border-right: none;\n}\n.chosen-rtl .chosen-search input[type=\"text\"] {\n  padding: 4px 5px 4px 20px;\n  background: white url('chosen-sprite.png') no-repeat -30px -20px;\n  background: url('chosen-sprite.png') no-repeat -30px -20px, -webkit-gradient(linear, 50% 0%, 50% 100%, color-stop(1%, #eeeeee), color-stop(15%, #ffffff));\n  background: url('chosen-sprite.png') no-repeat -30px -20px, -webkit-linear-gradient(#eeeeee 1%, #ffffff 15%);\n  background: url('chosen-sprite.png') no-repeat -30px -20px, -moz-linear-gradient(#eeeeee 1%, #ffffff 15%);\n  background: url('chosen-sprite.png') no-repeat -30px -20px, -o-linear-gradient(#eeeeee 1%, #ffffff 15%);\n  background: url('chosen-sprite.png') no-repeat -30px -20px, linear-gradient(#eeeeee 1%, #ffffff 15%);\n  direction: rtl;\n}\n.chosen-rtl.chosen-container-single .chosen-single div b {\n  background-position: 6px 2px;\n}\n.chosen-rtl.chosen-container-single.chosen-with-drop .chosen-single div b {\n  background-position: -12px 2px;\n}\n\n/* @end */\n/* @group Retina compatibility */\n@media only screen and (-webkit-min-device-pixel-ratio: 2), only screen and (min-resolution: 144dpi) {\n  .chosen-rtl .chosen-search input[type=\"text\"],\n  .chosen-container-single .chosen-single abbr,\n  .chosen-container-single .chosen-single div b,\n  .chosen-container-single .chosen-search input[type=\"text\"],\n  .chosen-container-multi .chosen-choices .search-choice .search-choice-close,\n  .chosen-container .chosen-results-scroll-down span,\n  .chosen-container .chosen-results-scroll-up span {\n    background-image: url('chosen-sprite@2x.png') !important;\n    background-size: 52px 37px !important;\n    background-repeat: no-repeat !important;\n  }\n}\n/* @end */\n"
  },
  {
    "path": "src/resources/third-party/chosen.js",
    "content": "/*!\nChosen, a Select Box Enhancer for jQuery and Prototype\nby Patrick Filler for Harvest, http://getharvest.com\n\nVersion 1.0.0\nFull source at https://github.com/harvesthq/chosen\nCopyright (c) 2011 Harvest http://getharvest.com\n\nMIT License, https://github.com/harvesthq/chosen/blob/master/LICENSE.md\nThis file is generated by `grunt build`, do not edit it by hand.\n*/\n\n(function() {\n  var $, AbstractChosen, Chosen, SelectParser, _ref,\n    __hasProp = {}.hasOwnProperty,\n    __extends = function(child, parent) { for (var key in parent) { if (__hasProp.call(parent, key)) child[key] = parent[key]; } function ctor() { this.constructor = child; } ctor.prototype = parent.prototype; child.prototype = new ctor(); child.__super__ = parent.prototype; return child; };\n\n  SelectParser = (function() {\n    function SelectParser() {\n      this.options_index = 0;\n      this.parsed = [];\n    }\n\n    SelectParser.prototype.add_node = function(child) {\n      if (child.nodeName.toUpperCase() === \"OPTGROUP\") {\n        return this.add_group(child);\n      } else {\n        return this.add_option(child);\n      }\n    };\n\n    SelectParser.prototype.add_group = function(group) {\n      var group_position, option, _i, _len, _ref, _results;\n      group_position = this.parsed.length;\n      this.parsed.push({\n        array_index: group_position,\n        group: true,\n        label: this.escapeExpression(group.label),\n        children: 0,\n        disabled: group.disabled\n      });\n      _ref = group.childNodes;\n      _results = [];\n      for (_i = 0, _len = _ref.length; _i < _len; _i++) {\n        option = _ref[_i];\n        _results.push(this.add_option(option, group_position, group.disabled));\n      }\n      return _results;\n    };\n\n    SelectParser.prototype.add_option = function(option, group_position, group_disabled) {\n      if (option.nodeName.toUpperCase() === \"OPTION\") {\n        if (option.text !== \"\") {\n          if (group_position != null) {\n            this.parsed[group_position].children += 1;\n          }\n          this.parsed.push({\n            array_index: this.parsed.length,\n            options_index: this.options_index,\n            value: option.value,\n            text: option.getAttribute(\"data-text\") || option.text,\n            html: option.getAttribute(\"data-html\") || option.innerHTML,\n            selected: option.selected,\n            disabled: group_disabled === true ? group_disabled : option.disabled,\n            group_array_index: group_position,\n            classes: option.className,\n            style: option.style.cssText\n          });\n        } else {\n          this.parsed.push({\n            array_index: this.parsed.length,\n            options_index: this.options_index,\n            empty: true\n          });\n        }\n        return this.options_index += 1;\n      }\n    };\n\n    SelectParser.prototype.escapeExpression = function(text) {\n      var map, unsafe_chars;\n      if ((text == null) || text === false) {\n        return \"\";\n      }\n      if (!/[\\&\\<\\>\\\"\\'\\`]/.test(text)) {\n        return text;\n      }\n      map = {\n        \"<\": \"&lt;\",\n        \">\": \"&gt;\",\n        '\"': \"&quot;\",\n        \"'\": \"&#x27;\",\n        \"`\": \"&#x60;\"\n      };\n      unsafe_chars = /&(?!\\w+;)|[\\<\\>\\\"\\'\\`]/g;\n      return text.replace(unsafe_chars, function(chr) {\n        return map[chr] || \"&amp;\";\n      });\n    };\n\n    return SelectParser;\n\n  })();\n\n  SelectParser.select_to_array = function(select) {\n    var child, parser, _i, _len, _ref;\n    parser = new SelectParser();\n    _ref = select.childNodes;\n    for (_i = 0, _len = _ref.length; _i < _len; _i++) {\n      child = _ref[_i];\n      parser.add_node(child);\n    }\n    return parser.parsed;\n  };\n\n  AbstractChosen = (function() {\n    function AbstractChosen(form_field, options) {\n      this.form_field = form_field;\n      this.options = options != null ? options : {};\n      if (!AbstractChosen.browser_is_supported()) {\n        return;\n      }\n      this.is_multiple = this.form_field.multiple;\n      this.set_default_text();\n      this.set_default_values();\n      this.setup();\n      this.set_up_html();\n      this.register_observers();\n    }\n\n    AbstractChosen.prototype.set_default_values = function() {\n      var _this = this;\n      this.click_test_action = function(evt) {\n        return _this.test_active_click(evt);\n      };\n      this.activate_action = function(evt) {\n        return _this.activate_field(evt);\n      };\n      this.active_field = false;\n      this.mouse_on_container = false;\n      this.results_showing = false;\n      this.result_highlighted = null;\n      this.allow_single_deselect = (this.options.allow_single_deselect != null) && (this.form_field.options[0] != null) && this.form_field.options[0].text === \"\" ? this.options.allow_single_deselect : false;\n      this.disable_search_threshold = this.options.disable_search_threshold || 0;\n      this.disable_search = this.options.disable_search || false;\n      this.enable_split_word_search = this.options.enable_split_word_search != null ? this.options.enable_split_word_search : true;\n      this.group_search = this.options.group_search != null ? this.options.group_search : true;\n      this.search_contains = this.options.search_contains || false;\n      this.single_backstroke_delete = this.options.single_backstroke_delete != null ? this.options.single_backstroke_delete : true;\n      this.max_selected_options = this.options.max_selected_options || Infinity;\n      this.inherit_select_classes = this.options.inherit_select_classes || false;\n      this.display_selected_options = this.options.display_selected_options != null ? this.options.display_selected_options : true;\n      return this.display_disabled_options = this.options.display_disabled_options != null ? this.options.display_disabled_options : true;\n    };\n\n    AbstractChosen.prototype.set_default_text = function() {\n      if (this.form_field.getAttribute(\"data-placeholder\")) {\n        this.default_text = this.form_field.getAttribute(\"data-placeholder\");\n      } else if (this.is_multiple) {\n        this.default_text = this.options.placeholder_text_multiple || this.options.placeholder_text || AbstractChosen.default_multiple_text;\n      } else {\n        this.default_text = this.options.placeholder_text_single || this.options.placeholder_text || AbstractChosen.default_single_text;\n      }\n      return this.results_none_found = this.form_field.getAttribute(\"data-no_results_text\") || this.options.no_results_text || AbstractChosen.default_no_result_text;\n    };\n\n    AbstractChosen.prototype.mouse_enter = function() {\n      return this.mouse_on_container = true;\n    };\n\n    AbstractChosen.prototype.mouse_leave = function() {\n      return this.mouse_on_container = false;\n    };\n\n    AbstractChosen.prototype.input_focus = function(evt) {\n      var _this = this;\n      if (this.is_multiple) {\n        if (!this.active_field) {\n          return setTimeout((function() {\n            return _this.container_mousedown();\n          }), 50);\n        }\n      } else {\n        if (!this.active_field) {\n          return this.activate_field();\n        }\n      }\n    };\n\n    AbstractChosen.prototype.input_blur = function(evt) {\n      var _this = this;\n      if (!this.mouse_on_container) {\n        this.active_field = false;\n        return setTimeout((function() {\n          return _this.blur_test();\n        }), 100);\n      }\n    };\n\n    AbstractChosen.prototype.results_option_build = function(options) {\n      var content, data, _i, _len, _ref;\n      content = '';\n      _ref = this.results_data;\n      for (_i = 0, _len = _ref.length; _i < _len; _i++) {\n        data = _ref[_i];\n        if (data.group) {\n          content += this.result_add_group(data);\n        } else {\n          content += this.result_add_option(data);\n        }\n        if (options != null ? options.first : void 0) {\n          if (data.selected && this.is_multiple) {\n            this.choice_build(data);\n          } else if (data.selected && !this.is_multiple) {\n            this.single_set_selected_text(this.selected_value(data));\n          }\n        }\n      }\n      return content;\n    };\n\n    AbstractChosen.prototype.result_add_option = function(option) {\n      var classes, option_el;\n      if (!option.search_match) {\n        return '';\n      }\n      if (!this.include_option_in_results(option)) {\n        return '';\n      }\n      classes = [];\n      if (!option.disabled && !(option.selected && this.is_multiple)) {\n        classes.push(\"active-result\");\n      }\n      if (option.disabled && !(option.selected && this.is_multiple)) {\n        classes.push(\"disabled-result\");\n      }\n      if (option.selected) {\n        classes.push(\"result-selected\");\n      }\n      if (option.group_array_index != null) {\n        classes.push(\"group-option\");\n      }\n      if (option.classes !== \"\") {\n        classes.push(option.classes);\n      }\n      option_el = document.createElement(\"li\");\n      option_el.className = classes.join(\" \");\n      option_el.style.cssText = option.style;\n      option_el.setAttribute(\"data-option-array-index\", option.array_index);\n      option_el.innerHTML = this.get_search_text() ? option.search_text : option.html;\n      return this.outerHTML(option_el);\n    };\n\n    AbstractChosen.prototype.result_add_group = function(group) {\n      var group_el;\n      if (!(group.search_match || group.group_match)) {\n        return '';\n      }\n      if (!(group.active_options > 0)) {\n        return '';\n      }\n      group_el = document.createElement(\"li\");\n      group_el.className = \"group-result\";\n      group_el.innerHTML = group.search_text;\n      return this.outerHTML(group_el);\n    };\n\n    AbstractChosen.prototype.results_update_field = function() {\n      this.set_default_text();\n      if (!this.is_multiple) {\n        this.results_reset_cleanup();\n      }\n      this.result_clear_highlight();\n      this.results_build();\n      if (this.results_showing) {\n        return this.winnow_results();\n      }\n    };\n\n    AbstractChosen.prototype.reset_single_select_options = function() {\n      var result, _i, _len, _ref, _results;\n      _ref = this.results_data;\n      _results = [];\n      for (_i = 0, _len = _ref.length; _i < _len; _i++) {\n        result = _ref[_i];\n        if (result.selected) {\n          _results.push(result.selected = false);\n        } else {\n          _results.push(void 0);\n        }\n      }\n      return _results;\n    };\n\n    AbstractChosen.prototype.results_toggle = function() {\n      if (this.results_showing) {\n        return this.results_hide();\n      } else {\n        return this.results_show();\n      }\n    };\n\n    AbstractChosen.prototype.results_search = function(evt) {\n      if (this.results_showing) {\n        return this.winnow_results();\n      } else {\n        return this.results_show();\n      }\n    };\n\n    AbstractChosen.prototype.winnow_results = function() {\n      var escapedSearchText, option, regex, regexAnchor, results, results_group, searchText, startpos, text, zregex, _i, _len, _ref;\n      this.no_results_clear();\n      results = 0;\n      searchText = this.get_search_text();\n      escapedSearchText = searchText.replace(/[-[\\]{}()*+?.,\\\\^$|#\\s]/g, \"\\\\$&\");\n      regexAnchor = this.search_contains ? \"\" : \"^\";\n      regex = new RegExp(regexAnchor + escapedSearchText, 'i');\n      zregex = new RegExp(escapedSearchText, 'i');\n      _ref = this.results_data;\n      for (_i = 0, _len = _ref.length; _i < _len; _i++) {\n        option = _ref[_i];\n        option.search_match = false;\n        results_group = null;\n        if (this.include_option_in_results(option)) {\n          if (option.group) {\n            option.group_match = false;\n            option.active_options = 0;\n          }\n          if ((option.group_array_index != null) && this.results_data[option.group_array_index]) {\n            results_group = this.results_data[option.group_array_index];\n            if (results_group.active_options === 0 && results_group.search_match) {\n              results += 1;\n            }\n            results_group.active_options += 1;\n          }\n          if (!(option.group && !this.group_search)) {\n            option.search_text = option.group ? option.label : option.text;\n            option.search_match = this.search_string_match(option.search_text, regex);\n            if (option.search_match && !option.group) {\n              results += 1;\n            }\n            if (option.search_match) {\n              if (searchText.length) {\n                startpos = option.search_text.search(zregex);\n                text = option.search_text.substr(0, startpos + searchText.length) + '</em>' + option.search_text.substr(startpos + searchText.length);\n                option.search_text = text.substr(0, startpos) + '<em>' + text.substr(startpos);\n              }\n              if (results_group != null) {\n                results_group.group_match = true;\n              }\n            } else if ((option.group_array_index != null) && this.results_data[option.group_array_index].search_match) {\n              option.search_match = true;\n            }\n          }\n        }\n      }\n      this.result_clear_highlight();\n      if (results < 1 && searchText.length) {\n        this.update_results_content(\"\");\n        return this.no_results(searchText);\n      } else {\n        this.update_results_content(this.results_option_build());\n        return this.winnow_results_set_highlight();\n      }\n    };\n\n    AbstractChosen.prototype.search_string_match = function(search_string, regex) {\n      var part, parts, _i, _len;\n      if (regex.test(search_string)) {\n        return true;\n      } else if (this.enable_split_word_search && (search_string.indexOf(\" \") >= 0 || search_string.indexOf(\"[\") === 0)) {\n        parts = search_string.replace(/\\[|\\]/g, \"\").split(\" \");\n        if (parts.length) {\n          for (_i = 0, _len = parts.length; _i < _len; _i++) {\n            part = parts[_i];\n            if (regex.test(part)) {\n              return true;\n            }\n          }\n        }\n      }\n    };\n\n    AbstractChosen.prototype.choices_count = function() {\n      var option, _i, _len, _ref;\n      if (this.selected_option_count != null) {\n        return this.selected_option_count;\n      }\n      this.selected_option_count = 0;\n      _ref = this.form_field.options;\n      for (_i = 0, _len = _ref.length; _i < _len; _i++) {\n        option = _ref[_i];\n        if (option.selected) {\n          this.selected_option_count += 1;\n        }\n      }\n      return this.selected_option_count;\n    };\n\n    AbstractChosen.prototype.choices_click = function(evt) {\n      evt.preventDefault();\n      if (!(this.results_showing || this.is_disabled)) {\n        return this.results_show();\n      }\n    };\n\n    AbstractChosen.prototype.keyup_checker = function(evt) {\n      var stroke, _ref;\n      stroke = (_ref = evt.which) != null ? _ref : evt.keyCode;\n      this.search_field_scale();\n      switch (stroke) {\n        case 8:\n          if (this.is_multiple && this.backstroke_length < 1 && this.choices_count() > 0) {\n            return this.keydown_backstroke();\n          } else if (!this.pending_backstroke) {\n            this.result_clear_highlight();\n            return this.results_search();\n          }\n          break;\n        case 13:\n          evt.preventDefault();\n          if (this.results_showing) {\n            return this.result_select(evt);\n          }\n          break;\n        case 27:\n          if (this.results_showing) {\n            this.results_hide();\n          }\n          return true;\n        case 9:\n        case 38:\n        case 40:\n        case 16:\n        case 91:\n        case 17:\n          break;\n        default:\n          return this.results_search();\n      }\n    };\n\n    AbstractChosen.prototype.container_width = function(mode, initial) {\n      if (mode === \"collapsed\" && this.options.collapsed_width) {\n        return this.options.collapsed_width;\n      } else if (mode === \"expanded\" && this.options.expanded_width) {\n        return this.options.expanded_width;\n      } else if (initial) {\n        if (this.options.width != null) {\n          return this.options.width;\n        } else {\n          return \"\" + this.form_field.offsetWidth + \"px\";\n        }\n      }\n    };\n\n    AbstractChosen.prototype.include_option_in_results = function(option) {\n      if (this.is_multiple && (!this.display_selected_options && option.selected)) {\n        return false;\n      }\n      if (!this.display_disabled_options && option.disabled) {\n        return false;\n      }\n      if (option.empty) {\n        return false;\n      }\n      return true;\n    };\n\n    AbstractChosen.prototype.search_results_touchstart = function(evt) {\n      this.touch_started = true;\n      return this.search_results_mouseover(evt);\n    };\n\n    AbstractChosen.prototype.search_results_touchmove = function(evt) {\n      this.touch_started = false;\n      return this.search_results_mouseout(evt);\n    };\n\n    AbstractChosen.prototype.search_results_touchend = function(evt) {\n      if (this.touch_started) {\n        return this.search_results_mouseup(evt);\n      }\n    };\n\n    AbstractChosen.prototype.outerHTML = function(element) {\n      var tmp;\n      if (element.outerHTML) {\n        return element.outerHTML;\n      }\n      tmp = document.createElement(\"div\");\n      tmp.appendChild(element);\n      return tmp.innerHTML;\n    };\n\n    AbstractChosen.browser_is_supported = function() {\n      if (window.navigator.appName === \"Microsoft Internet Explorer\") {\n        return document.documentMode >= 8;\n      }\n      if (/iP(od|hone)/i.test(window.navigator.userAgent)) {\n        return false;\n      }\n      if (/Android/i.test(window.navigator.userAgent)) {\n        if (/Mobile/i.test(window.navigator.userAgent)) {\n          return false;\n        }\n      }\n      return true;\n    };\n\n    AbstractChosen.default_multiple_text = \"Select Some Options\";\n\n    AbstractChosen.default_single_text = \"Select an Option\";\n\n    AbstractChosen.default_no_result_text = \"No results match\";\n\n    return AbstractChosen;\n\n  })();\n\n  $ = jQuery;\n\n  $.fn.extend({\n    chosen: function(options) {\n      if (!AbstractChosen.browser_is_supported()) {\n        return this;\n      }\n      return this.each(function(input_field) {\n        var $this, chosen;\n        $this = $(this);\n        chosen = $this.data('chosen');\n        if (options === 'destroy' && chosen) {\n          chosen.destroy();\n        } else if (!chosen) {\n          $this.data('chosen', new Chosen(this, options));\n        }\n      });\n    }\n  });\n\n  Chosen = (function(_super) {\n    __extends(Chosen, _super);\n\n    function Chosen() {\n      _ref = Chosen.__super__.constructor.apply(this, arguments);\n      return _ref;\n    }\n\n    Chosen.prototype.setup = function() {\n      this.form_field_jq = $(this.form_field);\n      this.current_selectedIndex = this.form_field.selectedIndex;\n      return this.is_rtl = this.form_field_jq.hasClass(\"chosen-rtl\");\n    };\n\n    Chosen.prototype.set_up_html = function() {\n      var container_classes, container_props;\n      container_classes = [\"chosen-container\"];\n      container_classes.push(\"chosen-container-\" + (this.is_multiple ? \"multi\" : \"single\"));\n      if (this.inherit_select_classes && this.form_field.className) {\n        container_classes.push(this.form_field.className);\n      }\n      if (this.is_rtl) {\n        container_classes.push(\"chosen-rtl\");\n      }\n      container_props = {\n        'class': container_classes.join(' '),\n        'title': this.form_field.title\n      };\n      if (this.form_field.id.length) {\n        container_props.id = this.form_field.id.replace(/[^\\w]/g, '_') + \"_chosen\";\n      }\n      this.main_container = $(\"<div />\", container_props);\n      this.drop_container = $(\"<div />\", container_props);\n      this.container = $([this.main_container.get(0), this.drop_container.get(0)]);\n      this.container.css({\n        width: this.container_width(\"collapsed\", true)\n      });\n      if (this.is_multiple) {\n        this.main_container.html('<ul class=\"chosen-choices\"><li class=\"search-field\"><input type=\"text\" value=\"' + this.default_text + '\" class=\"default\" autocomplete=\"off\" style=\"width:25px;\" /></li></ul>');\n        this.drop_container.html('<div class=\"chosen-drop\"><ul class=\"chosen-results\"></ul></div>');\n      } else {\n        this.main_container.html('<a class=\"chosen-single chosen-default\" tabindex=\"-1\"><span>' + this.default_text + '</span><div><b></b></div></a>');\n        this.drop_container.html('<div class=\"chosen-drop\"><div class=\"chosen-search\"><input type=\"text\" autocomplete=\"off\" /></div><ul class=\"chosen-results\"></ul></div>');\n      }\n      this.form_field_jq.hide().after(this.main_container);\n      this.dropdown = this.drop_container.first();\n      this.search_field = this.container.find('input').first();\n      this.search_results = this.drop_container.find('ul.chosen-results').first();\n      this.search_field_scale();\n      this.search_no_results = this.container.find('li.no-results').first();\n      if (this.is_multiple) {\n        this.search_choices = this.container.find('ul.chosen-choices').first();\n        this.search_container = this.container.find('li.search-field').first();\n      } else {\n        this.search_container = this.container.find('div.chosen-search').first();\n        this.selected_item = this.container.find('.chosen-single').first();\n      }\n      this.drop_container.css({\n        position: \"absolute\",\n        left: \"-9999px\",\n        top: 0\n      });\n      $(\"body\").append(this.drop_container);\n      this.results_build();\n      this.set_tab_index();\n      this.set_label_behavior();\n      return this.form_field_jq.trigger(\"chosen:ready\", {\n        chosen: this\n      });\n    };\n\n    Chosen.prototype.register_observers = function() {\n      var _this = this;\n      this.container.bind('mousedown.chosen', function(evt) {\n        _this.container_mousedown(evt);\n      });\n      this.container.bind('mouseup.chosen', function(evt) {\n        _this.container_mouseup(evt);\n      });\n      this.container.bind('mouseenter.chosen', function(evt) {\n        _this.mouse_enter(evt);\n      });\n      this.container.bind('mouseleave.chosen', function(evt) {\n        _this.mouse_leave(evt);\n      });\n      this.search_results.bind('mouseup.chosen', function(evt) {\n        _this.search_results_mouseup(evt);\n      });\n      this.search_results.bind('mouseover.chosen', function(evt) {\n        _this.search_results_mouseover(evt);\n      });\n      this.search_results.bind('mouseout.chosen', function(evt) {\n        _this.search_results_mouseout(evt);\n      });\n      this.search_results.bind('mousewheel.chosen DOMMouseScroll.chosen', function(evt) {\n        _this.search_results_mousewheel(evt);\n      });\n      this.search_results.bind('touchstart.chosen', function(evt) {\n        _this.search_results_touchstart(evt);\n      });\n      this.search_results.bind('touchmove.chosen', function(evt) {\n        _this.search_results_touchmove(evt);\n      });\n      this.search_results.bind('touchend.chosen', function(evt) {\n        _this.search_results_touchend(evt);\n      });\n      this.form_field_jq.bind(\"chosen:updated.chosen\", function(evt) {\n        _this.results_update_field(evt);\n      });\n      this.form_field_jq.bind(\"chosen:activate.chosen\", function(evt) {\n        _this.activate_field(evt);\n      });\n      this.form_field_jq.bind(\"chosen:open.chosen\", function(evt) {\n        _this.container_mousedown(evt);\n      });\n      this.form_field_jq.bind(\"chosen:close.chosen\", function(evt) {\n        _this.input_blur(evt);\n      });\n      this.search_field.bind('blur.chosen', function(evt) {\n        _this.input_blur(evt);\n      });\n      this.search_field.bind('keyup.chosen', function(evt) {\n        _this.keyup_checker(evt);\n      });\n      this.search_field.bind('keydown.chosen', function(evt) {\n        _this.keydown_checker(evt);\n      });\n      this.search_field.bind('focus.chosen', function(evt) {\n        _this.input_focus(evt);\n      });\n      if (this.is_multiple) {\n        return this.search_choices.bind('click.chosen', function(evt) {\n          _this.choices_click(evt);\n        });\n      } else {\n        return this.container.bind('click.chosen', function(evt) {\n          evt.preventDefault();\n        });\n      }\n    };\n\n    Chosen.prototype.destroy = function() {\n      $(document).unbind(\"click.chosen\", this.click_test_action);\n      if (this.search_field[0].tabIndex) {\n        this.form_field_jq[0].tabIndex = this.search_field[0].tabIndex;\n      }\n      this.container.remove();\n      this.form_field_jq.removeData('chosen');\n      return this.form_field_jq.show();\n    };\n\n    Chosen.prototype.search_field_disabled = function() {\n      this.is_disabled = this.form_field_jq[0].disabled;\n      if (this.is_disabled) {\n        this.container.addClass('chosen-disabled');\n        this.search_field[0].disabled = true;\n        if (!this.is_multiple) {\n          this.selected_item.unbind(\"focus.chosen\", this.activate_action);\n        }\n        return this.close_field();\n      } else {\n        this.container.removeClass('chosen-disabled');\n        this.search_field[0].disabled = false;\n        if (!this.is_multiple) {\n          return this.selected_item.bind(\"focus.chosen\", this.activate_action);\n        }\n      }\n    };\n\n    Chosen.prototype.container_mousedown = function(evt) {\n      if (!this.is_disabled) {\n        if (evt && evt.type === \"mousedown\" && !this.results_showing) {\n          evt.preventDefault();\n        }\n        if (!((evt != null) && ($(evt.target)).hasClass(\"search-choice-close\"))) {\n          if (!this.active_field) {\n            if (this.is_multiple) {\n              this.search_field.val(\"\");\n            }\n            $(document).bind('click.chosen', this.click_test_action);\n            this.results_show();\n          } else if (!this.is_multiple && evt && (($(evt.target)[0] === this.selected_item[0]) || $(evt.target).parents(\"a.chosen-single\").length)) {\n            evt.preventDefault();\n            this.results_toggle();\n          }\n          return this.activate_field();\n        }\n      }\n    };\n\n    Chosen.prototype.container_mouseup = function(evt) {\n      if (evt.target.nodeName === \"ABBR\" && !this.is_disabled) {\n        return this.results_reset(evt);\n      }\n    };\n\n    Chosen.prototype.search_results_mousewheel = function(evt) {\n      var delta;\n      if (evt.originalEvent) {\n        delta = -evt.originalEvent.wheelDelta || evt.originalEvent.detail;\n      }\n      if (delta != null) {\n        evt.preventDefault();\n        if (evt.type === 'DOMMouseScroll') {\n          delta = delta * 40;\n        }\n        return this.search_results.scrollTop(delta + this.search_results.scrollTop());\n      }\n    };\n\n    Chosen.prototype.blur_test = function(evt) {\n      if (!this.active_field && this.container.hasClass(\"chosen-container-active\")) {\n        return this.close_field();\n      }\n    };\n\n    Chosen.prototype.close_field = function() {\n      $(document).unbind(\"click.chosen\", this.click_test_action);\n      this.active_field = false;\n      this.results_hide();\n      this.container.removeClass(\"chosen-container-active\");\n      this.clear_backstroke();\n      this.show_search_field_default();\n      return this.search_field_scale();\n    };\n\n    Chosen.prototype.activate_field = function() {\n      this.container.addClass(\"chosen-container-active\");\n      this.active_field = true;\n      this.search_field.val(this.search_field.val());\n      return this.search_field.focus();\n    };\n\n    Chosen.prototype.test_active_click = function(evt) {\n      var active_container;\n      active_container = $(evt.target).closest('.chosen-container');\n      if (active_container.length && this.container[0] === active_container[0]) {\n        return this.active_field = true;\n      } else {\n        return this.close_field();\n      }\n    };\n\n    Chosen.prototype.results_build = function() {\n      this.parsing = true;\n      this.selected_option_count = null;\n      this.results_data = SelectParser.select_to_array(this.form_field);\n      if (this.is_multiple) {\n        this.search_choices.find(\"li.search-choice\").remove();\n      } else if (!this.is_multiple) {\n        this.single_set_selected_text(null, true);\n        if (this.disable_search || this.form_field.options.length <= this.disable_search_threshold) {\n          this.search_field[0].readOnly = true;\n          this.container.addClass(\"chosen-container-single-nosearch\");\n        } else {\n          this.search_field[0].readOnly = false;\n          this.container.removeClass(\"chosen-container-single-nosearch\");\n        }\n      }\n      this.update_results_content(this.results_option_build({\n        first: true\n      }));\n      this.search_field_disabled();\n      this.show_search_field_default();\n      this.search_field_scale();\n      return this.parsing = false;\n    };\n\n    Chosen.prototype.result_do_highlight = function(el) {\n      var high_bottom, high_top, maxHeight, visible_bottom, visible_top;\n      if (el.length) {\n        this.result_clear_highlight();\n        this.result_highlight = el;\n        this.result_highlight.addClass(\"highlighted\");\n        maxHeight = parseInt(this.search_results.css(\"maxHeight\"), 10);\n        visible_top = this.search_results.scrollTop();\n        visible_bottom = maxHeight + visible_top;\n        high_top = this.result_highlight.position().top + this.search_results.scrollTop();\n        high_bottom = high_top + this.result_highlight.outerHeight();\n        if (high_bottom >= visible_bottom) {\n          return this.search_results.scrollTop((high_bottom - maxHeight) > 0 ? high_bottom - maxHeight : 0);\n        } else if (high_top < visible_top) {\n          return this.search_results.scrollTop(high_top);\n        }\n      }\n    };\n\n    Chosen.prototype.result_clear_highlight = function() {\n      if (this.result_highlight) {\n        this.result_highlight.removeClass(\"highlighted\");\n      }\n      return this.result_highlight = null;\n    };\n\n    Chosen.prototype.results_realign = function() {\n      return this.drop_container.css({\n        position: \"absolute\",\n        left: this.main_container.offset().left,\n        top: this.main_container.offset().top + this.main_container.height(),\n        width: this.main_container.width()\n      });\n    };\n\n    Chosen.prototype.results_show = function() {\n      var new_width, self,\n        _this = this;\n      if (this.is_multiple && this.max_selected_options <= this.choices_count()) {\n        this.form_field_jq.trigger(\"chosen:maxselected\", {\n          chosen: this\n        });\n        return false;\n      }\n      this.container.addClass(\"chosen-with-drop\");\n      new_width = this.container_width(\"expanded\");\n      if (new_width) {\n        this.container.css({\n          width: new_width\n        });\n      }\n      this.form_field_jq.trigger(\"chosen:showing_dropdown\", {\n        chosen: this\n      });\n      this.results_realign();\n      this.results_showing = true;\n      this.search_field.focus();\n      this.search_field.val(this.search_field.val());\n      self = this;\n      $(window).resize(function() {\n        if (self.results_showing) {\n          return self.results_realign();\n        }\n      });\n      return this.winnow_results();\n    };\n\n    Chosen.prototype.update_results_content = function(content) {\n      return this.search_results.html(content);\n    };\n\n    Chosen.prototype.results_hide = function() {\n      var new_width;\n      if (this.results_showing) {\n        this.result_clear_highlight();\n        new_width = this.container_width(\"collapsed\");\n        if (new_width) {\n          this.container.css({\n            width: new_width\n          });\n        }\n        this.container.removeClass(\"chosen-with-drop\");\n        this.form_field_jq.trigger(\"chosen:hiding_dropdown\", {\n          chosen: this\n        });\n      }\n      return this.results_showing = false;\n    };\n\n    Chosen.prototype.set_tab_index = function(el) {\n      var ti;\n      if (this.form_field.tabIndex) {\n        ti = this.form_field.tabIndex;\n        this.form_field.tabIndex = -1;\n        return this.search_field[0].tabIndex = ti;\n      }\n    };\n\n    Chosen.prototype.set_label_behavior = function() {\n      var _this = this;\n      this.form_field_label = this.form_field_jq.parents(\"label\");\n      if (!this.form_field_label.length && this.form_field.id.length) {\n        this.form_field_label = $(\"label[for='\" + this.form_field.id + \"']\");\n      }\n      if (this.form_field_label.length > 0) {\n        return this.form_field_label.bind('click.chosen', function(evt) {\n          if (_this.is_multiple) {\n            return _this.container_mousedown(evt);\n          } else {\n            return _this.activate_field();\n          }\n        });\n      }\n    };\n\n    Chosen.prototype.show_search_field_default = function() {\n      if (this.is_multiple && this.choices_count() < 1 && !this.active_field) {\n        this.search_field.val(this.default_text);\n        return this.search_field.addClass(\"default\");\n      } else {\n        this.search_field.val(\"\");\n        return this.search_field.removeClass(\"default\");\n      }\n    };\n\n    Chosen.prototype.search_results_mouseup = function(evt) {\n      var target;\n      target = $(evt.target).hasClass(\"active-result\") ? $(evt.target) : $(evt.target).parents(\".active-result\").first();\n      if (target.length) {\n        this.result_highlight = target;\n        this.result_select(evt);\n        return this.search_field.focus();\n      }\n    };\n\n    Chosen.prototype.search_results_mouseover = function(evt) {\n      var target;\n      target = $(evt.target).hasClass(\"active-result\") ? $(evt.target) : $(evt.target).parents(\".active-result\").first();\n      if (target) {\n        return this.result_do_highlight(target);\n      }\n    };\n\n    Chosen.prototype.search_results_mouseout = function(evt) {\n      if ($(evt.target).hasClass(\"active-result\" || $(evt.target).parents('.active-result').first())) {\n        return this.result_clear_highlight();\n      }\n    };\n\n    Chosen.prototype.choice_build = function(item) {\n      var choice, close_link,\n        _this = this;\n      choice = $('<li></li>', {\n        \"class\": \"search-choice\"\n      }).html(\"<span></span>\");\n      choice.find(\"span\").text(item.text);\n      if (item.disabled) {\n        choice.addClass('search-choice-disabled');\n      } else {\n        close_link = $('<a />', {\n          \"class\": 'search-choice-close',\n          'data-option-array-index': item.array_index\n        });\n        close_link.bind('click.chosen', function(evt) {\n          return _this.choice_destroy_link_click(evt);\n        });\n        choice.append(close_link);\n      }\n      return this.search_container.before(choice);\n    };\n\n    Chosen.prototype.choice_destroy_link_click = function(evt) {\n      evt.preventDefault();\n      evt.stopPropagation();\n      if (!this.is_disabled) {\n        return this.choice_destroy($(evt.target));\n      }\n    };\n\n    Chosen.prototype.choice_destroy = function(link) {\n      if (this.result_deselect(link[0].getAttribute(\"data-option-array-index\"))) {\n        this.show_search_field_default();\n        if (this.is_multiple && this.choices_count() > 0 && this.search_field.val().length < 1) {\n          this.results_hide();\n        }\n        link.parents('li').first().remove();\n        return this.search_field_scale();\n      }\n    };\n\n    Chosen.prototype.results_reset = function() {\n      this.reset_single_select_options();\n      this.form_field.options[0].selected = true;\n      this.single_set_selected_text(null, true);\n      this.show_search_field_default();\n      this.results_reset_cleanup();\n      this.form_field_jq.trigger(\"change\");\n      if (this.active_field) {\n        return this.results_hide();\n      }\n    };\n\n    Chosen.prototype.results_reset_cleanup = function() {\n      this.current_selectedIndex = this.form_field.selectedIndex;\n      return this.selected_item.find(\"abbr\").remove();\n    };\n\n    Chosen.prototype.result_select = function(evt) {\n      var high, item;\n      if (this.result_highlight) {\n        high = this.result_highlight;\n        this.result_clear_highlight();\n        if (this.is_multiple && this.max_selected_options <= this.choices_count()) {\n          this.form_field_jq.trigger(\"chosen:maxselected\", {\n            chosen: this\n          });\n          return false;\n        }\n        if (this.is_multiple) {\n          high.removeClass(\"active-result\");\n        } else {\n          this.reset_single_select_options();\n        }\n        item = this.results_data[high[0].getAttribute(\"data-option-array-index\")];\n        item.selected = true;\n        this.form_field.options[item.options_index].selected = true;\n        this.selected_option_count = null;\n        if (this.is_multiple) {\n          this.choice_build(item);\n        } else {\n          this.single_set_selected_text(this.selected_value(item));\n        }\n        if (!((evt.metaKey || evt.ctrlKey) && this.is_multiple)) {\n          this.results_hide();\n        }\n        this.search_field.val(\"\");\n        if (this.is_multiple || this.form_field.selectedIndex !== this.current_selectedIndex) {\n          this.form_field_jq.trigger(\"change\", {\n            'selected': this.form_field.options[item.options_index].value\n          });\n        }\n        this.current_selectedIndex = this.form_field.selectedIndex;\n        return this.search_field_scale();\n      }\n    };\n\n    Chosen.prototype.single_set_selected_text = function(value, set_default) {\n      if (set_default) {\n        this.selected_item.addClass(\"chosen-default\");\n        if (value === null) {\n          value = {\n            text: this.default_text\n          };\n        }\n      } else {\n        this.single_deselect_control_build();\n        this.selected_item.removeClass(\"chosen-default\");\n      }\n      if (value.text) {\n        return this.selected_item.find(\"span\").text(value.text);\n      } else {\n        return this.selected_item.find(\"span\").html(value.html);\n      }\n    };\n\n    Chosen.prototype.result_deselect = function(pos) {\n      var result_data;\n      result_data = this.results_data[pos];\n      if (!this.form_field.options[result_data.options_index].disabled) {\n        result_data.selected = false;\n        this.form_field.options[result_data.options_index].selected = false;\n        this.selected_option_count = null;\n        this.result_clear_highlight();\n        if (this.results_showing) {\n          this.winnow_results();\n        }\n        this.form_field_jq.trigger(\"change\", {\n          deselected: this.form_field.options[result_data.options_index].value\n        });\n        this.search_field_scale();\n        return true;\n      } else {\n        return false;\n      }\n    };\n\n    Chosen.prototype.single_deselect_control_build = function() {\n      if (!this.allow_single_deselect) {\n        return;\n      }\n      if (!this.selected_item.find(\"abbr\").length) {\n        this.selected_item.find(\"span\").first().after(\"<abbr class=\\\"search-choice-close\\\"></abbr>\");\n      }\n      return this.selected_item.addClass(\"chosen-single-with-deselect\");\n    };\n\n    Chosen.prototype.get_search_text = function() {\n      if (this.search_field.val() === this.default_text) {\n        return \"\";\n      } else {\n        return $('<div/>').text($.trim(this.search_field.val())).html();\n      }\n    };\n\n    Chosen.prototype.winnow_results_set_highlight = function() {\n      var do_high, selected_results;\n      selected_results = !this.is_multiple ? this.search_results.find(\".result-selected.active-result\") : [];\n      do_high = selected_results.length ? selected_results.first() : this.search_results.find(\".active-result\").first();\n      if (do_high != null) {\n        return this.result_do_highlight(do_high);\n      }\n    };\n\n    Chosen.prototype.no_results = function(terms) {\n      var no_results_html;\n      no_results_html = $('<li class=\"no-results\">' + this.results_none_found + ' \"<span></span>\"</li>');\n      no_results_html.find(\"span\").first().html(terms);\n      return this.search_results.append(no_results_html);\n    };\n\n    Chosen.prototype.no_results_clear = function() {\n      return this.search_results.find(\".no-results\").remove();\n    };\n\n    Chosen.prototype.keydown_arrow = function() {\n      var next_sib;\n      if (this.results_showing && this.result_highlight) {\n        next_sib = this.result_highlight.nextAll(\"li.active-result\").first();\n        if (next_sib) {\n          return this.result_do_highlight(next_sib);\n        }\n      } else {\n        return this.results_show();\n      }\n    };\n\n    Chosen.prototype.keyup_arrow = function() {\n      var prev_sibs;\n      if (!this.results_showing && !this.is_multiple) {\n        return this.results_show();\n      } else if (this.result_highlight) {\n        prev_sibs = this.result_highlight.prevAll(\"li.active-result\");\n        if (prev_sibs.length) {\n          return this.result_do_highlight(prev_sibs.first());\n        } else {\n          if (this.choices_count() > 0) {\n            this.results_hide();\n          }\n          return this.result_clear_highlight();\n        }\n      }\n    };\n\n    Chosen.prototype.keydown_backstroke = function() {\n      var next_available_destroy;\n      if (this.pending_backstroke) {\n        this.choice_destroy(this.pending_backstroke.find(\"a\").first());\n        return this.clear_backstroke();\n      } else {\n        next_available_destroy = this.search_container.siblings(\"li.search-choice\").last();\n        if (next_available_destroy.length && !next_available_destroy.hasClass(\"search-choice-disabled\")) {\n          this.pending_backstroke = next_available_destroy;\n          if (this.single_backstroke_delete) {\n            return this.keydown_backstroke();\n          } else {\n            return this.pending_backstroke.addClass(\"search-choice-focus\");\n          }\n        }\n      }\n    };\n\n    Chosen.prototype.clear_backstroke = function() {\n      if (this.pending_backstroke) {\n        this.pending_backstroke.removeClass(\"search-choice-focus\");\n      }\n      return this.pending_backstroke = null;\n    };\n\n    Chosen.prototype.keydown_checker = function(evt) {\n      var stroke, _ref1;\n      stroke = (_ref1 = evt.which) != null ? _ref1 : evt.keyCode;\n      this.search_field_scale();\n      if (stroke !== 8 && this.pending_backstroke) {\n        this.clear_backstroke();\n      }\n      switch (stroke) {\n        case 8:\n          this.backstroke_length = this.search_field.val().length;\n          break;\n        case 9:\n          if (this.results_showing && !this.is_multiple) {\n            this.result_select(evt);\n          }\n          this.mouse_on_container = false;\n          break;\n        case 13:\n          evt.preventDefault();\n          break;\n        case 38:\n          evt.preventDefault();\n          this.keyup_arrow();\n          break;\n        case 40:\n          evt.preventDefault();\n          this.keydown_arrow();\n          break;\n      }\n    };\n\n    Chosen.prototype.search_field_scale = function() {\n      var div, f_width, h, style, style_block, styles, w, _i, _len;\n      if (this.is_multiple) {\n        h = 0;\n        w = 0;\n        style_block = \"position:absolute; left: -1000px; top: -1000px; display:none;\";\n        styles = ['font-size', 'font-style', 'font-weight', 'font-family', 'line-height', 'text-transform', 'letter-spacing'];\n        for (_i = 0, _len = styles.length; _i < _len; _i++) {\n          style = styles[_i];\n          style_block += style + \":\" + this.search_field.css(style) + \";\";\n        }\n        div = $('<div />', {\n          'style': style_block\n        });\n        div.text(this.search_field.val());\n        $('body').append(div);\n        w = div.width() + 25;\n        div.remove();\n        f_width = this.container.outerWidth();\n        if (w > f_width - 10) {\n          w = f_width - 10;\n        }\n        return this.search_field.css({\n          'width': w + 'px'\n        });\n      }\n    };\n\n    Chosen.prototype.selected_value = function(item) {\n      if (this.options.generate_selected_value) {\n        return this.options.generate_selected_value(item);\n      } else {\n        return {\n          text: item.text\n        };\n      }\n    };\n\n    return Chosen;\n\n  })(AbstractChosen);\n\n}).call(this);\n"
  },
  {
    "path": "src/resources/third-party/jquery-ui-1.10.2.custom.css",
    "content": "/*! jQuery UI - v1.10.2 - 2013-03-22\n* http://jqueryui.com\n* Includes: jquery.ui.core.css, jquery.ui.resizable.css, jquery.ui.selectable.css, jquery.ui.autocomplete.css, jquery.ui.button.css, jquery.ui.dialog.css, jquery.ui.menu.css, jquery.ui.tooltip.css\n* To view and modify this theme, visit http://jqueryui.com/themeroller/?ffDefault=Helvetica%2CArial%2Csans-serif&fwDefault=normal&fsDefault=1.1em&cornerRadius=6px&bgColorHeader=cb842e&bgTextureHeader=glass&bgImgOpacityHeader=25&borderColorHeader=d49768&fcHeader=ffffff&iconColorHeader=ffffff&bgColorContent=f4f0ec&bgTextureContent=inset_soft&bgImgOpacityContent=100&borderColorContent=e0cfc2&fcContent=1e1b1d&iconColorContent=c47a23&bgColorDefault=ede4d4&bgTextureDefault=glass&bgImgOpacityDefault=70&borderColorDefault=cdc3b7&fcDefault=3f3731&iconColorDefault=f08000&bgColorHover=f5f0e5&bgTextureHover=glass&bgImgOpacityHover=100&borderColorHover=f5ad66&fcHover=a46313&iconColorHover=f08000&bgColorActive=f4f0ec&bgTextureActive=highlight_hard&bgImgOpacityActive=100&borderColorActive=e0cfc2&fcActive=b85700&iconColorActive=f35f07&bgColorHighlight=f5f5b5&bgTextureHighlight=highlight_hard&bgImgOpacityHighlight=75&borderColorHighlight=d9bb73&fcHighlight=060200&iconColorHighlight=cb672b&bgColorError=fee4bd&bgTextureError=highlight_hard&bgImgOpacityError=65&borderColorError=f8893f&fcError=592003&iconColorError=ff7519&bgColorOverlay=aaaaaa&bgTextureOverlay=flat&bgImgOpacityOverlay=75&opacityOverlay=30&bgColorShadow=aaaaaa&bgTextureShadow=flat&bgImgOpacityShadow=75&opacityShadow=30&thicknessShadow=8px&offsetTopShadow=-8px&offsetLeftShadow=-8px&cornerRadiusShadow=8px\n* Copyright 2013 jQuery Foundation and other contributors Licensed MIT */.ui-helper-hidden{display:none}.ui-helper-hidden-accessible{border:0;clip:rect(0 0 0 0);height:1px;margin:-1px;overflow:hidden;padding:0;position:absolute;width:1px}.ui-helper-reset{margin:0;padding:0;border:0;outline:0;line-height:1.3;text-decoration:none;font-size:100%;list-style:none}.ui-helper-clearfix:before,.ui-helper-clearfix:after{content:\"\";display:table;border-collapse:collapse}.ui-helper-clearfix:after{clear:both}.ui-helper-clearfix{min-height:0}.ui-helper-zfix{width:100%;height:100%;top:0;left:0;position:absolute;opacity:0;filter:Alpha(Opacity=0)}.ui-front{z-index:100}.ui-state-disabled{cursor:default!important}.ui-icon{display:block;text-indent:-99999px;overflow:hidden;background-repeat:no-repeat}.ui-widget-overlay{position:fixed;top:0;left:0;width:100%;height:100%}.ui-resizable{position:relative}.ui-resizable-handle{position:absolute;font-size:.1px;display:block}.ui-resizable-disabled .ui-resizable-handle,.ui-resizable-autohide .ui-resizable-handle{display:none}.ui-resizable-n{cursor:n-resize;height:7px;width:100%;top:-5px;left:0}.ui-resizable-s{cursor:s-resize;height:7px;width:100%;bottom:-5px;left:0}.ui-resizable-e{cursor:e-resize;width:7px;right:-5px;top:0;height:100%}.ui-resizable-w{cursor:w-resize;width:7px;left:-5px;top:0;height:100%}.ui-resizable-se{cursor:se-resize;width:12px;height:12px;right:1px;bottom:1px}.ui-resizable-sw{cursor:sw-resize;width:9px;height:9px;left:-5px;bottom:-5px}.ui-resizable-nw{cursor:nw-resize;width:9px;height:9px;left:-5px;top:-5px}.ui-resizable-ne{cursor:ne-resize;width:9px;height:9px;right:-5px;top:-5px}.ui-selectable-helper{position:absolute;z-index:100;border:1px dotted #000}.ui-autocomplete{position:absolute;top:0;left:0;cursor:default}.ui-button{display:inline-block;position:relative;padding:0;line-height:normal;margin-right:.1em;cursor:pointer;vertical-align:middle;text-align:center;overflow:visible}.ui-button,.ui-button:link,.ui-button:visited,.ui-button:hover,.ui-button:active{text-decoration:none}.ui-button-icon-only{width:2.2em}button.ui-button-icon-only{width:2.4em}.ui-button-icons-only{width:3.4em}button.ui-button-icons-only{width:3.7em}.ui-button .ui-button-text{display:block;line-height:normal}.ui-button-text-only .ui-button-text{padding:.4em 1em}.ui-button-icon-only .ui-button-text,.ui-button-icons-only .ui-button-text{padding:.4em;text-indent:-9999999px}.ui-button-text-icon-primary .ui-button-text,.ui-button-text-icons .ui-button-text{padding:.4em 1em .4em 2.1em}.ui-button-text-icon-secondary .ui-button-text,.ui-button-text-icons .ui-button-text{padding:.4em 2.1em .4em 1em}.ui-button-text-icons .ui-button-text{padding-left:2.1em;padding-right:2.1em}input.ui-button{padding:.4em 1em}.ui-button-icon-only .ui-icon,.ui-button-text-icon-primary .ui-icon,.ui-button-text-icon-secondary .ui-icon,.ui-button-text-icons .ui-icon,.ui-button-icons-only .ui-icon{position:absolute;top:50%;margin-top:-8px}.ui-button-icon-only .ui-icon{left:50%;margin-left:-8px}.ui-button-text-icon-primary .ui-button-icon-primary,.ui-button-text-icons .ui-button-icon-primary,.ui-button-icons-only .ui-button-icon-primary{left:.5em}.ui-button-text-icon-secondary .ui-button-icon-secondary,.ui-button-text-icons .ui-button-icon-secondary,.ui-button-icons-only .ui-button-icon-secondary{right:.5em}.ui-buttonset{margin-right:7px}.ui-buttonset .ui-button{margin-left:0;margin-right:-.3em}input.ui-button::-moz-focus-inner,button.ui-button::-moz-focus-inner{border:0;padding:0}.ui-dialog{position:absolute;top:0;left:0;padding:.2em;outline:0}.ui-dialog .ui-dialog-titlebar{padding:.4em 1em;position:relative}.ui-dialog .ui-dialog-title{float:left;margin:.1em 0;white-space:nowrap;width:90%;overflow:hidden;text-overflow:ellipsis}.ui-dialog .ui-dialog-titlebar-close{position:absolute;right:.3em;top:50%;width:21px;margin:-10px 0 0 0;padding:1px;height:20px}.ui-dialog .ui-dialog-content{position:relative;border:0;padding:.5em 1em;background:0;overflow:auto}.ui-dialog .ui-dialog-buttonpane{text-align:left;border-width:1px 0 0;background-image:none;margin-top:.5em;padding:.3em 1em .5em .4em}.ui-dialog .ui-dialog-buttonpane .ui-dialog-buttonset{float:right}.ui-dialog .ui-dialog-buttonpane button{margin:.5em .4em .5em 0;cursor:pointer}.ui-dialog .ui-resizable-se{width:12px;height:12px;right:-5px;bottom:-5px;background-position:16px 16px}.ui-draggable .ui-dialog-titlebar{cursor:move}.ui-menu{list-style:none;padding:2px;margin:0;display:block;outline:0}.ui-menu .ui-menu{margin-top:-3px;position:absolute}.ui-menu .ui-menu-item{margin:0;padding:0;width:100%}.ui-menu .ui-menu-divider{margin:5px -2px 5px -2px;height:0;font-size:0;line-height:0;border-width:1px 0 0}.ui-menu .ui-menu-item a{text-decoration:none;display:block;padding:2px .4em;line-height:1.5;min-height:0;font-weight:400}.ui-menu .ui-menu-item a.ui-state-focus,.ui-menu .ui-menu-item a.ui-state-active{font-weight:400;margin:-1px}.ui-menu .ui-state-disabled{font-weight:400;margin:.4em 0 .2em;line-height:1.5}.ui-menu .ui-state-disabled a{cursor:default}.ui-menu-icons{position:relative}.ui-menu-icons .ui-menu-item a{position:relative;padding-left:2em}.ui-menu .ui-icon{position:absolute;top:.2em;left:.2em}.ui-menu .ui-menu-icon{position:static;float:right}.ui-tooltip{padding:8px;position:absolute;z-index:9999;max-width:300px;-webkit-box-shadow:0 0 5px #aaa;box-shadow:0 0 5px #aaa}body .ui-tooltip{border-width:2px}.ui-widget{font-family:Helvetica,Arial,sans-serif;font-size:1.1em}.ui-widget .ui-widget{font-size:1em}.ui-widget input,.ui-widget select,.ui-widget textarea,.ui-widget button{font-family:Helvetica,Arial,sans-serif;font-size:1em}.ui-widget-content{border:1px solid #e0cfc2;background:#f4f0ec url(images/ui-bg_inset-soft_100_f4f0ec_1x100.png) 50% bottom repeat-x;color:#1e1b1d}.ui-widget-content a{color:#1e1b1d}.ui-widget-header{border:1px solid #d49768;background:#cb842e url(images/ui-bg_glass_25_cb842e_1x400.png) 50% 50% repeat-x;color:#fff;font-weight:bold}.ui-widget-header a{color:#fff}.ui-state-default,.ui-widget-content .ui-state-default,.ui-widget-header .ui-state-default{border:1px solid #cdc3b7;background:#ede4d4 url(images/ui-bg_glass_70_ede4d4_1x400.png) 50% 50% repeat-x;font-weight:normal;color:#3f3731}.ui-state-default a,.ui-state-default a:link,.ui-state-default a:visited{color:#3f3731;text-decoration:none}.ui-state-hover,.ui-widget-content .ui-state-hover,.ui-widget-header .ui-state-hover,.ui-state-focus,.ui-widget-content .ui-state-focus,.ui-widget-header .ui-state-focus{border:1px solid #f5ad66;background:#f5f0e5 url(images/ui-bg_glass_100_f5f0e5_1x400.png) 50% 50% repeat-x;font-weight:normal;color:#a46313}.ui-state-hover a,.ui-state-hover a:hover,.ui-state-hover a:link,.ui-state-hover a:visited{color:#a46313;text-decoration:none}.ui-state-active,.ui-widget-content .ui-state-active,.ui-widget-header .ui-state-active{border:1px solid #e0cfc2;background:#f4f0ec url(images/ui-bg_highlight-hard_100_f4f0ec_1x100.png) 50% 50% repeat-x;font-weight:normal;color:#b85700}.ui-state-active a,.ui-state-active a:link,.ui-state-active a:visited{color:#b85700;text-decoration:none}.ui-state-highlight,.ui-widget-content .ui-state-highlight,.ui-widget-header .ui-state-highlight{border:1px solid #d9bb73;background:#f5f5b5 url(images/ui-bg_highlight-hard_75_f5f5b5_1x100.png) 50% top repeat-x;color:#060200}.ui-state-highlight a,.ui-widget-content .ui-state-highlight a,.ui-widget-header .ui-state-highlight a{color:#060200}.ui-state-error,.ui-widget-content .ui-state-error,.ui-widget-header .ui-state-error{border:1px solid #f8893f;background:#fee4bd url(images/ui-bg_highlight-hard_65_fee4bd_1x100.png) 50% top repeat-x;color:#592003}.ui-state-error a,.ui-widget-content .ui-state-error a,.ui-widget-header .ui-state-error a{color:#592003}.ui-state-error-text,.ui-widget-content .ui-state-error-text,.ui-widget-header .ui-state-error-text{color:#592003}.ui-priority-primary,.ui-widget-content .ui-priority-primary,.ui-widget-header .ui-priority-primary{font-weight:bold}.ui-priority-secondary,.ui-widget-content .ui-priority-secondary,.ui-widget-header .ui-priority-secondary{opacity:.7;filter:Alpha(Opacity=70);font-weight:normal}.ui-state-disabled,.ui-widget-content .ui-state-disabled,.ui-widget-header .ui-state-disabled{opacity:.35;filter:Alpha(Opacity=35);background-image:none}.ui-state-disabled .ui-icon{filter:Alpha(Opacity=35)}.ui-icon{width:16px;height:16px}.ui-icon,.ui-widget-content .ui-icon{background-image:url(images/ui-icons_c47a23_256x240.png)}.ui-widget-header .ui-icon{background-image:url(images/ui-icons_ffffff_256x240.png)}.ui-state-default .ui-icon{background-image:url(images/ui-icons_f08000_256x240.png)}.ui-state-hover .ui-icon,.ui-state-focus .ui-icon{background-image:url(images/ui-icons_f08000_256x240.png)}.ui-state-active .ui-icon{background-image:url(images/ui-icons_f35f07_256x240.png)}.ui-state-highlight .ui-icon{background-image:url(images/ui-icons_cb672b_256x240.png)}.ui-state-error .ui-icon,.ui-state-error-text .ui-icon{background-image:url(images/ui-icons_ff7519_256x240.png)}.ui-icon-blank{background-position:16px 16px}.ui-icon-carat-1-n{background-position:0 0}.ui-icon-carat-1-ne{background-position:-16px 0}.ui-icon-carat-1-e{background-position:-32px 0}.ui-icon-carat-1-se{background-position:-48px 0}.ui-icon-carat-1-s{background-position:-64px 0}.ui-icon-carat-1-sw{background-position:-80px 0}.ui-icon-carat-1-w{background-position:-96px 0}.ui-icon-carat-1-nw{background-position:-112px 0}.ui-icon-carat-2-n-s{background-position:-128px 0}.ui-icon-carat-2-e-w{background-position:-144px 0}.ui-icon-triangle-1-n{background-position:0 -16px}.ui-icon-triangle-1-ne{background-position:-16px -16px}.ui-icon-triangle-1-e{background-position:-32px -16px}.ui-icon-triangle-1-se{background-position:-48px -16px}.ui-icon-triangle-1-s{background-position:-64px -16px}.ui-icon-triangle-1-sw{background-position:-80px -16px}.ui-icon-triangle-1-w{background-position:-96px -16px}.ui-icon-triangle-1-nw{background-position:-112px -16px}.ui-icon-triangle-2-n-s{background-position:-128px -16px}.ui-icon-triangle-2-e-w{background-position:-144px -16px}.ui-icon-arrow-1-n{background-position:0 -32px}.ui-icon-arrow-1-ne{background-position:-16px -32px}.ui-icon-arrow-1-e{background-position:-32px -32px}.ui-icon-arrow-1-se{background-position:-48px -32px}.ui-icon-arrow-1-s{background-position:-64px -32px}.ui-icon-arrow-1-sw{background-position:-80px -32px}.ui-icon-arrow-1-w{background-position:-96px -32px}.ui-icon-arrow-1-nw{background-position:-112px -32px}.ui-icon-arrow-2-n-s{background-position:-128px -32px}.ui-icon-arrow-2-ne-sw{background-position:-144px -32px}.ui-icon-arrow-2-e-w{background-position:-160px -32px}.ui-icon-arrow-2-se-nw{background-position:-176px -32px}.ui-icon-arrowstop-1-n{background-position:-192px -32px}.ui-icon-arrowstop-1-e{background-position:-208px -32px}.ui-icon-arrowstop-1-s{background-position:-224px -32px}.ui-icon-arrowstop-1-w{background-position:-240px -32px}.ui-icon-arrowthick-1-n{background-position:0 -48px}.ui-icon-arrowthick-1-ne{background-position:-16px -48px}.ui-icon-arrowthick-1-e{background-position:-32px -48px}.ui-icon-arrowthick-1-se{background-position:-48px -48px}.ui-icon-arrowthick-1-s{background-position:-64px -48px}.ui-icon-arrowthick-1-sw{background-position:-80px -48px}.ui-icon-arrowthick-1-w{background-position:-96px -48px}.ui-icon-arrowthick-1-nw{background-position:-112px -48px}.ui-icon-arrowthick-2-n-s{background-position:-128px -48px}.ui-icon-arrowthick-2-ne-sw{background-position:-144px -48px}.ui-icon-arrowthick-2-e-w{background-position:-160px -48px}.ui-icon-arrowthick-2-se-nw{background-position:-176px -48px}.ui-icon-arrowthickstop-1-n{background-position:-192px -48px}.ui-icon-arrowthickstop-1-e{background-position:-208px -48px}.ui-icon-arrowthickstop-1-s{background-position:-224px -48px}.ui-icon-arrowthickstop-1-w{background-position:-240px -48px}.ui-icon-arrowreturnthick-1-w{background-position:0 -64px}.ui-icon-arrowreturnthick-1-n{background-position:-16px -64px}.ui-icon-arrowreturnthick-1-e{background-position:-32px -64px}.ui-icon-arrowreturnthick-1-s{background-position:-48px -64px}.ui-icon-arrowreturn-1-w{background-position:-64px -64px}.ui-icon-arrowreturn-1-n{background-position:-80px -64px}.ui-icon-arrowreturn-1-e{background-position:-96px -64px}.ui-icon-arrowreturn-1-s{background-position:-112px -64px}.ui-icon-arrowrefresh-1-w{background-position:-128px -64px}.ui-icon-arrowrefresh-1-n{background-position:-144px -64px}.ui-icon-arrowrefresh-1-e{background-position:-160px -64px}.ui-icon-arrowrefresh-1-s{background-position:-176px -64px}.ui-icon-arrow-4{background-position:0 -80px}.ui-icon-arrow-4-diag{background-position:-16px -80px}.ui-icon-extlink{background-position:-32px -80px}.ui-icon-newwin{background-position:-48px -80px}.ui-icon-refresh{background-position:-64px -80px}.ui-icon-shuffle{background-position:-80px -80px}.ui-icon-transfer-e-w{background-position:-96px -80px}.ui-icon-transferthick-e-w{background-position:-112px -80px}.ui-icon-folder-collapsed{background-position:0 -96px}.ui-icon-folder-open{background-position:-16px -96px}.ui-icon-document{background-position:-32px -96px}.ui-icon-document-b{background-position:-48px -96px}.ui-icon-note{background-position:-64px -96px}.ui-icon-mail-closed{background-position:-80px -96px}.ui-icon-mail-open{background-position:-96px -96px}.ui-icon-suitcase{background-position:-112px -96px}.ui-icon-comment{background-position:-128px -96px}.ui-icon-person{background-position:-144px -96px}.ui-icon-print{background-position:-160px -96px}.ui-icon-trash{background-position:-176px -96px}.ui-icon-locked{background-position:-192px -96px}.ui-icon-unlocked{background-position:-208px -96px}.ui-icon-bookmark{background-position:-224px -96px}.ui-icon-tag{background-position:-240px -96px}.ui-icon-home{background-position:0 -112px}.ui-icon-flag{background-position:-16px -112px}.ui-icon-calendar{background-position:-32px -112px}.ui-icon-cart{background-position:-48px -112px}.ui-icon-pencil{background-position:-64px -112px}.ui-icon-clock{background-position:-80px -112px}.ui-icon-disk{background-position:-96px -112px}.ui-icon-calculator{background-position:-112px -112px}.ui-icon-zoomin{background-position:-128px -112px}.ui-icon-zoomout{background-position:-144px -112px}.ui-icon-search{background-position:-160px -112px}.ui-icon-wrench{background-position:-176px -112px}.ui-icon-gear{background-position:-192px -112px}.ui-icon-heart{background-position:-208px -112px}.ui-icon-star{background-position:-224px -112px}.ui-icon-link{background-position:-240px -112px}.ui-icon-cancel{background-position:0 -128px}.ui-icon-plus{background-position:-16px -128px}.ui-icon-plusthick{background-position:-32px -128px}.ui-icon-minus{background-position:-48px -128px}.ui-icon-minusthick{background-position:-64px -128px}.ui-icon-close{background-position:-80px -128px}.ui-icon-closethick{background-position:-96px -128px}.ui-icon-key{background-position:-112px -128px}.ui-icon-lightbulb{background-position:-128px -128px}.ui-icon-scissors{background-position:-144px -128px}.ui-icon-clipboard{background-position:-160px -128px}.ui-icon-copy{background-position:-176px -128px}.ui-icon-contact{background-position:-192px -128px}.ui-icon-image{background-position:-208px -128px}.ui-icon-video{background-position:-224px -128px}.ui-icon-script{background-position:-240px -128px}.ui-icon-alert{background-position:0 -144px}.ui-icon-info{background-position:-16px -144px}.ui-icon-notice{background-position:-32px -144px}.ui-icon-help{background-position:-48px -144px}.ui-icon-check{background-position:-64px -144px}.ui-icon-bullet{background-position:-80px -144px}.ui-icon-radio-on{background-position:-96px -144px}.ui-icon-radio-off{background-position:-112px -144px}.ui-icon-pin-w{background-position:-128px -144px}.ui-icon-pin-s{background-position:-144px -144px}.ui-icon-play{background-position:0 -160px}.ui-icon-pause{background-position:-16px -160px}.ui-icon-seek-next{background-position:-32px -160px}.ui-icon-seek-prev{background-position:-48px -160px}.ui-icon-seek-end{background-position:-64px -160px}.ui-icon-seek-start{background-position:-80px -160px}.ui-icon-seek-first{background-position:-80px -160px}.ui-icon-stop{background-position:-96px -160px}.ui-icon-eject{background-position:-112px -160px}.ui-icon-volume-off{background-position:-128px -160px}.ui-icon-volume-on{background-position:-144px -160px}.ui-icon-power{background-position:0 -176px}.ui-icon-signal-diag{background-position:-16px -176px}.ui-icon-signal{background-position:-32px -176px}.ui-icon-battery-0{background-position:-48px -176px}.ui-icon-battery-1{background-position:-64px -176px}.ui-icon-battery-2{background-position:-80px -176px}.ui-icon-battery-3{background-position:-96px -176px}.ui-icon-circle-plus{background-position:0 -192px}.ui-icon-circle-minus{background-position:-16px -192px}.ui-icon-circle-close{background-position:-32px -192px}.ui-icon-circle-triangle-e{background-position:-48px -192px}.ui-icon-circle-triangle-s{background-position:-64px -192px}.ui-icon-circle-triangle-w{background-position:-80px -192px}.ui-icon-circle-triangle-n{background-position:-96px -192px}.ui-icon-circle-arrow-e{background-position:-112px -192px}.ui-icon-circle-arrow-s{background-position:-128px -192px}.ui-icon-circle-arrow-w{background-position:-144px -192px}.ui-icon-circle-arrow-n{background-position:-160px -192px}.ui-icon-circle-zoomin{background-position:-176px -192px}.ui-icon-circle-zoomout{background-position:-192px -192px}.ui-icon-circle-check{background-position:-208px -192px}.ui-icon-circlesmall-plus{background-position:0 -208px}.ui-icon-circlesmall-minus{background-position:-16px -208px}.ui-icon-circlesmall-close{background-position:-32px -208px}.ui-icon-squaresmall-plus{background-position:-48px -208px}.ui-icon-squaresmall-minus{background-position:-64px -208px}.ui-icon-squaresmall-close{background-position:-80px -208px}.ui-icon-grip-dotted-vertical{background-position:0 -224px}.ui-icon-grip-dotted-horizontal{background-position:-16px -224px}.ui-icon-grip-solid-vertical{background-position:-32px -224px}.ui-icon-grip-solid-horizontal{background-position:-48px -224px}.ui-icon-gripsmall-diagonal-se{background-position:-64px -224px}.ui-icon-grip-diagonal-se{background-position:-80px -224px}.ui-corner-all,.ui-corner-top,.ui-corner-left,.ui-corner-tl{border-top-left-radius:6px}.ui-corner-all,.ui-corner-top,.ui-corner-right,.ui-corner-tr{border-top-right-radius:6px}.ui-corner-all,.ui-corner-bottom,.ui-corner-left,.ui-corner-bl{border-bottom-left-radius:6px}.ui-corner-all,.ui-corner-bottom,.ui-corner-right,.ui-corner-br{border-bottom-right-radius:6px}.ui-widget-overlay{background:#aaa url(images/ui-bg_flat_75_aaaaaa_40x100.png) 50% 50% repeat-x;opacity:.3;filter:Alpha(Opacity=30)}.ui-widget-shadow{margin:-8px 0 0 -8px;padding:8px;background:#aaa url(images/ui-bg_flat_75_aaaaaa_40x100.png) 50% 50% repeat-x;opacity:.3;filter:Alpha(Opacity=30);border-radius:8px}"
  },
  {
    "path": "src/resources/third-party/jquery-ui-autocomplete-html.js",
    "content": "/*\n * jQuery UI Autocomplete HTML Extension\n *\n * Copyright 2010, Scott González (http://scottgonzalez.com)\n * Dual licensed under the MIT or GPL Version 2 licenses.\n *\n * http://github.com/scottgonzalez/jquery-ui-extensions\n */\n(function( $ ) {\n\nvar proto = $.ui.autocomplete.prototype,\n\tinitSource = proto._initSource;\n\nfunction filter( array, term ) {\n\tvar matcher = new RegExp( $.ui.autocomplete.escapeRegex(term), \"i\" );\n\treturn $.grep( array, function(value) {\n\t\treturn matcher.test( $( \"<div>\" ).html( value.label || value.value || value ).text() );\n\t});\n}\n\n$.extend( proto, {\n\t_initSource: function() {\n\t\tif ( this.options.html && $.isArray(this.options.source) ) {\n\t\t\tthis.source = function( request, response ) {\n\t\t\t\tresponse( filter( this.options.source, request.term ) );\n\t\t\t};\n\t\t} else {\n\t\t\tinitSource.call( this );\n\t\t}\n\t},\n\n\t_renderItem: function( ul, item) {\n\t\treturn $( \"<li></li>\" )\n\t\t\t.data( \"item.autocomplete\", item )\n\t\t\t.append( $( \"<a></a>\" )[ this.options.html ? \"html\" : \"text\" ]( item.label ) )\n\t\t\t.appendTo( ul );\n\t}\n});\n\n})( jQuery );\n"
  },
  {
    "path": "src/resources/tutorial.css",
    "content": "/* -*- mode: css; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\ndiv.main > table {\n    margin-top: 1em;\n    padding: 1em\n}\n\ndiv.main table td.h2 {\n    border-bottom: 1px solid #cca\n}\n\ndiv.main table td.h2 h2 {\n    font-size: 1rem;\n    margin: 1.5rem 0 0.75rem;\n}\n\ndiv.main table td.h2 div {\n    margin: 0 auto;\n    width: 50em\n}\n\ndiv.main table td.toc div {\n    margin: 0 auto 0.5em auto;\n    padding-top: 1em;\n    width: 40em\n}\n\ndiv.main table td.short {\n    padding-top: 1em\n}\n\ndiv.main table td.text {\n    text-align: center;\n    padding-top: 1em\n}\n\ndiv.main table td.text div {\n    text-align: justify;\n    margin-left: auto;\n    margin-right: auto;\n    width: 50em;\n    line-height: 140%;\n    margin-top: 0.5em;\n    margin-bottom: 0.5em\n}\n\ndiv.main table td.text div div.hint {\n    font-style: italic;\n    margin-left: 2em;\n    width: 90%\n}\n\ndiv.main table td.text div div.example {\n    padding-top: 0.1em;\n    padding-bottom: 0.1em;\n    margin-left: 2em;\n    width: 90%\n}\n\ndiv.main table td.text div h3 {\n    border-bottom: 1px solid #cca\n}\n\ndiv.main table td.text div.link {\n    padding-left: 2em;\n    font-weight: bold;\n    font-family: monospace\n}\n\ndiv.main table td.text a {\n    text-decoration: underline;\n    color: #222;\n    white-space: nowrap\n}\n\ndiv.main table td.text ol,\ndiv.main table td.text ul,\ndiv.main table td.text dl {\n    width: 90%\n}\n\ndiv.main table td.text ol li,\ndiv.main table td.text ul li {\n    margin-top: 0.5em;\n    margin-bottom: 0.5em\n}\n\ndiv.main table td.text dl {\n    margin-left: 1em\n}\ndiv.main table td.text dl dt {\n    font-family: monospace;\n    margin-top: 0.5em\n}\ndiv.main table td.text dl dd {\n    margin-bottom: 0.5em\n}\n\ndiv.main table td.text code {\n    white-space: nowrap;\n    font-style: normal;\n    background-color: #eec;\n    padding: 0 2px\n}\n\ndiv.main table td.text code.bold {\n    font-family: monospace;\n    font-weight: bold\n}\n\ndiv.main table td.goto div {\n    text-align: right;\n    margin: 0 auto 0.5em auto;\n    width: 50em\n}\n\ndiv.main table td.back {\n    text-align: center;\n    padding-left: 0;\n    padding-top: 2em\n}\n\ndiv.main table td.back div {\n    padding-top: 2em;\n    border-top: 1px solid #cca\n}\n\ntable.repositories {\n    font-family: monospace;\n}\n\ntable.pre {\n    margin: 1em;\n    margin-right: auto;\n    font-family: monospace\n}\n\ntable.pre tr td {\n    white-space: pre;\n    margin: 0;\n    padding: 0\n}\n\ntable.toc {\n    width: 100%;\n}\n\ntable.toc th {\n    text-align: left\n}\n\ntable.toc tr.h2 td {\n    padding-top: 1em\n}\n\ntable.toc tr.h3 td {\n    padding-left: 2em\n}\n\ntable.toc a {\n    color: #222\n}\n"
  },
  {
    "path": "src/resources/tutorial.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\n$(document).ready(function ()\n  {\n    $(\"td.goto a, td.back a\").button();\n  });\n"
  },
  {
    "path": "src/resources/whitespace.css",
    "content": "/* -*- mode: css; indent-tabs-mode: nil -*-\n\n Copyright 2012 Jens Lindström, Opera Software ASA\n\n Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n use this file except in compliance with the License.  You may obtain a copy of\n the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n License for the specific language governing permissions and limitations under\n the License.\n\n*/\n\ntr.line td > i.tailws, tr.line.inserted td i.tailws {\n    background: red !important\n}\n\ntr.line td i.cr {\n    color: #888\n}\ntr.line td i.cr:before {\n    content: \"^M\"\n}\n\ntr.line td i.bom {\n    color: #888\n}\ntr.line td i.bom:before {\n    content: \"BOM\"\n}\n\ntr.line td > i.tailws i.cr {\n    color: white\n}\n\ntr.line td > i.tailws > b.t {\n    color: white\n}\n\ntr.line i.i i.tailws {\n    outline: 2px solid red\n}\n\ntr.line.inserted > td.line.new b.t.ill,\ntr.line.replaced > td.line.new b.t.ill,\ntr.line.modified > td.line b.t.ill {\n    background-color: red;\n    color: white\n}\n"
  },
  {
    "path": "src/reviewing/__init__.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\n"
  },
  {
    "path": "src/reviewing/comment/__init__.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport dbutils\nimport diff\nimport diff.parse\nimport gitutils\nimport itertools\nimport changeset.load as changeset_load\nimport htmlutils\nimport re\nimport page.utils\n\nfrom htmlutils import jsify\nfrom time import strftime\nfrom reviewing.filters import Filters\nfrom operation import OperationFailure\n\nclass Comment:\n    def __init__(self, chain, batch_id, id, state, user, time, when, comment, code, unread):\n        self.chain = chain\n        self.batch_id = batch_id\n        self.id = id\n        self.state = state\n        self.user = user\n        self.time = time\n        self.when = when\n        self.comment = comment\n        self.code = code\n        self.unread = unread\n\n    def __repr__(self):\n        return \"Comment(%r)\" % self.comment\n\n    def getJSConstructor(self):\n        return \"new Comment(%d, %s, %s, %s, %s)\" % (self.id, self.user.getJSConstructor(), jsify(strftime(\"%Y-%m-%d %H:%M\", self.time.timetuple())), jsify(self.state), jsify(self.comment))\n\n    @staticmethod\n    def fromId(db, id, user):\n        cursor = db.cursor()\n        cursor.execute(\"SELECT chain, batch, uid, time, state, comment, code FROM comments WHERE id=%s\", (id,))\n        row = cursor.fetchone()\n        if not row: return None\n        else:\n            chain_id, batch_id, author_id, time, state, comment, code = row\n            author = dbutils.User.fromId(db, author_id)\n            adjusted_time = user.adjustTimestamp(db, time)\n            when = user.formatTimestamp(db, time)\n            cursor.execute(\"SELECT 1 FROM commentstoread WHERE uid=%s AND comment=%s\", (user.id, id))\n            return Comment(CommentChain.fromId(db, chain_id, user), batch_id, id, state, author, adjusted_time, when, comment, code, cursor.fetchone() is not None)\n\nclass CommentChain:\n    def __init__(self, id, user, review, batch_id, type, state, origin=None, file_id=None, first_commit=None, last_commit=None, closed_by=None, addressed_by=None, type_is_draft=False, state_is_draft=False, last_commit_is_draft=False, addressed_by_is_draft=False, leader=None, count=None, unread=None):\n        self.id = id\n        self.user = user\n        self.review = review\n        self.batch_id = batch_id\n        self.type = type\n        self.type_is_draft = type_is_draft\n        self.state = state\n        self.state_is_draft = state_is_draft\n        self.origin = origin\n        self.file_id = file_id\n        self.first_commit = first_commit\n        self.last_commit = last_commit\n        self.last_commit_is_draft = last_commit_is_draft\n\n        self.closed_by = closed_by\n        self.addressed_by = addressed_by\n        self.addressed_by_is_draft = addressed_by_is_draft\n\n        self.lines = None\n        self.lines_by_sha1 = None\n\n        self.__leader = leader\n        self.__count = count\n        self.__unread = unread\n        self.comments = []\n\n    def setLines(self, sha1, offset, count):\n        if not self.lines:\n            self.lines = []\n            self.lines_by_sha1 = {}\n        assert sha1 not in self.lines\n        self.lines.append((sha1, offset, count))\n        self.lines_by_sha1[sha1] = (offset, count)\n        return self\n\n    def loadComments(self, db, user, include_draft_comments=True):\n        if include_draft_comments:\n            if self.state == \"draft\":\n                draft_user_id = self.user.id\n            else:\n                draft_user_id = user.id\n        else:\n            draft_user_id = None\n\n        cursor = db.cursor()\n        cursor.execute(\"\"\"SELECT comments.id,\n                                 comments.batch,\n                                 comments.state,\n                                 comments.uid,\n                                 comments.time,\n                                 comments.comment,\n                                 comments.code,\n                                 commentstoread.uid IS NOT NULL AS unread\n                            FROM comments\n                 LEFT OUTER JOIN commentstoread ON (comments.id=commentstoread.comment AND commentstoread.uid=%s)\n                           WHERE comments.chain=%s\n                             AND ((comments.state='draft' AND comments.uid=%s) OR comments.state='current')\n                        ORDER BY comments.batch ASC\"\"\",\n                       (user.id, self.id, draft_user_id))\n        last = None\n        for comment_id, batch_id, comment_state, author_id, time, comment, code, unread in cursor.fetchall():\n            author = dbutils.User.fromId(db, author_id)\n            adjusted_time = user.adjustTimestamp(db, time)\n            when = user.formatTimestamp(db, time)\n            comment = Comment(self, batch_id, comment_id, comment_state, author,\n                              adjusted_time, when, comment, code, unread)\n            if comment_state == 'draft': last = comment\n            else: self.comments.append(comment)\n        if last: self.comments.append(last)\n\n    def when(self):\n        return self.comments[0].when\n\n    def countComments(self):\n        if self.__count is None:\n            self.__count = len(self.comments)\n        return self.__count\n\n    def countUnread(self):\n        if self.__unread is None:\n            self.__unread = len(filter(lambda comment: comment.unread, self.comments))\n        return self.__unread\n\n    def title(self, include_time=True):\n        if self.type == \"issue\":\n            result = \"Issue raised by %s\" % (self.user.fullname)\n        else:\n            result = \"Note by %s\" % (self.user.fullname)\n        if include_time:\n            result += \" at %s\" % self.when()\n        return result\n\n    def leader(self, max_length=80, text=False):\n        if self.__leader is None: self.__leader = self.comments[0].comment.split(\"\\n\", 1)[0]\n        if len(self.__leader) > max_length:\n            if text: return self.__leader[:max_length - 5] + \"[...]\"\n            else: return htmlutils.htmlify(self.__leader[:max_length - 3]) + \"[&#8230;]\"\n        else:\n            if text: return self.__leader\n            else: return htmlutils.htmlify(self.__leader)\n\n    def getJSConstructor(self, sha1=None):\n        if self.closed_by: closed_by = self.closed_by.getJSConstructor()\n        else: closed_by = \"null\"\n\n        if self.addressed_by: addressed_by = jsify(self.addressed_by.sha1)\n        else: addressed_by = \"null\"\n\n        comments = \", \".join(map(Comment.getJSConstructor, self.comments))\n\n        if sha1:\n            offset, count = self.lines_by_sha1[sha1]\n            if self.file_id:\n                lines = \"new CommentLines(%d, %s, %d, %d)\" % (self.file_id, jsify(sha1), offset, offset + count - 1)\n            else:\n                lines = \"new CommentLines(null, %s, %d, %d)\" % (jsify(sha1), offset, offset + count - 1)\n        else:\n            lines = \"null\"\n\n        return \"new CommentChain(%d, %s, %s, %s, %s, %s, %s, [%s], %s)\" % (self.id, self.user.getJSConstructor(), jsify(self.type), \"true\" if self.type_is_draft else \"false\", jsify(self.state), closed_by, addressed_by, comments, lines)\n\n    def __nonzero__(self):\n        return bool(self.comments)\n\n    def __eq__(self, other):\n        return other is not None and self.id == other.id\n\n    def __ne__(self, other):\n        return other is None or self.id != other.id\n\n    def __repr__(self):\n        return \"CommentChain(%d)\" % self.id\n\n    def __len__(self):\n        return len(self.comments)\n\n    def __getitem__(self, index):\n        return self.comments[index]\n\n    @staticmethod\n    def fromReview(db, review, user):\n        cursor = db.cursor()\n        cursor.execute(\"\"\"SELECT commentchains.id, commentchains.batch,\n                                 users.id, users.name, users.fullname, users.status,\n                                 useremails.email, useremails.verified,\n                                 commentchains.type, drafttype.to_type,\n                                 commentchains.state, draftstate.to_state,\n                                 SUBSTR(comments.comment, 1, 81),\n                                 chaincomments(commentchains.id),\n                                 chainunread(commentchains.id, %s)\n                            FROM commentchains\n                            JOIN users ON (users.id=commentchains.uid)\n                            JOIN useremails ON (useremails.id=users.email)\n                            JOIN comments ON (comments.id=commentchains.first_comment)\n                 LEFT OUTER JOIN commentchainchanges AS drafttype ON (drafttype.chain=commentchains.id\n                                                                  AND drafttype.uid=%s\n                                                                  AND drafttype.to_type IS NOT NULL\n                                                                  AND drafttype.state='draft')\n                 LEFT OUTER JOIN commentchainchanges AS draftstate ON (draftstate.chain=commentchains.id\n                                                                   AND draftstate.uid=%s\n                                                                   AND draftstate.to_state IS NOT NULL\n                                                                   AND draftstate.state='draft')\n                           WHERE commentchains.review=%s\n                             AND (commentchains.state!='draft' or commentchains.uid=%s)\n                        ORDER BY commentchains.id ASC\"\"\",\n                       (user.id, user.id, user.id, review.id, user.id,))\n        chains = []\n        for chain_id, batch_id, user_id, user_name, user_fullname, user_status, user_email, user_email_verified, chain_type, draft_type, chain_state, draft_state, leader, count, unread in cursor:\n            if draft_type is not None: chain_type = draft_type\n            if draft_state is not None: chain_state = draft_state\n\n            if \"\\n\" in leader: leader = leader[:leader.index(\"\\n\")]\n\n            chains.append(CommentChain(chain_id, dbutils.User(user_id, user_name, user_fullname, user_status, user_email, user_email_verified), review, batch_id, chain_type, chain_state, leader=leader, count=count, unread=unread))\n\n        return chains\n\n    @staticmethod\n    def fromId(db, id, user, review=None, skip=None):\n        cursor = db.cursor()\n        cursor.execute(\"SELECT review, batch, uid, type, state, origin, file, first_commit, last_commit, closed_by, addressed_by FROM commentchains WHERE id=%s\", [id])\n        row = cursor.fetchone()\n        if not row:\n            return None\n        else:\n            review_id, batch_id, user_id, type, state, origin, file_id, first_commit_id, last_commit_id, closed_by_id, addressed_by_id = row\n            type_is_draft = False\n            state_is_draft = False\n            last_commit_is_draft = False\n            addressed_by_is_draft = False\n\n            if user is not None:\n                cursor.execute(\"\"\"SELECT from_type, to_type,\n                                         from_state, to_state,\n                                         from_last_commit, to_last_commit,\n                                         from_addressed_by, to_addressed_by\n                                    FROM commentchainchanges\n                                   WHERE chain=%s\n                                     AND uid=%s\n                                     AND state='draft'\"\"\",\n                               [id, user.id])\n\n                for from_type, to_type, from_state, to_state, from_last_commit_id, to_last_commit_id, from_addressed_by_id, to_addressed_by_id in cursor:\n                    if from_state == state:\n                        state = to_state\n                        state_is_draft = True\n                        if to_state != \"open\":\n                            closed_by_id = user.id\n                    if from_type == type:\n                        type = to_type\n                        type_is_draft = True\n                    if from_last_commit_id == last_commit_id:\n                        last_commit_id = from_last_commit_id\n                        last_commit_is_draft = True\n                    if from_addressed_by_id == addressed_by_id:\n                        addressed_by_id = to_addressed_by_id\n                        addressed_by_is_draft = True\n\n            if review is None:\n                review = dbutils.Review.fromId(db, review_id)\n            else:\n                assert review.id == review_id\n\n            first_commit = last_commit = addressed_by = None\n\n            if not skip or 'commits' not in skip:\n                if first_commit_id: first_commit = gitutils.Commit.fromId(db, review.repository, first_commit_id)\n                if last_commit_id: last_commit = gitutils.Commit.fromId(db, review.repository, last_commit_id)\n                if addressed_by_id: addressed_by = gitutils.Commit.fromId(db, review.repository, addressed_by_id)\n\n            if closed_by_id: closed_by = dbutils.User.fromId(db, closed_by_id)\n            else: closed_by = None\n\n            chain = CommentChain(id, dbutils.User.fromId(db, user_id), review,\n                                 batch_id, type, state, origin, file_id,\n                                 first_commit, last_commit, closed_by, addressed_by,\n                                 type_is_draft=type_is_draft,\n                                 state_is_draft=state_is_draft,\n                                 last_commit_is_draft=last_commit_is_draft,\n                                 addressed_by_is_draft=addressed_by_is_draft)\n\n            if not skip or 'lines' not in skip:\n                if chain.state == \"draft\":\n                    draft_user_id = chain.user.id\n                elif user is not None:\n                    draft_user_id = user.id\n                else:\n                    draft_user_id = None\n\n                cursor.execute(\"\"\"SELECT sha1, first_line, last_line\n                                    FROM commentchainlines\n                                   WHERE chain=%s\n                                     AND (state='current' OR uid=%s)\"\"\",\n                               (id, draft_user_id))\n\n                for sha1, first_line, last_line in cursor.fetchall():\n                    chain.setLines(sha1, first_line, last_line - first_line + 1)\n\n            return chain\n\ndef loadCommentChains(db, review, user, file=None, changeset=None, commit=None, local_comments_only=False):\n    result = []\n    cursor = db.cursor()\n\n    chain_ids = None\n\n    if file is None and changeset is None and commit is None:\n        cursor.execute(\"SELECT id FROM commentchains WHERE review=%s AND file IS NULL\", [review.id])\n    elif commit is not None:\n        cursor.execute(\"\"\"SELECT DISTINCT id\n                            FROM commentchains\n                           WHERE review=%s\n                             AND file IS NULL\n                             AND first_commit=%s\n                             AND ((state!='draft' OR uid=%s)\n                               AND state!='empty')\n                        GROUP BY id\"\"\",\n                       [review.id, commit.getId(db), user.id])\n    elif local_comments_only:\n        cursor.execute(\"\"\"SELECT DISTINCT commentchains.id\n                            FROM commentchains\n                            JOIN commentchainlines ON (commentchainlines.chain=commentchains.id)\n                            JOIN fileversions ON (fileversions.file=commentchains.file)\n                           WHERE commentchains.review=%s\n                             AND commentchains.file=%s\n                             AND commentchains.state!='empty'\n                             AND ((commentchains.first_commit=%s AND commentchains.last_commit=%s)\n                               OR commentchains.addressed_by=%s)\n                             AND fileversions.changeset=%s\n                             AND (commentchainlines.sha1=fileversions.old_sha1\n                               OR commentchainlines.sha1=fileversions.new_sha1)\n                             AND (commentchainlines.state='current'\n                               OR commentchainlines.uid=%s)\n                        ORDER BY commentchains.id ASC\"\"\",\n                       (review.id, file.id, changeset.parent.getId(db), changeset.child.getId(db), changeset.child.getId(db), changeset.id, user.id))\n    else:\n        chain_ids = set()\n\n        if file is not None: files = [file]\n        else: files = changeset.files\n\n        for file in files:\n            cursor.execute(\"\"\"SELECT id\n                                FROM commentchains\n                                JOIN commentchainlines ON (commentchainlines.chain=commentchains.id)\n                               WHERE commentchains.review=%s\n                                 AND commentchains.file=%s\n                                 AND commentchains.state!='empty'\n                                 AND (commentchains.state!='draft' OR commentchains.uid=%s)\n                                 AND (commentchainlines.sha1=%s\n                                   OR commentchainlines.sha1=%s)\n                                 AND (commentchainlines.state='current'\n                                   OR commentchainlines.uid=%s)\"\"\",\n                           (review.id, file.id, user.id, file.old_sha1, file.new_sha1, user.id))\n\n            for (chain_id,) in cursor.fetchall():\n                chain_ids.add(chain_id)\n\n    if chain_ids is None:\n        chain_ids = set()\n\n        for (chain_id,) in cursor.fetchall():\n            chain_ids.add(chain_id)\n\n    for chain_id in sorted(chain_ids):\n        chain = CommentChain.fromId(db, chain_id, user, review=review)\n        chain.loadComments(db, user)\n        result.append(chain)\n\n    return result\n\ndef createCommentChain(db, user, review, chain_type, commit=None, origin=None, file=None, parent=None, child=None, offset=None, count=None):\n    import reviewing.comment.propagate\n\n    if chain_type == \"issue\" and review.state != \"open\":\n        raise OperationFailure(code=\"reviewclosed\",\n                               title=\"Review is closed!\",\n                               message=\"You need to reopen the review before you can raise new issues.\")\n\n    cursor = db.cursor()\n\n    if file is not None:\n        if origin == \"old\":\n            commit = parent\n        else:\n            commit = child\n\n        propagation = reviewing.comment.propagate.Propagation(db)\n\n        if not propagation.setCustom(review, commit, file.id, offset, offset + count - 1):\n            raise OperationFailure(code=\"invalidoperation\",\n                                   title=\"Invalid operation\",\n                                   message=\"It's not possible to create a comment here.\")\n\n        propagation.calculateInitialLines()\n\n        cursor.execute(\"\"\"INSERT INTO commentchains (review, uid, type, origin, file, first_commit, last_commit)\n                               VALUES (%s, %s, %s, %s, %s, %s, %s)\n                            RETURNING id\"\"\",\n                       (review.id, user.id, chain_type, origin, file.id,\n                        parent.getId(db) if parent else None,\n                        child.getId(db) if child else None))\n\n        chain_id = cursor.fetchone()[0]\n        commentchainlines_values = []\n\n        for sha1, (first_line, last_line) in propagation.new_lines.items():\n            commentchainlines_values.append((chain_id, user.id, sha1, first_line, last_line))\n\n        cursor.executemany(\"\"\"INSERT INTO commentchainlines (chain, uid, sha1, first_line, last_line)\n                                   VALUES (%s, %s, %s, %s, %s)\"\"\",\n                           commentchainlines_values)\n    elif commit is not None:\n        if offset + count > len(commit.message.splitlines()):\n            raise OperationFailure(code=\"invalidoperation\",\n                                   title=\"Invalid operation\",\n                                   message=\"It's not possible to create a comment here.\")\n\n        cursor.execute(\"\"\"INSERT INTO commentchains (review, uid, type, first_commit, last_commit)\n                               VALUES (%s, %s, %s, %s, %s)\n                            RETURNING id\"\"\",\n                       (review.id, user.id, chain_type, commit.getId(db), commit.getId(db)))\n        chain_id = cursor.fetchone()[0]\n\n        cursor.execute(\"\"\"INSERT INTO commentchainlines (chain, uid, sha1, first_line, last_line)\n                               VALUES (%s, %s, %s, %s, %s)\"\"\",\n                       (chain_id, user.id, commit.sha1, offset, offset + count - 1))\n    else:\n        cursor.execute(\"\"\"INSERT INTO commentchains (review, uid, type)\n                               VALUES (%s, %s, %s)\n                            RETURNING id\"\"\",\n                       (review.id, user.id, chain_type))\n        chain_id = cursor.fetchone()[0]\n\n    commentchainusers = set([user.id] + map(int, review.owners))\n\n    cursor.executemany(\"INSERT INTO commentchainusers (chain, uid) VALUES (%s, %s)\", [(chain_id, user_id) for user_id in commentchainusers])\n\n    return chain_id\n\ndef createComment(db, user, chain_id, comment, first=False):\n    cursor = db.cursor()\n\n    cursor.execute(\"INSERT INTO comments (chain, uid, time, state, comment) VALUES (%s, %s, now(), 'draft', %s) RETURNING id\", (chain_id, user.id, comment))\n    comment_id = cursor.fetchone()[0]\n\n    if first:\n        cursor.execute(\"UPDATE commentchains SET first_comment=%s WHERE id=%s\", (comment_id, chain_id))\n\n    return comment_id\n\ndef validateCommentChain(db, review, origin, parent, child, file, offset, count):\n    \"\"\"\n    Check whether the commented lines are changed by later commits in the\n    review.\n\n    If they are, a diff.Changeset object representing the first changeset that\n    modifies those lines is returned.  If they are not, None is returned.\n    \"\"\"\n\n    import reviewing.comment.propagate\n\n    if origin == \"old\":\n        commit = parent\n    else:\n        commit = child\n\n    propagation = reviewing.comment.propagate.Propagation(db)\n\n    if not propagation.setCustom(review, commit, file.id, offset, offset + count - 1):\n        return \"invalid\", {}\n\n    propagation.calculateInitialLines()\n\n    if propagation.active:\n        if commit.getFileSHA1(file.path) != review.branch.getHead(db).getFileSHA1(file.path):\n            return \"transferred\", {}\n        else:\n            return \"clean\", {}\n    else:\n        addressed_by = propagation.addressed_by[0]\n\n        return \"modified\", { \"parent_sha1\": addressed_by.parent.sha1,\n                             \"child_sha1\": addressed_by.child.sha1,\n                             \"offset\": addressed_by.location.first_line }\n\ndef propagateCommentChains(db, user, review, commits, replayed_rebases={}):\n    import reviewing.comment.propagate\n\n    cursor = db.cursor()\n    cursor.execute(\"\"\"SELECT id, uid, type, state, file\n                        FROM commentchains\n                       WHERE review=%s\n                         AND file IS NOT NULL\"\"\",\n                   (review.id,))\n\n    chains_by_file = {}\n\n    for chain_id, chain_user_id, chain_type, chain_state, file_id in cursor:\n        chains_by_file.setdefault(file_id, {})[chain_id] = (chain_user_id, chain_type, chain_state)\n\n    commentchainlines_values = []\n    addressed_values = []\n\n    for file_id, chains in chains_by_file.items():\n        file_path = dbutils.describe_file(db, file_id)\n        file_sha1 = review.branch.getHead(db).getFileSHA1(file_path)\n\n        cursor.execute(\"\"\"SELECT chain, first_line, last_line\n                            FROM commentchainlines\n                           WHERE chain=ANY (%s)\n                             AND sha1=%s\"\"\",\n                       (chains.keys(), file_sha1))\n\n        for chain_id, first_line, last_line in cursor:\n            assert len(commits.getHeads()) == 1\n\n            head = commits.getHeads().pop()\n\n            if head in replayed_rebases:\n                head = replayed_rebases[head]\n\n            propagation = reviewing.comment.propagate.Propagation(db)\n            propagation.setExisting(review, chain_id, review.branch.getHead(db), file_id, first_line, last_line)\n            propagation.calculateAdditionalLines(commits, head)\n\n            chain_user_id, chain_type, chain_state = chains[chain_id]\n            lines_state = \"draft\" if chain_state == \"draft\" else \"current\"\n\n            for sha1, (first_line, last_line) in propagation.new_lines.items():\n                commentchainlines_values.append((chain_id, chain_user_id, lines_state, sha1, first_line, last_line))\n\n            if chain_type == \"issue\" and chain_state in (\"open\", \"draft\") and not propagation.active:\n                addressed_values.append((propagation.addressed_by[0].child.getId(db), chain_id))\n\n    cursor.executemany(\"\"\"INSERT INTO commentchainlines (chain, uid, state, sha1, first_line, last_line)\n                          VALUES (%s, %s, %s, %s, %s, %s)\"\"\",\n                       commentchainlines_values)\n\n    if addressed_values:\n        cursor.executemany(\"UPDATE commentchains SET state='addressed', addressed_by=%s WHERE id=%s AND state='open'\", addressed_values)\n        cursor.executemany(\"UPDATE commentchains SET addressed_by=%s WHERE id=%s AND state='draft'\", addressed_values)\n\n        print \"Addressed issues:\"\n        for commit_id, chain_id in addressed_values:\n            chain = CommentChain.fromId(db, chain_id, user, review=review)\n            if chain.state == 'addressed':\n                chain.loadComments(db, user)\n                title = \"  %s: \" % chain.title(False)\n                print \"%s%s\" % (title, chain.leader(max_length=80 - len(title), text=True))\n"
  },
  {
    "path": "src/reviewing/comment/propagate.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport dbutils\nimport diff\n\nfrom changeset.utils import createChangeset\n\nFORWARD  = 1\nBACKWARD = 2\n\nclass Location:\n    def __init__(self, first_line, last_line, active=True):\n        self.first_line = first_line\n        self.last_line = last_line\n        self.active = active\n\n    def copy(self):\n        return Location(self.first_line, self.last_line, self.active)\n\n    def __iadd__(self, delta):\n        self.first_line += delta\n        self.last_line += delta\n        return self\n\n    def __len__(self):\n        return 2\n\n    def __getitem__(self, index):\n        if index == 0: return self.first_line\n        elif index == 1: return self.last_line\n        else: raise IndexError\n\n    def __eq__(self, other):\n        return tuple(self) == tuple(other)\n\n    def apply(self, changes, direction):\n        \"\"\"\n        Apply a set of changes and adjust the location accordingly.\n\n        Process a set of changes in the form of a list of objects with the\n        attributes delete_offset, delete_count, insert_offset and insert_count\n        (such as diff.Chunk objects) sorted on ascending offsets.  If any of the\n        changes overlap this location, the location's 'active' attribute is set\n        to False, otherwise the 'first_line' and 'last_line' attributes are\n        adjusted to keep the location referencing the same lines.\n\n        If the 'direction' argument is FORWARD, this location is interpreted as\n        a location in the old version (before the changes) and is adjusted to a\n        location in the new version (after the changes.)  If the argument is\n        BACKWARD, this location is interpreted as a location in the new version\n        (after the changes) and is adjusted to a location in the old version\n        (before the changes.)\n\n        Returns True if the location is still active.\n        \"\"\"\n\n        delta = 0\n\n        # The only difference between the two loops is that uses of\n        # delete_offset/delete_count and insert_offset/insert_count are\n        # mirrored.\n        if direction == FORWARD:\n            for change in changes:\n                if change.delete_offset + change.delete_count <= self.first_line:\n                    # Change is before (and does not overlap) the location.\n                    delta += change.insert_count - change.delete_count\n                elif change.delete_offset <= self.last_line:\n                    # Change overlaps the location.\n                    self.active = False\n                    break\n                else:\n                    # Change is after the location, meaning, since changes come\n                    # in ascending offset order, that all other changes are also\n                    # after the location.\n                    break\n        else:\n            for change in changes:\n                if change.insert_offset + change.insert_count <= self.first_line:\n                    # Change is before (and does not overlap) the location.\n                    delta += change.delete_count - change.insert_count\n                elif change.insert_offset <= self.last_line:\n                    # Change overlaps the comment chain.\n                    self.active = False\n                    break\n                else:\n                    # Change is after the location, meaning, since changes come\n                    # in ascending offset order, that all other changes are also\n                    # after the location.\n                    break\n\n        # Apply 'delta' to the location if it's still active.\n        if self.active: self += delta\n\n        return self.active\n\nclass AddressedBy(object):\n    def __init__(self, parent, child, location):\n        self.parent = parent\n        self.child = child\n        self.location = location\n\nclass Propagation:\n    def __init__(self, db):\n        self.db = db\n        self.review = None\n        self.head = None\n        self.rebases = None\n        self.initial_commit = None\n        self.file_path = None\n        self.file_id = None\n        self.location = None\n        self.active = None\n        self.all_lines = None\n        self.new_lines = None\n\n    def setCustom(self, review, commit, file_id, first_line, last_line):\n        \"\"\"\n        Initialize for propagation of a custom location.\n\n        This mode of operation is used to propagate a new comment chain to all\n        relevant commits current part of the review.\n\n        Returns false if the creating a comment at the specified location is not\n        supported, typically because the commit is not being reviewed in the\n        review.\n        \"\"\"\n\n        assert first_line > 0\n        assert last_line >= first_line\n\n        if not review.containsCommit(self.db, commit, True):\n            return False\n\n        self.review = review\n        self.rebases = review.getReviewRebases(self.db)\n        self.initial_commit = commit\n        self.addressed_by = []\n        self.file_path = dbutils.describe_file(self.db, file_id)\n        self.file_id = file_id\n        self.location = Location(first_line, last_line)\n        self.active = True\n\n        file_entry = commit.getFileEntry(self.file_path)\n\n        if file_entry is None:\n            # File doesn't exist (in the given commit.)\n            return False\n\n        diff_file = diff.File(new_sha1=file_entry.sha1,\n                              new_mode=file_entry.mode,\n                              repository=review.repository)\n        diff_file.loadNewLines()\n\n        if last_line > diff_file.newCount():\n            # Range of lines is out of bounds.\n            return False\n\n        self.all_lines = { file_entry.sha1: (first_line, last_line) }\n        self.new_lines = { file_entry.sha1: (first_line, last_line) }\n\n        return True\n\n    def setExisting(self, review, chain_id, commit, file_id, first_line, last_line, reopening=False):\n        \"\"\"\n        Initialize for propagation of existing comment chain.\n\n        This initializes the location to where the comment chain is located in\n        the most recent commit in the review.  If the comment chain is not\n        present in the most recent commit in the review, this function returns\n        False.\n\n        This mode of operation is used to update existing comment chains when\n        adding new commits to a review.\n        \"\"\"\n\n        self.review = review\n        self.rebases = review.getReviewRebases(self.db)\n        self.initial_commit = commit\n        self.addressed_by = []\n        self.file_path = dbutils.describe_file(self.db, file_id)\n        self.file_id = file_id\n        self.location = Location(first_line, last_line)\n        self.active = True\n        self.all_lines = {}\n        self.new_lines = {}\n\n        cursor = self.db.cursor()\n        cursor.execute(\"\"\"SELECT sha1, first_line, last_line\n                            FROM commentchainlines\n                           WHERE chain=%s\"\"\",\n                       (chain_id,))\n\n        for file_sha1, first_line, last_line in cursor:\n            self.all_lines[file_sha1] = (first_line, last_line)\n\n        if reopening:\n            self.__setLines(commit.getFileSHA1(self.file_path), self.location)\n\n        return True\n\n    def calculateInitialLines(self):\n        \"\"\"\n        Calculate the initial set of line mappings for a comment chain.\n\n        Propagates the initial location both backward and forward through all\n        current commits in the review.  If, through forward propagation, the\n        location becomes inactive, the 'active' attribute is set to False.  In\n        any case, the 'lines' attribute will map each file SHA-1 to a pair of\n        line numbers (first_line, last_line) for each location found during the\n        propagation.\n\n        Returns the value of the 'active' attribute.\n        \"\"\"\n\n        self.head = self.review.branch.getHead(self.db)\n\n        self.__propagate(self.review.getCommitSet(self.db))\n        return self.active\n\n    def calculateAdditionalLines(self, commits, head):\n        \"\"\"\n        Calculate additional set of line mappings when adding new commits.\n\n        If this propagation object is not active (because the comment chain\n        it represents is not present in the most recent commit in the review)\n        then nothing happens.\n\n        Returns the value of the 'active' attribute.\n        \"\"\"\n\n        self.head = head\n\n        self.__propagate(commits)\n        return self.active\n\n    def __propagate(self, commits):\n        cursor = self.db.cursor()\n\n        def propagateBackward(commit, location, processed):\n            parents = commits.getParents(commit)\n            recurse = []\n\n            if not parents:\n                rebase = self.rebases.fromNewHead(commit)\n                if rebase:\n                    parents.add(rebase.old_head)\n                else:\n                    for parent_sha1 in commit.parents:\n                        rebase = self.rebases.fromNewHead(parent_sha1)\n                        if rebase:\n                            parents.add(rebase.old_head)\n\n            for parent in parents - processed:\n                changes, removed, added = self.__getChanges(parent, commit)\n                if added:\n                    pass\n                elif changes:\n                    parent_location = location.copy()\n                    if parent_location.apply(changes, BACKWARD):\n                        file_sha1 = parent.getFileSHA1(self.file_path)\n                        assert file_sha1\n                        self.__setLines(file_sha1, parent_location)\n                        recurse.append((parent, parent_location))\n                else:\n                    recurse.append((parent, location))\n\n            processed.add(commit)\n\n            for parent, parent_location in recurse:\n                propagateBackward(parent, parent_location, processed)\n\n        def propagateForward(commit, location, processed):\n            if commit == self.head:\n                self.active = True\n\n            children = commits.getChildren(commit)\n            recurse = []\n\n            if not children:\n                rebase = self.rebases.fromOldHead(commit)\n                if rebase:\n                    children.update([rebase.new_head])\n\n            if not children:\n                assert not commits or commit in commits.getHeads() or self.rebases.fromNewHead(commit)\n\n            for child in children - processed:\n                changes, removed, added = self.__getChanges(commit, child)\n                if removed:\n                    self.addressed_by.append(AddressedBy(commit, child, location))\n                elif changes:\n                    child_location = location.copy()\n                    if child_location.apply(changes, FORWARD):\n                        file_sha1 = child.getFileSHA1(self.file_path)\n                        assert file_sha1\n                        self.__setLines(file_sha1, child_location)\n                        recurse.append((child, child_location))\n                    else:\n                        self.addressed_by.append(AddressedBy(commit, child, location))\n                else:\n                    recurse.append((child, location))\n\n            processed.add(commit)\n\n            for child, child_location in recurse:\n                propagateForward(child, child_location, processed)\n\n            # If we started propagation in the middle of, or at the end of, the\n            # commit-set, this call does the main backward propagation.  After\n            # that, it will do extra backward propagation via other parents of\n            # merge commits encountered during forward propagation.\n            #\n            # For non-merge commits, 'processed' will always contain the single\n            # parent of 'commit', and propagateBackward() will find no parent\n            # commits to process, leaving this call a no-op.\n            propagateBackward(commit, location, processed)\n\n        # Will be set to True again if propagation reaches the head of the\n        # commit-set.\n        self.active = False\n\n        propagateForward(self.initial_commit, self.location, set())\n\n    def __getChanges(self, from_commit, to_commit):\n        changesets = createChangeset(self.db,\n                                     user=None,\n                                     repository=self.review.repository,\n                                     from_commit=from_commit,\n                                     to_commit=to_commit,\n                                     filtered_file_ids=set([self.file_id]),\n                                     do_highlight=False)\n\n        assert len(changesets) == 1\n\n        if changesets[0].files:\n            changed_file = changesets[0].files[0]\n            assert changed_file.id == self.file_id\n            removed = changed_file.new_sha1 == \"0\" * 40\n            added = changed_file.old_sha1 == \"0\" * 40\n            return changesets[0].files[0].chunks, removed, added\n        else:\n            return None, False, False\n\n    def __setLines(self, file_sha1, lines):\n        if file_sha1 not in self.all_lines:\n            self.all_lines[file_sha1] = self.new_lines[file_sha1] = tuple(lines)\n        else:\n            assert self.all_lines[file_sha1] == tuple(lines)\n"
  },
  {
    "path": "src/reviewing/filters.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport dbutils\nimport time\nimport re\n\nclass PatternError(Exception):\n    def __init__(self, pattern, message):\n        self.pattern = pattern\n        self.message = message\n\n    def __str__(self):\n        return \"%s: %s\" % (self.pattern, self.message)\n\ndef sanitizePath(path):\n    return re.sub(\"//+\", \"/\", path.strip().lstrip(\"/\")) or \"/\"\n\ndef validatePattern(pattern):\n    if re.search(r\"[^/]\\*\\*\", pattern):\n        raise PatternError(pattern, \"** not at beginning of path or path component\")\n    elif re.search(r\"\\*\\*$\", pattern):\n        raise PatternError(pattern, \"** at end of path\")\n    elif re.search(r\"\\*\\*[^/]\", pattern):\n        raise PatternError(pattern, \"** not at end of path component\")\n\ndef validPattern(pattern):\n    try:\n        validatePattern(pattern)\n        return True\n    except PatternError:\n        return False\n\ndef compilePattern(pattern):\n    wildcards = { \"**/\": \"(?:[^/]+/)*\",\n                  \"**\": \"(?:[^/]+(?:/|$))*\",\n                  \"*\": \"[^/]*\",\n                  \"?\": \"[^/]\" }\n\n    def escape(match):\n        return \"\\\\\" + match.group(0)\n\n    def replacement(match):\n        return wildcards[match.group(0)]\n\n    pattern = re.sub(r\"[[{()+^$.\\\\|]\", escape, pattern)\n\n    return re.compile(\"^\" + re.sub(\"\\\\*\\\\*(?:/|$)|\\\\*|\\\\?\", replacement, pattern) + \"$\")\n\ndef hasWildcard(string):\n    return \"*\" in string or \"?\" in string\n\nclass Path(object):\n    def __init__(self, path):\n        path = path.lstrip(\"/\")\n\n        self.path = path\n\n        if hasWildcard(path):\n            validatePattern(path)\n\n        if path.endswith(\"/\"):\n            self.regexp = compilePattern(path + \"**/*\")\n        else:\n            self.regexp = compilePattern(path)\n\n        if not path:\n            self.dirname, self.filename = \"\", None\n        elif \"/\" in path:\n            self.dirname, self.filename = path.rsplit(\"/\", 1)\n            if not self.filename:\n                self.filename = None\n        else:\n            self.dirname, self.filename = \"\", path\n\n        if hasWildcard(self.dirname):\n            components = self.dirname.split(\"/\")\n            for index, component in enumerate(components):\n                if hasWildcard(component):\n                    self.fixedDirname = \"/\".join(components[:index])\n                    self.wildDirname = \"/\".join(components[index:])\n                    self.wildDirnameRegExp = compilePattern(self.wildDirname)\n                    break\n        else:\n            self.fixedDirname = self.dirname\n            self.wildDirname = None\n\n        if self.filename and hasWildcard(self.filename):\n            self.filenameRegExp = compilePattern(self.filename)\n        else:\n            self.filenameRegExp = None\n\n    def __repr__(self):\n        return \"Path(%r)\" % self.path\n\n    def match(self, path):\n        return bool(self.regexp.match(path))\n\n    @staticmethod\n    def cmp(pathA, pathB):\n        # Filters that select individual files rank above filters that\n        # select directories (even if the actual name of the file contains\n        # wildcards.)\n        if pathA.endswith(\"/\") and not pathB.endswith(\"/\"):\n            return -1\n        elif not pathA.endswith(\"/\") and pathB.endswith(\"/\"):\n            return 1\n\n        # Filters with more slashes in them rank higher than filters with fewer\n        # slashes (but \"**/\" doesn't count as a slash, since it might match zero\n        # slashes in practice.)\n        specificityA = pathA.count(\"/\") - len(re.findall(r\"\\*\\*/\", pathA))\n        specificityB = pathB.count(\"/\") - len(re.findall(r\"\\*\\*/\", pathB))\n        if specificityA < specificityB:\n            return -1\n        elif specificityA > specificityB:\n            return 1\n\n        # Filters with fewer wildcards in them rank higher than filters with\n        # more wildcards.\n        wildcardsA = len(re.findall(\"\\\\*\\\\*|\\\\*|\\\\?\", pathA))\n        wildcardsB = len(re.findall(\"\\\\*\\\\*|\\\\*|\\\\?\", pathB))\n        if wildcardsA < wildcardsB:\n            return 1\n        elif wildcardsA > wildcardsB:\n            return -1\n\n        # Fall back to lexicographical ordering.  The filters probably won't\n        # match the same files anyway, and if they do, well, at least this\n        # way it's stable and predictable.\n        return cmp(pathA, pathB)\n\nclass Filters:\n    def __init__(self):\n        # Pseudo-types:\n        #   data: dict(user_id -> tuple(filter_type, delegate))\n        #   file: tuple(file_id, data)\n        #   tree: tuple(dict(dirname -> tree), dict(filename -> file))\n\n        self.files = {}          # dict(path -> file)\n        self.directories = {}    # dict(dirname -> tree)\n        self.root = ({}, {})     # tree\n        self.data = {}           # dict(file_id -> data)\n        self.active_filters = {} # dict(user_id -> set(filter_id))\n        self.matched_files = {}  # dict(filter_id -> set(file_id))\n\n        # Note: The same per-file 'data' objects are referenced by all of\n        # 'self.files', 'self.tree' and 'self.data'.\n\n        self.directories[\"\"] = self.root\n\n    def setFiles(self, db, file_ids=None, review=None):\n        assert (file_ids is None) != (review is None)\n\n        cursor = db.cursor()\n\n        if file_ids is None:\n            cursor.execute(\"SELECT DISTINCT file FROM reviewfiles WHERE review=%s\", (review.id,))\n            file_ids = [file_id for (file_id,) in cursor]\n\n        cursor.execute(\"SELECT id, path FROM files WHERE id=ANY (%s)\", (file_ids,))\n\n        for file_id, path in cursor:\n            data = {}\n\n            self.files[path] = (file_id, data)\n            self.data[file_id] = data\n\n            if \"/\" in path:\n                dirname, filename = path.rsplit(\"/\", 1)\n\n                def find_tree(dirname):\n                    tree = self.directories.get(dirname)\n                    if tree:\n                        return tree\n                    tree = self.directories[dirname] = ({}, {})\n                    if \"/\" in dirname:\n                        dirname, basename = dirname.rsplit(\"/\", 1)\n                        find_tree(dirname)[0][basename] = tree\n                    else:\n                        self.root[0][dirname] = tree\n                    return tree\n\n                tree = find_tree(dirname)\n            else:\n                filename = path\n                tree = self.root\n\n            tree[1][filename] = self.files[path]\n\n    def addFilter(self, user_id, path, filter_type, delegate, filter_id):\n        def files_in_tree(components, tree):\n            for dirname, child_tree in tree[0].items():\n                for f in files_in_tree(components + [dirname], child_tree):\n                    yield f\n            dirname = \"/\".join(components) + \"/\" if components else \"\"\n            for filename, (file_id, data) in tree[1].items():\n                yield dirname, filename, file_id, data\n\n        if not path:\n            dirname, filename = \"\", None\n            components = []\n        elif \"/\" in path:\n            dirname, filename = path.rsplit(\"/\", 1)\n            if not dirname:\n                dirname = \"\"\n                components = []\n            else:\n                components = dirname.split(\"/\")\n            if not filename:\n                filename = None\n        else:\n            dirname, filename = \"\", path\n            components = []\n\n        def hasWildcard(string):\n            return \"*\" in string or \"?\" in string\n\n        file_ids = []\n        files = []\n\n        if hasWildcard(path):\n            tree = self.root\n\n            for index, component in enumerate(components):\n                if hasWildcard(component):\n                    wild_dirname = \"/\".join(components[index:]) + \"/\"\n                    break\n                else:\n                    tree = tree[0].get(component)\n                    if not tree:\n                        return\n            else:\n                wild_dirname = None\n\n            re_filename = compilePattern(filename or \"*\")\n\n            if wild_dirname:\n                re_dirname = compilePattern(wild_dirname)\n\n                for dirname, filename, file_id, data in files_in_tree([], tree):\n                    if re_dirname.match(dirname) and re_filename.match(filename):\n                        file_ids.append(file_id)\n                        files.append(data)\n            else:\n                for filename, (file_id, data) in tree[1].items():\n                    if re_filename.match(filename):\n                        file_ids.append(file_id)\n                        files.append(data)\n        else:\n            if filename:\n                if path in self.files:\n                    file_id, data = self.files[path]\n                    file_ids.append(file_id)\n                    files.append(data)\n                else:\n                    return\n            else:\n                if dirname in self.directories:\n                    for _, _, file_id, data in files_in_tree([dirname], self.directories[dirname]):\n                        file_ids.append(file_id)\n                        files.append(data)\n                else:\n                    return\n\n        self.matched_files[filter_id] = file_ids\n\n        if filter_type == \"ignored\":\n            for data in files:\n                if user_id in data:\n                    del data[user_id]\n        elif filter_type in (\"reviewer\", \"watcher\"):\n            if files:\n                self.active_filters.setdefault(user_id, set()).add(filter_id)\n                for data in files:\n                    data[user_id] = (filter_type, delegate)\n\n    def addFilters(self, filters):\n        def compareFilters(filterA, filterB):\n            return Path.cmp(filterA[1], filterB[1])\n\n        def add_filter_id(filter_data):\n            if len(filter_data) == 4:\n                return tuple(filter_data) + (None,)\n            return filter_data\n\n        sorted_filters = sorted(map(add_filter_id, filters), cmp=compareFilters)\n\n        for user_id, path, filter_type, delegate, filter_id in sorted_filters:\n            self.addFilter(user_id, path, filter_type, delegate, filter_id)\n\n    class Review:\n        def __init__(self, review_id, applyfilters, applyparentfilters, repository):\n            self.id = review_id\n            self.applyfilters = applyfilters\n            self.applyparentfilters = applyparentfilters\n            self.repository = repository\n\n    def load(self, db, repository=None, review=None, recursive=False, user=None,\n             added_review_filters=[], removed_review_filters=[]):\n        assert (repository is None) != (review is None)\n\n        cursor = db.cursor()\n\n        if user is not None: user_filter = \" AND uid=%d\" % user.id\n        else: user_filter = \"\"\n\n        def loadGlobal(repository, recursive):\n            if recursive and repository.parent:\n                loadGlobal(repository.parent, recursive)\n\n            cursor.execute(\"\"\"SELECT filters.uid, filters.path, filters.type, filters.delegate, filters.id\n                                FROM filters\n                                JOIN users ON (users.id=filters.uid)\n                               WHERE filters.repository=%%s\n                                 AND users.status!='retired'\n                                     %s\"\"\" % user_filter,\n                           (repository.id,))\n            self.addFilters(cursor)\n\n        def loadReview(review):\n            cursor.execute(\"\"\"SELECT reviewfilters.uid, reviewfilters.path, reviewfilters.type, NULL\n                                FROM reviewfilters\n                                JOIN users ON (users.id=reviewfilters.uid)\n                               WHERE reviewfilters.review=%%s\n                                 AND users.status!='retired'\n                                     %s\"\"\" % user_filter,\n                           (review.id,))\n            if added_review_filters or removed_review_filters:\n                review_filters = set(cursor.fetchall())\n                review_filters -= set(map(tuple, removed_review_filters))\n                review_filters |= set(map(tuple, added_review_filters))\n                self.addFilters(list(review_filters))\n            else:\n                self.addFilters(cursor)\n\n        if review:\n            if review.applyfilters:\n                loadGlobal(review.repository, review.applyparentfilters)\n            loadReview(review)\n        else:\n            loadGlobal(repository, recursive)\n\n    def getUserFileAssociation(self, user_id, file_id):\n        user_id = int(user_id)\n        file_id = int(file_id)\n\n        data = self.data.get(file_id)\n        if not data:\n            return None\n\n        data = data.get(user_id)\n        if not data:\n            return None\n\n        return data[0]\n\n    def isReviewer(self, user_id, file_id):\n        return self.getUserFileAssociation(user_id, file_id) == 'reviewer'\n\n    def isWatcher(self, user_id, file_id):\n        return self.getUserFileAssociation(user_id, file_id) == 'watcher'\n\n    def isRelevant(self, user_id, file_id):\n        return self.getUserFileAssociation(user_id, file_id) in ('reviewer', 'watcher')\n\n    def listUsers(self, file_id):\n        return self.data.get(file_id, {})\n\n    def getRelevantFiles(self):\n        relevant = {}\n\n        for file_id, data in self.data.items():\n            for user_id, (filter_type, _) in data.items():\n                if filter_type in ('reviewer', 'watcher'):\n                    relevant.setdefault(user_id, set()).add(file_id)\n\n        return relevant\n\n    def getActiveFilters(self, user):\n        return self.active_filters.get(user.id, set())\n\ndef getMatchedFiles(repository, paths):\n    paths = [Path(path) for path in sorted(paths, cmp=Path.cmp, reverse=True)]\n\n    common_fixedDirname = None\n    for path in paths:\n        if path.fixedDirname is None:\n            common_fixedDirname = []\n            break\n        elif common_fixedDirname is None:\n            common_fixedDirname = path.fixedDirname.split(\"/\")\n        else:\n            for index, component in enumerate(path.fixedDirname.split(\"/\")):\n                if index == len(common_fixedDirname):\n                    break\n                elif common_fixedDirname[index] != component:\n                    del common_fixedDirname[index:]\n                    break\n            else:\n                del common_fixedDirname[index:]\n    common_fixedDirname = \"/\".join(common_fixedDirname)\n\n    args = [\"ls-tree\", \"-r\", \"--name-only\", \"HEAD\"]\n\n    if common_fixedDirname:\n        args.append(common_fixedDirname + \"/\")\n\n    matched = dict((path.path, []) for path in paths)\n\n    if repository.isEmpty():\n        return matched\n\n    filenames = repository.run(*args).splitlines()\n\n    if len(paths) == 1 and not paths[0].wildDirname and not paths[0].filename:\n        return { paths[0].path: filenames }\n\n    for filename in filenames:\n        for path in paths:\n            if path.match(filename):\n                matched[path.path].append(filename)\n                break\n\n    return matched\n\ndef countMatchedFiles(repository, paths):\n    matched = getMatchedFiles(repository, paths)\n    return dict((path, len(filenames)) for path, filenames in matched.items())\n"
  },
  {
    "path": "src/reviewing/html.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport htmlutils\nimport gitutils\nimport dbutils\nimport log.html\nimport diff\nimport diff.context\nimport changeset.html as changeset_html\nimport changeset.utils as changeset_utils\nimport page.utils\nimport page.showcommit\nimport linkify\n\ndef renderComments(db, target, user, chain, position, linkify):\n    repository = chain.review.repository\n\n    div_chain = target.div(\"comments %s\" % position)\n\n    for comment in chain.comments:\n        div_comment = div_chain.div(\"comment%s\" % (comment.state == \"draft\" and \" draft\" or \"\"), id=\"c%dc%d\" % (chain.id, comment.id))\n\n        div_header = div_comment.div(\"header\")\n        div_header.span(\"author\").text(\"%s <%s>\" % (comment.user.fullname, comment.user.email))\n        div_header.text(\" posted \")\n        div_header.span(\"time\").text(comment.when)\n\n        div_text = div_comment.div(\"text\", id=\"c%dtext\" % comment.id).preformatted()\n        div_text.text(comment.comment, linkify=linkify, repository=repository)\n\n    if chain.type == \"issue\" and chain.state not in (\"draft\", \"open\"):\n        div_resolution = div_chain.div(\"resolution\")\n\n        if chain.state == \"addressed\":\n            div_resolution.text(\"Addressed by \").a(href=\"showcommit?review=%d&sha1=%s\" % (chain.review.id, chain.addressed_by.sha1)).text(chain.addressed_by.sha1[:8])\n\n            if chain.closed_by:\n                div_resolution.text(\" (by %s)\" % chain.closed_by.fullname)\n        else:\n            div_resolution.text(\"Resolved by \" + chain.closed_by.fullname)\n\n    div_buttons = div_chain.div(\"buttons\")\n\n    if (chain.state == \"closed\" or chain.addressed_by) and chain.type == \"issue\":\n        div_buttons.button(\"reopen\", onclick=\"commentChainById[%d].reopen(true);\" % chain.id).text(\"Reopen Issue\")\n\n    if chain.type == \"issue\":\n        if chain.state == \"open\":\n            if not chain.type_is_draft:\n                div_buttons.button(\"resolve\", onclick=\"commentChainById[%d].resolve(null);\" % chain.id).text(\"Resolve Issue\")\n        if chain.type_is_draft or user.getPreference(db, \"ui.convertIssueToNote\"):\n            div_buttons.button(\"morph\", onclick=\"commentChainById[%d].morph(null, this);\" % chain.id).text(\"Convert %sto Note\" % (\"back \" if chain.type_is_draft else \"\"))\n    else:\n        div_buttons.button(\"morph\", onclick=\"commentChainById[%d].morph(null, this);\" % chain.id).text(\"Convert %sto Issue\" % (\"back \" if chain.type_is_draft else \"\"))\n\n    if chain.comments[-1].state == \"draft\":\n        div_buttons.button(\"edit\", onclick=\"commentChainById[%d].editComment(commentChainById[%d].comments[%d], null);\" % (chain.id, chain.id, len(chain.comments) - 1)).text(\"Edit\")\n        div_buttons.button(\"delete\", onclick=\"commentChainById[%d].deleteComment(commentChainById[%d].comments[%d], null);\" % (chain.id, chain.id, len(chain.comments) - 1)).text(\"Delete\")\n        reply_hidden = \" hidden\"\n    else:\n        reply_hidden = \"\"\n\n    div_buttons.button(\"reply\" + reply_hidden, onclick=\"commentChainById[%d].reply(null);\" % chain.id).text(\"Reply\")\n\n    div_buttons.span(\"buttonscope buttonscope-comment\")\n\ndef getCodeCommentChainChangeset(db, chain, original=False):\n    if (chain.state != \"addressed\" or original) and chain.first_commit == chain.last_commit:\n        # Comment against a single version of the file, not against a diff.\n        return None, None\n    elif chain.state == \"addressed\" and not original:\n        parent = gitutils.Commit.fromSHA1(db, chain.review.repository, chain.addressed_by.parents[0])\n        child = chain.addressed_by\n    else:\n        parent = chain.first_commit\n        child = chain.last_commit\n\n    return parent, child\n\ndef renderCodeCommentChain(db, target, user, review, chain, context_lines=3, compact=False, tabify=False, original=False, changeset=None, linkify=False):\n    repository = review.repository\n\n    old_sha1 = None\n    new_sha1 = None\n\n    old = 1\n    new = 2\n\n    cursor = db.cursor()\n\n    file_id = chain.file_id\n    file_path = dbutils.describe_file(db, file_id)\n\n    if (chain.state != \"addressed\" or original) and chain.first_commit == chain.last_commit:\n        sha1 = chain.first_commit.getFileSHA1(file_path)\n\n        cursor.execute(\"SELECT first_line, last_line FROM commentchainlines WHERE chain=%s AND sha1=%s\", (chain.id, sha1))\n        first_line, last_line = cursor.fetchone()\n\n        file = diff.File(file_id, file_path, sha1, sha1, review.repository, chunks=[])\n        file.loadNewLines(True)\n\n        start = max(1, first_line - context_lines)\n        end = min(file.newCount(), last_line + context_lines)\n        count = end + 1 - start\n\n        lines = file.newLines(True)\n        lines = [diff.Line(diff.Line.CONTEXT, start + index, lines[start + index - 1], start + index, lines[start + index - 1]) for index in range(count)]\n\n        file.macro_chunks = [diff.MacroChunk([], lines)]\n\n        use = new\n        display_type = \"new\"\n        commit_url_component = \"sha1=%s\" % chain.first_commit.sha1\n    else:\n        if chain.state == \"addressed\" and not original and review.containsCommit(db, chain.addressed_by):\n            parent = gitutils.Commit.fromSHA1(db, review.repository, chain.addressed_by.parents[0])\n            child = chain.addressed_by\n            use = old\n        else:\n            parent = chain.first_commit\n            child = chain.last_commit\n\n            if parent == child:\n                if chain.origin == \"old\":\n                    cursor.execute(\"\"\"SELECT changesets.child\n                                        FROM changesets, reviewchangesets\n                                       WHERE changesets.parent=%s\n                                         AND reviewchangesets.changeset=changesets.id\n                                         AND reviewchangesets.review=%s\"\"\",\n                                   [child.getId(db), review.id])\n\n                    try:\n                        child = gitutils.Commit.fromId(db, repository, cursor.fetchone()[0])\n                    except:\n                        parent = gitutils.Commit.fromSHA1(db, repository, child.parents[0])\n                else:\n                    parent = gitutils.Commit.fromSHA1(db, repository, child.parents[0])\n\n            if chain.origin == \"old\": use = old\n            else: use = new\n\n        if parent.sha1 in child.parents and len(child.parents) == 1:\n            commit = child\n            from_commit = None\n            to_commit = None\n        else:\n            commit = None\n            from_commit = parent\n            to_commit = child\n\n        if changeset:\n            assert ((changeset.parent == from_commit and changeset.child == to_commit)\n                    if commit is None else\n                    (changeset.parent.sha1 == commit.parents[0] and changeset.child == commit))\n            assert changeset.getFile(file_id)\n        else:\n            changeset = changeset_utils.createChangeset(db, user, repository, commit=commit, from_commit=from_commit, to_commit=to_commit, filtered_file_ids=set((file_id,)))[0]\n\n        file = changeset.getFile(file_id)\n\n        if not file:\n            if chain.state == \"addressed\" and not original:\n                renderCodeCommentChain(db, target, user, review, chain, context_lines, compact, tabify, original=True)\n                return\n            else:\n                raise\n\n        # Commit so that the diff and its analysis, written to the database by createChangeset(),\n        # can be reused later.\n        db.commit()\n\n        old_sha1 = file.old_sha1\n        new_sha1 = file.new_sha1\n\n        if use == old and old_sha1 == '0' * 40: use = new\n        elif use == new and new_sha1 == '0' * 40: use = old\n\n        if use == old: sha1 = old_sha1\n        else: sha1 = new_sha1\n\n        cursor.execute(\"SELECT first_line, last_line FROM commentchainlines WHERE chain=%s AND sha1=%s\", [chain.id, sha1])\n\n        first_line, last_line = cursor.fetchone()\n\n        def readChunks():\n            return [diff.Chunk(delete_offset, delete_count, insert_offset, insert_count, analysis=analysis, is_whitespace=is_whitespace)\n                    for delete_offset, delete_count, insert_offset, insert_count, analysis, is_whitespace\n                    in cursor.fetchall()]\n\n        first_context_line = first_line - context_lines\n        last_context_line = last_line + context_lines\n\n        def includeChunk(chunk):\n            if use == old: chunk_first_line, chunk_last_line = chunk.delete_offset, chunk.delete_offset + chunk.delete_count - 1\n            else: chunk_first_line, chunk_last_line = chunk.insert_offset, chunk.insert_offset + chunk.insert_count - 1\n\n            return chunk_last_line >= first_context_line and chunk_first_line <= last_context_line\n\n        def lineFilter(line):\n            if use == old:\n                linenr = line.old_offset\n                if linenr == first_context_line and line.type == diff.Line.INSERTED:\n                    return False\n            else:\n                linenr = line.new_offset\n                if linenr == first_context_line and line.type == diff.Line.DELETED:\n                    return False\n\n            return first_context_line <= linenr <= last_context_line\n\n        file.loadOldLines(True)\n        file.loadNewLines(True)\n\n        context = diff.context.ContextLines(file, file.chunks, [(chain, use == old)])\n        file.macro_chunks = context.getMacroChunks(context_lines, highlight=True, lineFilter=lineFilter)\n\n        try: macro_chunk = file.macro_chunks[0]\n        except: raise repr((parent.sha1, child.sha1))\n\n        display_type = \"both\"\n\n        if chain.state != \"addressed\":\n            first_line_type = macro_chunk.lines[0].type\n            if first_line_type == diff.Line.CONTEXT or (use == old and first_line_type == diff.Line.DELETED) or (use == new and first_line_type == diff.Line.INSERTED):\n                for line in macro_chunk.lines[1:]:\n                    if first_line_type != line.type:\n                        break\n                else:\n                    display_type = \"old\" if use == old else \"new\"\n\n        commit_url_component = \"from=%s&to=%s\" % (parent.sha1, child.sha1)\n\n    def renderHeaderLeft(db, target, file):\n        target.span(\"comment-chain-title\").a(href=\"/showcomment?chain=%d\" % chain.id).text(chain.title())\n\n    def renderHeaderRight(db, target, file):\n        side = use == old and \"o\" or \"n\"\n        uri = \"showcommit?%s&review=%d&file=%d#f%d%s%d\" % (commit_url_component, review.id, file.id, file.id, side, first_line)\n        target.span(\"filename\").a(href=uri).text(file.path)\n\n    def renderCommentsLocal(db, target, **kwargs):\n        if display_type == \"both\":\n            if use == old: position = \"left\"\n            else: position = \"right\"\n        else:\n            position = \"center\"\n\n        renderComments(db, target, user, chain, position, linkify)\n\n    def lineId(base):\n        return \"c%d%s\" % (chain.id, base)\n\n    def lineCellId(base):\n        return \"c%d%s\" % (chain.id, base)\n\n    target.addInternalScript(\"commentChainById[%d] = %s;\" % (chain.id, chain.getJSConstructor(sha1)), here=True)\n\n    changeset_html.renderFile(db, target, user, review, file, options={ \"support_expand\": False, \"display_type\": display_type, \"header_left\": renderHeaderLeft, \"header_right\": renderHeaderRight, \"content_after\": renderCommentsLocal, \"show\": True, \"expand\": True, \"line_id\": lineId, \"line_cell_id\": lineCellId, \"compact\": compact, \"tabify\": tabify, \"include_deleted\": True })\n\n    data = (chain.id, file_id, use == old and \"o\" or \"n\", first_line,\n            chain.id, file_id, use == old and \"o\" or \"n\", last_line,\n            htmlutils.jsify(chain.type), htmlutils.jsify(chain.state),\n            chain.id)\n\n    target.addInternalScript(\"\"\"$(document).ready(function ()\n  {\n    var markers = new CommentMarkers(null);\n    markers.setLines(document.getElementById('c%df%d%s%d'), document.getElementById('c%df%d%s%d'));\n    markers.setType(%s, %s);\n    commentChainById[%d].markers = markers;\n  });\"\"\" % data, here=True)\n\ndef renderReviewCommentChain(db, target, user, review, chain, linkify=False, message=None):\n    target.addInternalScript(\"commentChainById[%d] = %s;\" % (chain.id, chain.getJSConstructor()), here=True)\n\n    table = target.table(\"file show expanded first\", width=\"60%\", align=\"center\", cellspacing=0, cellpadding=0)\n\n    columns = table.colgroup()\n    columns.col(\"edge\")\n    columns.col(\"linenr\")\n    columns.col(\"line\")\n    columns.col(\"middle\")\n    columns.col(\"middle\")\n    columns.col(\"line\")\n    columns.col(\"linenr\")\n    columns.col(\"edge\")\n\n    table.thead().tr().td(\"left\", colspan=8, align=\"left\").span(\"comment-chain-title\").a(href=\"/showcomment?chain=%d\" % chain.id).text(chain.title())\n    table.tbody('spacer').tr('spacer').td(colspan='8').text()\n\n    if message:\n        row = table.tbody(\"content\").tr(\"content\")\n        row.td(colspan=2).text()\n        row.td(\"excuse\", colspan=4).innerHTML(message)\n        row.td(colspan=2).text()\n\n    table.tbody('spacer').tr('spacer').td(colspan='8').text()\n\n    row = table.tbody(\"content\").tr(\"content\")\n    row.td(colspan=2).text()\n    renderComments(db, row.td(colspan=4), user, chain, \"center\", linkify)\n    row.td(colspan=2).text()\n\n    table.tbody('spacer').tr('spacer').td(colspan='8').text()\n    table.tfoot().tr().td(\"left\", colspan=8, align=\"left\").text()\n\ndef renderCommitCommentChain(db, target, user, review, chain, linkify=False):\n    target.addInternalScript(\"commentChainById[%d] = %s;\" % (chain.id, chain.getJSConstructor()), here=True)\n\n    table = target.table(\"file show expanded first\", width=\"60%\", align=\"center\", cellspacing=0, cellpadding=0)\n\n    columns = table.colgroup()\n    columns.col(\"edge\")\n    columns.col(\"linenr\")\n    columns.col(\"line\")\n    columns.col(\"middle\")\n    columns.col(\"middle\")\n    columns.col(\"line\")\n    columns.col(\"linenr\")\n    columns.col(\"edge\")\n\n    table.thead().tr().td(\"left\", colspan=8, align=\"left\").span(\"comment-chain-title\").a(href=\"/showcomment?chain=%d\" % chain.id).text(chain.title())\n    table.tbody('spacer').tr('spacer').td(colspan='8').text()\n\n    row = table.tbody(\"content\").tr(\"content\")\n    row.td(colspan=2).text()\n    page.showcommit.renderCommitInfo(db, row.td(\"content\", colspan=4), user, review.repository, review, chain.first_commit, minimal=True)\n    row.td(colspan=2).text()\n\n    table.tbody('spacer').tr('spacer').td(colspan='8').text()\n\n    row = table.tbody(\"content\").tr(\"content\")\n    row.td(colspan=2).text()\n    renderComments(db, row.td(colspan=4), user, chain, \"center\", linkify)\n    row.td(colspan=2).text()\n\n    table.tbody('spacer').tr('spacer').td(colspan='8').text()\n    table.tfoot().tr().td(\"left\", colspan=8, align=\"left\").text()\n\ndef renderCommentChain(db, target, user, review, chain, context_lines=3, compact=False, tabify=False, original=False, changeset=None, linkify=False):\n    chain.loadComments(db, user)\n\n    target.addExternalStylesheet(\"resource/changeset.css\")\n    target.addExternalStylesheet(\"resource/comment.css\")\n    target.addExternalStylesheet(\"resource/review.css\")\n    target.addExternalScript(\"resource/changeset.js\")\n    target.addExternalScript(\"resource/comment.js\")\n    target.addExternalScript(\"resource/review.js\")\n\n    target = target.div(\"comment-chain\", id=\"c%d\" % chain.id)\n\n    if chain.file_id:\n        renderCodeCommentChain(db, target, user, review, chain, context_lines, compact, tabify, original, changeset, linkify)\n    elif chain.first_commit:\n        renderCommitCommentChain(db, target, user, review, chain, linkify)\n    else:\n        renderReviewCommentChain(db, target, user, review, chain, linkify)\n"
  },
  {
    "path": "src/reviewing/mail.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport dbutils\nimport gitutils\nimport configuration\nimport textutils\nimport diff\nimport mailutils\nfrom mailutils import sendPendingMails, generateMessageId\n\nimport changeset.text as changeset_text\nimport changeset.utils as changeset_utils\nimport changeset.load as changeset_load\nimport log.commitset as log_commitset\n\nimport reviewing.comment as review_comment\nimport utils as review_utils\n\nimport time\n\ndef sendMail(db, review, message_id, from_user, to_user, recipients, subject, body,\n             parent_message_id=None, headers=None):\n    if headers is None:\n        headers = {}\n    else:\n        headers = headers.copy()\n\n    headers[\"OperaCritic-URL\"] = review.getURL(db, to_user, separator=\", \")\n    headers[\"OperaCritic-Association\"] = review.getUserAssociation(db, to_user)\n    headers[\"OperaCritic-Repository\"] = review.repository.getURL(db, to_user)\n\n    list_id = to_user.getPreference(db, \"email.listId\", repository=review.repository)\n    if list_id:\n        headers[\"List-Id\"] = \"<%s.%s>\" % (list_id, configuration.base.HOSTNAME)\n\n    recipients = list(recipients)\n    review_association = review.getUserAssociation(db, to_user)\n\n    if to_user.getPreference(db, \"email.enableAssociationRecipients\", repository=review.repository):\n        system_user, _, system_domain = configuration.base.SYSTEM_USER_EMAIL.partition(\"@\")\n        for association in review_association.split(\", \"):\n            recipients.append(mailutils.User(\n                    \"is-%(association)s <%(user)s+%(association)s@%(hostname)s>\"\n                    % { \"association\": association,\n                        \"user\": system_user,\n                        \"hostname\": system_domain }))\n\n    return mailutils.queueMail(from_user, to_user, recipients, subject, body,\n                               message_id=message_id,\n                               parent_message_id=parent_message_id,\n                               headers=headers)\n\nclass MailDisabled(Exception):\n    pass\n\ndef generateSubjectLine(db, user, review, item):\n    subject_format = user.getPreference(db, \"email.subjectLine.%s\" % item)\n\n    if not subject_format.strip():\n        raise MailDisabled\n\n    data = { \"id\": \"r/%d\" % review.id,\n             \"summary\": review.summary,\n             \"progress\": str(review.getReviewState(db)),\n             \"branch\": review.branch.name }\n\n    try:\n        return subject_format % data\n    except Exception as exception:\n        return \"%s (format: %r)\" % (str(exception), subject_format)\n\ndef getReviewMessageId(db, to_user, review, files):\n    cursor = db.cursor()\n    cursor.execute(\"\"\"SELECT messageid\n                        FROM reviewmessageids\n                       WHERE uid=%s\n                         AND review=%s\"\"\",\n                   (to_user.id, review.id))\n    for (message_id,) in cursor:\n        return message_id\n    else:\n        filename, message_id = sendReviewPlaceholder(\n            db, to_user, review, generateMessageId(len(files) + 1))\n        if filename:\n            files.append(filename)\n        return message_id\n\ndef renderChainInMail(db, to_user, chain, focus_comment, new_state, new_type, line_length, context_lines):\n    result = \"\"\n    hr = \"-\" * line_length\n    urls = to_user.getCriticURLs(db)\n    url = \"\\n\".join([\"  %s/showcomment?chain=%d\" % (url, chain.id) for url in urls])\n\n    cursor = db.cursor()\n\n    if chain.file_id:\n        path = dbutils.describe_file(db, chain.file_id)\n\n        if chain.first_commit == chain.last_commit or chain.origin == 'old':\n            entry = chain.first_commit.getFileEntry(path)\n        else:\n            entry = chain.last_commit.getFileEntry(path)\n\n        sha1 = entry.sha1\n        mode = entry.mode\n\n        first_line, count = chain.lines_by_sha1[sha1]\n\n        context = changeset_utils.getCodeContext(db, sha1, first_line, minimized=True)\n        if context: result += \"%s in %s, %s:\\n%s\\n%s\\n\" % (chain.type.capitalize(), path, context, url, hr)\n        else: result += \"%s in %s:\\n%s\\n%s\\n\" % (chain.type.capitalize(), path, url, hr)\n\n        file = diff.File(id=chain.file_id, path=path, new_mode=mode, new_sha1=sha1, repository=chain.review.repository)\n        file.loadNewLines()\n        lines = file.newLines(False)\n\n        last_line = first_line + count - 1\n        first_line = max(1, first_line - context_lines)\n        last_line = min(last_line + context_lines, len(lines))\n        width = len(str(last_line))\n\n        for offset, line in enumerate(lines[first_line - 1:last_line]):\n            result += \"%s|%s\\n\" % (str(first_line + offset).rjust(width), line)\n\n        result += hr + \"\\n\"\n    elif chain.first_commit:\n        result += \"%s in commit %s by %s:\\n%s\\n%s\\n\" % (chain.type.capitalize(), chain.first_commit.sha1[:8], chain.first_commit.author.name, url, hr)\n\n        first_line, count = chain.lines_by_sha1[chain.first_commit.sha1]\n        last_line = first_line + count - 1\n        lines = chain.first_commit.message.splitlines()\n\n        for line in lines[first_line:last_line + 1]:\n            result += \"  %s\\n\" % line\n\n        result += hr + \"\\n\"\n    else:\n        result += \"General %s:\\n%s\\n%s\\n\" % (chain.type, url, hr)\n\n    mode = to_user.getPreference(db, \"email.updatedReview.quotedComments\")\n\n    def formatComment(comment):\n        return \"%s at %s:\\n%s\\n\" % (comment.user.fullname, comment.when, textutils.reflow(comment.comment, line_length, indent=2))\n\n    assert not focus_comment or focus_comment == chain.comments[-1], \"focus comment (#%d) is not last in chain (#%d) as expected\" % (focus_comment.id, chain.id)\n\n    if not focus_comment or len(chain.comments) > 1:\n        if focus_comment: comments = chain.comments[:-1]\n        else: comments = chain.comments\n\n        result = \"\\n\".join([\"> \" + line for line in result.splitlines()]) + \"\\n\"\n\n        quote1 = \"\"\n        notshown = \"\"\n        quote2 = \"\"\n\n        if mode == \"first\":\n            quote1 = formatComment(comments[0])\n            if len(comments) > 1:\n                notshown = \"[%d comment%s not shown]\" % (len(comments) - 1, \"s\" if len(comments) > 2 else \"\")\n        elif mode == \"firstlast\":\n            quote1 = formatComment(comments[0])\n            if len(comments) > 2:\n                notshown = \"[%d comment%s not shown]\" % (len(comments) - 2, \"s\" if len(comments) > 3 else \"\")\n            if len(comments) > 1:\n                quote2 = formatComment(comments[-1])\n        elif mode == \"last\":\n            if len(comments) > 1:\n                notshown = \"[%d comment%s not shown]\" % (len(comments) - 1, \"s\" if len(comments) > 2 else \"\")\n            quote2 = formatComment(comments[-1])\n        else:\n            for comment in comments:\n                quote1 += formatComment(comment)\n\n        if quote1:\n            result += \"\\n\".join([\"> \" + line for line in quote1.splitlines()]) + \"\\n\"\n        if notshown:\n            result += notshown + \"\\n\"\n        if quote2:\n            result += \"\\n\".join([\"> \" + line for line in quote2.splitlines()]) + \"\\n\"\n\n        if focus_comment:\n            result += \"\\n\"\n\n    if focus_comment:\n        result += formatComment(focus_comment)\n\n    if new_type == \"issue\":\n        result += \"\\nCONVERTED TO ISSUE!\\n\"\n    elif new_type == \"note\":\n        result += \"\\nCONVERTED TO NOTE!\\n\"\n\n    if new_state == \"closed\":\n        result += \"\\nISSUE RESOLVED!\\n\"\n    elif new_state == \"addressed\":\n        result += \"\\nISSUE ADDRESSED!\\n\"\n    elif new_state == \"open\":\n        result += \"\\nISSUE REOPENED!\\n\"\n    elif chain.state == \"closed\":\n        result += \"\\n(This issue is resolved.)\\n\"\n    elif chain.state == \"addressed\":\n        result += \"\\n(This issue is addressed.)\\n\"\n\n    return result\n\ndef checkEmailEnabled(db, to_user):\n    \"\"\"Check whether we should send emails to the user.\"\"\"\n    if to_user.email_verified is False:\n        # Email address needs verification before use.\n        raise MailDisabled\n    if not to_user.getPreference(db, \"email.activated\"):\n        # User has requested that no emails be sent.\n        raise MailDisabled\n\ndef sendReviewCreated(db, from_user, to_user, recipients, review):\n    # First check if we can/should send emails to the user at all.\n    try:\n        checkEmailEnabled(db, to_user)\n        subject = generateSubjectLine(db, to_user, review, 'newReview')\n    except MailDisabled:\n        return []\n\n    line_length = to_user.getPreference(db, \"email.lineLength\")\n    hr = \"-\" * line_length\n\n    data = { 'review.id': review.id,\n             'review.url': review.getURL(db, to_user, 2),\n             'review.owner.fullname': review.owners[0].fullname,\n             'review.branch.name': review.branch.name,\n             'review.branch.repository': review.repository.getURL(db, to_user),\n             'hr': hr }\n\n    body = \"\"\"%(hr)s\nThis is an automatic message generated by the review at:\n%(review.url)s\n%(hr)s\n\n\n\"\"\" % data\n\n    body += \"\"\"%(review.owner.fullname)s has requested a review of the changes on the branch\n  %(review.branch.name)s\nin the repository\n  %(review.branch.repository)s\n\n\n\"\"\" % data\n\n    all_reviewers = to_user.getPreference(db, \"email.newReview.displayReviewers\")\n    all_watchers = to_user.getPreference(db, \"email.newReview.displayWatchers\")\n\n    if all_reviewers or all_watchers:\n        if all_reviewers:\n            if review.reviewers:\n                body += \"The users assigned to review the changes on the review branch are:\\n\"\n\n                for reviewer in review.reviewers:\n                    body += \"  \" + reviewer.fullname + \"\\n\"\n\n                body += \"\\n\"\n            else:\n                body += \"\"\"No reviewers have been identified for the changes in this review.  This means\nthe review is currently stuck; it cannot finish unless there are reviewers.\n\n\"\"\"\n\n        if all_watchers and review.watchers:\n            body += \"The following additional users are following the review:\\n\"\n\n            for watcher in review.watchers:\n                body += \"  \" + watcher.fullname + \"\\n\"\n\n            body += \"\\n\"\n\n        body += \"\\n\"\n\n    if review.description:\n        body += \"\"\"Description:\n%s\n\n\n\"\"\" % textutils.reflow(review.description, line_length, indent=2)\n\n    cursor = db.cursor()\n    cursor.execute(\"\"\"SELECT file, SUM(deleted), SUM(inserted)\n                        FROM fullreviewuserfiles\n                       WHERE review=%s\n                         AND assignee=%s\n                    GROUP BY file\"\"\",\n                   (review.id, to_user.id))\n    pending_files_lines = cursor.fetchall()\n\n    if pending_files_lines:\n        body += renderFiles(db, to_user, review, \"These changes were assigned to you:\", pending_files_lines, showcommit_link=True)\n\n    all_commits = to_user.getPreference(db, \"email.newReview.displayCommits\")\n\n    if all_commits:\n        body += \"The commits requested to be reviewed are:\\n\\n\"\n\n        contextLines = to_user.getPreference(db, \"email.newReview.diff.contextLines\")\n        diffMaxLines = to_user.getPreference(db, \"email.newReview.diff.maxLines\")\n\n        displayStats = to_user.getPreference(db, \"email.newReview.displayStats\")\n        statsMaxLines = to_user.getPreference(db, \"email.newReview.stats.maxLines\")\n\n        if contextLines < 0: contextLines = 0\n\n        # FIXME: The order here is essentially random.  We shouldn't depend on\n        # it, and reversing it doesn't make much sense...\n        commits = list(reversed(review.branch.getCommits(db)))\n\n        if diffMaxLines == 0: diffs = None\n        else:\n            diffs = {}\n            lines = 0\n\n            for commit in commits:\n                if len(commit.parents) == 1:\n                    cursor.execute(\"\"\"SELECT id\n                                        FROM reviewchangesets\n                                        JOIN changesets ON (id=changeset)\n                                       WHERE review=%s\n                                         AND child=%s\"\"\", (review.id, commit.getId(db)))\n\n                    (changeset_id,) = cursor.fetchone()\n\n                    diff = changeset_text.unified(db, changeset_load.loadChangeset(db, review.repository, changeset_id), contextLines)\n                    diffs[commit] = diff\n                    lines += diff.count(\"\\n\")\n                    if lines > diffMaxLines:\n                        diffs = None\n                        break\n\n        if not displayStats or statsMaxLines == 0: stats = None\n        else:\n            stats = {}\n            lines = 0\n\n            for commit in commits:\n                commit_stats = review.repository.run(\"show\", \"--oneline\", \"--stat\", commit.sha1).split('\\n', 1)[1]\n                stats[commit] = commit_stats\n                lines += commit_stats.count('\\n')\n                if lines > statsMaxLines:\n                    stats = None\n                    break\n\n        for index, commit in enumerate(commits):\n            if index > 0: body += \"\\n\\n\\n\"\n\n            body += \"\"\"Commit: %(sha1)s\nAuthor: %(author.fullname)s <%(author.email)s> at %(author.time)s\n\n%(message)s\n\"\"\" % { 'sha1': commit.sha1,\n        'author.fullname': commit.author.getFullname(db),\n        'author.email': commit.author.email,\n        'author.time': time.strftime(\"%Y-%m-%d %H:%M:%S\", commit.author.time),\n        'message': textutils.reflow(commit.message.strip(), line_length, indent=2) }\n\n            if stats and commit in stats:\n                body += \"---\\n\" + stats[commit]\n\n            if diffs and commit in diffs:\n                body += \"\\n\" + diffs[commit]\n\n    message_id = generateMessageId()\n\n    cursor.execute(\"INSERT INTO reviewmessageids (uid, review, messageid) VALUES (%s, %s, %s)\",\n                   [to_user.id, review.id, message_id])\n\n    return [sendMail(db, review, message_id, from_user,\n                     to_user, recipients, subject, body)]\n\ndef renderFiles(db, to_user, review, title, files_lines, commits=None, relevant_only=False, relevant_files=None, showcommit_link=False):\n    result = \"\"\n    if files_lines:\n        files = []\n\n        for file_id, delete_count, insert_count in files_lines:\n            if not relevant_only or file_id in relevant_files:\n                files.append((dbutils.describe_file(db, file_id), delete_count, insert_count))\n\n        if files:\n            paths = []\n            deleted = []\n            inserted = []\n\n            for path, delete_count, insert_count in sorted(files):\n                paths.append(path)\n                deleted.append(delete_count)\n                inserted.append(insert_count)\n\n            paths = diff.File.eliminateCommonPrefixes(paths, text=True)\n\n            len_paths = max(map(len, paths))\n            len_deleted = max(map(len, map(str, deleted)))\n            len_inserted = max(map(len, map(str, inserted)))\n\n            result += title + \"\\n\"\n\n            for path, delete_count, insert_count in zip(paths, deleted, inserted):\n                if delete_count == 0 and insert_count == 0:\n                    result += \"  %s  binary file\\n\" % path.ljust(len_paths)\n                else:\n                    delete_field = delete_count > 0 and \"-%d\" % delete_count or \"\"\n                    insert_field = insert_count > 0 and \"+%d\" % insert_count or \"\"\n                    result += \"  %s  %s %s\\n\" % (path.ljust(len_paths), delete_field.rjust(len_deleted + 1), insert_field.rjust(len_inserted + 1))\n\n            if commits:\n                if len(commits) == 1:\n                    result += \"from this commit:\\n\"\n                else:\n                    result += \"from these commits:\\n\"\n\n                for commit_id in commits:\n                    commit = gitutils.Commit.fromId(db, review.repository, commit_id)\n                    result += \"  %s %s\\n\" % (commit.sha1[:8], commit.niceSummary())\n\n            if showcommit_link:\n                urls = to_user.getCriticURLs(db)\n\n                try:\n                    from_sha1, to_sha1 = showcommit_link\n                    url_format = \"  %%s/showcommit?review=%%d&from=%s&to=%s&filter=pending\\n\" % (from_sha1, to_sha1)\n                except:\n                    url_format = \"  %s/showcommit?review=%d&filter=pending\\n\"\n\n                result += \"\\nTo review all these changes:\\n\"\n                for url in urls:\n                    result += url_format % (url, review.id)\n\n            result += \"\\n\\n\"\n    return result\n\ndef sendReviewPlaceholder(db, to_user, review, message_id=None):\n    # First check if we can/should send emails to the user at all.\n    try:\n        checkEmailEnabled(db, to_user)\n        subject = generateSubjectLine(db, to_user, review, 'newishReview')\n    except MailDisabled:\n        return []\n\n    line_length = to_user.getPreference(db, \"email.lineLength\")\n    hr = \"-\" * line_length\n\n    why = \"This message is sent to you when you become associated with a review after the review was initially requested.  It is then sent instead of the regular \\\"New Review\\\" message, for the purpose of using as the reference/in-reply-to message for other messages sent about this review.\"\n\n    data = { 'review.id': review.id,\n             'review.url': review.getURL(db, to_user, 2),\n             'review.owner.fullname': review.owners[0].fullname,\n             'review.branch.name': review.branch.name,\n             'review.branch.repository': review.repository.getURL(db, to_user),\n             'hr': hr,\n             'why': textutils.reflow(why, line_length) }\n\n    body = \"\"\"%(hr)s\nThis is an automatic message generated by the review at:\n%(review.url)s\n%(hr)s\n\n\n%(why)s\n\n\n%(hr)s\n\n\n\"\"\" % data\n\n    body += \"\"\"%(review.owner.fullname)s has requested a review of the changes on the branch\n  %(review.branch.name)s\nin the repository\n  %(review.branch.repository)s\n\n\n\"\"\" % data\n\n    all_reviewers = to_user.getPreference(db, \"email.newReview.displayReviewers\")\n    all_watchers = to_user.getPreference(db, \"email.newReview.displayWatchers\")\n\n    if all_reviewers or all_watchers:\n        if all_reviewers:\n            if review.reviewers:\n                body += \"The users assigned to review the changes on the review branch are:\\n\"\n\n                for reviewer in review.reviewers:\n                    body += \"  \" + reviewer.fullname + \"\\n\"\n\n                body += \"\\n\"\n            else:\n                body += \"\"\"No reviewers have been identified for the changes in this review.  This means\nthe review is currently stuck; it cannot finish unless there are reviewers.\n\n\"\"\"\n\n        if all_watchers and review.watchers:\n            body += \"The following additional users are following the review:\\n\"\n\n            for watcher in review.watchers:\n                body += \"  \" + watcher.fullname + \"\\n\"\n\n            body += \"\\n\"\n\n        body += \"\\n\"\n\n    if review.description:\n        body += \"\"\"Description:\n%s\n\n\n\"\"\" % textutils.reflow(review.description, line_length, indent=2)\n\n    if message_id is None:\n        message_id = generateMessageId()\n\n    cursor = db.cursor()\n    cursor.execute(\"\"\"INSERT INTO reviewmessageids (uid, review, messageid)\n                           VALUES (%s, %s, %s)\"\"\",\n                   (to_user.id, review.id, message_id))\n\n    return (sendMail(db, review, message_id, review.owners[0], to_user, [to_user],\n                     subject, body),\n            message_id)\n\ndef sendReviewBatch(db, from_user, to_user, recipients, review, batch_id, was_accepted, is_accepted, profiler=None):\n    if profiler: profiler.check(\"generate mail: start\")\n\n    # First check if we can/should send emails to the user at all.\n    try:\n        checkEmailEnabled(db, to_user)\n        subject = generateSubjectLine(db, to_user, review, \"updatedReview.submittedChanges\")\n    except MailDisabled:\n        return []\n\n    if from_user == to_user and to_user.getPreference(db, \"email.ignoreOwnChanges\"):\n        return []\n\n    cursor = db.cursor()\n\n    line_length = to_user.getPreference(db, \"email.lineLength\")\n    relevant_only = to_user not in review.owners and to_user != from_user and to_user.getPreference(db, \"email.updatedReview.relevantChangesOnly\")\n\n    if relevant_only:\n        cursor.execute(\"SELECT type FROM reviewusers WHERE review=%s AND uid=%s\", (review.id, to_user.id))\n        if cursor.fetchone()[0] == 'manual': relevant_only = False\n\n    if profiler: profiler.check(\"generate mail: prologue\")\n\n    if relevant_only:\n        relevant_files = review.getRelevantFiles(db, to_user)\n    else:\n        relevant_files = None\n\n    if profiler: profiler.check(\"generate mail: get relevant files\")\n\n    cursor.execute(\"SELECT comment FROM batches WHERE id=%s\", [batch_id])\n    batch_chain_id = cursor.fetchone()[0]\n\n    if profiler: profiler.check(\"generate mail: batch chain\")\n\n    cursor.execute(\"\"\"SELECT reviewfiles.file, SUM(reviewfiles.deleted), SUM(reviewfiles.inserted)\n                        FROM reviewfiles\n                        JOIN reviewfilechanges ON (reviewfilechanges.file=reviewfiles.id)\n                       WHERE reviewfilechanges.batch=%s\n                         AND reviewfilechanges.to_state='reviewed'\n                    GROUP BY reviewfiles.file\"\"\",\n                       (batch_id,))\n    reviewed_files_lines = cursor.fetchall()\n\n    if profiler: profiler.check(\"generate mail: reviewed files/lines\")\n\n    cursor.execute(\"\"\"SELECT DISTINCT changesets.child\n                        FROM reviewfiles\n                        JOIN reviewfilechanges ON (reviewfilechanges.file=reviewfiles.id)\n                        JOIN changesets ON (changesets.id=reviewfiles.changeset)\n                       WHERE reviewfilechanges.batch=%s\n                         AND reviewfilechanges.to_state='reviewed'\"\"\",\n                       (batch_id,))\n    reviewed_commits = [commit_id for (commit_id,) in cursor]\n\n    if profiler: profiler.check(\"generate mail: reviewed commits\")\n\n    cursor.execute(\"\"\"SELECT reviewfiles.file, SUM(reviewfiles.deleted), SUM(reviewfiles.inserted)\n                        FROM reviewfiles\n                        JOIN reviewfilechanges ON (reviewfilechanges.file=reviewfiles.id)\n                       WHERE reviewfilechanges.batch=%s\n                         AND reviewfilechanges.to_state='pending'\n                    GROUP BY reviewfiles.file\"\"\",\n                   (batch_id,))\n    unreviewed_files_lines = cursor.fetchall()\n\n    if profiler: profiler.check(\"generate mail: unreviewed files/lines\")\n\n    cursor.execute(\"\"\"SELECT DISTINCT changesets.child\n                        FROM reviewfiles\n                        JOIN reviewfilechanges ON (reviewfilechanges.file=reviewfiles.id)\n                        JOIN changesets ON (changesets.id=reviewfiles.changeset)\n                       WHERE reviewfilechanges.batch=%s\n                         AND reviewfilechanges.to_state='pending'\"\"\",\n                       (batch_id,))\n    unreviewed_commits = [commit_id for (commit_id,) in cursor]\n\n    if profiler: profiler.check(\"generate mail: unreviewed commits\")\n\n    reviewed_files = renderFiles(db, to_user, review, \"Reviewed Files:\", reviewed_files_lines, reviewed_commits, relevant_only, relevant_files)\n    unreviewed_files = renderFiles(db, to_user, review, \"Unreviewed Files:\", unreviewed_files_lines, unreviewed_commits, relevant_only, relevant_files)\n\n    if profiler: profiler.check(\"generate mail: render files\")\n\n    context_lines = to_user.getPreference(db, \"email.comment.contextLines\")\n\n    comment_ids = set()\n\n    def isRelevantComment(chain):\n        if chain.file_id is None or chain.file_id in relevant_files: return True\n\n        cursor.execute(\"SELECT 1 FROM commentchainusers WHERE chain=%s AND uid=%s\", (chain.id, to_user.id))\n        return cursor.fetchone() is not None\n\n    def fetchNewCommentChains():\n        chains = []\n        for (chain_id,) in cursor.fetchall():\n            if chain_id != batch_chain_id:\n                chain = review_comment.CommentChain.fromId(db, chain_id, from_user, review=review)\n                if not relevant_only or isRelevantComment(chain):\n                    chain.loadComments(db, from_user)\n                    chains.append((chain, None, None))\n        return chains\n\n    def fetchAdditionalCommentChains():\n        chains = []\n        for chain_id, comment_id, new_state, new_type in cursor.fetchall():\n            if comment_id is not None or new_state is not None or new_type is not None:\n                chain = review_comment.CommentChain.fromId(db, chain_id, from_user, review=review)\n                if not relevant_only or isRelevantComment(chain):\n                    chain.loadComments(db, from_user)\n                    chains.append((chain, new_state, new_type))\n        return chains\n\n    cursor.execute(\"SELECT id FROM commentchains WHERE batch=%s AND type='issue' ORDER BY id ASC\", [batch_id])\n    new_issues = fetchNewCommentChains()\n\n    if profiler: profiler.check(\"generate mail: new issues\")\n\n    cursor.execute(\"SELECT id FROM commentchains WHERE batch=%s AND type='note' ORDER BY id ASC\", [batch_id])\n    new_notes = fetchNewCommentChains()\n\n    if profiler: profiler.check(\"generate mail: new notes\")\n\n    cursor.execute(\"\"\"SELECT commentchains.id, comments.id, commentchainchanges.to_state, commentchainchanges.to_type\n                        FROM commentchains\n             LEFT OUTER JOIN comments ON (commentchains.id=comments.chain\n                                      AND comments.batch=%s)\n             LEFT OUTER JOIN commentchainchanges ON (commentchains.id=commentchainchanges.chain\n                                                 AND commentchainchanges.batch=%s)\n                       WHERE commentchains.review=%s\n                         AND commentchains.batch!=%s\"\"\",\n                   [batch_id, batch_id, review.id, batch_id])\n    additional_comments = fetchAdditionalCommentChains()\n\n    if profiler: profiler.check(\"generate mail: additional comments\")\n\n    if is_accepted != was_accepted and not reviewed_files and not unreviewed_files and not new_issues and not new_notes and not additional_comments:\n        return []\n\n    data = { 'review.id': review.id,\n             'review.url': review.getURL(db, to_user, 2),\n             'review.branch.name': review.branch.name,\n             'review.branch.repository': review.repository.getURL(db, to_user),\n             'hr': \"-\" * line_length }\n\n    header = \"\"\"%(hr)s\nThis is an automatic message generated by the review at:\n%(review.url)s\n%(hr)s\n\n\n\"\"\" % data\n\n    if batch_chain_id is not None:\n        batch_chain = review_comment.CommentChain.fromId(db, batch_chain_id, from_user, review=review)\n    else:\n        batch_chain = None\n\n    data[\"batch.author.fullname\"] = from_user.fullname\n\n    first_name = from_user.getFirstName()\n\n    if batch_chain is not None:\n        batch_chain.loadComments(db, from_user)\n\n        comment_ids.add(batch_chain.comments[0].id)\n\n        remark = \"\"\"%s'%s comment:\n%s\n\n\n\"\"\" % (first_name, first_name[-1] != 's' and 's' or '', textutils.reflow(batch_chain.comments[0].comment, line_length, indent=2))\n    else:\n        remark = \"\"\n\n    body = header\n    body += textutils.reflow(\"%(batch.author.fullname)s has submitted a batch of changes to the review.\" % data, line_length)\n    body += \"\\n\\n\\n\"\n    body += remark\n\n    if not was_accepted and is_accepted:\n        state_change = textutils.reflow(\"The review is now ACCEPTED!\", line_length) + \"\\n\\n\\n\"\n    elif was_accepted and not is_accepted:\n        state_change = textutils.reflow(\"The review is NO LONGER ACCEPTED!\", line_length) + \"\\n\\n\\n\"\n    else:\n        state_change = \"\"\n\n    body += state_change\n    body += reviewed_files\n    body += unreviewed_files\n\n    def renderCommentChains(chains):\n        result = \"\"\n        if chains:\n            for chain, new_state, new_type in chains:\n                for focus_comment in chain.comments:\n                    if focus_comment.batch_id == batch_id:\n                        break\n                else:\n                    focus_comment = None\n                if focus_comment is not None or new_state is not None or new_type is not None:\n                    result += renderChainInMail(db, to_user, chain, focus_comment, new_state, new_type, line_length, context_lines) + \"\\n\\n\"\n                if focus_comment is not None:\n                    comment_ids.add(focus_comment.id)\n        return result\n\n    body += renderCommentChains(new_issues)\n    body += renderCommentChains(new_notes)\n\n    if profiler: profiler.check(\"generate mail: render new comment chains\")\n\n    comment_threading = to_user.getPreference(db, \"email.updatedReview.commentThreading\")\n\n    send_main_mail = state_change or reviewed_files or unreviewed_files or new_issues or new_notes\n\n    if not comment_threading:\n        send_main_mail = send_main_mail or additional_comments\n        body += renderCommentChains(additional_comments)\n\n        if profiler: profiler.check(\"generate mail: render additional comments\")\n\n    review_message_id = [None]\n    files = []\n\n    def localGenerateMessageId():\n        return generateMessageId(len(files) + 1)\n\n    def localGetReviewMessageId():\n        if review_message_id[0] is None:\n            review_message_id[0] = getReviewMessageId(db, to_user, review, files)\n        return review_message_id[0]\n\n    if send_main_mail:\n        message_id = localGenerateMessageId()\n\n        cursor.executemany(\"INSERT INTO commentmessageids (uid, comment, messageid) VALUES (%s, %s, %s)\",\n                           [(to_user.id, comment_id, message_id) for comment_id in comment_ids])\n\n        files.append(sendMail(\n            db, review, message_id, from_user, to_user, recipients, subject, body,\n            parent_message_id=localGetReviewMessageId()))\n\n    if comment_threading:\n        threads = {}\n\n        for chain, new_state, new_type in additional_comments:\n            if chain.comments[-1].batch_id == batch_id:\n                parent_comment_id = chain.comments[-2].id\n            else:\n                parent_comment_id = chain.comments[-1].id\n\n            cursor.execute(\"\"\"SELECT messageid\n                                FROM commentmessageids\n                               WHERE comment=%s\n                                 AND uid=%s\"\"\",\n                           [parent_comment_id, to_user.id])\n            row = cursor.fetchone()\n\n            if row:\n                parent_message_id = row[0]\n            else:\n                parent_message_id = localGetReviewMessageId()\n\n            threads.setdefault(parent_message_id, []).append((chain, new_state, new_type))\n\n        for parent_message_id, chains in threads.items():\n            comment_ids = set()\n\n            body = header + remark + renderCommentChains(chains)\n\n            message_id = localGenerateMessageId()\n\n            cursor.executemany(\"INSERT INTO commentmessageids (uid, comment, messageid) VALUES (%s, %s, %s)\",\n                               [(to_user.id, comment_id, message_id) for comment_id in comment_ids])\n\n            files.append(sendMail(\n                db, review, message_id, from_user, to_user, recipients, subject, body,\n                parent_message_id=parent_message_id))\n\n    if profiler: profiler.check(\"generate mail: finished\")\n\n    return files\n\ndef sendReviewAddedCommits(db, from_user, to_user, recipients, review, commits, changesets, tracked_branch=False):\n    # First check if we can/should send emails to the user at all.\n    try:\n        checkEmailEnabled(db, to_user)\n        subject = generateSubjectLine(db, to_user, review, \"updatedReview.commitsPushed\")\n    except MailDisabled:\n        return []\n\n    if from_user == to_user and to_user.getPreference(db, \"email.ignoreOwnChanges\"):\n        return []\n\n    line_length = to_user.getPreference(db, \"email.lineLength\")\n    hr = \"-\" * line_length\n    relevant_only = to_user not in review.owners and to_user != from_user and to_user.getPreference(db, \"email.updatedReview.relevantChangesOnly\")\n\n    cursor = db.cursor()\n\n    if relevant_only:\n        cursor.execute(\"SELECT type FROM reviewusers WHERE review=%s AND uid=%s\", (review.id, to_user.id))\n        if cursor.fetchone()[0] == 'manual': relevant_only = False\n\n    all_commits = dict((commit.sha1, commit) for commit in commits)\n    changeset_for_commit = {}\n\n    for changeset in changesets:\n        # We don't include diffs for merge commits in mails.\n        if len(changeset.child.parents) == 1:\n            if changeset.child in all_commits:\n                changeset_for_commit[changeset.child] = changeset\n            else:\n                # An added changeset where the child isn't part of the added\n                # commits will be a changeset between a \"replayed rebase\" commit\n                # and the new head commit, generated when doing a non-fast-\n                # forward rebase.  The relevant commit from such a changeset is\n                # the first (and only) parent.\n                changeset_for_commit[changeset.parent] = changeset\n\n    if relevant_only:\n        relevant_files = review.getRelevantFiles(db, to_user)\n        relevant_commits = set()\n\n        for changeset in changesets:\n            for file in changeset.files:\n                if file.id in relevant_files:\n                    if changeset.child in all_commits:\n                        relevant_commits.add(changeset.child)\n                    else:\n                        # \"Replayed rebase\" commit; see comment above.\n                        relevant_commits.add(all_commits[changeset.parent])\n                    break\n            else:\n                cursor.execute(\"SELECT id FROM commentchains WHERE review=%s AND state='addressed' AND addressed_by=%s\", (review.id, changeset.child.getId(db)))\n                for chain_id in cursor.fetchall():\n                    cursor.execute(\"SELECT 1 FROM commentchainusers WHERE chain=%s AND uid=%s\", (chain_id, to_user.id))\n                    if cursor.fetchone():\n                        relevant_commits.add(changeset.child)\n                        break\n\n        if not relevant_commits:\n            return []\n    else:\n        relevant_commits = None\n\n    data = { 'review.id': review.id,\n             'review.url': review.getURL(db, to_user, 2),\n             'review.branch.name': review.branch.name,\n             'review.branch.repository': review.repository.getURL(db, to_user),\n             'hr': hr }\n\n    body = \"\"\"%(hr)s\nThis is an automatic message generated by the review at:\n%(review.url)s\n%(hr)s\n\n\n\"\"\" % data\n\n    commitset = log_commitset.CommitSet(commits)\n\n    if tracked_branch:\n        body += \"The automatic tracking of\\n  %s\\n\" % tracked_branch\n        body += textutils.reflow(\"has updated the review by pushing %sadditional commit%s to the branch\" % (\"an \" if len(commits) == 1 else \"\", \"s\" if len(commits) > 1 else \"\"), line_length)\n    else:\n        body += textutils.reflow(\"%s has updated the review by pushing %sadditional commit%s to the branch\" % (from_user.fullname, \"an \" if len(commits) == 1 else \"\", \"s\" if len(commits) > 1 else \"\"), line_length)\n\n    body += \"\\n  %s\\n\" % review.branch.name\n    body += textutils.reflow(\"in the repository\", line_length)\n    body += \"\\n  %s\\n\\n\\n\" % review.repository.getURL(db, to_user)\n\n    cursor.execute(\"\"\"SELECT file, SUM(deleted), SUM(inserted)\n                        FROM fullreviewuserfiles\n                       WHERE review=%%s\n                         AND changeset IN (%s)\n                         AND state='pending'\n                         AND assignee=%%s\n                    GROUP BY file\"\"\" % \",\".join([\"%s\"] * len(changesets)),\n                   [review.id] + [changeset.id for changeset in changesets] + [to_user.id])\n    pending_files_lines = cursor.fetchall()\n\n    if pending_files_lines:\n        heads = commitset.getHeads()\n        tails = commitset.getFilteredTails(review.repository)\n\n        if len(heads) == 1 and len(tails) == 1:\n            showcommit_link = (tails.pop()[:8], heads.pop().sha1[:8])\n        else:\n            showcommit_link = False\n\n        body += renderFiles(db, to_user, review, \"These changes were assigned to you:\", pending_files_lines, showcommit_link=showcommit_link)\n\n    all_commits = to_user.getPreference(db, \"email.updatedReview.displayCommits\")\n    context_lines = to_user.getPreference(db, \"email.comment.contextLines\")\n\n    if all_commits:\n        body += \"The additional commit%s requested to be reviewed are:\\n\\n\" % (\"s\" if len(commits) > 1 else \"\")\n\n        contextLines = to_user.getPreference(db, \"email.updatedReview.diff.contextLines\")\n        diffMaxLines = to_user.getPreference(db, \"email.updatedReview.diff.maxLines\")\n\n        displayStats = to_user.getPreference(db, \"email.updatedReview.displayStats\")\n        statsMaxLines = to_user.getPreference(db, \"email.updatedReview.stats.maxLines\")\n\n        if contextLines < 0: contextLines = 0\n\n        if diffMaxLines == 0: diffs = None\n        else:\n            diffs = {}\n            lines = 0\n\n            for commit in commits:\n                if commit in changeset_for_commit:\n                    diff = changeset_text.unified(db, changeset_for_commit[commit], contextLines)\n                    diffs[commit] = diff\n                    lines += diff.count(\"\\n\")\n                    if lines > diffMaxLines:\n                        diffs = None\n                        break\n\n        if not displayStats or statsMaxLines == 0: stats = None\n        else:\n            stats = {}\n            lines = 0\n\n            for commit in commits:\n                commit_stats = review.repository.run(\"show\", \"--oneline\", \"--stat\", commit.sha1).split('\\n', 1)[1]\n                stats[commit] = commit_stats\n                lines += commit_stats.count('\\n')\n                if lines > statsMaxLines:\n                    stats = None\n                    break\n\n        for index, commit in enumerate(commits):\n            if index > 0: body += \"\\n\\n\\n\"\n\n            body += \"\"\"Commit: %(sha1)s\nAuthor: %(author.fullname)s <%(author.email)s> at %(author.time)s\n\n%(message)s\n\"\"\" % { 'sha1': commit.sha1,\n        'author.fullname': commit.author.getFullname(db),\n        'author.email': commit.author.email,\n        'author.time': time.strftime(\"%Y-%m-%d %H:%M:%S\", commit.author.time),\n        'message': textutils.reflow(commit.message.strip(), line_length, indent=2) }\n\n            if stats and commit in stats:\n                body += \"---\\n\" + stats[commit]\n\n            if diffs and commit in diffs:\n                body += \"\\n\" + diffs[commit]\n\n            cursor.execute(\"SELECT id FROM commentchains WHERE review=%s AND state='addressed' AND addressed_by=%s\", (review.id, commit.getId(db)))\n            rows = cursor.fetchall()\n\n            if rows:\n                for (chain_id,) in rows:\n                    chain = review_comment.CommentChain.fromId(db, chain_id, to_user, review=review)\n                    chain.loadComments(db, to_user, include_draft_comments=False)\n                    body += \"\\n\\n\" + renderChainInMail(db, to_user, chain, None, \"addressed\", None, line_length, context_lines)\n\n    files = []\n\n    parent_message_id = getReviewMessageId(db, to_user, review, files)\n    message_id = generateMessageId(len(files) + 1)\n\n    files.append(sendMail(\n        db, review, message_id, from_user, to_user, recipients, subject, body,\n        parent_message_id=parent_message_id))\n\n    return files\n\ndef sendPing(db, from_user, to_user, recipients, review, note):\n    # First check if we can/should send emails to the user at all.\n    try:\n        checkEmailEnabled(db, to_user)\n        subject = generateSubjectLine(db, to_user, review, \"pingedReview\")\n    except MailDisabled:\n        return []\n\n    line_length = to_user.getPreference(db, \"email.lineLength\")\n    hr = \"-\" * line_length\n\n    data = { 'review.id': review.id,\n             'review.url': review.getURL(db, to_user, 2),\n             'review.branch.name': review.branch.name,\n             'review.branch.repository': review.repository.getURL(db, to_user),\n             'from.fullname': from_user.fullname,\n             'hr': hr }\n\n    body = \"\"\"%(hr)s\nThis is an automatic message generated by the review at:\n%(review.url)s\n%(hr)s\n\n\n\"\"\" % data\n\n    body += \"\"\"%(from.fullname)s has pinged the review!\n\n\n\"\"\" % data\n\n    if note:\n        body += \"\"\"Additional information from %s:\n%s\n\n\n\"\"\" % (from_user.getFirstName(), textutils.reflow(note, line_length, indent=2))\n\n    cursor = db.cursor()\n\n    cursor.execute(\"\"\"SELECT reviewfiles.file, SUM(reviewfiles.deleted), SUM(reviewfiles.inserted)\n                        FROM reviewfiles\n                        JOIN reviewuserfiles ON (reviewuserfiles.file=reviewfiles.id)\n                       WHERE reviewfiles.review=%s\n                         AND reviewfiles.state='pending'\n                         AND reviewuserfiles.uid=%s\n                    GROUP BY reviewfiles.file\"\"\",\n                   (review.id, to_user.id))\n    pending_files_lines = cursor.fetchall()\n\n    cursor.execute(\"\"\"SELECT DISTINCT changesets.child\n                        FROM reviewfiles\n                        JOIN reviewuserfiles ON (reviewuserfiles.file=reviewfiles.id)\n                        JOIN changesets ON (changesets.id=reviewfiles.changeset)\n                       WHERE reviewfiles.review=%s\n                         AND reviewfiles.state='pending'\n                         AND reviewuserfiles.uid=%s\"\"\",\n                   (review.id, to_user.id))\n    pending_commits = cursor.fetchall()\n\n    body += renderFiles(db, to_user, review, \"These pending changes are assigned to you:\", pending_files_lines, pending_commits, showcommit_link=True)\n\n    cursor.execute(\"SELECT messageid FROM reviewmessageids WHERE uid=%s AND review=%s\", [to_user.id, review.id])\n    row = cursor.fetchone()\n\n    if row: parent_message_id = \"<%s@%s>\" % (row[0], configuration.base.HOSTNAME)\n    else: parent_message_id = None\n\n    files = []\n\n    parent_message_id = getReviewMessageId(db, to_user, review, files)\n    message_id = generateMessageId(len(files) + 1)\n\n    files.append(sendMail(\n        db, review, message_id, from_user, to_user, recipients, subject, body,\n        parent_message_id=parent_message_id))\n\n    return files\n\ndef sendAssignmentsChanged(db, from_user, to_user, review, added_filters, removed_filters, unassigned, assigned):\n    # First check if we can/should send emails to the user at all.\n    try:\n        checkEmailEnabled(db, to_user)\n        subject = generateSubjectLine(db, to_user, review, \"updatedReview.assignmentsChanged\")\n    except MailDisabled:\n        return []\n\n    line_length = to_user.getPreference(db, \"email.lineLength\")\n    hr = \"-\" * line_length\n\n    data = { 'review.id': review.id,\n             'review.url': review.getURL(db, to_user, 2),\n             'review.branch.name': review.branch.name,\n             'review.branch.repository': review.repository.getURL(db, to_user),\n             'from.fullname': from_user.fullname,\n             'hr': hr }\n\n    body = \"\"\"%(hr)s\nThis is an automatic message generated by the review at:\n%(review.url)s\n%(hr)s\n\n\n\"\"\" % data\n\n    body += \"\"\"%(from.fullname)s has modified the assignments in the review.\n\n\n\"\"\" % data\n\n    def renderPaths(items):\n        return \"  \\n\".join(diff.File.eliminateCommonPrefixes(sorted(map(lambda item: item[1], items)), text=True)) + \"\\n\"\n\n    if added_filters or removed_filters:\n        if added_filters:\n            added_reviewer = filter(lambda item: item[0] == \"reviewer\", added_filters)\n            added_watcher = filter(lambda item: item[0] == \"watcher\", added_filters)\n        else:\n            added_reviewer = None\n            added_watcher = None\n\n        if removed_filters:\n            removed_reviewer = filter(lambda item: item[0] == \"reviewer\", removed_filters)\n            removed_watcher = filter(lambda item: item[0] == \"watcher\", removed_filters)\n        else:\n            removed_reviewer = None\n            removed_watcher = None\n\n        if added_reviewer:\n            body += \"You are now reviewing the following paths:\\n  %s\\n\" % renderPaths(added_reviewer)\n        if added_watcher:\n            body += \"You are now watching the following paths:\\n  %s\\n\" % renderPaths(added_watcher)\n        if removed_reviewer:\n            body += \"You are no longer reviewing the following paths:\\n  %s\\n\" % renderPaths(removed_reviewer)\n        if removed_watcher:\n            body += \"You are no longer watching the following paths:\\n  %s\\n\" % renderPaths(removed_watcher)\n\n        body += \"\\n\"\n\n    if unassigned:\n        body += renderFiles(db, to_user, review, \"The following changes are no longer assigned to you:\", unassigned)\n\n    if assigned:\n        body += renderFiles(db, to_user, review, \"The following changes are now assigned to you:\",assigned)\n\n    files = []\n\n    parent_message_id = getReviewMessageId(db, to_user, review, files)\n    message_id = generateMessageId(len(files) + 1)\n\n    files.append(sendMail(\n        db, review, message_id, from_user, to_user, [to_user], subject, body,\n        parent_message_id=parent_message_id))\n\n    return files\n\ndef sendFiltersApplied(db, from_user, to_user, review, globalfilters, parentfilters, assigned):\n    # First check if we can/should send emails to the user at all.\n    try:\n        checkEmailEnabled(db, to_user)\n        subject = generateSubjectLine(db, to_user, review, \"updatedReview.parentFiltersApplied\")\n    except MailDisabled:\n        return []\n\n    line_length = to_user.getPreference(db, \"email.lineLength\")\n    hr = \"-\" * line_length\n\n    data = { 'review.id': review.id,\n             'review.url': review.getURL(db, to_user, 2),\n             'review.branch.name': review.branch.name,\n             'review.branch.repository': review.repository.getURL(db, to_user),\n             'from.fullname': from_user.fullname,\n             'hr': hr }\n\n    body = \"\"\"%(hr)s\nThis is an automatic message generated by the review at:\n%(review.url)s\n%(hr)s\n\n\n\"\"\" % data\n\n    if globalfilters:\n        what = \"global filters\"\n    else:\n        what = \"global filters from upstream repositories\"\n\n    text = (\"%s has modified the assignments in the review by making %s apply, \"\n            \"which they previously did not.  This had the effect that you are \"\n            \"now a %s the review.\"\n            % (from_user.fullname,\n               what,\n               \"reviewer of changes in\" if assigned else \"watcher of\"))\n\n    body += \"\"\"%s\n\n\n\"\"\" % textutils.reflow(text, line_length)\n\n    if assigned:\n        body += renderFiles(db, to_user, review, \"The following changes are now assigned to you:\", assigned)\n\n    files = []\n\n    parent_message_id = getReviewMessageId(db, to_user, review, files)\n    message_id = generateMessageId(len(files) + 1)\n\n    files.append(sendMail(\n        db, review, message_id, from_user, to_user, [to_user], subject, body,\n        parent_message_id=parent_message_id))\n\n    return files\n\ndef sendReviewRebased(db, from_user, to_user, recipients, review, new_upstream, rebased_commits, onto_branch=None):\n    # First check if we can/should send emails to the user at all.\n    try:\n        checkEmailEnabled(db, to_user)\n        subject = generateSubjectLine(db, to_user, review, \"updatedReview.reviewRebased\")\n    except MailDisabled:\n        return []\n\n    if from_user == to_user and to_user.getPreference(db, \"email.ignoreOwnChanges\"):\n        return []\n\n    line_length = to_user.getPreference(db, \"email.lineLength\")\n    hr = \"-\" * line_length\n\n    data = { 'review.id': review.id,\n             'review.url': review.getURL(db, to_user, 2),\n             'review.branch.name': review.branch.name,\n             'review.branch.repository': review.repository.getURL(db, to_user),\n             'from.fullname': from_user.fullname,\n             'hr': hr }\n\n    body = \"\"\"%(hr)s\nThis is an automatic message generated by the review at:\n%(review.url)s\n%(hr)s\n\n\n\"\"\" % data\n\n    if new_upstream:\n        data[\"new_upstream\"] = new_upstream.oneline(db, decorate=True)\n        text = \"\"\"\\\n%(from.fullname)s has rebased the review branch onto:\n\n%(new_upstream)s\"\"\" % data\n    else:\n        text = \"%(from.fullname)s has rewritten the history on the review branch.\" % data\n\n    body += \"\"\"%s\n\n\n\"\"\" % textutils.reflow(text, line_length)\n\n    body += \"\"\"The new branch log is:\n\n\"\"\"\n\n    for commit in rebased_commits:\n        body += commit.oneline(db) + \"\\n\"\n\n    files = []\n\n    parent_message_id = getReviewMessageId(db, to_user, review, files)\n    message_id = generateMessageId(len(files) + 1)\n\n    files.append(sendMail(\n        db, review, message_id, from_user, to_user, recipients, subject, body,\n        parent_message_id=parent_message_id))\n\n    return files\n\ndef sendExtensionOutput(db, user_id, batch_id, output):\n    to_user = dbutils.User.fromId(db, user_id)\n\n    cursor = db.cursor()\n    cursor.execute(\"SELECT review, uid FROM batches WHERE id=%s\", (batch_id,))\n\n    review_id, batch_user_id = cursor.fetchone()\n\n    review = dbutils.Review.fromId(db, review_id)\n    batch_user = dbutils.User.fromId(db, batch_user_id)\n\n    # First check if we can/should send emails to the user at all.\n    try:\n        checkEmailEnabled(db, to_user)\n        subject = generateSubjectLine(db, to_user, review, \"extensionOutput\")\n    except MailDisabled:\n        return []\n\n    line_length = to_user.getPreference(db, \"email.lineLength\")\n    hr = \"-\" * line_length\n\n    data = { 'review.id': review.id,\n             'review.url': review.getURL(db, to_user, 2),\n             'batch.user.fullname': batch_user.fullname,\n             'hr': hr }\n\n    body = \"\"\"%(hr)s\nThis is an automatic message generated by the review at:\n%(review.url)s\n%(hr)s\n\n\n\"\"\" % data\n\n    text = \"A batch of changes submitted by %(batch.user.fullname)s has been processed by your installed extensions.\" % data\n\n    body += \"\"\"%s\n\n\n\"\"\" % textutils.reflow(text, line_length)\n\n    body += \"The extensions generated the following output:\\n%s\" % output\n\n    files = []\n\n    parent_message_id = getReviewMessageId(db, to_user, review, files)\n    message_id = generateMessageId(len(files) + 1)\n\n    files.append(sendMail(\n        db, review, message_id, to_user, to_user, [to_user], subject, body,\n        parent_message_id=parent_message_id))\n\n    return files\n"
  },
  {
    "path": "src/reviewing/rebase.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport time\n\nimport configuration\nimport gitutils\n\ndef timestamp(ts):\n    return time.strftime(\"%Y-%m-%d %H:%M:%S\", ts)\n\ndef createEquivalentMergeCommit(db, review, user, old_head, old_upstream, new_head, new_upstream, onto_branch=None):\n    repository = review.repository\n\n    old_upstream_name = repository.findInterestingTag(db, old_upstream.sha1) or old_upstream.sha1\n    new_upstream_name = repository.findInterestingTag(db, new_upstream.sha1) or new_upstream.sha1\n\n    if onto_branch:\n        merged_thing = \"branch '%s'\" % onto_branch\n    else:\n        merged_thing = \"commit '%s'\" % new_upstream_name\n\n    commit_message = \"\"\"\\\nMerge %(merged_thing)s into %(review.branch.name)s\n\nThis commit was generated automatically by Critic as an equivalent merge\nto the rebase of the commits\n\n  %(old_upstream_name)s..%(old_head.sha1)s\n\nonto the %(merged_thing)s.\"\"\" % { \"merged_thing\": merged_thing,\n                                  \"review.branch.name\": review.branch.name,\n                                  \"old_upstream_name\": old_upstream_name,\n                                  \"old_head.sha1\": old_head.sha1 }\n\n    merge_sha1 = repository.run('commit-tree', new_head.tree, '-p', old_head.sha1, '-p', new_upstream.sha1,\n                                input=commit_message,\n                                env=gitutils.getGitEnvironment()).strip()\n\n    merge = gitutils.Commit.fromSHA1(db, repository, merge_sha1)\n    gituser_id = merge.author.getGitUserId(db)\n\n    cursor = db.cursor()\n    cursor.execute(\"\"\"INSERT INTO commits (sha1, author_gituser, commit_gituser, author_time, commit_time)\n                           VALUES (%s, %s, %s, %s, %s)\n                        RETURNING id\"\"\",\n                   (merge.sha1, gituser_id, gituser_id, timestamp(merge.author.time), timestamp(merge.committer.time)))\n    merge.id = cursor.fetchone()[0]\n\n    cursor.executemany(\"INSERT INTO edges (parent, child) VALUES (%s, %s)\",\n                       [(old_head.getId(db), merge.id),\n                        (new_upstream.getId(db), merge.id)])\n\n    # Need to commit the transaction to make the new commit available\n    # to other database sessions right away, specifically so that the\n    # changeset service can see it.\n    db.commit()\n\n    return merge\n\ndef replayRebase(db, review, user, old_head, old_upstream, new_head, new_upstream, onto_branch=None):\n    repository = review.repository\n\n    old_upstream_name = repository.findInterestingTag(db, old_upstream.sha1) or old_upstream.sha1\n\n    if onto_branch:\n        new_upstream_name = \"branch '%s'\" % onto_branch\n    else:\n        new_upstream_name = \"commit '%s'\" % (repository.findInterestingTag(db, new_upstream.sha1) or new_upstream.sha1)\n\n    commit_message = \"\"\"\\\nRebased %(review.branch.name)s onto %(new_upstream_name)s\n\nThis commit was generated automatically by Critic to \"replay\" the\nrebase of the commits\n\n  %(old_upstream_name)s..%(old_head.sha1)s\n\nonto the %(new_upstream_name)s.\"\"\" % { \"review.branch.name\": review.branch.name,\n                                       \"old_head.sha1\": old_head.sha1,\n                                       \"old_upstream_name\": old_upstream_name,\n                                       \"new_upstream_name\": new_upstream_name }\n\n    original_sha1 = repository.run(\n        'commit-tree', old_head.tree, '-p', old_upstream.sha1,\n        env=gitutils.getGitEnvironment(),\n        input=commit_message).strip()\n\n    with repository.workcopy(original_sha1) as workcopy:\n        with repository.temporaryref(new_upstream) as new_upstream_ref, \\\n                repository.temporaryref(original_sha1) as original_ref:\n            workcopy.run(\"fetch\", \"--quiet\", \"origin\",\n                         \"%s:refs/heads/temporary\" % new_upstream_ref,\n                         \"%s:refs/heads/original\" % original_ref)\n\n        workcopy.run(\"checkout\", \"refs/heads/temporary\")\n\n        returncode, stdout, stderr = workcopy.run(\n            \"cherry-pick\", \"refs/heads/original\",\n            env=gitutils.getGitEnvironment(),\n            check_errors=False)\n\n        # If the rebase produced conflicts, just stage and commit them:\n        if returncode != 0:\n            # Reset any submodule gitlinks with conflicts: since we don't\n            # have the submodules checked out, \"git commit --all\" below\n            # may fail to index them.\n            for line in stdout.splitlines():\n                if line.startswith(\"CONFLICT (submodule):\"):\n                    submodule_path = line.split()[-1]\n                    workcopy.run(\"reset\", \"--\", submodule_path, check_errors=False)\n\n            # Then stage and commit the result, with conflict markers and all.\n            workcopy.run(\"commit\", \"--all\", \"--reuse-message=%s\" % original_sha1,\n                         env=gitutils.getGitEnvironment())\n\n        rebased_sha1 = workcopy.run(\"rev-parse\", \"HEAD\").strip()\n\n        workcopy.run(\"push\", \"origin\", \"HEAD:refs/keepalive/\" + rebased_sha1)\n\n    return gitutils.Commit.fromSHA1(db, repository, rebased_sha1)\n"
  },
  {
    "path": "src/reviewing/utils.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport dbutils\nimport gitutils\nfrom dbutils import *\nfrom itertools import izip, repeat, chain\nimport htmlutils\nimport configuration\n\nimport mail\nimport diff\nimport changeset.utils as changeset_utils\nimport changeset.load as changeset_load\nimport reviewing.comment\nimport reviewing.filters\nimport log.commitset as log_commitset\nimport extensions.role.filterhook\n\nfrom operation import OperationError, OperationFailure\nfrom filters import Filters\n\ndef getFileIdsFromChangesets(changesets):\n    file_ids = set()\n    for changeset in changesets:\n        file_ids.update(changed_file.id for changed_file in changeset.files)\n    return file_ids\n\ndef getReviewersAndWatchers(db, repository, commits=None, changesets=None, reviewfilters=None,\n                            applyfilters=True, applyparentfilters=False):\n    \"\"\"getReviewersAndWatchers(db, commits=None, changesets=None) -> tuple\n\nReturns a tuple containing two dictionaries, each mapping file IDs to\ndictionaries mapping user IDs to sets of changeset IDs.  The first dictionary\ndefines the reviwers of each file, the second dictionary defines the watchers of\neach file.  For any changes in a file for which no reviewer is identified, None\nis used as a key in the dictionary instead of a real user ID.\"\"\"\n\n    if changesets is None:\n        changesets = []\n        changeset_utils.createChangesets(db, repository, commits)\n        for commit in commits:\n            changesets.extend(changeset_utils.createChangeset(db, None, repository, commit, do_highlight=False))\n\n    cursor = db.cursor()\n\n    filters = Filters()\n    filters.setFiles(db, list(getFileIdsFromChangesets(changesets)))\n\n    if applyfilters:\n        filters.load(db, repository=repository, recursive=applyparentfilters)\n\n    if reviewfilters:\n        filters.addFilters(reviewfilters)\n\n    reviewers = {}\n    watchers = {}\n\n    for changeset in changesets:\n        author_user_ids = changeset.child.author.getUserIds(db) if changeset.child else set()\n\n        cursor.execute(\"SELECT DISTINCT file FROM fileversions WHERE changeset=%s\", (changeset.id,))\n\n        for (file_id,) in cursor:\n            reviewers_found = False\n\n            for user_id, (filter_type, delegate) in filters.listUsers(file_id).items():\n                if filter_type == 'reviewer':\n                    if user_id not in author_user_ids:\n                        reviewer_user_ids = [user_id]\n                    elif delegate:\n                        reviewer_user_ids = []\n                        for delegate_user_name in delegate.split(\",\"):\n                            delegate_user = dbutils.User.fromName(db, delegate_user_name)\n                            reviewer_user_ids.append(delegate_user.id)\n                    else:\n                        reviewer_user_ids = []\n\n                    for reviewer_user_id in reviewer_user_ids:\n                        reviewers.setdefault(file_id, {}).setdefault(reviewer_user_id, set()).add(changeset.id)\n                        reviewers_found = True\n                else:\n                    watchers.setdefault(file_id, {}).setdefault(user_id, set()).add(changeset.id)\n\n            if not reviewers_found:\n                reviewers.setdefault(file_id, {}).setdefault(None, set()).add(changeset.id)\n\n    return reviewers, watchers\n\ndef getReviewedReviewers(db, review):\n    \"\"\"getReviewedReviewers(db, review) -> dictionary\n\nReturns a dictionary, like the ones returned by getReviewersAndWatchers(), but\nwith details about all reviewed changes in the review.\"\"\"\n\n    cursor = db.cursor()\n\n    cursor.execute(\"\"\"SELECT reviewfiles.reviewer, reviewfiles.changeset, reviewfiles.file\n                        FROM reviewfiles\n                       WHERE reviewfiles.review=%s\n                         AND reviewfiles.state='reviewed'\"\"\",\n                   (review.id,))\n\n    reviewers = {}\n\n    for user_id, changeset_id, file_id in cursor.fetchall():\n        reviewers.setdefault(file_id, {}).setdefault(user_id, set()).add(changeset_id)\n\n    return reviewers\n\ndef getPendingReviewers(db, review):\n    \"\"\"getPendingReviewers(db, review) -> dictionary\n\nReturns a dictionary, like the ones returned by getReviewersAndWatchers(), but\nwith details about remaining unreviewed changes in the review.  Changes not\nassigned to a reviewer are handled the same way.\"\"\"\n\n    cursor = db.cursor()\n\n    cursor.execute(\"\"\"SELECT reviewuserfiles.uid, reviewfiles.changeset, reviewfiles.file\n                        FROM reviewfiles\n             LEFT OUTER JOIN reviewuserfiles ON (reviewuserfiles.file=reviewfiles.id)\n                       WHERE reviewfiles.review=%s\n                         AND reviewfiles.state='pending'\"\"\",\n                   (review.id,))\n\n    reviewers = {}\n\n    for user_id, changeset_id, file_id in cursor.fetchall():\n        reviewers.setdefault(file_id, {}).setdefault(user_id, set()).add(changeset_id)\n\n    return reviewers\n\ndef collectReviewTeams(reviewers):\n    \"\"\"collectReviewTeams(reviewers) -> dictionary\n\nTakes a dictionary as returned by getReviewersAndWatchers() or\ngetPendingReviewers() and transform into a dictionary mapping sets of users to\nsets of files that those groups of users share review responsibilities for.  The\nsame user may appear in number of sets, as may the same file.\n\nIf None appears as a key in the returned dictionary, the set of files it is\nmapped to have changes in them with no assigned reviewers.\"\"\"\n\n    teams = {}\n\n    for file_id, file_reviewers in reviewers.items():\n        if None in file_reviewers:\n            teams.setdefault(None, set()).add(file_id)\n        team = frozenset(filter(None, file_reviewers.keys()))\n        if team: teams.setdefault(team, set()).add(file_id)\n\n    return teams\n\ndef assignChanges(db, user, review, commits=None, changesets=None, update=False):\n    cursor = db.cursor()\n\n    if changesets is None:\n        assert commits is not None\n\n        changesets = []\n\n        for commit in commits:\n            changesets.extend(changeset_utils.createChangeset(db, user, review.repository, commit))\n\n    applyfilters = review.applyfilters\n    applyparentfilters = review.applyparentfilters\n\n    reviewers, watchers = getReviewersAndWatchers(db, review.repository, changesets=changesets, reviewfilters=review.getReviewFilters(db),\n                                                  applyfilters=applyfilters, applyparentfilters=applyparentfilters)\n\n    cursor.execute(\"SELECT uid FROM reviewusers WHERE review=%s\", (review.id,))\n\n    reviewusers = set([user_id for (user_id,) in cursor])\n    reviewusers_values = set()\n    reviewuserfiles_values = set()\n\n    reviewuserfiles_existing = {}\n\n    if update:\n        cursor.execute(\"\"\"SELECT reviewuserfiles.uid, reviewfiles.changeset, reviewfiles.file\n                            FROM reviewfiles\n                            JOIN reviewuserfiles ON (reviewuserfiles.file=reviewfiles.id)\n                           WHERE reviewfiles.review=%s\"\"\", (review.id,))\n        for user_id, changeset_id, file_id in cursor:\n            reviewuserfiles_existing[(user_id, changeset_id, file_id)] = True\n\n    new_reviewers = set()\n    new_watchers = set()\n\n    cursor.execute(\"\"\"SELECT DISTINCT uid\n                        FROM reviewfiles\n                        JOIN reviewuserfiles ON (reviewuserfiles.file=reviewfiles.id)\n                       WHERE review=%s\"\"\", (review.id,))\n    old_reviewers = set([user_id for (user_id,) in cursor])\n\n    for file_id, file_users in reviewers.items():\n        for user_id, user_changesets in file_users.items():\n            if user_id:\n                new_reviewers.add(user_id)\n\n                if user_id not in reviewusers:\n                    reviewusers.add(user_id)\n                    reviewusers_values.add((review.id, user_id))\n                for changeset_id in user_changesets:\n                    if (user_id, changeset_id, file_id) not in reviewuserfiles_existing:\n                        reviewuserfiles_values.add((user_id, review.id, changeset_id, file_id))\n\n    for file_id, file_users in watchers.items():\n        for user_id, user_changesets in file_users.items():\n            if user_id:\n                if user_id not in reviewusers:\n                    new_watchers.add(user_id)\n                    reviewusers.add(user_id)\n                    reviewusers_values.add((review.id, user_id))\n\n    new_reviewers -= old_reviewers\n    new_watchers -= old_reviewers | new_reviewers\n\n    cursor.executemany(\"INSERT INTO reviewusers (review, uid) VALUES (%s, %s)\", reviewusers_values)\n    cursor.executemany(\"INSERT INTO reviewuserfiles (file, uid) SELECT id, %s FROM reviewfiles WHERE review=%s AND changeset=%s AND file=%s\", reviewuserfiles_values)\n\n    if configuration.extensions.ENABLED:\n        cursor.execute(\"\"\"SELECT id, uid, extension, path\n                            FROM extensionhookfilters\n                           WHERE repository=%s\"\"\",\n                       (review.repository.id,))\n\n        rows = cursor.fetchall()\n\n        if rows:\n            if commits is None:\n                commits = set()\n                for changeset in changesets:\n                    commits.add(changeset.child)\n                commits = list(commits)\n\n            filters = Filters()\n            filters.setFiles(db, list(getFileIdsFromChangesets(changesets)))\n\n            for filter_id, user_id, extension_id, path in rows:\n                filters.addFilter(user_id, path, None, None, filter_id)\n\n            for filter_id, file_ids in filters.matched_files.items():\n                extensions.role.filterhook.queueFilterHookEvent(\n                    db, filter_id, review, user, commits, file_ids)\n\n    return new_reviewers, new_watchers\n\ndef createChangesetsForCommits(db, commits, silent_if_empty=set(), full_merges=set(), replayed_rebases={}):\n    repository = commits[0].repository\n    changesets = []\n    silent_commits = set()\n    silent_changesets = set()\n\n    simple_commits = []\n    for commit in commits:\n        if commit not in full_merges and commit not in replayed_rebases:\n            simple_commits.append(commit)\n    if simple_commits:\n        changeset_utils.createChangesets(db, repository, simple_commits)\n\n    for commit in commits:\n        if commit in full_merges:\n            commit_changesets = changeset_utils.createFullMergeChangeset(\n                db, user, repository, commit, do_highlight=False)\n        elif commit in replayed_rebases:\n            commit_changesets = changeset_utils.createChangeset(\n                db, user, repository,\n                from_commit=commit, to_commit=replayed_rebases[commit],\n                conflicts=True, do_highlight=False)\n        else:\n            commit_changesets = changeset_utils.createChangeset(\n                db, user, repository, commit, do_highlight=False)\n\n        if commit in silent_if_empty:\n            for commit_changeset in commit_changesets:\n                if commit_changeset.files:\n                    break\n            else:\n                silent_commits.add(commit)\n                silent_changesets.update(commit_changesets)\n\n        changesets.extend(commit_changesets)\n\n    return changesets, silent_commits, silent_changesets\n\ndef addCommitsToReview(db, user, review, commits, new_review=False, commitset=None, pending_mails=None, silent_if_empty=set(), full_merges=set(), replayed_rebases={}, tracked_branch=False):\n    cursor = db.cursor()\n\n    if not new_review:\n        import index\n\n        new_commits = log_commitset.CommitSet(commits)\n        old_commits = log_commitset.CommitSet(review.branch.getCommits(db))\n        merges = new_commits.getMerges()\n\n        for merge in merges:\n            # We might have stripped it in a previous pass.\n            if not merge in new_commits: continue\n\n            tails = filter(lambda sha1: sha1 not in old_commits and sha1 not in merge.parents, new_commits.getTailsFrom(merge))\n\n            if tails:\n                if tracked_branch:\n                    raise index.IndexException(\"\"\"\\\nMerge %s adds merged-in commits.  Please push the merge manually\nand follow the instructions.\"\"\" % merge.sha1[:8])\n\n                cursor.execute(\"SELECT id, confirmed, tail FROM reviewmergeconfirmations WHERE review=%s AND uid=%s AND merge=%s\", (review.id, user.id, merge.getId(db)))\n\n                row = cursor.fetchone()\n\n                if not row or not row[1]:\n                    if not row:\n                        cursor.execute(\"INSERT INTO reviewmergeconfirmations (review, uid, merge) VALUES (%s, %s, %s) RETURNING id\", (review.id, user.id, merge.getId(db)))\n                        confirmation_id = cursor.fetchone()[0]\n\n                        merged = set()\n\n                        for tail_sha1 in tails:\n                            children = new_commits.getChildren(tail_sha1)\n\n                            while children:\n                                child = children.pop()\n                                if child not in merged and new_commits.isAncestorOf(child, merge):\n                                    merged.add(child)\n                                    children.update(new_commits.getChildren(child) - merged)\n\n                        merged_values = [(confirmation_id, commit.getId(db)) for commit in merged]\n                        cursor.executemany(\"INSERT INTO reviewmergecontributions (id, merged) VALUES (%s, %s)\", merged_values)\n                        db.commit()\n                    else:\n                        confirmation_id = row[0]\n\n                    message = \"Merge %s adds merged-in commits:\" % merge.sha1[:8]\n\n                    for tail_sha1 in tails:\n                        for parent_sha1 in merge.parents:\n                            if parent_sha1 in new_commits:\n                                parent = new_commits.get(parent_sha1)\n                                if tail_sha1 in new_commits.getTailsFrom(parent):\n                                    message += \"\\n  %s..%s\" % (tail_sha1[:8], parent_sha1[:8])\n\n                    message += \"\"\"\nPlease confirm that this is intended by loading:\n  %s/confirmmerge?id=%d\"\"\" % (dbutils.getURLPrefix(db, user), confirmation_id)\n\n                    raise index.IndexException(message)\n                elif row[2] is not None:\n                    if row[2] == merge.getId(db):\n                        cursor.execute(\"SELECT merged FROM reviewmergecontributions WHERE id=%s\",\n                                       (row[0],))\n\n                        for (merged_id,) in cursor:\n                            merged = gitutils.Commit.fromId(db, review.repository, merged_id)\n                            if merged.sha1 in merge.parents:\n                                new_commits = new_commits.without([merged])\n                                break\n                    else:\n                        tail = gitutils.Commit.fromId(db, review.repository, row[2])\n                        cut = [gitutils.Commit.fromSHA1(db, review.repository, sha1)\n                               for sha1 in tail.parents if sha1 in new_commits]\n                        new_commits = new_commits.without(cut)\n\n        if commitset:\n            commitset &= set(new_commits)\n            commits = [commit for commit in commits if commit in commitset]\n\n    changesets, silent_commits, silent_changesets = \\\n        createChangesetsForCommits(db, commits, silent_if_empty, full_merges, replayed_rebases)\n\n    if not new_review:\n        print \"Adding %d commit%s to the review at:\\n  %s\" % (len(commits), len(commits) > 1 and \"s\" or \"\", review.getURL(db))\n\n    reviewchangesets_values = [(review.id, changeset.id) for changeset in changesets]\n\n    cursor.executemany(\"\"\"INSERT INTO reviewchangesets (review, changeset) VALUES (%s, %s)\"\"\", reviewchangesets_values)\n    cursor.executemany(\"\"\"INSERT INTO reviewfiles (review, changeset, file, deleted, inserted)\n                               SELECT reviewchangesets.review, reviewchangesets.changeset, fileversions.file,\n                                      COALESCE(SUM(chunks.deleteCount), 0), COALESCE(SUM(chunks.insertCount), 0)\n                                 FROM reviewchangesets\n                                 JOIN fileversions USING (changeset)\n                      LEFT OUTER JOIN chunks USING (changeset, file)\n                                WHERE reviewchangesets.review=%s\n                                  AND reviewchangesets.changeset=%s\n                             GROUP BY reviewchangesets.review, reviewchangesets.changeset, fileversions.file\"\"\",\n                       reviewchangesets_values)\n\n    new_reviewers, new_watchers = assignChanges(db, user, review, changesets=changesets)\n\n    cursor.execute(\"SELECT include FROM reviewrecipientfilters WHERE review=%s AND uid IS NULL\", (review.id,))\n\n    try: opt_out = cursor.fetchone()[0] is True\n    except: opt_out = True\n\n    if not new_review:\n        for user_id in new_reviewers:\n            new_reviewuser = dbutils.User.fromId(db, user_id)\n            print \"Added reviewer: %s <%s>\" % (new_reviewuser.fullname, new_reviewuser.email)\n\n            if opt_out:\n                # If the user has opted out from receiving e-mails about this\n                # review while only watching it, clear the opt-out now that the\n                # user becomes a reviewer.\n                cursor.execute(\"DELETE FROM reviewrecipientfilters WHERE review=%s AND uid=%s AND include=FALSE\", (review.id, user_id))\n\n        for user_id in new_watchers:\n            new_reviewuser = dbutils.User.fromId(db, user_id)\n            print \"Added watcher:  %s <%s>\" % (new_reviewuser.fullname, new_reviewuser.email)\n\n        review.incrementSerial(db)\n\n        reviewing.comment.propagateCommentChains(db, user, review, new_commits, replayed_rebases)\n\n    if pending_mails is None: pending_mails = []\n\n    notify_commits = filter(lambda commit: commit not in silent_commits, commits)\n    notify_changesets = filter(lambda changeset: changeset not in silent_changesets, changesets)\n\n    if not new_review and notify_changesets:\n        recipients = review.getRecipients(db)\n        for to_user in recipients:\n            pending_mails.extend(mail.sendReviewAddedCommits(\n                    db, user, to_user, recipients, review, notify_commits,\n                    notify_changesets, tracked_branch=tracked_branch))\n\n    mail.sendPendingMails(pending_mails)\n\n    review.reviewers.extend([User.fromId(db, user_id) for user_id in new_reviewers])\n\n    for user_id in new_watchers:\n        review.watchers[User.fromId(db, user_id)] = \"automatic\"\n\n    return True\n\ndef createReview(db, user, repository, commits, branch_name, summary, description, from_branch_name=None, via_push=False, reviewfilters=None, applyfilters=True, applyparentfilters=False, recipientfilters=None):\n    cursor = db.cursor()\n\n    if via_push:\n        applyparentfilters = bool(user.getPreference(db, 'review.applyUpstreamFilters'))\n\n    branch = dbutils.Branch.fromName(db, repository, branch_name)\n\n    if branch is not None:\n        raise OperationFailure(\n            code=\"branchexists\",\n            title=\"Invalid review branch name\",\n            message=\"\"\"\\\n<p>There is already a branch named <code>%s</code> in the repository.  You have\nto select a different name.</p>\n\n<p>If you believe the existing branch was created during an earlier (failed)\nattempt to create this review, you can try to delete it from the repository\nusing the command<p>\n\n<pre>  git push &lt;remote&gt; :%s</pre>\n\n<p>and then press the \"Submit Review\" button on this page again.\"\"\"\n            % (htmlutils.htmlify(branch_name), htmlutils.htmlify(branch_name)),\n            is_html=True)\n\n    if not commits:\n        raise OperationFailure(\n            code=\"nocommits\",\n            title=\"No commits specified\",\n            message=\"You need at least one commit to create a review.\")\n\n    commitset = log_commitset.CommitSet(commits)\n    heads = commitset.getHeads()\n\n    if len(heads) != 1:\n        # There is really no plausible way for this error to occur.\n        raise OperationFailure(\n            code=\"disconnectedtree\",\n            title=\"Disconnected tree\",\n            message=(\"The specified commits do do not form a single connected \"\n                     \"tree.  Creating a review of them is not supported.\"))\n\n    head = heads.pop()\n\n    if len(commitset.getTails()) != 1:\n        tail_id = None\n    else:\n        tail_id = gitutils.Commit.fromSHA1(db, repository, commitset.getTails().pop()).getId(db)\n\n    if not via_push:\n        try:\n            repository.createBranch(branch_name, head.sha1)\n        except gitutils.GitCommandError as error:\n            raise OperationFailure(\n                code=\"branchfailed\",\n                title=\"Failed to create review branch\",\n                message=(\"<p><b>Output from git:</b></p>\"\n                         \"<code style='padding-left: 1em'>%s</code>\"\n                         % htmlutils.htmlify(error.output)),\n                is_html=True)\n\n    createChangesetsForCommits(db, commits)\n\n    try:\n        cursor.execute(\"\"\"INSERT INTO branches (repository, name, head, tail, type)\n                               VALUES (%s, %s, %s, %s, 'review')\n                            RETURNING id\"\"\",\n                       (repository.id, branch_name, head.getId(db), tail_id))\n\n        branch_id = cursor.fetchone()[0]\n        reachable_values = [(branch_id, commit.getId(db)) for commit in commits]\n\n        cursor.executemany(\"\"\"INSERT INTO reachable (branch, commit)\n                                   VALUES (%s, %s)\"\"\",\n                           reachable_values)\n\n        from_branch_id = None\n        if from_branch_name is not None:\n            cursor.execute(\"\"\"SELECT id\n                                FROM branches\n                               WHERE repository=%s\n                                 AND name=%s\"\"\",\n                           (repository.id, from_branch_name))\n            row = cursor.fetchone()\n            if row:\n                from_branch_id = row[0]\n\n        cursor.execute(\"\"\"INSERT INTO reviews (type, branch, origin, state,\n                                               summary, description,\n                                               applyfilters, applyparentfilters)\n                               VALUES ('official', %s, %s, 'open',\n                                       %s, %s,\n                                       %s, %s)\n                            RETURNING id\"\"\",\n                       (branch_id, from_branch_id,\n                        summary, description,\n                        applyfilters, applyparentfilters))\n\n        review = dbutils.Review.fromId(db, cursor.fetchone()[0])\n\n        cursor.execute(\"\"\"INSERT INTO reviewusers (review, uid, owner)\n                               VALUES (%s, %s, TRUE)\"\"\",\n                       (review.id, user.id))\n\n        if reviewfilters is not None:\n            cursor.executemany(\"\"\"INSERT INTO reviewfilters (review, uid, path, type, creator)\n                                       VALUES (%s, %s, %s, %s, %s)\"\"\",\n                               [(review.id, filter_user_id, filter_path, filter_type, user.id)\n                                for filter_user_id, filter_path, filter_type, filter_delegate in reviewfilters])\n\n        is_opt_in = False\n\n        if recipientfilters is not None:\n            cursor.executemany(\n                \"\"\"INSERT INTO reviewrecipientfilters (review, uid, include)\n                        VALUES (%s, %s, %s)\"\"\",\n                [(review.id, filter_user_id, filter_include)\n                 for filter_user_id, filter_include in recipientfilters])\n\n            for filter_user_id, filter_include in recipientfilters:\n                if filter_user_id is None and not filter_include:\n                    is_opt_in = True\n\n        addCommitsToReview(db, user, review, commits, new_review=True)\n\n        # Reload to get list of changesets added by addCommitsToReview().\n        review = dbutils.Review.fromId(db, review.id)\n\n        pending_mails = []\n        recipients = review.getRecipients(db)\n        for to_user in recipients:\n            pending_mails.extend(mail.sendReviewCreated(db, user, to_user, recipients, review))\n\n        if not is_opt_in:\n            recipient_by_id = dict((to_user.id, to_user) for to_user in recipients)\n\n            cursor.execute(\"\"\"SELECT userpreferences.uid, userpreferences.repository,\n                                     userpreferences.filter, userpreferences.integer\n                                FROM userpreferences\n                     LEFT OUTER JOIN filters ON (filters.id=userpreferences.filter)\n                               WHERE userpreferences.item='review.defaultOptOut'\n                                 AND userpreferences.uid=ANY (%s)\n                                 AND (userpreferences.filter IS NULL\n                                   OR filters.repository=%s)\n                                 AND (userpreferences.repository IS NULL\n                                   OR userpreferences.repository=%s)\"\"\",\n                           (recipient_by_id.keys(), repository.id, repository.id))\n\n            user_settings = {}\n            has_filter_settings = False\n\n            for user_id, repository_id, filter_id, integer in cursor:\n                settings = user_settings.setdefault(user_id, [None, None, {}])\n                value = bool(integer)\n\n                if repository_id is None and filter_id is None:\n                    settings[0] = value\n                elif repository_id is not None:\n                    settings[1] = value\n                else:\n                    settings[2][filter_id] = value\n                    has_filter_settings = True\n\n            if has_filter_settings:\n                filters = Filters()\n                filters.setFiles(db, review=review)\n\n            for user_id, (global_default, repository_default, filter_settings) in user_settings.items():\n                to_user = recipient_by_id[user_id]\n                opt_out = None\n\n                if repository_default is not None:\n                    opt_out = repository_default\n                elif global_default is not None:\n                    opt_out = global_default\n\n                if filter_settings:\n                    # Policy:\n                    #\n                    # If all of the user's filters that matched files in the\n                    # review have review.defaultOptOut enabled, then opt out.\n                    # When determining this, any review filters of the user's\n                    # that match files in the review count as filters that don't\n                    # have the review.defaultOptOut enabled.\n                    #\n                    # If any of the user's filters that matched files in the\n                    # review have review.defaultOptOut disabled, then don't opt\n                    # out.  When determining this, review filters are ignored.\n                    #\n                    # Otherwise, ignore the filter settings, and go with either\n                    # the user's per-repository or global setting (as set\n                    # above.)\n\n                    filters.load(db, review=review, user=to_user)\n\n                    # A set of filter ids.  If None is in the set, the user has\n                    # one or more review filters in the review.  (These do not\n                    # have ids.)\n                    active_filters = filters.getActiveFilters(to_user)\n\n                    for filter_id in active_filters:\n                        if filter_id is None:\n                            continue\n                        elif filter_id in filter_settings:\n                            if not filter_settings[filter_id]:\n                                opt_out = False\n                                break\n                        else:\n                            break\n                    else:\n                        if None not in active_filters:\n                            opt_out = True\n\n                if opt_out:\n                    cursor.execute(\"\"\"INSERT INTO reviewrecipientfilters (review, uid, include)\n                                           VALUES (%s, %s, FALSE)\"\"\",\n                                   (review.id, to_user.id))\n\n        db.commit()\n\n        mail.sendPendingMails(pending_mails)\n\n        return review\n    except:\n        if not via_push:\n            repository.run(\"branch\", \"-D\", branch_name)\n        raise\n\ndef getDraftItems(db, user, review):\n    return \"approved=%(reviewedNormal)d,disapproved=%(unreviewedNormal)d,approvedBinary=%(reviewedBinary)d,disapprovedBinary=%(unreviewedBinary)d,comments=%(writtenComments)d,reopened=%(reopenedIssues)d,closed=%(resolvedIssues)d,morphed=%(morphedChains)d\" % review.getDraftStatus(db, user)\n\ndef renderDraftItems(db, user, review, target):\n    items = review.getDraftStatus(db, user)\n\n    target.addExternalStylesheet(\"resource/review.css\")\n    target.addExternalScript(\"resource/review.js\")\n\n    div = target.div(id='draftStatus')\n\n    if any(items.values()):\n        div.span('draft').text(\"Draft: \")\n\n        approved = items.pop(\"reviewedNormal\", None)\n        if approved:\n            div.text(' ')\n            div.span('approved').text(\"reviewed %d line%s\" % (approved, approved > 1 and \"s\" or \"\"))\n\n            if any(items.values()): div.text(',')\n\n        disapproved = items.pop(\"unreviewedNormal\", None)\n        if disapproved:\n            div.text(' ')\n            div.span('disapproved').text(\"unreviewed %d line%s\" % (disapproved, disapproved > 1 and \"s\" or \"\"))\n\n            if any(items.values()): div.text(',')\n\n        approved = items.pop(\"reviewedBinary\", None)\n        if approved:\n            div.text(' ')\n            div.span('approved-binary').text(\"reviewed %d binary file%s\" % (approved, approved > 1 and \"s\" or \"\"))\n\n            if any(items.values()): div.text(',')\n\n        disapproved = items.pop(\"unreviewedBinary\", None)\n        if disapproved:\n            div.text(' ')\n            div.span('disapproved-binary').text(\"unreviewed %d binary file%s\" % (disapproved, disapproved > 1 and \"s\" or \"\"))\n\n            if any(items.values()): div.text(',')\n\n        comments = items.pop(\"writtenComments\", None)\n        if comments:\n            div.text(' ')\n            div.span('comments').text(\"wrote %d comment%s\" % (comments, comments > 1 and \"s\" or \"\"))\n\n            if any(items.values()): div.text(',')\n\n        reopened = items.pop(\"reopenedIssues\", None)\n        if reopened:\n            div.text(' ')\n            div.span('reopened').text(\"reopened %d issue%s\" % (reopened, reopened > 1 and \"s\" or \"\"))\n\n            if any(items.values()): div.text(',')\n\n        closed = items.pop(\"resolvedIssues\", None)\n        if closed:\n            div.text(' ')\n            div.span('closed').text(\"resolved %d issue%s\" % (closed, closed > 1 and \"s\" or \"\"))\n\n            if any(items.values()): div.text(',')\n\n        morphed = items.pop(\"morphedChains\", None)\n        if morphed:\n            div.text(' ')\n            div.span('closed').text(\"morphed %d comment%s\" % (morphed, morphed > 1 and \"s\" or \"\"))\n\n            if any(items.values()): div.text(',')\n\n        div.text(' ')\n        buttons = div.span(\"buttons\")\n        buttons.button(onclick='previewChanges();').text(\"Preview\")\n        buttons.button(onclick='submitChanges();').text(\"Submit\")\n        buttons.button(onclick='cancelChanges();').text(\"Abort\")\n\n        return True\n    else:\n        return False\n\ndef addReviewFilters(db, creator, user, review, reviewer_paths, watcher_paths):\n    cursor = db.cursor()\n\n    cursor.execute(\"INSERT INTO reviewassignmentstransactions (review, assigner) VALUES (%s, %s) RETURNING id\", (review.id, creator.id))\n    transaction_id = cursor.fetchone()[0]\n\n    def add(filter_type, paths):\n        for path in paths:\n            cursor.execute(\"\"\"SELECT id, type\n                                FROM reviewfilters\n                               WHERE review=%s\n                                 AND uid=%s\n                                 AND path=%s\"\"\",\n                           (review.id, user.id, path))\n\n            row = cursor.fetchone()\n\n            if row:\n                old_filter_id, old_filter_type = row\n\n                if old_filter_type == filter_type:\n                    continue\n                else:\n                    cursor.execute(\"\"\"DELETE FROM reviewfilters\n                                            WHERE id=%s\"\"\",\n                                   (old_filter_id,))\n                    cursor.execute(\"\"\"INSERT INTO reviewfilterchanges (transaction, uid, path, type, created)\n                                           VALUES (%s, %s, %s, %s, false)\"\"\",\n                                   (transaction_id, user.id, path, old_filter_type))\n\n            cursor.execute(\"\"\"INSERT INTO reviewfilters (review, uid, path, type, creator)\n                                   VALUES (%s, %s, %s, %s, %s)\"\"\",\n                           (review.id, user.id, path, filter_type, creator.id))\n            cursor.execute(\"\"\"INSERT INTO reviewfilterchanges (transaction, uid, path, type, created)\n                                   VALUES (%s, %s, %s, %s, true)\"\"\",\n                           (transaction_id, user.id, path, filter_type))\n\n    add(\"reviewer\", reviewer_paths)\n    add(\"watcher\", watcher_paths)\n\n    filters = Filters()\n    filters.setFiles(db, review=review)\n    filters.load(db, review=review, user=user)\n\n    if user not in review.reviewers and user not in review.watchers and user not in review.owners:\n        cursor.execute(\"\"\"INSERT INTO reviewusers (review, uid, type)\n                          VALUES (%s, %s, 'manual')\"\"\",\n                       (review.id, user.id,))\n\n    delete_files = set()\n    insert_files = set()\n\n    if watcher_paths:\n        # Unassign changes currently assigned to the affected user.\n        cursor.execute(\"\"\"SELECT reviewfiles.id, reviewfiles.file\n                            FROM reviewfiles\n                            JOIN reviewuserfiles ON (reviewuserfiles.file=reviewfiles.id)\n                           WHERE reviewfiles.review=%s\n                             AND reviewuserfiles.uid=%s\"\"\",\n                       (review.id, user.id))\n\n        for review_file_id, file_id in cursor:\n            if not filters.isReviewer(user.id, file_id):\n                delete_files.add(review_file_id)\n\n    if reviewer_paths:\n        # Assign changes currently not assigned to the affected user.\n        cursor.execute(\"\"\"SELECT reviewfiles.id, reviewfiles.file\n                            FROM reviewfiles\n                            JOIN changesets ON (changesets.id=reviewfiles.changeset)\n                            JOIN commits ON (commits.id=changesets.child)\n                            JOIN gitusers ON (gitusers.id=commits.author_gituser)\n                 LEFT OUTER JOIN usergitemails ON (usergitemails.email=gitusers.email\n                                               AND usergitemails.uid=%s)\n                 LEFT OUTER JOIN reviewuserfiles ON (reviewuserfiles.file=reviewfiles.id\n                                                 AND reviewuserfiles.uid=%s)\n                           WHERE reviewfiles.review=%s\n                             AND usergitemails.uid IS NULL\n                             AND reviewuserfiles.uid IS NULL\"\"\",\n                       (user.id, user.id, review.id))\n\n        for review_file_id, file_id in cursor:\n            if filters.isReviewer(user.id, file_id):\n                insert_files.add(review_file_id)\n\n    if delete_files:\n        cursor.executemany(\"DELETE FROM reviewuserfiles WHERE file=%s AND uid=%s\",\n                           izip(delete_files, repeat(user.id)))\n        cursor.executemany(\"INSERT INTO reviewassignmentchanges (transaction, file, uid, assigned) VALUES (%s, %s, %s, false)\",\n                           izip(repeat(transaction_id), delete_files, repeat(user.id)))\n\n    if insert_files:\n        cursor.executemany(\"INSERT INTO reviewuserfiles (file, uid) VALUES (%s, %s)\",\n                           izip(insert_files, repeat(user.id)))\n        cursor.executemany(\"INSERT INTO reviewassignmentchanges (transaction, file, uid, assigned) VALUES (%s, %s, %s, true)\",\n                           izip(repeat(transaction_id), insert_files, repeat(user.id)))\n\n    return generateMailsForAssignmentsTransaction(db, transaction_id)\n\ndef parseReviewFilters(db, data):\n    reviewfilters = []\n\n    for filter_data in data:\n        filter_user = dbutils.User.fromName(db, filter_data[\"username\"])\n        filter_type = filter_data[\"type\"]\n        filter_path = reviewing.filters.sanitizePath(filter_data[\"path\"])\n\n        # Make sure the path doesn't contain any invalid wild-cards.\n        try:\n            reviewing.filters.validatePattern(filter_path)\n        except reviewing.filters.PatternError as error:\n            raise OperationFailure(code=\"invalidpattern\",\n                                   title=\"Invalid filter pattern\",\n                                   message=\"Problem: %s\" % error.message)\n\n        reviewfilters.append((filter_user.id, filter_path, filter_type, None))\n\n    return reviewfilters\n\ndef parseRecipientFilters(db, data):\n    mode = data.get(\"mode\", \"opt-out\")\n    included = data.get(\"included\", [])\n    excluded = data.get(\"excluded\", [])\n\n    recipientfilters = []\n\n    if mode == \"opt-in\":\n        recipientfilters.append((None, False))\n        filter_usernames = included\n        filter_include = True\n    else:\n        filter_usernames = excluded\n        filter_include = False\n\n    for filter_username in filter_usernames:\n        filter_user = dbutils.User.fromName(db, filter_username)\n        if not filter_user:\n            raise OperationError(\"no such user: '%s'\" % filter_username)\n        recipientfilters.append((filter_user.id, filter_include))\n\n    return recipientfilters\n\ndef queryFilters(db, user, review, globalfilters=False, parentfilters=False):\n    cursor = db.cursor()\n\n    if globalfilters:\n        cursor.execute(\"UPDATE reviews SET applyfilters=TRUE WHERE id=%s\", (review.id,))\n        review.applyfilters = True\n    if parentfilters:\n        cursor.execute(\"UPDATE reviews SET applyparentfilters=TRUE WHERE id=%s\", (review.id,))\n        review.applyparentfilters = True\n\n    cursor.execute(\"\"\"SELECT changeset\n                        FROM reviewchangesets\n                       WHERE review=%s\"\"\",\n                   (review.id,))\n\n    # TODO: This two-phase creation of Changeset objects is a bit silly.\n    changesets = [diff.Changeset.fromId(db, review.repository, changeset_id)\n                  for (changeset_id,) in cursor]\n    changeset_load.loadChangesets(\n        db, review.repository, changesets, load_chunks=False)\n\n    return assignChanges(db, user, review, changesets=changesets, update=True)\n\ndef applyFilters(db, user, review, globalfilters=False, parentfilters=False):\n    new_reviewers, new_watchers = queryFilters(db, user, review, globalfilters, parentfilters)\n\n    pending_mails = []\n    cursor = db.cursor()\n\n    for user_id in new_reviewers:\n        new_reviewer = dbutils.User.fromId(db, user_id)\n\n        cursor.execute(\"\"\"SELECT reviewfiles.file, SUM(reviewfiles.deleted), SUM(reviewfiles.inserted)\n                            FROM reviewfiles\n                            JOIN reviewuserfiles ON (reviewuserfiles.file=reviewfiles.id)\n                           WHERE reviewfiles.review=%s\n                             AND reviewuserfiles.uid=%s\n                        GROUP BY reviewfiles.file\"\"\",\n                       (review.id, user_id))\n\n        pending_mails.extend(mail.sendFiltersApplied(\n                db, user, new_reviewer, review, globalfilters, parentfilters, cursor.fetchall()))\n\n    for user_id in new_watchers:\n        new_watcher = dbutils.User.fromId(db, user_id)\n        pending_mails.extend(mail.sendFiltersApplied(\n                db, user, new_watcher, review, globalfilters, parentfilters, None))\n\n    review.incrementSerial(db)\n\n    db.commit()\n\n    mail.sendPendingMails(pending_mails)\n\ndef generateMailsForBatch(db, batch_id, was_accepted, is_accepted, profiler=None):\n    cursor = db.cursor()\n    cursor.execute(\"SELECT review, uid FROM batches WHERE id=%s\", (batch_id,))\n\n    review_id, user_id = cursor.fetchone()\n\n    review = dbutils.Review.fromId(db, review_id)\n    from_user = dbutils.User.fromId(db, user_id)\n\n    pending_mails = []\n\n    recipients = review.getRecipients(db)\n    for to_user in recipients:\n        pending_mails.extend(mail.sendReviewBatch(db, from_user, to_user, recipients, review, batch_id, was_accepted, is_accepted, profiler=profiler))\n\n    return pending_mails\n\ndef generateMailsForAssignmentsTransaction(db, transaction_id):\n    cursor = db.cursor()\n    cursor.execute(\"SELECT review, assigner, note FROM reviewassignmentstransactions WHERE id=%s\", (transaction_id,))\n\n    review_id, assigner_id, note = cursor.fetchone()\n\n    review = dbutils.Review.fromId(db, review_id)\n    assigner = dbutils.User.fromId(db, assigner_id)\n\n    cursor.execute(\"\"\"SELECT uid, path, type, created\n                        FROM reviewfilterchanges\n                       WHERE transaction=%s\"\"\",\n                   (transaction_id,))\n\n    by_user = {}\n\n    for reviewer_id, path, filter_type, created in cursor:\n        added_filters, removed_filters, unassigned, assigned = by_user.setdefault(reviewer_id, ([], [], [], []))\n        if created: added_filters.append((filter_type, path or \"/\"))\n        else: removed_filters.append((filter_type, path or \"/\"))\n\n    cursor.execute(\"\"\"SELECT reviewassignmentchanges.uid, reviewassignmentchanges.assigned, reviewfiles.file, SUM(reviewfiles.deleted), SUM(reviewfiles.inserted)\n                        FROM reviewfiles\n                        JOIN reviewassignmentchanges ON (reviewassignmentchanges.file=reviewfiles.id)\n                       WHERE reviewassignmentchanges.transaction=%s\n                    GROUP BY reviewassignmentchanges.uid, reviewassignmentchanges.assigned, reviewfiles.file\"\"\",\n                   (transaction_id,))\n\n    for reviewer_id, was_assigned, file_id, deleted, inserted in cursor:\n        added_filters, removed_filters, unassigned, assigned = by_user.setdefault(reviewer_id, (None, None, [], []))\n\n        if was_assigned: assigned.append((file_id, deleted, inserted))\n        else: unassigned.append((file_id, deleted, inserted))\n\n    pending_mails = []\n\n    for reviewer_id, (added_filters, removed_filters, unassigned, assigned) in by_user.items():\n        reviewer = dbutils.User.fromId(db, reviewer_id)\n        if assigner != reviewer:\n            pending_mails.extend(mail.sendAssignmentsChanged(db, assigner, reviewer, review, added_filters, removed_filters, unassigned, assigned))\n\n    return pending_mails\n\ndef retireUser(db, user):\n    cursor = db.cursor()\n\n    # Set the user's status to 'retired'.\n    cursor.execute(\"\"\"UPDATE users\n                         SET status='retired'\n                       WHERE id=%s\"\"\",\n                   (user.id,))\n\n    # Delete any assignments of unreviewed (pending) changes to the user.  We're\n    # leaving assignments of reviewed changes in-place; no particular need to\n    # drop historical data.\n    #\n    # Deleting even this risks dropping some historical data, specifically\n    # changes involving files being marked as reviewed, and then unmarked again.\n    # But having \"active\" assignments to users that aren't going to review them\n    # complicates a whole bunch of queries, so to keep things simple, we can\n    # sacrifice a little history.\n    cursor.execute(\"\"\"DELETE FROM reviewuserfiles\n                            WHERE uid=%s\n                              AND file IN (SELECT id\n                                             FROM reviewfiles\n                                            WHERE state='pending')\"\"\",\n                   (user.id,))\n"
  },
  {
    "path": "src/run_unittest.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport argparse\n\nparser = argparse.ArgumentParser()\n\nparser.add_argument(\"--coverage\", action=\"store_true\")\nparser.add_argument(\"test_module\")\nparser.add_argument(\"test_arguments\", nargs=argparse.REMAINDER)\n\narguments = parser.parse_args()\n\nif arguments.coverage:\n    from coverage import call\nelse:\n    def call(_, fn, *args, **kwargs):\n        fn(*args, **kwargs)\n\ndef execute(path, argv):\n    \"\"\"Load |path| and call main() in it with |argv| as arguments\n\n       If there is no main(), instead assume that each argument in |argv| is the\n       name of a test, and run each test by calling a function named the same as\n       the test, with no arguments.\"\"\"\n\n    module = {}\n    execfile(path, module)\n    if \"main\" in module:\n        module[\"main\"](argv)\n        return\n\n    for test in argv:\n        if test in module:\n            module[test]()\n\ntry:\n    call(\"unittest\", execute, arguments.test_module, arguments.test_arguments)\n    sys.exit(0)\nexcept Exception:\n    import traceback\n    traceback.print_exc()\n    sys.exit(1)\n"
  },
  {
    "path": "src/syntaxhighlight/__init__.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport os\nimport os.path\nimport bz2\n\nimport htmlutils\nimport textutils\nimport configuration\nimport diff.parse\n\nLANGUAGES = set()\n\nclass TokenTypes:\n    Operator = 1\n    Identifier = 2\n    Keyword = 3\n    Character = 4\n    String = 5\n    Comment = 6\n    Integer = 7\n    Float = 8\n    Preprocessing = 9\n\nTokenClassNames = {\n    TokenTypes.Operator: \"op\",\n    TokenTypes.Identifier: \"id\",\n    TokenTypes.Keyword: \"kw\",\n    TokenTypes.Character: \"chr\",\n    TokenTypes.String: \"str\",\n    TokenTypes.Comment: \"com\",\n    TokenTypes.Integer: \"int\",\n    TokenTypes.Float: \"fp\",\n    TokenTypes.Preprocessing: \"pp\",\n}\n\nclass HighlightRequested(Exception):\n    pass\n\ndef generateHighlightPath(sha1, language, mode=\"legacy\"):\n    if mode == \"json\":\n        suffix = \".json\"\n    else:\n        suffix = \"\"\n    return os.path.join(configuration.services.HIGHLIGHT[\"cache_dir\"], sha1[:2], sha1[2:] + \".\" + language + suffix)\n\ndef isHighlighted(sha1, language, mode=\"legacy\"):\n    path = generateHighlightPath(sha1, language, mode)\n    return os.path.isfile(path) or os.path.isfile(path + \".bz2\")\n\ndef wrap(raw_source, mode):\n    if mode == \"json\":\n        return \"\\n\".join(textutils.json_encode([[None, line]])\n                         for line in diff.parse.splitlines(raw_source))\n    return htmlutils.htmlify(raw_source)\n\ndef readHighlight(repository, sha1, path, language, request=False, mode=\"legacy\"):\n    from request import requestHighlights\n\n    async = mode == \"json\"\n    source = None\n\n    if language:\n        path = generateHighlightPath(sha1, language, mode)\n\n        if os.path.isfile(path):\n            os.utime(path, None)\n            source = open(path).read()\n        elif os.path.isfile(path + \".bz2\"):\n            os.utime(path + \".bz2\", None)\n            source = bz2.BZ2File(path + \".bz2\", \"r\").read()\n        elif request:\n            requestHighlights(repository, { sha1: (path, language) }, mode, async=async)\n            if mode == \"json\":\n                raise HighlightRequested()\n            return readHighlight(repository, sha1, path, language, False, mode)\n\n    if not source:\n        source = wrap(textutils.decode(repository.fetch(sha1).data), mode)\n\n    return source\n\n# Import for side-effects: these modules add strings to the LANGUAGES set to\n# indicate which languages they support highlighting.\nimport cpp\nimport generic\n"
  },
  {
    "path": "src/syntaxhighlight/clexer.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\n# Simple C/C++ lexer\n#\n# Module Contents\n# ===============\n#\n# split(source[, include_ws=True, include_comments=True])\n#\n#   Returns an iterator that returns the tokens in the C/C++ source as\n#   individual strings.  If 'include_ws' is true, sequences of whitespace\n#   (including linebreaks) separating tokens are returned as well.  If\n#   'include_comments' is true, comments are returned as individual tokens as\n#   well.\n#\n#   Preprocessor lines are returned as single tokens, starting with the first\n#   character of the line and ending before the linebreak character that ends\n#   the preprocessor directive; including any backslashes and linebreak\n#   characters following backslashes.\n#\n#   Character and string literal tokens are returned exactly as they occurred in\n#   the source string, with any escape sequences (including escaped linebreaks)\n#   preserved.\n#\n#   Escaped linebreaks outside of preprocessor directives and string literals\n#   are not handled; in practice the backslash will vanish and the linebreak\n#   will be returned as a whitespace token (possibly combined with any following\n#   whitespace.)  If whitespace preceded the backslash, there will be two\n#   separate whitespace tokens, split where the backslash was.\n#\n# tokenize(tokens[, filename=\"<unknown>\"])\n#\n#   Returns an iterator that returns each string returned by the iterable\n#   'tokens' converted into a Token object.  The token's have line and column\n#   numbers calculated assuming that the token sequence contains all tokens and\n#   whitespace sequences from a single file.  If not, the line and column\n#   numbers will not be correct.  The supplied 'filename' is also stored in each\n#   token.\n#\n# group(tokens[, groups={ '(':')', '{':'}', '[':']' }])\n#\n#   Returns a list containing all tokens returned by the iterable 'tokens',\n#   grouped into sublists according to 'groups', which should be a dictionary\n#   mapping group start tokens to group end tokens.\n#\n#   Each sublist created will start with a token from the set of keys in\n#   'groups' and end with the corresponding end token.  Every token returned by\n#   the iterable 'tokens' will occur exactly once in the tree of lists returned\n#   by this function.\n#\n#   Throws a CLexerException if an unexpected group end token is encountered, or\n#   if the sequence of tokens ends inside a group.\n#\n# group1(tokens, end[, groups={ '(':')', '{':'}', '[':']' }])\n#\n#   Like group(), but returns a tuple containing a single group (ending with the\n#   token 'end') and the actual token ending the group (since it's not included\n#   in the group.)  Typically 'tokens' will be an iterator that has just\n#   returned the corresponding start token.  Upon return, if 'tokens' is an\n#   iterator, it will just have returned a token identical to 'end'.\n#\n#   The returned list contains neither the group start or end token, and is\n#   grouped as if group() had been used to group that particular sequence of\n#   tokens.  (It is necessary to do this grouping anyway to identify the token\n#   that actually ends the group while ignoring sub-groupings using the same\n#   pair of start/end tokens.)\n#\n# partition(tokens, separator)\n#\n#   Splits the sequence of tokens returned by the iterable 'tokens' by the token\n#   'separator'.  Normally, the token sequence 'tokens' should be grouped using\n#   group() first, so that occurrences of 'separator' inside groups are ignored.\n#   The return value is a list of lists of tokens (or lists as created by\n#   group()).\n#\n#   If 'tokens' returns zero tokens, an empty list is returned.\n#\n# flatten(tokens)\n#\n#   Returns an iterator that returns each token returned by the iterable\n#   'tokens' while reversing the type of grouping done by group()/group1().\n#\n#   This means that flatten(group(tokens)) is a no-op.\n#\n# join(tokens[, insertSpaces=True])\n#\n#   Returns a string produced by concatenating all tokens returned by the\n#   iterable 'tokens'.  If 'insertSpaces' is true, a single space is inserted\n#   between each token unless there was whitespace strings in the token sequence\n#   there already.  Automatically flattens the token sequence first.\n#\n# CLexerException\n#\n#   Exception type thrown by the functions group() and group1().  Inherits\n#   the built-in Exception type and adds absolutely nothing.\n#\n# Token Objects\n# =============\n#\n# Token objects are created by the function tokenize(), but can also be\n# constructed manually using the constructor\n#\n# Token(value[, filename=\"<unknown>\", line=0, column=0])\n#\n# Token instances are comparable, hashable, are true (non-zero) unless they\n# represent whitespace or comments, and can be converted back to string form by\n# str().  In addition they support the following methods:\n#\n# filename()\n#\n#   Returns the filename from which the token stems.  In practice, either the\n#   filename passed to tokenize() or to the Token constructor.\n#\n# line()\n#\n#   Returns the line number (first line=1) at which the token occurred.\n#\n# column()\n#\n#   Returns the column number (first column=0) at which the token occurred.\n#\n# isidentifier()\n#\n#   Returns true if the token is an identifier.\n#\n# isspace()\n#\n#   Returns true if the token is whitespace.\n#\n# iscomment()\n#\n#   Returns true if the token is a comment.\n#\n# isppdirective()\n#\n#   Returns true if the token is a preprocessor directive.\n#\n# reduced()\n#\n#   Returns a string where special tokens (whitespace, comments and preprocessor\n#   directives) are converted to the shortest possible sequence of whitespace\n#   that preserves the line and column number of following tokens.\n\nimport re\nimport sys\nimport itertools\nimport traceback\n\ndef rejoin(items, escape):\n    if escape:\n        fixed = []\n        for item in sorted(items, key=len, reverse=True):\n            for ch in \"(){}[]*+?|.^$\": item = item.replace(ch, \"\\\\\" + ch)\n            fixed.append(item)\n        items = fixed\n    return \"|\".join(items)\n\nOPERATORS_AND_PUNCTUATORS = [ \"(\", \")\", \"{\", \"}\", \"[\", \"]\", \"<\", \">\", \"<=\", \">=\", \"<<\", \">>\", \"<<=\", \">>=\",\n                              \"+\", \"-\", \"*\", \"/\", \"%\", \"+=\", \"-=\", \"*=\", \"/=\", \"%=\", \"&&\", \"||\",\n                              \"&\", \"|\", \"^\", \"!\", \",\", \".\", \"::\", \":\", \";\", \"=\", \"==\", \"!=\",\n                              \"&=\", \"|=\", \"^=\", \"++\", \"--\", \"~\", \"?\", \"->\", \"->*\", \".*\", \"##\", \"#\" ];\n\nKEYWORDS = set([ \"asm\", \"do\", \"if\", \"return\", \"typedef\",\n                 \"auto\", \"double\", \"inline\", \"short\", \"typeid\",\n                 \"bool\", \"dynamic_cast\", \"int\", \"signed\", \"typename\",\n                 \"break\", \"else\", \"long\", \"sizeof\", \"union\",\n                 \"case\", \"enum\", \"mutable\", \"static\", \"unsigned\",\n                 \"catch\", \"explicit\", \"namespace\", \"static_cast\", \"using\",\n                 \"char\", \"export\", \"new\", \"struct\", \"virtual\",\n                 \"class\", \"extern\", \"operator\", \"switch\", \"void\",\n                 \"const\", \"false\", \"private\", \"template\", \"volatile\",\n                 \"const_cast\", \"float\", \"protected\", \"this\", \"wchar_t\",\n                 \"continue\", \"for\", \"public\", \"throw\", \"while\",\n                 \"default\", \"friend\", \"register\", \"true\",\n                 \"delete\", \"goto\", \"reinterpret_cast\", \"try\" ])\n\nCONFLICT_MARKER = \"^(?:<<<<<<<|=======|>>>>>>>)[^\\n]*$\"\nINT_LITERAL = \"(?:0|[1-9][0-9]*|0x[0-9a-fA-F]+)[lLuU]*(?![0-9a-zA-Z_\\\\.])\"\nFLOAT_LITERAL = \"(?:(?:[0-9]*\\\\.[0-9]+|[0-9]+\\\\.)(?:[eE][+-]?[0-9]+)?|[0-9]+[eE][+-]?[0-9]+)[fFlL]*(?![0-9a-zA-Z_])\"\nIDENTIFIER = \"[a-zA-Z_][a-zA-Z0-9_]*\"\nMULTILINE_COMMENT = \"/\\\\*.*?\\\\*/\"\nSINGLELINE_COMMENT = \"//.*?(?=\\n|$)\"\nSTRING_LITERAL = '\"(?:\\\\\\\\.|[^\"\\n])*\"'\nCHARACTER_LITERAL = \"'(?:\\\\\\\\.|[^'\\n])*'\"\nWIDE_STRING_LITERAL = 'L\"(?:\\\\\\\\.|[^\"\\n])*\"'\nWIDE_CHARACTER_LITERAL = \"L'(?:\\\\\\\\.|[^'\\n])*'\"\nPREPROCESSOR_DIRECTIVE = \"^[ \\t]*#(?:%s|%s|%s|%s|%s|%s|\\\\\\\\\\n|[^\\n])*\" % (MULTILINE_COMMENT, SINGLELINE_COMMENT, STRING_LITERAL, CHARACTER_LITERAL, WIDE_STRING_LITERAL, WIDE_CHARACTER_LITERAL)\nOPERATOR_OR_PUNCTUATOR = rejoin(OPERATORS_AND_PUNCTUATORS, escape=True)\nWHITESPACE = \"\\\\s+\"\nBYTE_ORDER_MARK = u\"\\ufeff\".encode(\"utf-8\")\n\nSUBPATTERNS = [FLOAT_LITERAL, INT_LITERAL, WIDE_STRING_LITERAL, WIDE_CHARACTER_LITERAL, IDENTIFIER, MULTILINE_COMMENT, SINGLELINE_COMMENT, PREPROCESSOR_DIRECTIVE, STRING_LITERAL, CHARACTER_LITERAL]\n\nRE_CTOKENS = re.compile(rejoin([CONFLICT_MARKER, IDENTIFIER, FLOAT_LITERAL, MULTILINE_COMMENT, SINGLELINE_COMMENT, PREPROCESSOR_DIRECTIVE, OPERATOR_OR_PUNCTUATOR, INT_LITERAL, STRING_LITERAL, CHARACTER_LITERAL, BYTE_ORDER_MARK, \".\"], escape=False), re.DOTALL | re.MULTILINE)\nRE_CTOKENS_INCLUDE_WS = re.compile(rejoin([CONFLICT_MARKER, IDENTIFIER, FLOAT_LITERAL, MULTILINE_COMMENT, SINGLELINE_COMMENT, PREPROCESSOR_DIRECTIVE, OPERATOR_OR_PUNCTUATOR, INT_LITERAL, STRING_LITERAL, CHARACTER_LITERAL, BYTE_ORDER_MARK, WHITESPACE, \".\"], escape=False), re.DOTALL | re.MULTILINE)\n\nRE_IDENTIFIER = re.compile(IDENTIFIER)\nRE_INT_LITERAL = re.compile(\"^\" + INT_LITERAL + \"$\")\nRE_FLOAT_LITERAL = re.compile(\"^\" + FLOAT_LITERAL + \"$\")\n\nclass CLexerException(Exception):\n    def __init__(self, message):\n        Exception.__init__(self, message)\n\nclass CLexerGroupingException(Exception):\n    def __init__(self, message, tokens):\n        Exception.__init__(self, message)\n        self.__tokens = tokens\n\n    def tokens(self):\n        return self.__tokens\n\ndef iskeyword(value): return (str(value)[0].isalpha() or str(value)[0] == \"_\") and str(value) in KEYWORDS\ndef isidentifier(value): return (str(value)[0].isalpha() or str(value)[0] == \"_\") and str(value) not in KEYWORDS\ndef isspace(value): return str(value).isspace()\ndef iscomment(value): return str(value)[0:2] in (\"/*\", \"//\")\ndef isppdirective(value): return str(value).lstrip(\" \\t\").startswith(\"#\")\ndef isconflictmarker(value):\n    value = str(value)\n    return value.startswith(\"<<<<<<<\") or value.startswith(\"=======\") or value.startswith(\">>>>>>>\")\ndef isint(value): return RE_INT_LITERAL.match(str(value)) is not None\ndef isfloat(value): return RE_FLOAT_LITERAL.match(str(value)) is not None\ndef isbyteordermark(value): return str(value) == BYTE_ORDER_MARK\n\ndef split(input, include_ws=True, include_comments=True):\n    if include_ws: expression = RE_CTOKENS_INCLUDE_WS\n    else: expression = RE_CTOKENS\n    tokens = itertools.imap(lambda match: match.group(0), expression.finditer(input))\n    if include_comments: return tokens\n    else: return itertools.ifilter(lambda token: not iscomment(token), tokens)\n\nclass Token:\n    def __init__(self, value, filename=\"<unknown>\", line=0, column=0):\n        self.__value = value\n        self.__filename = filename\n        self.__line = line\n        self.__column = column\n\n    def __cmp__(self, other):\n        return cmp(self.__value, other)\n\n    def __str__(self):\n        return self.__value\n\n    def __repr__(self):\n        return repr(self.__value)\n\n    def __hash__(self):\n        return hash(self.__value)\n\n    def __nonzero__(self):\n        return not (self.isspace() or self.iscomment())\n\n    def filename(self): return self.__filename\n    def line(self): return self.__line\n    def column(self): return self.__column\n\n    def iskeyword(self): return iskeyword(self.__value)\n    def isidentifier(self): return isidentifier(self.__value)\n    def isspace(self): return isspace(self.__value)\n    def iscomment(self): return iscomment(self.__value)\n    def isppdirective(self): return isppdirective(self.__value)\n    def isconflictmarker(self): return isconflictmarker(self.__value)\n    def isstring(self): return self.__value[0] == '\"' or self.__value.startswith('L\"')\n    def ischar(self): return self.__value[0] == \"'\" or self.__value.startswith(\"L'\")\n    def isint(self): return isint(self.__value)\n    def isfloat(self): return isfloat(self.__value)\n    def isbyteordermark(self): return isbyteordermark(self.__value)\n\n    def reduced(self):\n        if self.isspace() or self.iscomment():\n            if self.__value.startswith(\"//\"): return \"\"\n            else:\n                linebreaks = self.__value.count(\"\\n\")\n                if linebreaks:\n                    last = self.__value.rindex(\"\\n\")\n                    return \"\\n\" * linebreaks + \" \" * (len(self.__value) - last - 1)\n                else:\n                    return \" \" * len(self.__value)\n        elif self.isppdirective():\n            return \"\\n\" * self.__value.count(\"\\n\")\n        else:\n            return self.__value\n\ndef tokenize(tokens, filename=\"<unknown>\"):\n    line = 1\n    column = 0\n\n    for token in tokens:\n        if isinstance(token, Token):\n            yield token\n            token = str(token)\n        else: yield Token(token, filename, line, column)\n\n        linebreaks = token.count(\"\\n\")\n        if linebreaks:\n            line += linebreaks\n            column = len(token) - 1 - token.rindex(\"\\n\")\n        else:\n            column += len(token)\n\ndef locate(tokens, index):\n    line = 1\n    column = 0\n\n    for token_index, token in enumerate(flatten(tokens)):\n        if index == token_index: break\n        linebreaks = token.count(\"\\n\")\n        if linebreaks:\n            line += linebreaks\n            column = len(token) - 1 - token.rindex(\"\\n\")\n        else:\n            column += len(token)\n\n    return line, column\n\nDEFAULT_GROUP = {'(': ')', '{': '}', '[': ']'}\nDEFAULT_GROUP_REVERSE = {')': '(', '}': '{', ']': '['}\n\ndef group(tokens, groups=None):\n    if groups is None:\n        groups = DEFAULT_GROUP\n        reverse = DEFAULT_GROUP_REVERSE\n    else:\n        reverse = dict([(end, start) for start, end in groups.items()])\n\n    stack = [('<EOF>', [], -1)]\n    currentEnd = stack[-1][0]\n    currentList = stack[-1][1]\n\n    for index, token in enumerate(tokens):\n        if token in groups:\n            stack.append((groups[token], [token], index))\n            currentList = stack[-1][1]\n            currentEnd = stack[-1][0]\n        elif token == currentEnd:\n            currentList.append(token)\n            stack.pop()\n            stack[-1][1].append(currentList)\n            currentEnd = stack[-1][0]\n            currentList = stack[-1][1]\n        elif token in reverse:\n            if isinstance(token, Token):\n                line, column = token.line(), token.column()\n            else:\n                line, column = locate(tokens, index)\n            raise CLexerException(\"%d:%d: expected '%s', got '%s'\" % (line, column, currentEnd, token))\n        else:\n            currentList.append(token)\n\n    if len(stack) > 1:\n        token = stack[-1][1][0]\n        if isinstance(token, Token):\n            line, column = token.line(), token.column()\n        else:\n            line, column = locate(tokens, stack[-1][2])\n        raise CLexerException(\"%d:%d: unmatched group opener '%s'\" % (line, column, token))\n\n    return currentList\n\ndef group1(iterable, end, groups=None):\n    if groups is None:\n        groups = DEFAULT_GROUP\n        reverse = DEFAULT_GROUP_REVERSE\n    else:\n        reverse = dict([(end, start) for start, end in groups.items()])\n\n    stack = [('<EOF>', [])]\n    currentEnd = stack[-1][0]\n    currentList = stack[-1][1]\n\n    for token in iterable:\n        if token in groups:\n            stack.append((groups[token], [token]))\n            currentList = stack[-1][1]\n            currentEnd = stack[-1][0]\n        elif token == end and len(stack) == 1:\n            return currentList, token\n        elif token == currentEnd:\n            stack.pop()\n            currentList.append(token)\n            stack[-1][1].append(currentList)\n            currentEnd = stack[-1][0]\n            currentList = stack[-1][1]\n        elif token in reverse:\n            currentList.append(token)\n            while len(stack) > 1:\n                stack.pop()\n                stack[-1][1].append(currentList)\n                currentList = stack[-1][1]\n            raise CLexerGroupingException(\"expected '%s', got '%s'\" % (currentEnd, token), flatten(currentList))\n        else:\n            currentList.append(token)\n\n    while len(stack) > 1:\n        stack.pop()\n        stack[-1][1].append(currentList)\n        currentList = stack[-1][1]\n\n    token = stack[-1][1][0]\n\n    raise CLexerGroupingException(\"unmatched group opener '%s'\" % token, list(flatten(currentList)))\n\ndef partition(tokens, separator):\n    current = []\n    partitions = [current]\n    try:\n        tokens = iter(tokens)\n        while True:\n            token = tokens.next()\n            if token == separator:\n                current = []\n                partitions.append(current)\n            else:\n                current.append(token)\n    except StopIteration:\n        if len(partitions) == 1 and not partitions[0]: return []\n        else: return partitions\n\ndef flatten(tokens):\n    tokens = iter(tokens)\n\n    try:\n        while True:\n            token = tokens.next()\n            if isinstance(token, list):\n                tokens = itertools.chain(token, tokens)\n            else:\n                yield token\n    except StopIteration:\n        pass\n\ndef join(tokens, insertSpaces=True):\n    if insertSpaces:\n        result = \"\"\n        lastWasSpace = True\n        for token in flatten(tokens):\n            if not lastWasSpace and not token.isspace(): result += \" \"\n            result += str(token)\n            lastWasSpace = token.isspace()\n        return result\n    else:\n        return \"\".join(itertools.imap(str, flatten(tokens)))\n\n# Run regression tests if we're the main script and not being imported as a module.\nif __name__ == \"__main__\":\n    # The token expression does not match whitespace.\n    assert not RE_CTOKENS.match(\" \")\n    assert not RE_CTOKENS.match(\"\\t\")\n    assert not RE_CTOKENS.match(\"\\r\")\n    assert not RE_CTOKENS.match(\"\\n\")\n\n    def testToken(token, subpattern, rest=\"\", isOperator=False):\n        wholeMatch = RE_CTOKENS.match(token)\n        assert wholeMatch\n        assert wholeMatch.group(0) + rest == token\n\n        subMatch = re.match(subpattern, token, re.DOTALL | re.MULTILINE)\n        assert subMatch\n        assert subMatch.group(0) + rest == token\n\n        for other in SUBPATTERNS:\n            if other != subpattern and not (isOperator and other == PREPROCESSOR_DIRECTIVE and (token == \"#\" or token == \"##\")):\n                assert not re.match(other, token, re.DOTALL | re.MULTILINE)\n\n    for operatorOrPunctuator in OPERATORS_AND_PUNCTUATORS:\n        testToken(operatorOrPunctuator, OPERATOR_OR_PUNCTUATOR, isOperator=True)\n\n    testToken(\"0\", INT_LITERAL)\n    testToken(\"0u\", INT_LITERAL)\n    testToken(\"0l\", INT_LITERAL)\n    testToken(\"0ul\", INT_LITERAL)\n    testToken(\"0lu\", INT_LITERAL)\n\n    testToken(\"1\", INT_LITERAL)\n    testToken(\"4711\", INT_LITERAL)\n\n    testToken(\"0x0\", INT_LITERAL)\n    testToken(\"0xffffu\", INT_LITERAL)\n\n    testToken(\"0.\", FLOAT_LITERAL)\n    testToken(\"123.f\", FLOAT_LITERAL)\n    testToken(\"123.f\", FLOAT_LITERAL)\n    testToken(\"123.l\", FLOAT_LITERAL)\n    testToken(\"123.e1\", FLOAT_LITERAL)\n    testToken(\"123.e1f\", FLOAT_LITERAL)\n    testToken(\"123.e1l\", FLOAT_LITERAL)\n    testToken(\".0\", FLOAT_LITERAL)\n    testToken(\".123\", FLOAT_LITERAL)\n    testToken(\".123f\", FLOAT_LITERAL)\n    testToken(\".123l\", FLOAT_LITERAL)\n    testToken(\".123e1\", FLOAT_LITERAL)\n    testToken(\".123e1f\", FLOAT_LITERAL)\n    testToken(\".123e1l\", FLOAT_LITERAL)\n    testToken(\"0.0\", FLOAT_LITERAL)\n    testToken(\"123.123\", FLOAT_LITERAL)\n    testToken(\"123.123f\", FLOAT_LITERAL)\n    testToken(\"123.123l\", FLOAT_LITERAL)\n    testToken(\"123.123e1\", FLOAT_LITERAL)\n    testToken(\"123.123e1f\", FLOAT_LITERAL)\n    testToken(\"123.123e1l\", FLOAT_LITERAL)\n    testToken(\"0e1\", FLOAT_LITERAL)\n    testToken(\"123e1\", FLOAT_LITERAL)\n    testToken(\"123e1f\", FLOAT_LITERAL)\n    testToken(\"123e1l\", FLOAT_LITERAL)\n    testToken(\"123e100\", FLOAT_LITERAL)\n    testToken(\"123e+100\", FLOAT_LITERAL)\n    testToken(\"123e-100\", FLOAT_LITERAL)\n\n    testToken(\"i\", IDENTIFIER)\n    testToken(\"this_or_that\", IDENTIFIER)\n    testToken(\"_\", IDENTIFIER)\n    testToken(\"__i\", IDENTIFIER)\n    testToken(\"i1\", IDENTIFIER)\n\n    testToken(\"/**/\", MULTILINE_COMMENT)\n    testToken(\"/***/\", MULTILINE_COMMENT)\n    testToken(\"/****/\", MULTILINE_COMMENT)\n    testToken(\"/*****/\", MULTILINE_COMMENT)\n    testToken(\"/*foo*/\", MULTILINE_COMMENT)\n    testToken(\"/*foo\\nfoo\\nfoo*/\", MULTILINE_COMMENT)\n\n    testToken(\"//\", SINGLELINE_COMMENT)\n    testToken(\"///\", SINGLELINE_COMMENT)\n    testToken(\"////\", SINGLELINE_COMMENT)\n    testToken(\"//foo\", SINGLELINE_COMMENT)\n    testToken(\"//\\n\", SINGLELINE_COMMENT, \"\\n\")\n    testToken(\"///\\n\", SINGLELINE_COMMENT, \"\\n\")\n    testToken(\"////\\n\", SINGLELINE_COMMENT, \"\\n\")\n    testToken(\"//bar\\n\", SINGLELINE_COMMENT, \"\\n\")\n\n    testToken(\"#\", PREPROCESSOR_DIRECTIVE)\n    testToken(\"#foo\", PREPROCESSOR_DIRECTIVE)\n    testToken(\" #\", PREPROCESSOR_DIRECTIVE)\n    testToken(\" #foo\", PREPROCESSOR_DIRECTIVE)\n    testToken(\"#\\n\", PREPROCESSOR_DIRECTIVE, \"\\n\")\n    testToken(\"#foo\\n\", PREPROCESSOR_DIRECTIVE, \"\\n\")\n    testToken(\" #\\n\", PREPROCESSOR_DIRECTIVE, \"\\n\")\n    testToken(\" #foo\\n\", PREPROCESSOR_DIRECTIVE, \"\\n\")\n\n    testToken('\"\"', STRING_LITERAL)\n    testToken('\"foo\"', STRING_LITERAL)\n    testToken('\"\\\\\"\\\\\"\\\\\"\"', STRING_LITERAL)\n\n    testToken(\"''\", CHARACTER_LITERAL)\n    testToken(\"'foo'\", CHARACTER_LITERAL)\n    testToken(\"'\\\\'\\\\'\\\\''\", CHARACTER_LITERAL)\n"
  },
  {
    "path": "src/syntaxhighlight/context.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport os\n\nimport syntaxhighlight\nimport configuration\n\ndef importCodeContexts(db, sha1, language):\n    codecontexts_path = syntaxhighlight.generateHighlightPath(sha1, language) + \".ctx\"\n\n    if os.path.isfile(codecontexts_path):\n        contexts_values = []\n\n        for line in open(codecontexts_path):\n            line = line.strip()\n\n            first_line, last_line, context = line.split(\" \", 2)\n            if len(context) > configuration.services.HIGHLIGHT[\"max_context_length\"]:\n                context = context[:configuration.services.HIGHLIGHT[\"max_context_length\"] - 3] + \"...\"\n            contexts_values.append((sha1, context, int(first_line), int(last_line)))\n\n        cursor = db.cursor()\n        cursor.execute(\"DELETE FROM codecontexts WHERE sha1=%s\", [sha1])\n        cursor.executemany(\"INSERT INTO codecontexts (sha1, context, first_line, last_line) VALUES (%s, %s, %s, %s)\", contexts_values)\n        db.commit()\n\n        os.unlink(codecontexts_path)\n\n        return len(contexts_values)\n    else:\n        return 0\n"
  },
  {
    "path": "src/syntaxhighlight/cpp.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport syntaxhighlight\nimport syntaxhighlight.clexer\nimport htmlutils\nimport configuration\n\nfrom syntaxhighlight import TokenTypes\n\nclass HighlightCPP:\n    def highlightToken(self, token):\n        if token.iskeyword():\n            self.outputter.writeSingleline(TokenTypes.Keyword, str(token))\n        elif token.isidentifier():\n            self.outputter.writeSingleline(TokenTypes.Identifier, str(token))\n        elif token.iscomment():\n            if str(token)[0:2] == \"/*\":\n                self.outputter.writeMultiline(TokenTypes.Comment, str(token))\n            else:\n                self.outputter.writeSingleline(TokenTypes.Comment, str(token))\n        elif token.isppdirective():\n            self.outputter.writeMultiline(TokenTypes.Preprocessing, str(token))\n        elif token.isspace():\n            self.outputter.writePlain(str(token))\n        elif token.isconflictmarker():\n            self.outputter.writePlain(str(token))\n        elif str(token)[0] == '\"':\n            self.outputter.writeSingleline(TokenTypes.String, str(token))\n        elif str(token)[0] == \"'\":\n            self.outputter.writeSingleline(TokenTypes.Character, str(token))\n        elif token.isfloat():\n            self.outputter.writeSingleline(TokenTypes.Float, str(token))\n        elif token.isint():\n            self.outputter.writeSingleline(TokenTypes.Integer, str(token))\n        elif token.isbyteordermark():\n            self.outputter.writePlain(u\"\\ufeff\")\n        else:\n            self.outputter.writeSingleline(TokenTypes.Operator, str(token))\n\n    def outputContext(self, tokens, terminator):\n        if not self.contexts: return\n\n        def spaceBetween(first, second):\n            # Never insert spaces around the :: operator.\n            if first == '::' or second == '::':\n                return False\n\n            # Always a space after a comma.\n            if first == ',':\n                return True\n\n            # Always a space before a keyword or identifier, unless preceded by *, & or (.\n            if second.iskeyword() or second.isidentifier():\n                return str(first) not in ('*', '&', '(')\n\n            # Always a space before a * or &, unless preceded by (another) *.\n            if (second == '*' or second == '&') and first != '*':\n                return True\n\n            # Always spaces around equal signs.\n            if first == '=' or second == '=':\n                return True\n\n            # No spaces between by default.\n            return False\n\n        first_line = tokens[-1].line() + 1\n        last_line = terminator.line()\n\n        if last_line - first_line >= configuration.services.HIGHLIGHT[\"min_context_length\"]:\n            previous = tokens[0]\n            context = str(previous)\n\n            for token in tokens[1:]:\n                if token.isspace() or token.iscomment(): continue\n                if spaceBetween(previous, token): context += \" \"\n                context += str(token)\n                previous = token\n\n            self.contexts.write(\"%d %d %s\\n\" % (first_line, last_line, context))\n\n    def processTokens(self, tokens):\n        currentContexts = []\n        nextContext = []\n        nextContextClosed = False\n        level = 0\n\n        for token in tokens:\n            self.highlightToken(token)\n\n            if token.isspace() or token.iscomment() or token.isppdirective() or token.isconflictmarker():\n                pass\n            elif token.iskeyword():\n                if str(token) in (\"if\", \"else\", \"for\", \"while\", \"do\", \"switch\", \"return\", \"break\", \"continue\"):\n                    nextContext = None\n                    nextContextClosed = True\n                elif not nextContextClosed:\n                    nextContext.append(token)\n            elif token.isidentifier():\n                if not nextContextClosed:\n                    nextContext.append(token)\n            elif token == '{':\n                if nextContext:\n                    currentContexts.append([nextContext, level])\n                    nextContext = []\n                    nextContextClosed = False\n                level += 1\n            elif token == '}':\n                level -= 1\n                if currentContexts and currentContexts[-1][1] == level:\n                    thisContext = currentContexts.pop()\n                    self.outputContext(thisContext[0], token)\n                nextContext = []\n                nextContextClosed = False\n            elif nextContext:\n                if token == ',' and not nextContextClosed:\n                    nextContext = None\n                    nextContextClosed = True\n                elif token == ':':\n                    nextContextClosed = True\n                elif token == ';':\n                    nextContext = []\n                    nextContextClosed = False\n                elif token == '(':\n                    if not nextContextClosed:\n                        nextContext.append(token)\n                        try:\n                            group, token = syntaxhighlight.clexer.group1(tokens, ')')\n                            group = list(syntaxhighlight.clexer.flatten(group)) + [token]\n                            nextContext.extend(group)\n                            for token in group: self.highlightToken(token)\n                        except syntaxhighlight.clexer.CLexerGroupingException as error:\n                            for token in error.tokens(): self.highlightToken(token)\n                            nextContext = []\n                            nextContextClosed = False\n                elif not nextContextClosed:\n                    nextContext.append(token)\n\n    def __call__(self, source, outputter, contexts_path):\n        source = source.encode(\"utf-8\")\n        self.outputter = outputter\n        if contexts_path: self.contexts = open(contexts_path, \"w\")\n        else: self.contexts = None\n        self.processTokens(syntaxhighlight.clexer.tokenize(syntaxhighlight.clexer.split(source)))\n        if contexts_path: self.contexts.close()\n\n    @staticmethod\n    def create(language):\n        if language == \"c++\": return HighlightCPP()\n        else: return None\n\nsyntaxhighlight.LANGUAGES.add(\"c++\")\n"
  },
  {
    "path": "src/syntaxhighlight/generate.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport os\nimport errno\n\nimport syntaxhighlight\nimport gitutils\nimport textutils\nimport htmlutils\nimport diff.parse\n\nfrom syntaxhighlight import TokenClassNames\n\ndef createHighlighter(language):\n    import cpp\n    highlighter = cpp.HighlightCPP.create(language)\n    if highlighter: return highlighter\n\n    import generic\n    highlighter = generic.HighlightGeneric.create(language)\n    if highlighter: return highlighter\n\nclass Outputter(object):\n    def __init__(self, output_file):\n        self.output_file = output_file\n\n    def writeMultiline(self, token_type, content):\n        parts = content.split(\"\\n\")\n        for part in parts[:-1]:\n            if part:\n                self._writePart(token_type, part)\n            self._endLine()\n        if parts[-1]:\n            self._writePart(token_type, parts[-1])\n\n    def writeSingleline(self, token_type, content):\n        assert \"\\n\" not in content\n        self._writePart(token_type, content)\n\n    def writePlain(self, content):\n        parts = content.split(\"\\n\")\n        for part in parts[:-1]:\n            if part:\n                self._writePlain(part)\n            self._endLine()\n        if parts[-1]:\n            self._writePlain(parts[-1])\n\n    def flush(self):\n        self._flush()\n        self.output_file.close()\n\nclass HTMLOutputter(Outputter):\n    def _writePart(self, token_type, content):\n        self.output_file.write(\n            \"<b class='%s'>%s</b>\"\n            % (TokenClassNames[token_type], htmlutils.htmlify(content)))\n\n    def _writePlain(self, content):\n        self.output_file.write(htmlutils.htmlify(content))\n\n    def _endLine(self):\n        self.output_file.write(\"\\n\")\n\n    def _flush(self):\n        pass\n\nclass JSONOutputter(Outputter):\n    def __init__(self, output_file):\n        super(JSONOutputter, self).__init__(output_file)\n        self.line = []\n\n    def _writePart(self, token_type, content):\n        if self.line \\\n                and isinstance(self.line[-1], list) \\\n                and self.line[-1][0] == token_type:\n            self.line[-1][1] += content\n        else:\n            self.line.append([token_type, content])\n\n    def _writePlain(self, content):\n        self.line.append([None, content])\n\n    def _endLine(self):\n        self.output_file.write(textutils.json_encode(self.line) + \"\\n\")\n        self.line = []\n\n    def _flush(self):\n        if self.line:\n            self._endLine()\n\ndef generateHighlight(repository_path, sha1, language, mode, output_file=None):\n    highlighter = createHighlighter(language)\n    if not highlighter: return False\n\n    source = gitutils.Repository.readObject(repository_path, \"blob\", sha1)\n    source = textutils.decode(source)\n\n    if output_file:\n        highlighter(source, output_file, None)\n    else:\n        output_path = syntaxhighlight.generateHighlightPath(sha1, language, mode)\n\n        try: os.makedirs(os.path.dirname(output_path), 0750)\n        except OSError as error:\n            if error.errno == errno.EEXIST: pass\n            else: raise\n\n        output_file = open(output_path + \".tmp\", \"w\")\n        contexts_path = output_path + \".ctx\"\n\n        if mode == \"json\":\n            outputter = JSONOutputter(output_file)\n        else:\n            outputter = HTMLOutputter(output_file)\n\n        highlighter(source, outputter, contexts_path)\n\n        output_file.close()\n\n        os.chmod(output_path + \".tmp\", 0660)\n        os.rename(output_path + \".tmp\", output_path)\n\n    return True\n"
  },
  {
    "path": "src/syntaxhighlight/generic.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport re\n\ntry:\n    import pygments.lexers\n    import pygments.token\nexcept ImportError:\n    LANGUAGES = {}\nelse:\n    LANGUAGES = { \"python\": pygments.lexers.PythonLexer,\n                  \"perl\": pygments.lexers.PerlLexer,\n                  \"java\": pygments.lexers.JavaLexer,\n                  \"ruby\": pygments.lexers.RubyLexer,\n                  \"php\": pygments.lexers.PhpLexer,\n                  \"makefile\": pygments.lexers.MakefileLexer,\n                  \"javascript\": pygments.lexers.JavascriptLexer,\n                  \"sql\": pygments.lexers.SqlLexer,\n                  \"objective-c\": pygments.lexers.ObjectiveCLexer,\n                  \"xml\": pygments.lexers.XmlLexer }\n\nimport htmlutils\n\nfrom syntaxhighlight import TokenTypes\n\nclass HighlightGeneric:\n    def __init__(self, lexer):\n        self.lexer = lexer\n\n    def highlightToken(self, token, value):\n        value = value.encode(\"utf-8\")\n\n        if token in pygments.token.Token.Punctuation or token in pygments.token.Token.Operator:\n            self.outputter.writeSingleline(TokenTypes.Operator, value)\n        elif token in pygments.token.Token.Name or token in pygments.token.Token.String.Symbol:\n            self.outputter.writeSingleline(TokenTypes.Identifier, value)\n        elif token in pygments.token.Token.Keyword:\n            self.outputter.writeSingleline(TokenTypes.Keyword, value)\n        elif token in pygments.token.Token.String:\n            self.outputter.writeMultiline(TokenTypes.String, value)\n        elif token in pygments.token.Token.Comment:\n            self.outputter.writeMultiline(TokenTypes.Comment, value)\n        elif token in pygments.token.Token.Number.Integer:\n            self.outputter.writeSingleline(TokenTypes.Integer, value)\n        elif token in pygments.token.Token.Number.Float:\n            self.outputter.writeSingleline(TokenTypes.Float, value)\n        else:\n            self.outputter.writePlain(value)\n\n    def __call__(self, source, outputter, contexts_path):\n        self.outputter = outputter\n\n        blocks = re.split(\"^((?:<<<<<<<|>>>>>>>)[^\\n]*\\n)\", source, flags=re.MULTILINE)\n\n        in_conflict = False\n\n        for index, block in enumerate(blocks):\n            if (index & 1) == 0:\n                if in_conflict:\n                    blocks = re.split(\"^(=======[^\\n]*\\n)\", block, flags=re.MULTILINE)\n                else:\n                    blocks = [block]\n\n                for index, block in enumerate(blocks):\n                    if (index & 1) == 0:\n                        if block:\n                            for token, value in self.lexer.get_tokens(block):\n                                self.highlightToken(token, value)\n                    else:\n                        assert block[0] == \"=\"\n                        self.outputter.writePlain(block)\n            else:\n                assert block[0] == \"<\" or block[0] == \">\"\n                self.outputter.writePlain(block)\n                in_conflict = block[0] == \"<\"\n\n    @staticmethod\n    def create(language):\n        lexer = LANGUAGES.get(language)\n        if lexer: return HighlightGeneric(lexer(stripnl=False))\n        else: return None\n\nimport syntaxhighlight\n\nsyntaxhighlight.LANGUAGES.update(LANGUAGES.keys())\n"
  },
  {
    "path": "src/syntaxhighlight/request.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport socket\n\nimport base\nimport configuration\nimport syntaxhighlight\n\nfrom textutils import json_encode, json_decode\n\nclass HighlightBackgroundServiceError(base.ImplementationError):\n    def __init__(self, message):\n        super(HighlightBackgroundServiceError, self).__init__(\n            \"Highlight background service failed: %s\" % message)\n\ndef requestHighlights(repository, sha1s, mode, async=False):\n    requests = [\n        {\n            \"repository_path\": repository.path,\n            \"sha1\": sha1,\n            \"path\": path,\n            \"language\": language,\n            \"mode\": mode\n        }\n        for sha1, (path, language) in sha1s.items()\n        if not syntaxhighlight.isHighlighted(sha1, language, mode)\n    ]\n\n    if not requests:\n        return False\n\n    try:\n        connection = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM)\n        connection.connect(configuration.services.HIGHLIGHT[\"address\"])\n        connection.send(json_encode({\n            \"requests\": requests,\n            \"async\": async\n        }))\n        connection.shutdown(socket.SHUT_WR)\n\n        data = \"\"\n\n        while True:\n            received = connection.recv(4096)\n            if not received: break\n            data += received\n\n        connection.close()\n    except EnvironmentError as error:\n        raise HighlightBackgroundServiceError(str(error))\n\n    if async:\n        return True\n\n    if not data:\n        raise HighlightBackgroundServiceError(\n            \"returned an invalid response (no response)\")\n\n    try:\n        results = json_decode(data)\n    except ValueError:\n        raise HighlightBackgroundServiceError(\n            \"returned an invalid response (%r)\" % data)\n\n    if type(results) != list:\n        # If not a list, the result is probably an error message.\n        raise HighlightBackgroundServiceError(str(results))\n\n    if len(results) != len(requests):\n        raise HighlightBackgroundServiceError(\"didn't process all requests\")\n\n    return True\n"
  },
  {
    "path": "src/textformatting.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport re\n\nimport configuration\nimport dbutils\nimport gitutils\nimport htmlutils\n\ndef renderFormatted(db, user, table, lines, toc=False, title_right=None):\n    re_h1 = re.compile(\"^=+$\")\n    re_h2 = re.compile(\"^-+$\")\n    data = { \"configuration.URL\": dbutils.getURLPrefix(db, user),\n             \"configuration.base.HOSTNAME\": configuration.base.HOSTNAME,\n             \"configuration.base.SYSTEM_USER_NAME\": configuration.base.SYSTEM_USER_NAME,\n             \"configuration.base.SYSTEM_GROUP_NAME\": configuration.base.SYSTEM_GROUP_NAME,\n             \"configuration.paths.CONFIG_DIR\": configuration.paths.CONFIG_DIR,\n             \"configuration.paths.INSTALL_DIR\": configuration.paths.INSTALL_DIR,\n             \"configuration.paths.DATA_DIR\": configuration.paths.DATA_DIR,\n             \"configuration.paths.GIT_DIR\": configuration.paths.GIT_DIR }\n\n    references = {}\n    blocks = []\n    block = []\n\n    for line in lines:\n        match = re.match(r'\\[(.*?)\\]: (.*?)(?: \"(.*?)\")?$', line)\n        if match:\n            name, url, title = match.groups()\n            references[name] = (url, title)\n            continue\n\n        if line.strip():\n            block.append(line % data)\n        elif block:\n            blocks.append(block)\n            block = []\n    else:\n        if block:\n            blocks.append(block)\n\n    text = None\n\n    for block in blocks:\n        def textToId(text):\n            return text.lower().replace(' ', '_')\n\n        if len(block) == 2:\n            if re_h1.match(block[1]):\n                table.setTitle(block[0])\n                h1 = table.tr(\"h1\").td(\"h1\").h1(id=textToId(block[0]))\n                h1.text(block[0])\n                if title_right:\n                    span_right = h1.span(\"right\")\n                    if callable(title_right):\n                        title_right(span_right)\n                    else:\n                        span_right.text(title_right)\n                text = None\n                if toc:\n                    toc = table.tr(\"toc\").td(\"toc\").div().table(\"toc callout\")\n                    toc.tr(\"heading\").th().text(\"Table of contents\")\n                continue\n            elif re_h2.match(block[1]):\n                if toc: toc.tr(\"h2\").td().a(href=\"#\" + textToId(block[0])).text(block[0])\n                table.tr(\"h2\").td(\"h2\").div().h2(id=textToId(block[0])).text(block[0])\n                text = None\n                continue\n\n        if len(block) == 1 and block[0] == \"[repositories]\":\n            text = None\n\n            repositories = table.tr().td().table(\"repositories callout\")\n            headings = repositories.thead().tr()\n            headings.th(\"name\").text(\"Short name\")\n            headings.th(\"path\").text(\"Repository path\")\n\n            repositories.tr().td(colspan=2)\n\n            cursor = db.cursor()\n            cursor.execute(\"SELECT name, path FROM repositories ORDER BY id ASC\")\n\n            for name, path in cursor:\n                row = repositories.tr(\"repository\")\n                row.td(\"name\").text(name)\n                row.td(\"path\").text(gitutils.Repository.constructURL(db, user, path))\n\n            continue\n\n        if not text:\n            text = table.tr(\"text\").td(\"text\")\n\n        def translateLinks(text):\n            def linkify(match):\n                config_item, reference_text, reference_name = match.groups()\n\n                if config_item:\n                    url = \"/config?highlight=%s\" % config_item\n                    text = config_item\n                    title = None\n                else:\n                    reference_name = re.sub(r\"\\s+\", \" \", reference_name)\n                    assert reference_name in references, reference_name\n                    url, title = references[reference_name]\n                    text = reference_text\n\n                link = \"<a href=%s\" % htmlutils.htmlify(url, True)\n\n                if title:\n                    link += \" title=%s\" % htmlutils.htmlify(title, True)\n\n                return link + \">%s</a>\" % htmlutils.htmlify(text)\n\n            return re.sub(r\"CONFIG\\(([^)]+)\\)|\\[(.*?)\\]\\n?\\[(.*?)\\]\", linkify, text, flags=re.DOTALL)\n\n        def processText(lines):\n            if isinstance(lines, basestring):\n                lines = [lines]\n            for index, line in enumerate(lines):\n                if line.startswith(\"  http\"):\n                    lines[index] = \"<a href='%s'>%s</a>\" % (line.strip(), line.strip())\n            text = translateLinks(\"\\n\".join(lines))\n\n            # Replace double dashes with &mdash;, but only if they are\n            # surrounded by either spaces or word characters on both sides.\n            #\n            # We don't want to translate the double dashes in a\n            # --command-line-argument used in the text.\n            text = re.sub(r\"(^| )--( |$)\", r\"\\1&mdash;\\2\", text, flags=re.MULTILINE)\n            text = re.sub(r\"(\\w)--(\\w)\", r\"\\1&mdash;\\2\", text)\n\n            return text\n\n        if len(block) > 2 and re_h2.match(block[1]):\n            if toc: toc.tr(\"h3\").td().a(href=\"#\" + textToId(block[0])).text(block[0])\n            div = text.div()\n            div.h3(id=textToId(block[0])).text(block[0])\n            block = block[2:]\n\n        if block[0].startswith(\"|\"):\n            pre = text.div().table(\"pre callout\").tr().td().preformatted()\n            contents = \"\\n\".join([line[2:] for line in block])\n            if block[0].startswith(\"||\"):\n                pre.innerHTML(contents)\n            else:\n                pre.text(contents)\n        elif block[0].startswith(\"* \") or block[0].startswith(\"1 \"):\n            if block[0].startswith(\"* \"):\n                items = text.div().ul()\n            else:\n                items = text.div().ol()\n            item = []\n            for line in block:\n                if line[:2] != '  ':\n                    if item:\n                        items.li().text(processText(item), cdata=True)\n                    item = []\n                else:\n                    assert line[:2] == \"  \"\n                item.append(line[2:])\n            if item:\n                items.li().text(processText(item), cdata=True)\n        elif block[0].startswith(\"? \"):\n            items = text.div().dl()\n            term = []\n            definition = None\n            for line in block:\n                if line[:2] == '? ':\n                    if definition:\n                        items.dt().text(processText(\" \".join(term)), cdata=True)\n                        items.dd().text(processText(definition), cdata=True)\n                        definition = None\n                    term = [line[2:]]\n                elif line[:2] == '= ':\n                    assert term\n                    assert definition is None\n                    definition = [line[2:]]\n                elif definition is None:\n                    term.append(line[2:])\n                else:\n                    definition.append(line[2:])\n            items.dt().text(processText(\" \".join(term)), cdata=True)\n            items.dd().text(processText(definition), cdata=True)\n        elif block[0].startswith(\"  \"):\n            text_data = translateLinks(\"\\n\".join(block))\n            if block[0].startswith(\"  <code>\"):\n                className = \"example\"\n            else:\n                className = \"hint\"\n                text_data = text_data.replace(\"--\", \"&mdash;\")\n            text.div().div(className).text(text_data, cdata=True)\n        else:\n            text.div().text(processText(block), cdata=True)\n"
  },
  {
    "path": "src/textutils.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport re\nimport json\nimport unicodedata\n\ntry:\n    import configuration\n\n    DEFAULT_ENCODINGS = configuration.base.DEFAULT_ENCODINGS[:]\nexcept ImportError:\n    # This is the default set of default encodings.  We could fail to\n    # import 'configuration' for two principal reasons:\n    #\n    #  1) There's some catastrophic problem with the system.  Ignoring\n    #     the problem here won't make the least bit of difference.\n    #\n    #  2) We're trying to run unit tests without an installed system.\n    #     This fallback is simply nice in that case.\n\n    DEFAULT_ENCODINGS = [\"utf-8\"]\n\ndef reflow(text, line_length=80, indent=0):\n    if line_length == 0: return text\n\n    paragraphs = re.split(\"\\n\\n+\", text.replace(\"\\r\", \"\"))\n    spaces = \" \" * indent\n\n    for paragraph_index, paragraph in enumerate(paragraphs):\n        lines = paragraph.split(\"\\n\")\n        for line_index, line in enumerate(lines):\n            if (line and line[0] in \" \\t-*\") or line_index < len(lines) - 1 and len(line) < 0.5 * line_length:\n                if indent:\n                    paragraphs[paragraph_index] = \"\\n\".join([spaces + line for line in lines])\n                break\n        else:\n            lines = []\n            line = spaces\n            words = re.split(\"(\\s+)\", paragraph)\n            ws = \"\"\n            for word in words:\n                if not word.strip():\n                    if \"\\n\" in word: ws = \" \"\n                    else: ws = word\n                else:\n                    if len(line) > indent and len(line) + len(ws) + len(word) > line_length:\n                        lines.append(line)\n                        line = spaces\n                    if len(line) > indent: line += ws\n                    line += word\n            if line: lines.append(line)\n            paragraphs[paragraph_index] = \"\\n\".join(lines)\n\n    text = \"\\n\\n\".join(paragraphs)\n\n    if isinstance(text, unicode): return text.encode(\"utf-8\")\n    else: return text\n\ndef indent(string, width=4):\n    prefix = \" \" * width\n    return prefix + (\"\\n\" + prefix).join(string.splitlines())\n\ndef summarize(string, max_length=80, as_html=False):\n    if len(string) <= max_length:\n        return string\n    string = string[:max_length - 5]\n    if as_html:\n        import htmlutils\n        return htmlutils.htmlify(string) + \"[&#8230;]\"\n    else:\n        return string + \"[...]\"\n\ndef escape(text):\n    special = { \"\\a\": \"a\",\n                \"\\b\": \"b\",\n                \"\\t\": \"t\",\n                \"\\n\": \"n\",\n                \"\\v\": \"v\",\n                \"\\f\": \"f\",\n                \"\\r\": \"r\" }\n\n    def escape1(match):\n        substring = match.group(0)\n\n        if ord(substring) < 128:\n            if substring in special:\n                return \"\\\\%s\" % special[substring]\n            elif ord(substring) < 32:\n                return \"\\\\x%02x\" % ord(substring)\n            else:\n                return substring\n\n        category = unicodedata.category(substring)\n        if category[0] in \"CZ\" or category == \"So\":\n            if ord(substring) < 256:\n                return \"\\\\x%02x\" % ord(substring)\n            elif ord(substring) < 65536:\n                return \"\\\\u%04x\" % ord(substring)\n            else:\n                return \"\\\\U%08x\" % ord(substring)\n        else:\n            return substring\n\n    text = decode(str(text))\n    escaped = re.sub(\"\\W\", escape1, text, flags=re.UNICODE)\n\n    return escaped.encode(\"utf-8\")\n\njson_encode = json.dumps\n\ndef deunicode(v):\n    if type(v) == unicode: return v.encode(\"utf-8\")\n    elif type(v) == list: return map(deunicode, v)\n    elif type(v) == dict: return dict([(deunicode(a), deunicode(b)) for a, b in v.items()])\n    else: return v\n\ndef json_decode(s):\n    return deunicode(json.loads(s))\n\ndef decode(text):\n    if isinstance(text, unicode):\n        return text\n\n    text = str(text)\n\n    for index, encoding in enumerate(DEFAULT_ENCODINGS):\n        try:\n            decoded = text.decode(encoding)\n        except UnicodeDecodeError:\n            continue\n        except LookupError:\n            del DEFAULT_ENCODINGS[index]\n        else:\n            # Replace characters in the surrogate pair range with U+FFFD since\n            # PostgreSQL's UTF-8 decoder won't accept them.\n            return re.sub(u\"[\\ud800-\\udfff]\", \"\\ufffd\", decoded)\n\n    return text.decode(\"ascii\", errors=\"replace\")\n\ndef encode(text):\n    if isinstance(text, unicode):\n        return text.encode(\"utf-8\")\n    return str(text)\n"
  },
  {
    "path": "src/textutils_unittest.py",
    "content": "def independence():\n    # Simply check that textutils can be imported.\n\n    import textutils\n\n    print \"independence: ok\"\n"
  },
  {
    "path": "src/tutorials/administration.txt",
    "content": "System Administration\n=====================\n\nUpgrading Critic\n----------------\n\nTo install a newer version of Critic, simply check out a newer commit in the\n<code>critic.git</code> repository clone from which Critic was installed, and\nthen run the command\n\n| python upgrade.py\n\nas root.  If the repository clone from which Critic was installed has been\ndeleted, upgrading Critic from a newly cloned repository will usually work as\nwell.  The only case in which this does not work is if the commit that was\ninstalled (or last upgraded to) is not available in the newly cloned repository.\n\nUpgrading Actions\n-----------------\nThe actions taken by the <code>upgrade.py</code> are roughly these, in this\norder:\n\n1 Offer to create a backup copy (dump) of Critic's database,\n2 install additional prerequisite software the the new version depends on,\n3 ask questions for new configuration settings in the new version,\n4 copy updated source files (to\n    <code>%(configuration.paths.INSTALL_DIR)s/</code>),\n5 write updated configuration files (to\n    <code>%(configuration.paths.CONFIG_DIR)s/configuration/</code>)\n  and other system files (e.g. Apache site definition file, SysV init script,\n  <code>/usr/bin/criticctl</code>),\n6 restart Apache and Critic background services, and\n7 run migration scripts to update the database schema or perform other one-time\n  system updating tasks.\n\nEach of these steps could be empty, and thus skipped, depending on the changes\nbetween the currently installed version of Critic and the new version.\n\nLocally Modified Files\n----------------------\nIf the upgrade process wants to write an updated version of an installed file,\nit will first check if the currently installed file has been modified since it\nwas installed.  No local modification are reverted without asking first.  Note\nthough that modified configuration files can be modified without asking if this\ncan be done in a way that does not conflict with, and that preserves, the local\nmodifications.\n\nMigration Scripts\n-----------------\nSome changes to the system requires custom modifications during an upgrade to\nadjust the installed system to match how the new version of Critic would have\nset the system up if it had been installed from scratch.  The most common such\nchange is changes to the database schema, such as adding new tables or new\ncolumns to existing tables.\n\nThis type of custom modifications are performed by running small one-time\nscripts called migration scripts.  When upgrading to a new version of Critic,\nall migration scripts that exist in the new version that didn't exist in the\npreviously installed version are executed as the final step of the upgrade.\n\nIf the upgrade process would run any migration script that modifies the database\nschema, the offer to create a backup copy of Critic's database at the beginning\nof the upgrade process has \"yes\" as its default choice instead of \"no\" as it\notherwise has.  It is strongly recommended that the offer is accepted, since the\nchanges made by such a migration script are generally not reversible in any\nother than to restore an earlier backup copy of the database.\n\nChange Host Web Server\n----------------------\nCritic's installation/upgrading scripts support a few different host web server\nconfigurations:\n\n* Apache 2.x (with mod_wsgi)\n* nginx (HTTP front-end) + uWSGI (WSGI back-end)\n* uWSGI (as both HTTP front-end and WSGI back-end)\n\nThe Apache configuration used to be the only supported, so an old Critic system\nprobably runs that one.  The nginx+uWSGI configuration is the currently\npreferred (and default) one, but the installation script will ask which one to\nuse when a new system is installed.\n\nThe <code>upgrade.py</code> script can be used switch to a different\nconfiguration, i.e. replicate how a new install would have configured the\ndesired new web server configuration.  Doing this requires just a little bit of\nmanual administration work:\n\n1 Stop the current host web server.  This is required so that it stops listening\n  at the HTTP/HTTPS ports, so that the new web server can do so instead.  You\n  can typically do that by running the command\n    <code>service &lt;name&gt; stop</code>\n  as root, where <i>name</i> is one of <code>apache2</code>, <code>nginx</code>\n  and <code>uwsgi</code>.\n2 Edit <code>%(configuration.paths.CONFIG_DIR)s/configuration/base.py</code>,\n  changing the <code>WEB_SERVER_INTEGRATION</code> setting to the desired value.\n  See the inline documentation in the file for details.\n3 Run <code>python upgrade.py</code> as root.  Note that unless you're also\n  upgrading Critic to a new version, you may need to add the\n  <code>--force</code> command-line argument to force the upgrading script to\n  run despite there being no changes to install.\n\nThe upgrading script will, if necessary, install the new web server, and\nthen install configuration files for it.\n\n  Note that the script may warn that it failed to stop the <i>new</i> web server\n  before the actual upgrade.  This is expected, since the new web server may not\n  have been running, or even been installed, at that time.\n\nAfter this, Critic should be set up and running in the new web server.  You may\nneed to make sure the previous web server doesn't start up again the next time\nthe system boots.  The easiest way to do that is probably to uninstall it\ncompletely.\n\nConfiguring Critic\n------------------\n\nThe installation and upgrading procedure will produce a fully configured Critic\nsystem.  Not all details can be controlled, however, and sometimes it may be\nnecessary to change the installed configuration later on.  To do this, one needs\nto edit the configuration files under\n  <code>%(configuration.paths.CONFIG_DIR)s/</code>\nand restart the host web server (to reload the web front-end) and Critic's\nbackground services.  The best way to restart Critic is using the\n  <code>criticctl</code>\nutility:\n\n| criticctl restart\n\nBefore restarting, it will run a basic configuration test to make sure the\nconfiguration isn't obviously invalid.  This will not catch all possible errors,\nbut helps to prevent putting the Critic system in an unusable state.\n\nIt is also possible to just run the configuration test:\n\n| criticctl configtest\n\nDo note that while it is necessary to manually restart to immediately make\nchanges effective, some background services spawns child processes to do the\nactual work, and they will pick up changes right away, without an explicit\nrestart.  Likewise, the host web server may spawn additional web front-end child\nprocesses, or stop and restart existing ones, thus picking up changes before\nbeing explicitly restarted.  Due to this fact, it's best to explicitly restart\nimmediately after making a change, so that any problems are detected and can be\naddressed.\n\nGit Repository Access\n---------------------\n\nCritic's Git repositories should normally be accessible to the users of the\nsystem, both to clone and examine locally and to push new commits to.  It is not\nstrictly required that they are -- Critic can be set up such that users only\naccess Git repositories on some other system from which Critic in turn fetches\n-- but it is the typical set-up.\n\nGit supports a variety of schemes (or protocols) for accessing repositories.  In\ngeneral, any access scheme will work with Critic as long as doesn't involve\ndirectly manipulating the repository on the Critic server in a way that bypasses\nCritic's <code>pre-receive</code> hook.  Some access schemes require the system\nadministrator to perform administration tasks beyond installing and managing\nCritic, however.\n\ngit://host/path.git\n-------------------\nThe Git scheme, provided by the <code>git daemon</code> command, is typically\nused only for read-only access to Git repositories.  It can support push as well\nbut it will be unauthenticated, which means there will be no way of knowing\nwhich user is responsible for the push.  Critic has no support for starting and\nrunning <code>git daemon</code>.\n\nThis simple command can be used to start a <code>git daemon</code> process that\nprovides read-only access to all of Critic's repositories on\n%(configuration.base.HOSTNAME)s:\n\n| git daemon --export-all --base-path=%(configuration.paths.GIT_DIR)s %(configuration.paths.GIT_DIR)s\n\nThe command needs to be executed as the Critic system user\n(<code>%(configuration.base.SYSTEM_USER_NAME)s</code>), for instance using\n<code>sudo</code>:\n\n| sudo -H -u critic git daemon ...\n\nFor more details about how to run <code>git daemon</code>, see its built-in\ndocumentation:\n\n| git daemon --help\n\nhttp://host/path.git\n--------------------\nThe HTTP (and/or HTTPS) protocol, provided by the <code>git http-backend</code>\ncommand, is supported by Critic and enabled by default.  If the Critic web\nfront-end allows anonymous users, then it also allows unauthenticated read-only\naccess.  Push is only supported from authenticated users.\n\nssh://host%(configuration.paths.GIT_DIR)s/path.git / host:%(configuration.paths.GIT_DIR)s/path.git\n---------------------------------------------------------------------\nAccess to repositories over SSH simply requires users to be able to log into the\nCritic system over SSH and be members of the\n  <code>%(configuration.base.SYSTEM_GROUP_NAME)s</code>\nsystem group; Git takes care of the rest.  See also the note on\n  <a href=\"#system_users\">System Users</a>\nbelow.\n\n  Hint: To allow users to access Git repositories over SSH without giving them\n  \"shell access,\" set the system users' shell to <code>git-shell</code>.\n\nConfiguration\n-------------\nThere are two settings in Critic relating to repository access.\n\nFirst, there's the system configuration setting\n<code>REPOSITORY_URL_TYPES</code>, set in\n<code>%(configuration.paths.CONFIG_DIR)s/configuration/base.py</code>, which is\na list of access schemes that are supported on the system.  If this list does\nnot contain the string <code>\"http\"</code>, the built-in support for this access\nscheme is disabled.  The other schemes -- <code>\"git\"</code>, <code>\"ssh\"</code>\nand <code>\"host\"</code> -- are not in themselves affected by the setting.\n\nSecond, there's the user preference setting CONFIG(repository.urlType) that\ndetermines what type of repository URLs the web front-end displays.  Each user\ncan select their preferred URL type among the set of supported schemes as\ndefined by the system configuration setting above.\n\nUsers\n-----\n\nCritic has a user database containing records of the users of the system.  The\nrecord for each user contains the (typically short) user name, the user's full\nname (which is what is normally displayed) and the user's email address, which\nis used when the system sends notifications to the user.\n\nThe user database is populated in different ways depending on how Critic was\nconfigured to identify and authenticate users.\n\nAuthentication mode: Critic\n---------------------------\nIf Critic was configured to authenticate users itself, then users can be added\nto the database manually by a system administrator.  This can be done using the\n  <code>criticctl</code>\nutility (installed as\n  <code>/usr/bin/criticctl</code>\nby Critic's installation script):\n\n| criticctl adduser --name=USERNAME --email=EMAIL --fullname=FULLNAME\n\n  Note: The program will prompt for the user's password.  If any of the options\n  are omitted, it will also prompt for that information.\n\nIt is also possible to let new users create user records in Critic themselves,\nby changing the system setting\n  <code>ALLOW_USER_REGISTRATION</code>\nin\n  <code>%(configuration.paths.CONFIG_DIR)s/configuration/base.py</code>.\nWhen enabled, a link is added on the sign-in page to a separate page where users\ncan input the necessary information to create a user record.\n\nIs is also possible to let users sign in, and optionally create user records, by\nauthenticating via an external system, such as GitHub or Google, using the OAuth\n2.0 protocol.  For more information about this mechanism, see the [tutorial on\nexternal authentication][extauth].\n\n[extauth]: /tutorial?item=external-authentication\n\nAuthentication mode: Host\n-------------------------\nIf Critic was configured to let the host web server authenticate users, it\ninstead assumes that the\n  <code>REMOTE_USER</code>\nWSGI environment variable is set to the name of the accessing user, and\nautomatically creates a user record in its database if one doesn't already\nexist.  In this scenario, there's typically no reason for the system\nadministrator to manually add users to Critic's database (but it's still\npossible to do so.)  See\n  <a href=\"/tutorial?item=customization#automatic_user_creation\">the customization tutorial</a>\nfor details on this automatic user creation, and how it can be tweaked.\n\nAnonymous Access\n----------------\nIt is possible to let users access a Critic system without signing in, by\nsetting the system setting\n  <code>ALLOW_ANONYMOUS_USER</code>\nin\n  <code>%(configuration.paths.CONFIG_DIR)s/configuration/base.py</code>\nto True.\n\nWhen Critic was configured to authenticate users itself (authentication mode\n\"critic\") and use HTTP cookies to handle sessions (instead of HTTP\nauthentication) allowing anonymous access simply means the system won't redirect\nto the \"Sign in\" page for most page accesses; instead there will be a \"Sign in\"\nlink in the paqe header.\n\nWhen Critic was configured to let the host web server authenticate users, the\nuser is simply considered anonymous when the\n  <code>REMOTE_USER</code>\nWSGI environment variable is unset.  The host web server must be configured\nappropriately for this to be a meaningful configuration, and there are typically\nlimited use-cases.  One possible use-case is to make parts of the web UI\navailable anonymously by configuring the host web server to require a signed in\nuser for some paths and not for others.\n\nSystem Users\n------------\nCritic's user database doesn't necessarily have to match the system's user\ndatabase.  However, if access to the Git repositories is provided over SSH,\nthen, for the purpose of SSH authentication, there needs to be system users.\nAnd in this case, for each system user who can push changes to the Git\nrepositories over SSH there must be a matching user in Critic's user database\nwith the same username.\n\nAlso note that system users that should be allowed to access the Git\nrepositories must be members of the\n  <code>%(configuration.base.SYSTEM_GROUP_NAME)s</code>\nsystem group.\n\nGit Authors and Committers\n--------------------------\nCritic's user database doesn't necessarily contain records for every user\noccurring as the author or committer of commits in the Git repositories, but\ncommits can be mapped to Critic users by matching the email addresses in the Git\ncommit meta-data.  Each Critic user can register any number of email addresses\nthat he or she uses or has used when creating Git commits.  By default, the\nuser's primary email address is used to map commits to the user, but the user is\nin no way forced to use this particular email address in Git commits.\n\nCurrently, the only effect of mapping a commit's user references to a Critic\nuser is that a Critic user is never automatically assigned to review commits\nthat he/she is the author of.\n\nDeleting Users\n--------------\nIt is also possible to \"delete\" a user using <code>criticctl</code>:\n\n| criticctl deluser --name=USERNAME\n\nThis doesn't actually delete the user record from the database, since that is\nlikely to be referenced from many places, depending on what the user has done in\nthe system.  Instead, what deleting a user does is\n\n1 Marks the user as \"retired\", which prevents the system from acting as if the\n  user is expected to be reviewing any changes in the future, and\n2 deletes the user's password so that he or she can no longer sign in.\n\n<strong>Important note:</strong> Deleting a user's password only prevents access\nto the system if Critic handles user authentication.  If the web server handles\nit, the user must primarily be disabled in whatever mechanism the web server\nuses to authenticate users.  Critic will <b>not</b> disallow access in this\ncase.  What's more, in this case, if a deleted/retired user signs in, the user's\n\"retired\" status is automatically reverted, thus completely undoing the effects\nof \"deleting\" the user in the first place.\n\nRoles\n-----\n\nUser roles is a very basic access rights scheme, limiting which Critic users can\ndo a small number of things, including adding new repositories, modifying other\nusers' information, and adding news items.  Other than that, all users can\naccess all information in the system, modify all reviews and so on.\n\nThe available roles are:\n\n? administrator\n= A user with the <code>administrator</code> role can restart system background\n  services via the\n    <code><a href=\"/services\">%(configuration.URL)s/services</a></code>\n  page, edit other users' information via\n    <code style='white-space: nowrap'>%(configuration.URL)s/home?user=&lt;name&gt;</code>\n  and also enable/disable any tracked branch via\n    <code><a href=\"/repositories\">%(configuration.URL)s/repositories</a></code>.\n  More exceptions are likely to be added in the future.\n\n? repositories\n= A user with the <code>repositories</code> role can add new repositories to the\n  system via the\n    <code><a href=\"/newrepository\">%(configuration.URL)s/newrepository</a></code>\n  page.\n\n? newswriter\n= A user with the <code>newswriter</code> role can write news items items that\n  appear on the\n    <code><a href=\"/news\">%(configuration.URL)s/news</a></code>\n  page.  The user can also edit existing items.  It is likely that news items\n  will be also be added automatically when upgrading the system, to inform about\n  significant changes.\n\n? developer\n= The <code>developer</code> role doesn't really give access to restricted\n  functionality.  Instead it affects how unexpected errors -- such as uncaught\n  Python exceptions -- are handled.  Normally, unexpected errors are presented\n  to the user as just that, an \"unexpected error,\" with a note that a message\n  has been sent to the system administrator, and an email is sent to the system\n  administrator with as much details as possible about the error.  But if the\n  user triggering the error has the <code>developer</code> role, no email is\n  sent to the system administrator (who is likely to be the same person in\n  practice) and instead the error details -- typically a Python stack-trace --\n  is displayed directly to the user.\n\nAssigning Roles\n---------------\nTo assign a role to a user, use the\n  <code>criticctl</code>\nutility:\n\n| criticctl addrole --name=USERNAME --role=ROLE\n\nand to unassign a role:\n\n| criticctl delrole --name=USERNAME --role=ROLE\n\n  Note: If the options are not provided, <code>criticctl</code> will instead\n  prompt for the missing information.\n\nBackground Services\n-------------------\n\nThe running Critic system is divided into two parts: the web front-end code,\nrunning as a WSGI application (in Apache, using mod_wsgi) and a number of\nbackground services running as daemon processes (started via a SysV init\nscript.)  The background services are listed on the page\n  <code><a href=\"/services\">%(configuration.URL)s/services</a></code>\nwhere the system administrator (any user with the <code>administrator</code>\nrole) can restart them and view their logs.  The WSGI daemon processes can also\nbe restarted from that page.  By default there are two WSGI daemon processes;\nthis number can be adjusted by modifying the <code>WSGIDaemonProcess</code>\ndirective in the Apache site definition file.\n\nNormally, the system administrator does not need to do anything to any of the\nbackground services; they are restarted automatically when Critic is upgraded or\nif any of them crash.  If any errors or warnings are logged by a background\nservice, an email is sent to the system administrator email address.\n\nBackground services are configured in the configuration file\n  <code>%(configuration.paths.CONFIG_DIR)s/configuration/services.py</code>\nwhere the set of services to start is defined, as well as various per-service\nconfiguration settings.  If these settings are changed, the services need to be\nrestarted for the new settings to take effect.\n\nThe following background services are started automatically by Critic:\n\nbranchtracker\n-------------\nThe <code>branchtracker</code> service is responsible for fetching tracked\nbranches from remote Git repositories and pushing them to Critic's repositories.\nIf a Critic repository is created to mirror another Git repository, one such\ntracked branch is automatically set up, typically named <code>master</code>.\nAdditional tracked branches can be added via the\n  <code><a href=\"/repositories\">%(configuration.URL)s/repositories</a></code>\npage by any user with the <code>administrator</code> role.  Reviews can also be\ncreated to track branches in other Git repositories using the\n  <code><a href=\"/createreview\">%(configuration.URL)s/createreview</a></code>\npage.\n\nchangeset\n---------\nThe <code>changeset</code> service is responsible for processing diffs and\nstoring cached information about them in Critic's database.  The service also\nperiodically scans the database for cached diffs that haven't been accessed for\nsome time, and deletes them.\n\nThe service has the following additional configuration settings in\n<code>services.py</code> (with the default settings):\n\n| CHANGESET[\"max_workers\"] = 4\n| CHANGESET[\"rss_limit\"] = 1024 ** 3\n| CHANGESET[\"purge_at\"] = (2, 15)\n\nThe \"max_workers\" setting decides how many parallel jobs the service can run.\nThe \"rss_limit\" setting is a safe-guard against run-away memory usage.  The\n\"purge_at\" setting is a tuple (HOUR, MINUTE) that defines when during the day\n(in the server's local time) that the service should perform its maintenance\ntasks.\n\ngithook\n-------\nThe <code>githook</code> service performs all the checks and updates needed when\nCritic's repositories are updated by users using <code>git push</code>.  The\nactual <code>pre-receive</code> hook that is installed in Critic's repositories\nis a simple script that connects to the service over a UNIX socket and forwards\nall information about the update to the service.\n\nhighlight\n---------\nThe <code>highlight</code> service is responsible for generating syntax\nhighlighted copies of source file versions and storing them in a cache, as well\nas periodically compressing and eventually deleting old highlighted copies.\n\nAs part of syntax highlighting, \"code contexts\" are in some cases (depending on\nthe language of the source file) calculated and stored in Critic's database.  A\n\"code context\" is a short string, for instance a function signature, mapped to a\nrange of lines, and used by the diff viewer to give context to the code\nfragments it displays.\n\nThe service has the following additional configuration settings in\n<code>services.py</code> (with the default settings):\n\n| HIGHLIGHT[\"cache_dir\"] = os.path.join(configuration.paths.CACHE_DIR, \"highlight\")\n| HIGHLIGHT[\"min_context_length\"] = 5\n| HIGHLIGHT[\"max_context_length\"] = 256\n| HIGHLIGHT[\"max_workers\"] = 4\n| HIGHLIGHT[\"compact_at\"] = (3, 15)\n\nThe \"cache_dir\" setting defines where cached highlighted copies are stored on\ndisk.  The \"min_context_length\" and \"max_context_length\" limits length of code\ncontext strings.  The \"max_workers\" setting defines how many parallel jobs the\nservice can run.  The \"compact_at\" setting is a tuple (HOUR, MINUTE) that\ndefines when during the day (in the server's local time) that the service should\nperform its maintenance tasks.\n\nmaildelivery\n------------\nThe <code>maildelivery</code> service is responsible for picking up queued mails\nproduced by other parts of Critic and delivering them to the configured SMTP\nserver.\n\nThe service has the following additional configuration settings in\n<code>services.py</code> (with the default settings):\n\n| MAILDELIVERY[\"timeout\"] = 10\n\nThe \"timeout\" setting defines the (socket) timeout used when contacting the SMTP\nserver.  If a connection or mail delivery attempt times out, the service will\nwait a while and then try again.\n\nmaintenance\n-----------\nThe <code>maintenance</code> service performs miscellaneous maintenance tasks\nperiodically.  Currently it performs these tasks:\n\n* Keeping the UTC offsets in the <code>timezones</code> table in Critic's\n  database up-to-date.  The offsets are fetched from PostgreSQL's\n  <code>pg_timezone_names</code> table, but accessing this table directly is\n  quite slow, so the offsets in it are read periodically and cached.\n* Performing scheduled [review branch archivals][archival].\n* Running <code>git gc</code> in Critic's Git repositories.\n\nThe service has the following additional configuration settings in\n<code>services.py</code> (with the default settings):\n\n| MAINTENANCE[\"maintenance_at\"] = (4, 0)\n\nThe \"maintenance_at\" setting is a tuple (HOUR, MINUTE) that defines when during\nthe day (in the server's local time) that the service should perform its\nmaintenance tasks.\n\n[archival]: #review_branch_archival\n\nservicemanager\n--------------\nThe <code>servicemanager</code> service is a meta-service that starts all other\nservices, and restarts them if they crash.  If this service is restarted it\nrestarts all other services.\n\nThe service has the following additional configuration settings in\n<code>services.py</code> (with the default settings):\n\n| SERVICEMANAGER[\"services\"] = [...]\n\nThe \"services\" setting is a list of services to start.\n\nwatchdog\n--------\nThe <code>watchdog</code> service is a simple monitoring service.\n\nBy default it only monitors the memory usage of the WSGI daemon processes and\nrestarts them if their memory usage exceeds the configured limits.  There are\ntwo limits; a soft limit at which it tells the WSGI process to restart itself by\nsending it a <code>SIGINT</code> signal, and a hard limit at which it kills the\nWSGI process by sending it a <code>SIGKILL</code> signal.\n\nIf configured to do so, the service can also monitor overall system load and\nsend emails to the system administrator if the load exceeds configured limits.\n\nThe service has the following additional configuration settings in\n<code>services.py</code> (with the default settings):\n\n| WATCHDOG[\"rss_soft_limit\"] = 1024 ** 3\n| WATCHDOG[\"rss_hard_limit\"] = 2 * WATCHDOG[\"rss_soft_limit\"]\n|\n| # Not set by default:\n| WATCHDOG[\"load1_limit\"] = <1-minute load average limit>\n| WATCHDOG[\"load5_limit\"] = <5-minute load average limit>\n| WATCHDOG[\"load15_limit\"] = <15-minute load average limit>\n\nThe \"rss_soft_limit\" and \"rss_hard_limit\" defines the memory limits, in bytes.\nThe load average limit settings defines the limit at which to start warning\nabout high system load.  Any one, or all three, can be set to different values.\nNote: the configured value is multiplied by the number of CPUs in the system\nbefore being compared to the actual system load.\n\nExtensions\n----------\n\nCritic supports an extension mechanism that allows users of the system to extend\nCritic's functionality in various ways.  Extensions are written in ECMAScript\n(executed in a custom interpreter based on the V8 ECMAScript engine) and have\naccess to a high-level API exposing most information in Critic's database.\nExtensions can for instance be used to integrate Critic and other systems, such\nas build servers or issue tracking systems.\n\nThe extension mechanism is not enabled in a newly installed Critic system.  The\nsystem administrator needs to use the <code>extend.py</code> script to prepare\nand enable the extension mechanism.  This script installs additional required\nsoftware, downloads (from GitHub) the source code used to build the ECMAScript\ninterpreter, builds and installs it, and finally modifies Critic's configuration\nto enable the extension mechanism.\n\nTo enable the extension mechanism, simply run this command as root:\n\n| python extend.py\n\nFor general information about the extension mechanism and how to create\nextensions, see the\n  <a href=\"/tutorial?item=extensions\">extensions tutorial</a>\nand the\n  <a href=\"/tutorial?item=extensions-api\">extensions API tutorial</a>.\n\nSecurity considerations\n-----------------------\nExtensions are scripts running as the Critic system user, and effectively has\naccess to all information in Critic's database and Git repositories, and to all\nfiles in the system's file systems that the Critic system user has access to.\nIn addition, they can start child processes, and thus do essentially anything\nthe Critic system user can do.  Untrusted users should obviously not be allowed\nto add extensions to a Critic system.\n\nTo add extensions to a Critic system, a user needs shell access to the system,\nor at least some way to create directories and files in their $HOME directory.\nTypically, an untrusted user should not be given shell access to the system\nanyway, so not allowing them to add extensions is somewhat a secondary concern.\n\n  Hint: As mentioned above, users can be given access to Git repositories over\n  SSH without giving them shell access, by setting the system users' shell to\n  <code>git-shell</code>.\n\nIf users must be given shell access to the system, but not be allowed to add\nextensions to it, it is possible to disable \"user extensions\" -- extensions\nadded by regular users -- by setting the configuration option\n<code>USER_EXTENSIONS_DIR</code> in\n  <code>%(configuration.paths.CONFIG_DIR)s/configuration/extensions.py</code>\nto the value <code>None</code>.  With that configuration, only users with write\naccess to the directory\n  <code>%(configuration.paths.DATA_DIR)s/extensions/</code>\ncan add extensions to the system.\n\nReview branch archival\n----------------------\n\nFor a description of this mechanism, see the [tutorial on the\nsubject][archival-tutorial].\n\nA system administrator can tweak the mechanism by changing the system-wide or\nper-repository defaults for the settings\nCONFIG(review.branchArchiveDelay.closed) and\nCONFIG(review.branchArchiveDelay.dropped).  Individual users can override the\ndefaults, but only to achieve a longer delay, or disable archival entirely, for\nreviews they own.\n\nIf review branch archival is not desired at all, it can also be disabled\nsystem-wide by the system administrator by setting\n<code>ARCHIVE_REVIEW_BRANCHES</code> in\n<code>%(configuration.paths.CONFIG_DIR)s/configuration/base.py</code> to\n<code>False</code>.\n\n[archival-tutorial]: /tutorial?item=archival\n"
  },
  {
    "path": "src/tutorials/archival.txt",
    "content": "Review branch archival\n======================\n\nBackground\n----------\n\nOver time, Critic's repositories may accumulate many effectively obsolete review\nbranch refs belonging to long-since finished and closed, or dropped, reviews.\nThis slows down various Git operations performed on the repository, and also\npollutes your local repositories if you ever fetch from Critic's repository.\n\nTo remedy this problem and keep Critic's repositories clean and tidy, old review\nbranch refs are archived.  This means that the ref is simply deleted from the\nrepository, while making sure that all commits are kept alive in the repository.\n\nDetails\n-------\n\nOnly review branches -- those with a <code>r/</code> prefix -- are archived, no\nother branches.  Review branches are furthermore only archived when they are\n<em>obsolete</em>.  A review branch is considered obsolete when the review is\nfinished and has been closed, or when the review has been dropped.\n\nWhen a review branch becomes obsolete, it's scheduled for archival some time\ninto the future.  By default, if the review is finished, its branch is archived\nno sooner than 7 whole days after the review was closed.  If the review is\ndropped, its branch is archived no sooner than one whole day after the review\nwas dropped.\n\nThe actual archival is done by a nightly maintenance task.\n\nConfiguration\n-------------\n\nThe delays -- <em>7 days</em> and <em>one day</em> -- can be changed via the\nconfiguration settings CONFIG(review.branchArchiveDelay.closed) and\nCONFIG(review.branchArchiveDelay.dropped).  The settings can be set per\nrepository, meaning a user can configure different delays for different\nrepositories, and also that the system default can be different for different\nrepositories.\n\nWhen the appropriate delay is calculated for a review, the (possibly\nper-repository) delay settings for each owner of the review, as well as the\n(possibly per-repository) system default, are considered.  If any of those\nsettings is zero, the review branch is not scheduled for archival at all.\nOtherwise, the <em>highest</em> setting is used, thus resulting in the longest\ndelay requested by any owner, or the system administrator.\n\nResurrection\n------------\n\nWhen a review branch has been archived it is in no way lost forever.  It can\nalways be <em>resurrected</em>, simply by clicking the <code>Resurrect\nBranch</code> button on the review's front-page.  When a review branch is\nmanually resurrected this way, another archival is always rescheduled with the\nusual delay.\n\nThe review branch is also automatically resurrected if the review is reopened.\nIn that case, no new archival is rescheduled until the review is closed or\ndropped again.\n\nLocal repositories\n------------------\n\nIf you have a local clone of one of Critic's repositories, or have added\nCritic's repository as a remote and fetched from it, you will have local copies\n-- called <em>remote tracking branches</em> and named\n<code>refs/remotes/origin/*</code> or similar -- of all branches that existed in\nCritic's repository when you cloned or fetched from it.  Git will by default not\nremove these local copies when the corresponding branch ref is deleted in\nCritic's repository.\n\nTo have Git clean up your local repository, you can run the command\n\n| git remote prune $name\n\nwhere <code>$name</code> is the name of your Critic remote, typically\n<code>origin</code> if you cloned Critic's repository directly.\n\nAlternatively, you can make <code>git fetch</code> do this pruning each time you\nfetch from the remote by running the command\n\n| git config remote.$name.prune true\n\nor each time you fetch from any remote using the command\n\n| git config fetch.prune true\n\nWarning\n-------\nIn Git versions prior to 2.0.1, this pruning can take a long time, if there are\nmany refs to prune, and especially if there are many other refs in the\nrepository as well.  The <code>git remote prune</code> command can be stopped\nand restarted any number of times, and can thus be used to incrementally prune\nrefs in chunks, if desired.\n"
  },
  {
    "path": "src/tutorials/checkbranch.txt",
    "content": "Branch Review Status Analyzer\n=============================\n\nCommit Status\n-------------\n\nCritic's automatic branch review status analyzer is designed to answer the\nquestion \"If branch A was merged into branch B, would any unreviewed changes be\nmerged?\"  In the typical use-case, \"A\" would be a task branch, and \"B\" would be\nmaster.\n\nTo analyze a branch, visit the\n  <a href=checkbranch>/checkbranch</a>\npage, enter the name of the branch to check, select the repository in which it\nlives, optionally request that the branch be fetched from the repository's\nupstream repository and then press the \"Check\" button.\n\nThe list of commits that would be merged is calculated using\n  <code>git rev-list</code>;\nthe exact command used is included on the result page.\n\nThe review status of the branch is calculated per commit, and is signalled using\nthe colors green, yellow and red.\n\nGreen\n-----\nA green commit status means that the commit is verified by Critic to have been\nreviewed in an accepted review in Critic.  To be assigned this status, one of\nthese two statements must be true about the commit:\n\n1 The commit is the same as the head commit on the review branch (same SHA-1)\n2 The commit is included on the review branch, and is the parent of another\n  green commit from the same review.\n\nThe meaning of the second point is simply that if the head commit of an accepted\nreview is among the checked commits, then all other commits from that review are\nalso green.  It also means that if the head commit of an accepted review is not\namong the checked commits, then no other commits from the review will be green,\neven if they are exactly the same commits (same SHA-1); this is because the\nchanges that were accepted in the review were exactly all the changes, not just\nsome of them.  Trying to merge only some of the commits from the review can't be\nverified by Critic as a safe operation, since it might mean that you are\naccidentally trying to merge an incomplete set of changes, for instance missing\nfixes for all review issues.\n\nYellow\n------\nA yellow commit status means that the commit has been manually connected to a\nreview, or had its review status otherwise explained, by a user.  This might\nmean that the commit is a squash of the changes in a review, but that a strong\nSHA-1-based connection can't be made by Critic because the review hasn't been\nrebased, or can't be rebased.  It could also mean that the changes in the commit\nwere reviewed outside of Critic.\n\nRed\n---\nA red commit status either means that Critic hasn't been able to connect the\ncommit to a review at all, or that the review to which the commit has been\nconnected isn't accepted or finished.\n\nThat the commit hasn't been connected to a review does not conclusively mean the\ncommit hasn't been reviewed, it just means Critic can't say anything for sure\nabout it.  Since Critic requires the same SHA-1 to automatically match a review\nto a commit, simply cherry-picking a commit from a review branch to the checked\nbranch will always make it show up as red, even if the cherry-pick was\nconflict-free and the commit message is identical.\n\n\nUsage\n-----\n\nThe only way in which one needs to interact with the branch review status page\nis by manually connecting commits to reviews when Critic can't make the\nconnection automatically, or otherwise explain how a commit was reviewed.  To do\nthis, every non-green commit has an <code>[edit]</code> link in the right-most\ncolumn.  Red commits committed by the user looking at the\n  <code>/checkbranch</code>\npage are highlighted for visibility.\n\nThe <code>[edit]</code> link opens up a dialog.  This dialog has an input field\nfor entering the ID number of a review, and a text area for writing a comment.\nIf Critic finds any reviews that seem like likely candidates to have contained\nthe changes in the commit, the dialog also contains a drop-down containing those\ncandidate reviews.  Selecting an item in the drop-down simply sets the review ID\ninput field's value.\n\nEntering a comment is optional if a review ID is specified.  If no review ID is\nspecified, for instance because the changes were reviewed outside of Critic, a\ncomment is required.  The comment should just be a very short explanation for\nhow the changes were reviewed.\n\nReview Rebase\n-------------\nIf the ID number of a review is entered, Critic will automatically check if it\nwould be possible to rebase the review to contain this single commit.  Doing so\nwould make the commit appear green and is always preferable to having it appear\nyellow, since green means Critic has verified that the commit contains exactly\nthe same changes as the ones that were approved in the review, and thus\neliminates the possibility of the wrong changes having been pushed to the branch\nby mistake.\n\n<b>Note:</b> It's important to only accept the offer to rebase the review if the\ncommit is a squash of all changes in the review.  If the commit is only part of\nthe review, for instance a cherry-pick of a single bug fix made on a larger task\nbranch, rebasing the review would be a very bad idea.  But since the green\ncommit status is a much stronger guarantee that the right changes made it onto\nthe branch, it's strongly recommended that the review is rebased whenever\npossible, so that the commit appears green.\n"
  },
  {
    "path": "src/tutorials/customization.txt",
    "content": "System Customization\n====================\n\nCritic supports some low-level, and typically very simple, customization of\nvarious aspects of its behavior.  This customization would be done by the Critic\nsystem administrator and is entirely optional; if a Critic system is not\ncustomized at all, the default behavior is typically entirely reasonable.\n\nThe customization mechanism consists of conditional imports of functions from\nmodules in a package named \"customization\", which by default does not exist.  If\nthe system administrator creates such a package in a directory in Critic's\n<code>PYTHONPATH</code>, Critic will automatically start using whatever\ncustomization behaviours it implements.\n\nIn this Critic install, two Critic-specific directories are included in its\n<code>PYTHONPATH</code>: <code>%(configuration.paths.CONFIG_DIR)s/</code> and\n<code>%(configuration.paths.INSTALL_DIR)s/</code>.  To initially set up\ncustomization, create a sub-directory named \"customization\" in one of these\ndirectories, containing an empty file named <code>__init__.py</code>.\nIndividual customizations (described below) are then done by creating additional\n<code>.py</code> files in this directory.\n\nIn all cases, the actual implementation of functions and classes below are only\nexamples meant to indicate how the interfaces could be implemented.  In other\nwords, only the function names and argument lists in the examples are in any way\n\"normative.\"\n\nAutomatic User Creation\n-----------------------\n\nThis customization affects how users are automatically created when Critic is\nconfigured to let the web server do authentication, and the authenticated user\ndoes not exist in Critic's user database (in other words, the first time a user\nlogs in.)\n\ncustomization/email.py\n----------------------\nThis module should define this function:\n\n| def getUserEmailAddress(username):\n|   return username + \"@example.com\"\n\nThe function <code>customization.email.getUserEmailAddress()</code>, if defined,\nis called to calculate a user's email address from the username.  If the\nfunction returns None, or is not defined, the user is created with no email\naddress.  Either way, the user can change the email address via the /home page\nwhen logged in.\n\nHyperlinks in Text\n------------------\n\nCritic's web front-end does automatic pattern-based \"linkification\" of text\n(such as review descriptions, commit messages and user comments.)  By default it\nhandles plain http:// and https:// URLs, &lt;URL:...&gt;, SHA-1 commit\nreferences (including BEGIN..END ranges), and r/NNNN review references.\n\nThis mechanism can be extended with other patterns, for instance linking issue\nreferences to an issue tracking system.\n\ncustomization/linktypes.py\n--------------------------\nThis module should import the module <code>linkify</code> (which is part of\nCritic) and then define sub-classes of the class <code>linkify.LinkType</code>,\nand create one or more instances of each such sub-class\n(<code>linkify.LinkType</code>'s <code>__init__()</code> registers the link type\nin a global structure.)\n\nEach sub-class should call <code>linkify.LinkType</code>'s\n<code>__init__()</code> with a single argument that is a string containing a\nregular expression (with no capture groups) that matches the sub-string that\nshould be made into a link.\n\nThe sub-class should also implement the method <code>linkify</code> which will\nbe called with two arguments, a string containing the sub-string that should be\nmade into a link and a context object.  This method should either return a\nstring containing a URL, or None.  If it returns a string, the word is made into\na link to that URL.\n\n| import linkify\n|\n| class IssueLink(linkify.LinkType):\n|   def __init__(self):\n|     super(IssueLink, self).__init__(\"#[0-9]+\")\n|   def linkify(self, word, context):\n|     return \"https://issuetracker.example.com/showIssue?id=\" + word[1:]\n|\n| IssueLink()\n\nThe context argument can safely be ignored.  Notable simple and potentially\nuseful information in it is <code>context.review.id</code>, which is the integer\nid of the review, if the page loaded is connected to a review, and\n<code>context.repository.name</code>, which is the string name of the repository\ncontaining the review or commit.  Both of these sub-objects\n(<code>context.review</code> and <code>context.repository</code>) can be\n<code>None</code> if link conversion happens in a context where there is no\nassociated review or repository.\n\nBranch Name Patterns\n--------------------\n\nCritic can handle reviews that are configured to track branches in other\nrepositories (not on Critic.)  In such a scenario, a review can be rebased by\nswitching it over to track a new (rebased) branch in the other repository.  If\nbranch names in the other repository follow a certain naming pattern, like\n<code>&lt;basename&gt;/&lt;serial-number&gt;</code>, Critic can be customized to\nfind and suggest appropriate branch names based on that naming pattern when\nrebasing such reviews.\n\ncustomization/branches.py\n-------------------------\nThis module should define one or both of these functions:\n\n| import os\n|\n| def getRebasedBranchPattern(branch_name):\n|   return os.path.dirname(branch_name) + \"/*\"\n|\n| def isRebasedBranchCandidate(current_name, other_name):\n|   return os.path.dirname(current_name) == os.path.dirname(other_name)\n\nWhen searching for plausible new branches for a review to track in the other\nrepository, the other repository is scanned using <code>git ls-remote</code>.\n\nThe <code>getRebasedBranchPattern()</code> function, if defined, is used to\ncalculate a pattern argument to <code>git ls-remote</code>.  If it is not\ndefined, or if it returns None, <code>git ls-remote</code> is called to list all\nbranches in the other repository.  The <code>isRebasedBranchCandidate()</code>\nfunction, if defined, is used to filter the set of branches returned by\n<code>git ls-remote</code> down further.\n\nIf neither function is defined, no scanning of the other repository is done, and\nno branches are suggested to the user.  This doesn't leave the user entirely\nwithout guidance, however; the input field on the page where the user is asked\nto input the branch name will auto-complete input based on branches that exist\nin the other repository (also using <code>git ls-remote</code>.)\n\nCustom pre-receive Checks\n-------------------------\n\nIt is entirely possible to install custom Git hooks in Critic's repositories,\nbut doing so may conflict with the hooks that Critic installs and depends on for\nits interactions with the repositories.  Currently, Critic only installs a\npre-receive hook, but other hooks may be installed in the future.\n\nIf custom checks should be applied to ref updates, before Critic handles them,\nthe preferred way is using Critic's customization mechanism.\n\ncustomization/githook.py\n------------------------\nThis module should define an exception class, Reject, and a function, update():\n\n| class Reject(Exception):\n|   pass\n|\n| def update(repository_path, ref_name, old_value, new_value):\n|   if ref_name.startswith(\"refs/heads/reserved/\"):\n|     raise Reject(\"Don't push reserved refs!\")\n\nThe function is called with the arguments 'repository_path', which is the\n(absolute) file-system path of the Git repository (any one of Critic's\nrepositories on the system); 'ref_name', which is the full name of the reference\nbeing updated; 'old_value', which is the current SHA-1 of the reference, or\nNone if the reference is being created; and 'new_value', which is the new\nSHA-1 of the reference, or None if the reference is being deleted.\n\nIf a Reject exception (or sub-class thereof) is raised by update(), the update\nis rejected with an error message constructed by applying str() to the exception\nobject.  For a sub-class of Exception created with a single argument, as in the\nexample above, this means that argument is converted to a string and returned.\n\nIf an exception of any other type is raised, it is ignored, and the update is\nprocessed by Critic (and either rejected or allowed, as usual.)\n\nIf the update() function writes to stdout and the update is not rejected by\neither the update() function itself or Critic's normal handling of the update,\nthe text written is sent to the Git client updating the reference, and included\nin its output, with each line prefixed by the string \"remote: \".\n"
  },
  {
    "path": "src/tutorials/extensions-api.txt",
    "content": "Critic Extensions API\n=====================\n\nBasic Usage\n-----------\n\nThis API is available to extension scripts as the module \"critic\" on which\nconstructors and interface objects are exposed as properties.  The constructor\nfor the User interface is accessible to extension scripts as \"critic.User\", for\ninstance.\n\nBasic Data Types\n----------------\n\nCriticError\n-----------\n| exception CriticError : Error {\n| };\n\nThrown by other functions when called with incorrect arguments or similar.\nNative ECMAScript exceptions such as TypeError or ReferenceError may also be\nthrown.\n\nUser\n----\n| interface Filter {\n|   long       id;\n|   User       user;\n|   Repository repository;\n|   string     path;\n|   string     type;\n|   User[]     delegates;\n| };\n|\n| interface User {\n|   dictionary UserData {\n|     long   id;\n|     string name;\n|     string email;\n|   };\n|\n|   constructor User(UserData data);\n|   constructor User(string name);\n|   constructor User(long id);\n|\n|   readonly attribute long   id;\n|   readonly attribute string name;\n|   readonly attribute string email;\n|   readonly attribute string fullname;\n|\n|   static readonly attribute User current;\n|\n|   boolean|long|string getPreference(string name);\n|\n|   Filter[] getFilters(Repository repository);\n|   Filter addFilter(Repository repository, string path, string type, string? delegates);\n| };\n\nSimple object representing a Critic user.  The getPreference method can be used\nto access the user's preferences; it returns a value of type boolean, long or\nstring depending on the preference.  The getFilters method can be used to list\nthe user's global filters for a given repository and the addFilter method can be\nused to add a new global filter for a user.  Note: adding a new filter does not\ncause changes in existing reviews to be assigned to the user.\n\nAn extension can fetch user records from the database simply by constructing\nUser objects, supplying either the user name or a user ID number; \"new\ncritic.User('jl')\" or \"new critic.User(18)\".\n\nThe current user (the user who installed the extension and is currently loading\na page, pushing changes or whatever triggered the extension) is always\naccessible as \"critic.User.current\".\n\nFile\n------------------\n| interface File {\n|   readonly attribute long      id;\n|   readonly attribute string    path;\n|\n|   string toString();\n|   long valueOf();\n|\n|   static File find(string path);\n|   static File find(long id);\n| };\n\nObject representing a file.  Exist in Critic primarily as a means to assign\nsystem-wide ID numbers to paths, and are used throughout the Critic extensions\nAPI for representing paths.  The toString() methods return the value of the\n'path' attribute, and the valueOf() methods return the value of the 'id'\nattribute.\n\nThe 'path' attribute of a File object has no leading forward slash (and of\ncourse no trailing one either.)\n\nThe find() methods will return the same object if called multiple times to find\nthe same object (whether via path or ID.)\n\nGit Repository Access\n---------------------\n\nGitObject\n---------\n| interface GitObject {\n|   readonly attribute string     sha1;\n|   readonly attribute string     type;\n|   readonly attribute Uint8Array data;\n| };\n\nRaw representation of an object fetched from the Git repository.\n\nGitUserTime\n-----------\n| interface GitUserTime {\n|   readonly attribute string fullname;\n|   readonly attribute string email;\n|   readonly attribute Date   time;\n|   readonly attribute User?  user;\n| };\n\nRepresentation of the information in the \"author\" and \"committer\" fields in a\nGit commit object.\n\nRepository\n----------\n| interface Repository {\n|   constructor Repository(string name);\n|   constructor Repository(long id);\n|\n|   readonly attribute long   id;\n|   readonly attribute string name;\n|   readonly attribute string path;\n|\n|   interface Filter {\n|     readonly attribute User       user;\n|     readonly attribute string     path;\n|     readonly attribute string     type;\n|     readonly attribute User[]     delegates;\n|   };\n|\n|   readonly attribute Filter[] filters;\n|\n|   GitObject fetch(string sha1);\n|   string    revparse(string ref);\n|\n|   Commit    getCommit(string ref);\n|   Commit    getCommit(long id);\n|\n|   Branch    getBranch(string name);\n|\n|   Changeset getChangeset(string ref);\n|   Changeset getChangeset(Commit commit);\n|   Changeset getChangeset(string a, string b);\n|   Changeset getChangeset(Commit a, Commit b);\n|   Changeset getChangeset(long id);\n|\n|   RepositoryWorkCopy\n|             getWorkCopy();\n| };\n\nRepresentation of one of Critic's repositories.  Can be constructed supplying\neither the repository's \"short name\" or its ID.\n\nThe 'filters' attribute returns an array of all user filters registered for the\nrepository.  The array object also has the properties 'users' and 'paths', that\nreturns dictionaries mapping user names and paths to filtered arrays of filters,\nrespectively.  The full array and all the filtered arrays combined contain the\nsame set of filters.  The 'path' attribute on filter objects define what set of\nthe tree is being filtered.  The value of the 'type' attribute on filter objects\nis one of the strings \"reviewer\" and \"watcher\".\n\nThe fetch() method can be used to read low-level objects from the Git\nrepository.  Using this function is usually not necessary.  Its argument must be\na full 40 character SHA-1 sum.\n\nThe revparse() method can be used to interpret a Git ref name and convert it to\na full 40 character SHA-1 sum.  This function runs the command \"git rev-parse\n--verify --quiet <ref>\" and will thus fail for any argument that isn't \"usable\nas a single, valid object name\" according to Git.\n\nThe getCommit() method can be used to fetch Commit objects, either specifying a\nstring (which is interpreted using revparse()) or a Critic commit ID.\n\nThe getBranch() method can be used to fetch Branch objects, either specifying a\nstring (which must be the exact ref name, minus the \"refs/heads/\" prefix) or a\nCritic branch ID.\n\nThe getChangeset() method can be used to fetch an object representing the diff\nbetween two commits.  The arguments can be one commit, two commits or a Critic\nchangeset ID.  Commit arguments can be specified either as strings (which are\ninterpreted using revparse()) or as Commit objects.\n\nThe getWorkCopy() method can be used to create or access a work-copy clone of\nthis repository.\n\nRepositoryWorkCopy\n------------------\n| interface RepositoryWorkCopy {\n|   readonly attribute Repository repository;\n|   readonly attribute string     path;\n|\n|   string run(string command, ..., Object? environment);\n| };\n\nRepresentation of a work-copy clone of a repository.  Work-copys are\nsemi-persistent and can be reused by different invokations of an extension, but\nare never shared between users or between extensions.  An extension should not\ndepend on changes made in a work-copy remaining for more than 24 hours, and if\npossible, shouldn't depend on them to remain between invokations of the\nextension at all.\n\nThe \"path\" attribute is the absolute file system path of the repository, without\na trailing slash.\n\nThe run() method can be used to execute arbitrary git commands in the\nrepository.  The first argument should be the name of the git command, such as\n\"status\" or \"commit\".  Additional arguments are passed as options to the git\ncommand.  If the last argument to the function is an object, it is not passed as\nan regular argument to the git command, instead its enumerable properties are\nset as environment variables for the git command.  Input can be passed to the\ngit command's stdin by setting the special environment variable \"stdin\" (which\nis not actually set as an environment variable.)\n\nWhen a work-copy clone is reused, it's always cleaned up by running \"git clean\n-x -d -f -f\" and \"git reset --hard HEAD\" in it.\n\nCommit\n------\n| interface Commit {\n|   readonly attribute Repository  repository;\n|   readonly attribute long        id;\n|   readonly attribute string      sha1;\n|   readonly attribute string      tree;\n|   readonly attribute GitUserTime author;\n|   readonly attribute GitUserTime committer;\n|   readonly attribute string      message;\n|   readonly attribute string      summary;\n|   readonly attribute Commit[]    parents;\n|\n|   CommitFileVersion getFile(string path);\n| };\n\nRepresentation of a Git commit.\n\nThe \"tree\" attribute contains the SHA-1 sum of the tree object that the commit\nreferences.\n\nThe \"summary\" attribute contains either the first line of the commit message,\nor, if that line starts with either \"fixup!\" or \"squash!\" and is followed by at\nleast one non-empty line, the first non-empty line following the first line.\n\nThe getFile() method can be used to access an arbitrary file as it appears in\nthe commit.  The file need not have been modified in the commit.\n\nFileVersion\n-----------\n| interface FileVersion : File {\n|   readonly Repository repository;\n|   readonly string     mode;\n|   readonly long       size;\n|   readonly string     sha1;\n|\n|   readonly Uint8Array bytes;\n|   readonly string[]?  lines;\n| };\n|\n| interface CommitFileVersion : FileVersion {\n|   readonly Commit     commit;\n| };\n\nRepresentation of a version of a certain file in a repository.  The \"mode\"\nattribute is a string containing 6 digits, usually \"100644\".  The three last\ndigits represent the access bits (R, W and X) for user, group and others,\nrespectively.  The \"size\" attribute is the size of the file in bytes.  The\n\"sha1\" attribute is the SHA-1 sum of the blob object in the repository.\n\nThe \"bytes\" attribute contains the file version's contents as an Uint8Array\nobject.  The \"lines\" attribute contains the file version's contents as an array\nof lines, or null if the file is heuristically determined to contain binary data\n(the heuristics are the same as git uses.)  The raw contents of the file are\ninterpreted as UTF-8 and if that does not work, as ISO-8859-1.\n\nCommitSet\n---------\n| interface CommitSet : Array {\n|   constructor CommitSet(Commit[] commits);\n|\n|   readonly attribute long     length;\n|   readonly attribute Object   parents;\n|   readonly attribute Object   children;\n|   readonly attribute Commit[] heads;\n|   readonly attribute Commit[] tails;\n|   readonly attribute Commit[] upstreams;\n| };\n\nRepresentation of a set of commits.  The object is first of all an array of\nCommit objects, in depth-first first-parent-first order, starting from the head\ncommits of the set in unspecified order.  (In sets with multiple heads, parents\ncan occur before their children, so in such sets, the order should not be relied\nupon to be strictly topological.)  For each commit in the set, the object also\nhas a non-enumerable property whose name is the SHA-1 sum of the commit and\nwhose value is the Commit object.\n\nThe parents and children attributes are objects with one property per commit in\nthe set whose name is the SHA-1 sum of the commit and whose value is an array of\nCommit objects, containing the parents and children of the commit in the set,\nrespectively.  (IOW, if a commit has multiple parents, some included in the set\nand some not included in the set, this array contains only those parents that\nare in the set.  The commit object's parents attribute returns an array\ncontaining the commit's full set of parents.)\n\nThe heads attribute is an array of Commit objects that contains all commits in\nthe set that has no descendants in the set.  The array also has a\nnon-enumerable property per such commit whose name is the SHA-1 sum of the\ncommit and whose value is the Commit object.\n\nThe tails attribute is an array of Commit objects that contains all commits that\nare parents of commits included in the commit-set but are not included in the\ncommit-set themselves.  The array also has a non-enumerable property per such\ncommit whose name is the SHA-1 sum of the commit and whose value is the Commit\nobject.\n\nThe upstreams attribute is an array of Commit objects that contains all commits\nin the tails array that aren't ancestors of another commit in the tails array.\nIf the commit-set contains the commits on a branch, and the branch has been\nmerged with its upstream branch (e.g. 'master') a number of times, the tails\narray would contain each commit from the upstream branch that was merged in,\nwhereas the upstreams array would only contain one -- the commit that was merged\nin most recently.\n\nNote: A commit-set representing a simple branch of commits without any merges\nwill have a heads array containing a single commit and a tails array containing\na single commit.  Multiple tails indicate that the commit-set includes merges\nthat merge in commits that are not included in the commit-set.  Multiple heads\ntypically indicate that the commit-set consists of multiple disconnected\nsub-trees of commits, such as the commits that make up a review that has been\nrebased one or more times.\n\nBranch\n------\n| dictionary BranchData {\n|   long id;\n|\n|   Repository repository;\n|   string     name;\n| };\n|\n| interface Branch {\n|   constructor Branch(BranchData data);\n|\n|   readonly attribute Repository repository;\n|   readonly attribute long       id;\n|   readonly attribute string     name;\n|   readonly attribute Commit     head;\n|   readonly attribute Branch?    base;\n|   readonly attribute Review?    review;\n|   readonly attribute CommitSet  commits;\n|\n|   RepositoryWorkCopy getWorkCopy();\n| };\n\nRepresentation of Critic's view of a branch.  This is somewhat different from\nGit's normal view of a branch in that a branch is considered to be based on\n(branched from) another branch, and doesn't contain commits it has in common\nwith its base branch.  For each repository, however, there is a root branch\n(typically 'master') that contains the same set of commits in Critic's view of\nit as in Git's normal view.  This root branch has no base branch.\n\nThe review attribute returns a Review object representing the review associated\nwith this branch.  If this branch is a review branch, the associated review is\nthe review whose review branch this is.  Otherwise, the associated review is the\nlatest review created from this branch, if any.  If there is no associated\nreview, the attribute is null.\n\nThe commits attribute returns a CommitSet object containing all commits that\nCritic considers part of the branch.  This set's heads array contains a single\ncommit which is the same as the commit returned from the Branch object's head\nattribute.  The set's tails array may contain multiple commits, if the branch\ncontains merges.\n\nThe getWorkCopy() method can be used to create a work-copy of the repository\ncontaining this branch.  The work-copy's \"origin\" remote will be the repository\ncontaining this branch.  The work-copy's current branch will have the same name\nas this branch, and will be set up to track this branch in \"origin\".\n\nTrackedBranch\n-------------\n| dictionary FindData {\n|   Branch branch;\n|\n|   string remote;\n|   string name;\n| };\n|\n| interface TrackedBranch {\n|   readonly attribute Branch branch;\n|   readonly attribute Review? review;\n|   readonly attribute string remote;\n|   readonly attribute string name;\n|\n|   readonly attribute boolean disabled;\n|   readonly attribute boolean pending;\n|   readonly attribute boolean updating;\n|\n|   static TrackedBranch find(FindData data);\n| };\n\nRepresentation of a tracked branch.  The branch attribute represents the local\nbranch.  The review attribute is the review connected with the local branch, if\nthere is one, and null otherwise.  The remote and name attributes represent the\nremote repository and branch name.\n\nThe disabled attribute indicates whether the tracking is disabled.  The pending\nattribute indicates whether the branch will be updated ASAP.  The updating\nattribute indicates whether the branch is being updated right now.\n\nChangesets\n----------\n\nChangeset\n---------\n| interface Changeset {\n|   readonly attribute Repository      repository;\n|   readonly attribute Review?         review;\n|   readonly attribute long            id;\n|   readonly attribute Commit          parent;\n|   readonly attribute Commit          child;\n|   readonly attribute ChangesetFile[] files;\n|   readonly attribute Commit[]        commits;\n| };\n\nRepresents the collection of changes to files between two commits; parent and\nchild.  The files attribute is an array containing all changed files.\n\nMergeChangeset\n--------------\n| interface MergeChangeset {\n|   readonly attribute Repository  repository;\n|   readonly attribute Review?     review;\n|   readonly attribute Commit      commit;\n|   readonly attribute Changeset[] changesets;\n| };\n\nRepresents the relevance-filtered changes in a merge commit.  The changesets\nattribute is an array of Changeset objects, one for each parent of the commit\nreturned by the commit attribute.  Note that these changesets do not contain the\nfull set of changes between the merge commit and its parents; they only contain\nchanges that are deemed likely to have caused conflicts.\n\nChangesetFile\n-------------\n| interface ChangesetFile : File {\n|   readonly attribute Changeset             changeset;\n|   readonly attribute ChangesetFileVersion? oldVersion;\n|   readonly attribute ChangesetFileVersion? newVersion;\n|   readonly attribute ChangesetChunk[]?     chunks;\n|   readonly attribute long                  deleteCount;\n|   readonly attribute long                  insertCount;\n|\n|   readonly attribute boolean?              isReviewed;\n|   readonly attribute User?                 reviewedBy;\n| };\n\nRepresents a file changed in a changeset.  The chunks attribute is an array of\nchunks, each representing a sequence of changed lines.  If a file was added or\nremoved in the changeset, oldVersion or newVersion is null, respectively, and\nthe chunks attribute is null.\n\nThe deleteCount and insertCount attributes are the total number of lines deleted\nand inserted in the file in the changeset, respectively.\n\nIf the parent Changeset object (returned by the changeset attribute) was fetched\nfrom a Review object (using the Review.getChangeset method), the attribute\nisReviewed is set to true if these changes in this file have been marked as\nreviewed, and the reviewedBy attribute is set to the user who (most recently)\nmarked the file as reviewed.  If the parent Changeset object was not fetched\nfrom a Review object, both properties are null.\n\nChangesetFileVersion\n--------------------\n| interface ChangesetFileVersion : FileVersion {\n|   readonly attribute Changeset     changeset;\n|   readonly attribute ChangesetFile file;\n| };\n\nRepresents a single version of a file changed in the changeset.\n\nChangesetChunk\n--------------\n| interface ChangesetChunk {\n|   readonly attribute Changeset       changeset;\n|   readonly attribute ChangesetFile   file;\n|   readonly attribute long            deleteOffset;\n|   readonly attribute long            deleteCount;\n|   readonly attribute long            insertOffset;\n|   readonly attribute long            insertCount;\n|   readonly attribute ChangesetLine[] lines;\n| };\n\nRepresents a continuous set of changed lines.\n\nThe offset attributes deleteOffset and insertOffset represent the start line\n(zero-based) of the modification; either the first deleted or first inserted\nline.  If deleteCount or insertCount is zero, the line specified by the\ncorresponding offset is the first line after the changes.  A zero deleteCount\nmeans lines were only inserted and a zero insertCount means lines were only\ndeleted; both counts will never be zero.\n\nThe lines attribute is an array of lines from a hypothetical side-by-side diff.\nIt is possible for this array to contain more elements than deleteCount or\ninsertCount.  The maximum possible number of elements is\ndeleteCount+insertCount, and the minimum possible number of elements is\nMAX(deleteCount,insertCount).\n\nChangesetLine\n-------------\n| interface ChangesetLine {\n|   const long TYPE_CONTEXT    = 0;\n|   const long TYPE_WHITESPACE = 1;\n|   const long TYPE_REPLACED   = 2;\n|   const long TYPE_MODIFIED   = 3;\n|   const long TYPE_DELETED    = 4;\n|   const long TYPE_INSERTED   = 5;\n|\n|   const long OPERATION_REPLACE = 0;\n|   const long OPERATION_DELETE  = 1;\n|   const long OPERATION_INSERT  = 2;\n|\n|   readonly attribute long     type;\n|   readonly attribute long     oldIndex;\n|   readonly attribute string?  oldText;\n|   readonly attribute long     newIndex;\n|   readonly attribute string?  newText;\n|   readonly attribute Array[]? operations;\n| };\n\nRepresentation of a line in a hypothetical side-by-side diff.  The type of the\nline represents the line's role in the diff.  Context lines aren't changed at\nall, whitespace lines have only whitespace changes, replaced lines are lines\nwhere there are few similarities between the old and new versions, modified\nlines are lines where there are enough similarities between the old and new\nversion to assume the line was edited (in which case the operations attribute\ncontains an analysis of how the line was edited), and deleted and inserted lines\nare lines that are only present in the old or new version, respectively.\n(Technically, a replaced line is really just a deleted line and an inserted line\ncollapsed together.)\n\nThe operations attribute returns an array of arrays.  Each array represents a\nmodification of the line.  The first element of each array is the modification\ntype; replace, delete or insert.  In the case of replace, the array contains an\nadditional four elements; deleteStart, deleteEnd, insertStart and insertEnd; and\nthe meaning is that the characters [deleteStart,deleteEnd) in the old version of\nthe line (oldText) were replaced by the characters [insertStart,insertEnd) in\nthe new version of the line (newText).  In the case of delete or insert, the\narray contains an additional two elements; start and end; and the meaning is\nthat the characters [start,end) were deleted from the old version (delete) or\ninserted into the new version (insert).\n\nReviews\n-------\n\nReview\n------\n| interface Review {\n|   constructor Review(long id);\n|\n|   readonly attribute long           id;\n|   readonly attribute string         state;\n|   readonly attribute User[]         owners;\n|   readonly attribute User?          closedBy;\n|   readonly attribute User?          droppedBy;\n|   readonly attribute string         summary;\n|   readonly attribute string         description;\n|   readonly attribute Repository     repository;\n|   readonly attribute Branch         branch;\n|   readonly attribute CommitSet      commits;\n|   readonly attribute User[]         users;\n|   readonly attribute Object         reviewers;\n|   readonly attribute Object         watchers;\n|   readonly attribute CommentChain[] commentChains;\n|   readonly attribute OldBatch[]     batches;\n|   readonly attribute ReviewFilter[] filters;\n|   readonly attribute ReviewProgress progress;\n|   readonly attribute TrackedBranch  trackedBranch;\n|\n|   OldBatch getBatch(long id);\n|\n|   CommentChain getCommentChain(long id);\n|   Comment getComment(long id);\n|\n|   Changeset|MergeChangeset getChangeset(Commit commit);\n|\n|   NewBatch startBatch(User acting_user);\n|\n|   dictionary PrepareRebaseData {\n|     boolean historyRewrite;\n|     boolean singleCommit;\n|     Commit newUpstream;\n|\n|     string branch;\n|   };\n|\n|   void prepareRebase(PrepareRebaseData data);\n|   void cancelRebase();\n| };\n\nThe prepareRebase() method prepares the review to rebased.  This is the same\noperation as is performed using the \"Prepare Rebase\" button on the review\nfront-page.  Exactly one of data.historyRewrite, data.singleCommit and\ndata.newUpstream needs to be set (true / non-null) to indicate the type of\nrebase.  A history rewrite rebase is only allowed to change the commit history,\nnot the tree.  A move rebase, indicated by data.singleCommit or\ndata.newUpstream, is allowed to change the tree as well.  If data.singleCommit\nis true, the new upstream is implicitly the parent commit of the new head of the\nbranch, otherwise data.newUpstream must be set accordingly.  (See the tutorial\non review rebasing for more details.)\n\nThe cancelRebase() method cancels a previous prepared (but not yet performed)\nrebase.  While a rebase is prepared, no-one but the user who prepared the rebase\nis allowed to push to the review branch, and no other user can prepare a rebase\nof the review.  Thus, if a rebase is prepared but won't be performed, it should\nbe cancelled instead.\n\nReviewFilter\n------------\n| interface ReviewFilter {\n|   readonly attribute Review     review;\n|   readonly attribute string     type;\n|   readonly attribute User       user;\n|   readonly attribute string     path;\n|   readonly attribute User       creator;\n| };\n\nRepresentation of a review filter.  The value of the type attribute will be\neither \"reviewer\" or \"watcher\".\n\nReviewProgress\n--------------\n| interface ReviewProgress {\n|   readonly attribute boolean accepted;\n|   readonly attribute boolean finished;\n|   readonly attribute boolean dropped;\n|   readonly attribute long    pendingLines;\n|   readonly attribute long    reviewedLines;\n|   readonly attribute long    openIssues;\n|\n|   string toString();\n| };\n\nCommentChain\n------------\n| interface CommentChain {\n|   const long TYPE_ISSUE = 0;\n|   const long TYPE_NOTE  = 1;\n|\n|   const long STATE_OPEN      = 0;\n|   const long STATE_RESOLVED  = 0;\n|   const long STATE_ADDRESSED = 0;\n|\n|   readonly attribute Review    review;\n|   readonly attribute Batch     batch;\n|   readonly attribute long      id;\n|   readonly attribute User      user;\n|   readonly attribute User?     closedBy;\n|   readonly attribute long      type;\n|   readonly attribute long      state;\n|   readonly attribute Comment[] comments;\n|\n|   Comment getComment(long id);\n| };\n\nFileCommentChain\n----------------\n| interface FileCommentChain : CommentChain {\n|   const long ORIGIN_OLD = 0;\n|   const long ORIGIN_NEW = 1;\n|\n|   readonly attribute Changeset     changeset;\n|   readonly attribute Commit?       addressedBy;\n|   readonly attribute ChangesetFile file;\n|   readonly attribute long          origin;\n|   readonly attribute Object        lines;\n|   readonly attribute string        context;\n|   readonly attribute string        minimizedContext;\n| };\n\nCommitCommentChain\n------------------\n| interface CommitCommentChain : CommentChain {\n|   readonly attribute Commit commit;\n|   readonly attribute long   firstLine;\n|   readonly attribute long   lastLine;\n| };\n\nComment\n-------\n| interface Comment {\n|   readonly attribute CommentChain chain;\n|   readonly attribute Batch        batch;\n|   readonly attribute long         id;\n|   readonly attribute User         user;\n|   readonly attribute Date         time;\n|   readonly attribute string       text;\n| };\n\nBatch\n-----\n| interface Batch {\n|   readonly attribute Review review;\n|   readonly attribute User   user;\n| };\n\nOldBatch\n--------\n| interface OldBatch : Batch {\n|   readonly attribute long           id;\n|   readonly attribute Date           time;\n|   readonly attribute CommentChain?  commentChain;\n|   readonly attribute CommentChain[] issues;\n|   readonly attribute CommentChain[] notes;\n|   readonly attribute Comment[]      replies;\n| };\n\nObject representing a batch of changes submitted earlier.\n\nNewBatch\n--------\n| interface NewBatch : Batch {\n|   readonly attribute long? id;\n|\n|   dictionary Location {\n|     long lineIndex;\n|     long lineCount;\n|   };\n|   dictionary FileLocation : Location {\n|     ChangesetFileVersion fileVersion;\n|   };\n|   dictionary CommitLocation : Location {\n|     Commit commit;\n|   };\n|\n|   void raiseIssue(string text, FileLocation data);\n|   void raiseIssue(string text, CommitLocation data);\n|   void raiseIssue(string text);\n|\n|   void writeNote(string text, FileLocation data);\n|   void writeNote(string text, CommitLocation data);\n|   void writeNote(string text);\n|\n|   void addReply(CommentChain chain, string text);\n|   void resolveIssue(CommentChain chain);\n|   void markIssueAddressedBy(CommentChain chain, Commit commit);\n|\n|   void assignChanges(User assignee, Changeset changeset);\n|   void assignChanges(User assignee, ChangesetFile changeset);\n|\n|   void unassignChanges(User assignee, Changeset changeset);\n|   void unassignChanges(User assignee, ChangesetFile changeset);\n|\n|   void addReviewFilter(User user, string type, string path);\n|   void removeReviewFilter(User user, string type, string path);\n|\n|   dictionary FinishData {\n|     string text;\n|     boolean silent;\n|   };\n|\n|   void finish(FinishData? data);\n| };\n\nObject used to collect changes to submit to a review, and then submit them.\nEach batch can only manipulate one review on behalf of one user, and is started\nusing the method Review.startBatch().  Until finish() is called, nothing is\nadded to the database at all.  When finish() is called, everything is added to\nthe database, and mails are sent to relevant users.\n\nReviewSet\n---------\n| interface ReviewSet {\n| };\n\nThe ReviewSet interface is a small utility interface used by the Dashboard\ninterface to represent a set of reviews.  For each review in the set, the\nReviewSet object has one enumerable property whose name is the review ID and\nwhose value is the Review object.\n\nDashboard\n---------\n| interface Dashboard {\n|   readonly attribute User user;\n|\n|   interface Owned {\n|     readonly attribute Review[] finished;\n|     readonly attribute Review[] accepted;\n|     readonly attribute Review[] pending;\n|     readonly attribute Review[] dropped;\n|   };\n|\n|   interface Active : Review[] {\n|     readonly attribute ReviewSet hasPendingChanges;\n|     readonly attribute ReviewSet hasUnreadComments;\n|     readonly attribute ReviewSet isReviewer;\n|     readonly attribute ReviewSet isWatcher;\n|   };\n|\n|   interface Inactive : Review[] {\n|     readonly attribute ReviewSet isReviewer;\n|     readonly attribute ReviewSet isWatcher;\n|   };\n|\n|   readonly attribute Owned    owned;\n|   readonly attribute Active   active;\n|   readonly attribute Inactive inactive;\n| };\n\nThe Dashboard interface exposes various sets of reviews with which the user is\nassociated, roughly corresponding to those displayed on the built-in dashboard\npage.\n\nThe owned attribute returns a sub-object containing four arrays of reviews owned\nby the user, one array per review state: finished, accepted, pending and\ndropped.  These arrays are non-overlapping.  (The reviews in the finished and\ndropped arrays are not displayed on the built-in dashboard page.)\n\nThe active attribute returns an array of open (accepted or pending) reviews that\nare \"active,\" which actually rather means the user ought to be active; that is,\nin which there is work for the user to do.  This array can contain reviews owned\nby the user.  The array also has four additional attributes that return\nReviewSet objects containing sub-sets of the array.  The hasPendingChanges set\ncontains reviews in which there are unreviewed changes assigned to the user.\nThe hasUnreadComments set contains reviews in which the user has unread\ncomments.  These sets can overlap.  The isReviewer and isWatcher sets contain\nreviews in which the user is a reviewer (has changes assigned to him, already\nreviewed or not) and those in which the user is not a reviewer.  These sets\ndon't overlap, and together they contain all reviews in the array.\n\nThe inactive attribute returns an array of all open reviews with which the user\nis associated that were not included in the array returned by the active\nattribute.  The isReviewer and isWatcher attributes work the same way as the\ncorresponding attributes on the array return by the active attribute.\n\nFilters\n-------\n| interface Filters {\n|   dictionary FiltersInit {\n|     Repository   repository;\n|     Review       review;\n|     User         user;\n|   };\n|\n|   constructor Filters(FiltersInit data);\n|\n|   readonly attribute Repository repository;\n|   readonly attribute Review?    review;\n|\n|   boolean isReviewer(User user, File file);\n|   boolean isWatcher(User user, File file);\n|   boolean isRelevant(User user, File file);\n|\n|   Object listUsers(File file);\n| };\n\nThe Filters interface provides easy access to Critic's filtering system,\napplying global filters, inherited global filters and review filters.\n\nIf FiltersInit.review is set, FiltersInit.repository is ignored (the repository\nis inferred from the review instead.)  If FiltersInit.review is not set,\nFiltersInit.repository must be set.\n\nIf FiltersInit.user is set, only filters relating to that user are loaded.  This\nis an optimization for the case that only a single user's filters is of\ninterest.\n\nThe isReviewer() and isWatcher() functions return true if the provided user is a\nreviewer or watcher, respectively, of the provided file.  The isRelevant()\nfunction returns true if the provided user is either a reviewer or a watcher of\nthe provided file.\n\nThe listUsers() function returns the set of users that either review or watch\nthe provided file.  The returned object has one enumerable property per user in\nthe set whose name is the user ID and whose value is a User object.\n\nPer-user Data Storage\n---------------------\n\nStorage\n-------\n| interface Storage {\n|   constructor Storage(User user);\n|\n|   string get(string key);\n|   void set(string key, string value);\n| };\n\nThe Storage interface provides a simple key/value storage.  The length of the\nkey is limited to 64 characters, the value can be an arbitrarily long string.\nTo store more complex data, the builtin JSON encoder (JSON.stringify()) and JSON\ndecoder (JSON.parse()) can be used.\n\nA storage object accessing data for the current extension and current user is\navailable as \"critic.storage\".  Storage objects accessing data for the current\nextension and an arbitrary user can be constructed using \"new\ncritic.Storage(user)\".\n\nPer-user Log\n------------\n\nLog\n---\n| interface Log {\n|   constructor Log(User user);\n|\n|   dictionary WriteData {\n|     string category;\n|   };\n|\n|   void write(...[, WriteData data]);\n|\n|   dictionary SearchData {\n|     Date|string timeStart;\n|     Date|string timeEnd;\n|     string category;\n|   };\n|\n|   interface Entry {\n|     User user;\n|     Date time;\n|     string category;\n|     string text;\n|   };\n|\n|   Entry[] fetch(SearchData data);\n|   void remove(SearchData data);\n| };\n\nThe Log interface provides a simple per-extension and per-user logging facility.\nEach log entry consists of a timestamp, a category (an arbitrary string of no\nmore than 64 characters) and the text (an arbitrarily long string.)\n\nEntries are added to the log using the write() method.  If the last parameter to\nit is an object, it is used as a WriteData dictionary instance, and does not\ncontribute to the text logged.  The dictionary can be used to set the category\nof the log entry.  If not set, the category defaults to \"default\".  The rest of\nthe arguments are passed to the built-in format() function.\n\nEntries from the log can be fetched using the fetch() method and removed using\nthe remove() function.  Both functions take an optional SearchData dictionary\nargument, which can be used to select specific entries.  If the timeStart member\nis specified, only entries written since the specified point in time are\nselected.  If the timeEnd member is specified, only entries written before the\nspecified point in time are selected.  Both members can be either a Date object\nfor an absolute time or a string on the form \"N unit(s)\", such as \"1 month\" or\n\"5 hours\", for a relative time.  If the category member is specified, only\nentries with that category are selected.\n\nA log object logging for the current extension and current user is available as\n\"critic.log\".  Log objects logging for the current extension and an arbitrary\nuser can be constructed using \"new critic.Log(user)\".\n\nStatistics\n----------\n\nStatistics\n----------\n| interface Statistics {\n|   constructor Statistics();\n|\n|   void setReview(Review review);\n|\n|   void setInterval(Date start[, Date end]);\n|   void setInterval(string start[, string end]);\n|\n|   void setUser(User user);\n|\n|   void addDirectory(string path);\n|   void addFile(string path);\n|\n|   Object getReviewedLines();\n|   Object getWrittenComments();\n| };\n\nThe Statistics interface allows extraction of simply statistics data.  The\ndatabase queries issued by this interface can take a long time to complete,\nespecially when not filtering per review or user, and if the time interval is\nlarge.  It may take 30 seconds or more for getReviewedLines() to return.\n\nThe setReview() method restricts the query to a single review.  If no review is\nset, the query applies to all reviews (in all repositories.)\n\nThe setInterval() method restricts the query to a certain period of time.  Start\nand end points are specified as Date objects, or as strings on the form \"N\nunit(s)\", such as \"1 month\", \"2 weeks\" or \"12 hours\".  Supported units are\nseconds, minutes, hours, days, weeks, months and years.  The end point is\noptional; if not set, it defaults to the current time.\n\nThe setUser() method restricts the query to a certain user.\n\nThe addDirectory() method restricts the query to files under one or more\ndirectories.\n\nThe getReviewedLines() method executes a query counting the number of lines\nmarked as reviewed by each user.  The returned object has one property per user\nin the query result whose name is the user ID and whose value is an object with\nthe properties deleteCount and insertCount.  A simple enumeration of the query\nresults could look something like\n\n| var data = statistics.getReviewedLines();\n| for (var user_id in data)\n| {\n|   var user = new critic.User({ id: user_id });\n|   var lines = data[user_id];\n|   writeln(\"%%s has reviewed %%d lines\", user.fullname, lines.deleteCount + lines.insertCount);\n| }\n\nNote: the user_id variable in this code contains the user ID, which is really an\ninteger, but since it comes from a property enumeration, the variable's type is\nstring.  Calling critic.User(user_id) directly would not work, since it will\ninterpret its argument as a user name when it's a string, hence the variant\ncritic.User({ id: user_id }).  Using parseInt(user_id) to convert the user ID to\na number would have worked too, of course.\n\nThe getWrittenComments() method executes a query counting the number of comments\nwritten by each user.  It's similar to getReviewedLines(), except the per-user\ndata objects have the properties raisedIssues, writtenNotes, totalComments and\ntotalCharacters.  The totalComments property is the sum of raisedIssues and\nwrittenNotes, plus the number of replies written.  The totalCharacters property\nis the sum of the lengths of each comment counted by totalComments.\n\nHTML Utilities\n--------------\n\nUtilities for generating pages that look like other Critic pages are available\nin a sub-module accessible as \"critic.html\", for instancce\n\"critic.html.writeStandardHeader()\" and \"new critic.html.PaleYellowTable()\".\n\nwriteStandardHeader()\n---------------------\n| dictionary HeaderData {\n|   User   user;\n|   Array  stylesheets;\n|   Array  scripts;\n|   Object links;\n|   Review review;\n| };\n|\n| void writeStandardHeader(string title, HeaderData? data);\n\nWrites a document header.  This will write a DOCTYPE, start the HTML element,\nwrite a HEAD element with TITLE, default external stylesheets and scripts,\nadditional external stylesheets and scripts specified in the data argument, and\nthen start the BODY element and write a TABLE element containing the visible\npage header (the Critic logo and navigation links.)  Additional content written\nafter the call to this function will be parsed into the BODY element.\n\nThe data.user attribute controls which user's preferences and other state (such\nas unread news items) is used to render the page header.  If not specified, it\ndefaults to the user loading the page being generated.\n\nThe data.stylesheets and data.scripts arrays should contain URLs as strings.\n\nThe data.links object is enumerated and one link is added for every enumerable\nproperty.  If the name of the property starts with \"rel=\", a LINK element is\nadded to the HEAD element, otherwise a regular link whose title is the name of\nthe property is added to the page header (next to the \"Home\", \"Dashboard\",\n\"Branches\" et c. links).  The value of the property is used as the link's HREF\nin both cases.\n\nThe data.review object, if set, adds a standard \"Back To Review\" link, and adds\ndraft changes summary and Submit/Preview/Abort buttons to the right side of the\nheader, if the user has any unsubmitted changes in the review.\n\nA call to writeStandardHeader() might look something like\n\n| critic.html.writeStandardHeader(\"Page Title\", {\n|     links: { \"rel=up\": \"/dashboard\",\n|              \"BTS\": \"https://bugs.opera.com/\" }\n|     stylesheets: [\"/extension-resource/HelloWorld/custom.css\"],\n|     scripts: [\"/extension-resource/HelloWorld/custom.js\"]\n|   });\n\nwriteStandardFooter()\n---------------------\n| dictionary FooterData {\n|   User user;\n| };\n|\n| void writeStandardFooter(FooterData? data);\n\nWrites a document footer that adds a little bit of content and closes the BODY\nand HTML elements.\n\nescape()\n--------\n| string escape(string text);\n\nReplaces all characters in the argument string that have special meaning in HTML\nwith HTML entity references, so that it can safely be written as text or in an\nattribute value.  (This is a trivial conversion, of course, and is exposed\nmerely as convenience.)\n\nPaleYellowTable\n---------------\n| interface PaleYellowTable {\n|   constructor PaleYellowTable(string title);\n|\n|   void addHeading(string heading);\n|\n|   dictionary StandardItem {\n|     string name;\n|     string value;\n|     string description;\n|     Object buttons;\n|   };\n|   dictionary ButtonsItem {\n|     Object buttons;\n|   };\n|   dictionary CustomItem {\n|     string html;\n|   };\n|\n|   void addItem(StandardItem item);\n|   void addItem(ButtonsItem item);\n|   void addItem(CustomItem item);\n|\n|   void write();\n| };\n\nThe PaleYellowTable interface can be used to generate typical Critic \"pale\nyellow tables.\"  The title argument to the constructor is the main title of the\ntable.  Additional headings can be added using the addHeading() methods.\nRegular content in the table is added using the addItem() method.\n\nWhen the addItem() method is called with an argument matching the StandardItem\ndictionary, a fixed-form item is added, which produces a result similar to the\nfirst table on a review front-page, containing basic information about the\nreview.  The buttons attribute in the dictionary is optional; if present, the\nobject's properties are enumerated and a button is added for each, whose title\nis the name of the property, and whose onclick attribute value is the value of\nthe property.  All strings except the value (name, description, and button title\nand onclick handler) will have characters with special meaning in HTML replaced\nby entity references, and thus can't contain HTML styling.  The value string is\noutput as-is, and should be escaped using critic.html.escape() (or similar) if\nits content is unknown and shouldn't be interpreted as HTML.\n\nWhen the addItem() method is called with an argument matching the ButtonsItem\ndictionary, a row of buttons is added.  The object is handled the same way as\nStandardItem.buttons described above.\n\nWhen the addItem() method is called with an argument matching the CustomItem\ndictionary, the value of the html attribute is output as-is (without escaping)\ninside a full-width table cell.\n\nThe write() method writes all HTML for the whole table to stdout using the\nbuilt-in (global) write() function.  This means several PaleYellowTable objects\ncan be constructed and populated in parallel, and then all written (in some\norder) at the end.\n"
  },
  {
    "path": "src/tutorials/extensions.txt",
    "content": "Critic Extensions\n=================\n\nCreating an Extension\n---------------------\n\nTo create a Critic extension, you log into the server running Critic using SSH,\nand create a directory named <code>CriticExtensions</code> in your home\ndirectory.  Under this directory, you create a sub-directory with the same name\nas your extension.  Under this directory, you create a file named\n<code>MANIFEST</code>.  You will also need to make sure the directory and all\nfiles under it are accessible to the system group\n  <code>%(configuration.base.SYSTEM_GROUP_NAME)s</code>,\nfor instance by running these commands:\n\n| chmod a+x $HOME\n| chmod --recursive a+rX $HOME/CriticExtensions\n\nOnce the <code>MANIFEST</code> file exists, Critic will consider this extension\nto exist, and it will show up on the extension management page.  Naturally, the\ncontents of the <code>MANIFEST</code> file must follow certain rules, and if it\ndoesn't, the extension management page will simply state that the extension has\na broken manifest.  Extensions with broken manifests are only displayed to the\nauthor.\n\n  Hint: The red text that says the extension has a broken manifest is a link to\n  a page that shows what's wrong with the manifest.\n\nCritic extensions are written in ECMAScript, and executed by a standalone\nECMAScript interpreter.  An ECMAScript API for accessing much of the data in\nCritic's database, as well as for performing operations such as raising issues\nand assigning changes to reviewers, is available to the extension script.  See\nthe [extensions API tutorial][api-tutorial] for details on the API.\n\n[api-tutorial]: /tutorial?item=extensions-api\n\nInstalling an Extension\n-----------------------\n\nExtensions won't run just because they are created; they need to be installed to\nbecome active.  To install an extension, a user simply goes to the [extension\nmanagement page][manageextensions], selects a version of the extension to\ninstall, and clicks the \"Install\" button.  Alternatively, a system administrator\ncan install an extension \"universally\", which activates the extension for all\nusers.\n\nFor an extension to have one or more \"official\" versions, the extension's\ndirectory should be a Git repository, and this repository should have one or\nmore branches named \"version/*\".  For instance, to have a version named\n\"stable\", the repository should contain a branch named \"version/stable\".  When a\nuser installs an official version of an extension, the commit currently\nreferenced by the version's branch is checked out into a directory outside the\nauthor's home directory, and the extension is run from there instead.  If the\nbranch changes, the user can update his installation of the extension via the\n[extension management page][manageextensions].  (This may be automated in the\nfuture.)\n\nIn addition to official versions, every extension can be installed in \"live\"\nmode, which means that the extension runs directly from the extension's\ndirectory in the author's home directory.  This is typically the most convenient\nmode for the author of the extension to use while developing.  Regular users\nwould typically be better off installing an official version, to avoid temporary\nbreakage while the author is developing the extension.\n\nIf the extension's directory isn't a Git repository, or if the repository has no\nbranches named \"version/*\", only the \"live\" mode is available.\n\n[manageextensions]: /manageextensions\n\nThe MANIFEST\n------------\n\nThe format of the MANIFEST file is similar to that of a traditional INI file.\nSections are started by \"[heading]\" lines and key-value pairs are specified by\n\"key=value\" lines.  In the MANIFEST file, the values are interpreted either as\nplain strings (no escapes, no line-breaks) or, if the value starts and ends with\ndouble quotes, as a JSON string literal (note: not a string containing an\narbitrary JSON-formatted value.)\n\nThe beginning of the file is an implicit top-level section (without header.)\nThis section must specify two keys: \"Author\" and \"Description\".  \"Author\" can be\nspecified multiple times to specify multiple authors.  Additionally, the key\n\"Hidden\", with the value \"true\" or \"yes\", can be used to hide the extension from\nother users; useful for extensions that are too unfinished to be of much\ninterest to anyone but the author.\n\nOther sections in the MANIFEST file defines \"roles\" in which the extension\nexecutes.  At least one role must be defined for the MANIFEST file to be\nconsidered valid.  (If no roles are defined, the extension won't do anything, so\nthat'd be rather pointless.)\n\nThe manifest for a simple extension could look something like\n\n| Author = \"Jens Lindstr\\u00f6m\"\n| Description = A simple extension.\n|\n| [Page simplified/r/*]\n| Description = Simplified review front-page: http://critic.example.com/simplified/r/1234\n| Script = main.js\n| Function = pageReviewFrontPage\n|\n| [ProcessCommits]\n| Description = Automatically process commits.\n| Script = main.js\n| Function = processCommits\n\nRoles\n-----\n\nCurrently there are four supported roles: Page, Inject, ProcessCommits and\nFilterHook.  Each role is defined with a separate section.  Every such section\nmust specify three keys, \"Description\", \"Script\" and \"Function\".  The\n\"Description\" should simply describe what the extension does in this role; the\n\"Script\" is a path specifying the .js file to load (the path is relative the\nextension's directory) and \"Function\" is the name of the script function to\ncall.\n\nPage\n----\nA \"Page\" role simply lets the extension handle URLs loaded from Critic.  It can\neither define new URLs, or override existing ones.  (Hint: If an extension\noverrides a built-in URL, and you want to access the built-in one, add \"!/\" to\nthe beginning of the URL path to disable all extension URLs.)\n\nThe section header for a page role is \"[Page &lt;glob&gt;]\" where &lt;glob&gt;\nis matched against the path of the URL.  The path has no leading slash, and does\nnot contain the query path.  For instance, if the full URL is\n\"http://host/a/b?d=e\", then the string that the glob should match is \"a/b\".\n\nThe script function specified for the role will be called with three arguments:\n\n1 A string containing the HTTP method (\"GET\" or \"POST\",)\n2 a string containing the path (what the glob was matched against) and\n3 an object representing the query part of the URL.\n\nThe query object is null if no query part was present in the URL.  Otherwise, it\nhas two properties: \"raw\", whose value is the query part as a string, completely\nuninterpreted (without a leading question mark,) and \"params\", whose value is an\nobject with one property for every query parameter, whose values are the decoded\nvalues the parameters, or null if the parameter did not include a \"=value\" part.\n\nExample: the URL \"http://host/path?foo=10&bar=hi%%20ho&damer\" would produce the\nquery object\n\n| {\n|   raw: \"foo=10&bar=hi%%20ho&damer\",\n|   params: {\n|     foo: \"10\",\n|     bar: \"hi ho\",\n|     damer: null\n|   }\n| }\n\nThe script function generates the URL response by writing to stdout, using the\nbuilt-in functions \"write\" and/or \"writeln\".  The first line of output must\ncontain only an HTTP response code as a decimal number, typically \"200\".  The\nfollowing lines define HTTP response headers, for instance \"Content-Type:\ntext/html\".  (The content type defaults to \"text/plain\", and \"; charset=utf-8\"\nis appended to it automatically if no charset is specified.)  The response\nheader list is terminated by writing an empty line.\n\nEverything after the first empty line is forwarded as-is to the client.\n\nIf the script function returns without writing a single byte of output, the\nbehavior is as if the page role hadn't existed in the first place.  For a custom\nURL, this means the request fails, since the URL isn't handled.  If the page\nrole was invoked to override a built-in page, the built-in page is rendered\ninstead, thus allowing a page role to conditionally override a built-in page.\n\nAdditional static resources, such as images and external stylesheets or scripts,\nthat are put in a sub-directory named \"resources\" are automatically available\nvia the URL \"/extension-resource/&lt;extension&gt;/&lt;path&gt;\" (for users that\nhave installed the extension.)  For instance, the URL\n%(configuration.URL)s/extension-resource/HelloWorld/hello.txt would return the\nfile /home/&lt;author&gt;/CriticExtensions/HelloWorld/resources/hello.txt for users\nthat have installed the extension named HelloWorld.\n\nInject\n------\nAn \"Inject\" role is similar to a \"Page\" role in form, but instead of handling\nthe URL completely, it can issue simple commands to inject some content onto\nbuilt-in pages.\n\nThe section header for an inject role is \"[Inject &lt;glob&gt;]\" where\n&lt;glob&gt; is matched against the path of the URL, exactly like for a page\nrole.  If the URL is not handled at all by Critic, or if it is one that doesn't\nproduce an HTML document, the role will not be invoked.\n\nThe script function specified for the role will be called with two arguments:\n\n1 A string containing the path (what the glob was matched against) and\n2 an object representing the query part of the URL.\n\nThese arguments are the same as the arguments to the script function specified\nfor a page role, except that the method argument is missing.  It's missing\nbecause the role can only be invoked on URLs loaded using the GET method, and\nthus wouldn't be very interesting.\n\nThe script function injects content by writing lines to its standard output\nusing the write() or writeln() functions.  Each line represents one injection.\nEach line must be on the format \"&lt;command&gt; &lt;JSON&gt;\".  The command is\na simple keyword specifying the type of injection.  The parameters to the\ncommand is encoded as JSON and is different to different commands.\n\nThree commands are currently supported:\n\n? script\n= Injects <script src=\"url\"></script> into the page's HEAD element.\n  It will be injected after all other scripts on the page.  The JSON encoded\n  value should be a string containing the URL.\n? stylesheet\n= Injects <link type=\"text/css\" href=\"url\"> into the page's HEAD element.\n  It will be injected after all other stylesheets on the page.  The JSON encoded\n  value should be a string containing the URL.\n? link\n= Controls the set of links in the page header (\"Home\", \"Dashboard\", \"Branches\"\n  et c.)  The JSON encoded value should be an array containing two elements, the\n  title of the link, and the URL it points to or null.  If the title matches one\n  of the built-in links (or, technically, one added by another extension whose\n  inject role was invoked first) the existing link's URL is changed, or the\n  existing link is removed, if the URL was specified as null.  Otherwise, if the\n  URL was not specified as null, a new link is added to the set of links.\n\nFor example, to inject a custom script and a custom stylesheet, remove the\nbuilt-in \"Branches\" link and add a link to http://www.opera.com/, the script\nfunction could write something like this:\n\n| script \"/extension-resource/HelloWorld/custom.js\"\n| stylesheet \"/extension-resource/HelloWorld/custom.css\"\n| link [\"Branches\", null]\n| link [\"Opera.com\", \"http://www.opera.com/\"]\n\n  Hint: If an inject role fails to execute when loading a page, an error message\n  is inserted into the loaded page as an HTML comment.  So if your injection\n  doesn't seem to happen, check if there are any HTML comments mentioning your\n  extension!\n\nProcessCommits\n--------------\nA \"ProcessCommits\" role lets the extension process commits immediately when they\nare added to a review.  The extension can for instance do pattern-based\ndetection of problems, and raise issues about them automatically.  A\n\"ProcessCommits\" role is invoked when a user that has installed the extension\ncreates a review (via push or the web interface) or pushes additional commits to\nthe review branch.  If the extension is universally installed, the role is\ninvoked whenever any user creates a review or pushes additional commits to a\nreview branch.\n\nA \"ProcessCommits\" role has no addition parameters in the MANIFEST; only the\n\"Script\" and \"Function\" keys are necessary (and, of course, the \"Description\".)\n\nThe script function specified for the role will be called with three arguments:\n\n1 A [Review object][review] representing the review being modified,\n2 a [Changeset object][changeset] representing the collected changes being added\n  to the review and\n3 a [CommitSet object][commitset] containing the added commits.\n\nThe changeset argument is null if it isn't possible to describe the added\nchanges as a single diff.  This happens when the added commits include a merge\nwith a different branch, in which case a simple diff would include all the\nmerged in changes, even though those changes aren't added to the review.\n\nIf the script function writes to its standard output, the written text will be\nshown to the user, either as output from the \"git push\" command, or in a dialog\nin the web interface.\n\nFilterHook\n----------\nA \"FilterHook\" role lets the extension define an additional filter type, that a\nuser can select in filter type drop-down in the \"Add Filter\" dialog on their\n[\"Home\" page][home] instead of the built-in types (<code>Reviewer</code>,\n<code>Watcher</code> and <code>Ignored</code>) when they have installed the\nextension.\n\nWhen a user has such a filter, and a set of commits is added to a review\nincluding at least one commit that touches a file matched by the filter, the\nrole is invoked.\n\n  Note: When a review is first created, the initial set of commits to be\n  reviewed are considered to be added to the review as part of creation, which\n  also triggers the filter, if it matches any touched files.\n\nThe section header for a filter hook role is <code>[FilterHook\n&lt;name&gt;]</code>, where <code>&lt;name&gt;</code> should be a simple\nidentifier of the filter, unique to the extension.  The name should contain only\nASCII letter and digits, and underscores.\n\nThe section can contain two additional keys: \"Title\" and \"DataDescription\".  The\n\"Title\" is what is displayed in the filter type drop-down in the \"Add Filter\"\ndialog.  If not present, it defaults to the role's name.  The \"DataDescription\"\nkey, if present, enables an extra input field in the \"Add Filter\" dialog when\nthe filter is selected in the filter type drop-down, where the user can input an\narbitrary string.  The \"DataDescription\" should, as the name suggests, describe\nto the user what to input, and is displayed above the extra input field in the\n\"Add Filter\" dialog.\n\nThe script function specified for the role will be called with five arguments:\n\n1 A string containing the data that the user input in the extra data input field\n  in the \"Add Filter\" dialog, or <code>null</code> if there was no\n  \"DataDescription\" key in the role's section in the MANIFEST file.\n2 A [Review object][review] representing the review being modified,\n3 A [User object][user] representing the user who added the commits to the\n  review (or created the review.)\n4 A [CommitSet object][commitset] representing all the commits added.\n5 An array of [File objects][file] representing the files touched by the added\n  commits that actually matched the filter.\n\n  Note: The current user (as returned by <code>critic.User.current</code>) when\n  the script function is called is the user whose filter was triggered, not the\n  user that added commits, or the user that authered/hosts the extension.\n\nThe script function is called asynchronously, and can, unlike a \"ProcessCommits\"\nrole, not generate output that end up output by the \"git push\" command that\nadded commits to the review.  If it wishes to produce output, it can either\ncreate issues or notes in the review, or send custom emails using the\n<code>critic.MailTransaction</code> API.\n\n[home]: /home\n\n[review]: /tutorial?item=extensions-api#review\n[user]: /tutorial?item=extensions-api#user\n[changeset]: /tutorial?item=extensions-api#changeset\n[commitset]: /tutorial?item=extensions-api#commitset\n[file]: /tutorial?item=extensions-api#file\n"
  },
  {
    "path": "src/tutorials/external-authentication.txt",
    "content": "External Authentication\n=======================\n\nCritic supports letting an external system authenticate users using the [OAuth\n2.0 protocol][oauth2].  This is convenient if the target audience of the Critic\nsystem can be assumed to already have accounts in the external system, as they\nwould not need to set and remember a separate password in the Critic system.\n\nBasic operation\n---------------\n\nAn external authentication provider can be used in two different ways:\n\n* It can be supported as an alternative alongside regular password-based\n  authentication, in which case Critic's \"Sign in\" page will contain a link to\n  the external authentication provider below the username/password fields.\n* It can also be used as the only way of authentication users, in which case the\n  \"Sign in\" link will redirect directly to the external authentication provider.\n\nThe first mode is achieved by simply enabling one or more external\nauthentication providers.  The second mode is achieved by setting the\n  <code>AUTHENTICATION_MODE</code>\nvariable in\n  <code>%(configuration.paths.CONFIG_DIR)s/configuration/base.py</code>\nto the name of the external provider (<code>\"github\"</code> or\n<code>\"google\"</code>).\n\nNote that in the second mode, access to Git repositories over HTTP/HTTPS will\nnot be supported, since Git only supports authentication using HTTP\nauthentication (using the \"Authorization\" request header.)  Also, if anonymous\naccess is disabled, all unauthenticated accesses to the Critic system redirects\nimmediately to the external authentication provider, which may be confusing\nunless the user is prepared for it.\n\n[oauth2]: http://oauth.net/2/\n\nSupported authentication providers\n----------------------------------\n\nTwo providers of OAuth 2.0 authentication are supported by Critic out of the\nbox: GitHub and Google.\n\nGitHub\n------\nTo use GitHub for authentication, an OAuth application for the Critic instance\nneeds to be created in the [GitHub account settings UI][github].  The\napplication creation form will ask for some information about the application.\nThe \"Authorization callback URL\" value requires particular attention: it must be\nset to\n  <code>%(configuration.URL)s/oauth/github</code>.\n\n  If you're reading this tutorial on a different Critic system than the one\n  being configured, this URL need to be adjusted accordingly, of course.\n\nAfter creating the application, copy the generated \"Client ID\" and \"Client\nSecret\" into the\n  <code>PROVIDERS[\"github\"]</code>\ndictionary in\n  <code>%(configuration.paths.CONFIG_DIR)s/configuration/auth.py</code>,\nupdate the rest of the settings there appropriately, and then restart Critic by\nrunning (as root)\n\n| criticctl restart\n\n[github]: https://github.com/settings/applications/new\n\nGoogle\n------\nTo use Google for authentication, a project needs to be created at the [Google\ndevelopers console][google].  Once created, an OAuth client ID is created in the\nproject's management UI: \"APIs & auth\" &#x2192; \"Credentials\" &#x2192; \"Create\nnew client ID\".  The \"Authorized Javascript origins\" and \"Authorized redirect\nURI\" values require particular attention: The former must be set to\n  <code>%(configuration.URL)s</code>\nand the latter must be set to\n  <code>%(configuration.URL)s/oauth/github</code>.\n\n  If you're reading this tutorial on a different Critic system than the one\n  being configured, there URLs need to be adjusted accordingly, of course.\n\nAfter creating the project and OAuth client ID, copy the generated \"Client ID\"\nand \"Client Secret\" into the\n  <code>PROVIDERS[\"google\"]</code>\ndictionary in\n  <code>%(configuration.paths.CONFIG_DIR)s/configuration/auth.py</code>,\nupdate the rest of the settings there appropriately, and then restart Critic by\nrunning (as root)\n\n| criticctl restart\n\n[google]: https://cloud.google.com/console/project\n"
  },
  {
    "path": "src/tutorials/filters.txt",
    "content": "Filters\n=======\n\nIntroduction\n------------\n\n\"Filters\" is Critic's mechanism for automatically assigning reviewers for\nchanges when reviews are created or updated, and for allowing users to be\nnotified about code changes without being assigned to review them.\n\nFilter Scope\n------------\n\nThere are two filter scopes: repository and review.\n\nRepository\n----------\nRepository filters are added by each user on the user's\n  <a href=\"/home\">Home</a>\npage.  Such filters apply to all reviews created in their repository, except\nwhen the user who creates the review explicitly requests that no filters should\nbe applied.\n\nReview\n------\nReview filters are specific to a single review, and can be added either by the\nuser whom the filter applies to, the review owner, or any other user.  They can\nbe added when the review is created, or can be added, or removed, at any later\ntime.\n\nA review filter is always given priority over a repository filter, if the two\nconflict in any way.\n\nFilter Type\n-----------\n\nThere are three types of filters:\n\n* A <b>Reviewer</b> filter automatically assigns the user as a reviewer of all\n  changes in files selected by the filter, unless the user is the author of the\n  commit that makes the changes.\n\n* A <b>Watcher</b> filter will automatically \"CC\" the user on any review that\n  changes any of the files selected by the filter, but without assigning the\n  user as a reviewer.\n\n* An <b>Ignored</b> filter does nothing in itself, but can be used to override a\n  Reviewer or Watcher filter for a sub-set of the selected files or in a single\n  review, and thus ignore some changes that the user is not interested in.\n\nDelegates\n---------\nA <b>Reviewer</b> filter can optionally define a list of \"delegates.\"  The\ndelegates are users who should be assigned to review changes authored by the\nuser that has the Reviewer filter, in his stead.\n\nFile Selection\n--------------\n\nEach filter applies to a set of files in the repository.  This set of files is\ndefined by a single path.  If the path does not end with a path separator it\nnames a file, and only that file is selected by the filter.  If the path ends\nwith a path separator it names a directory, and selects all files in that\ndirectory and any sub-directory of it.  If the path is \"/\", all files in the\nwhole repository are selected.\n\nWildcards\n---------\nThe path can optionally contain wildcards to name multiple files or directories.\nThe basic functionality is the same; if a wildcard path matches the path of a\nfile, that file is selected, and if it matches the path of a directory, all\nfiles in that directory and in any sub-directory of it are selected.\n\nThree wildcards are supported: <code>?</code>, <code>*</code> and\n<code>**</code>.  The <code>?</code> wildcard matches any character except the\npath separator, <code>/</code>.  The <code>*</code> wildcard matches zero or\nmore of any character except the path separator.  The <code>**</code> wildcard\nmatches zero or more complete path components, and can only occur at the\nbeginning of the path or (alone) between two path separators.\n\nFilter Ordering\n---------------\n\nIt's entirely possible for a user to have multiple filters that select\noverlapping sets of files.  When this happens, only one filter per user will be\napplied for each file, and that filter alone will define the type (Reviewer,\nWatcher or Ignored,) and the set of delegates.  To define which filter among a\nset of matching filters to apply for a file, the filters are ordered according\nto the following rules, in order of priority:\n\n* A review filter wins over a repository filter.\n* A filter whose path does not end with a path separator (single file) wins over\n  a filter whose path does (sub-tree).\n* A filter whose path contains more path separators wins over a filter whose\n  path contains fewer path separators (even if the latter filter contains a\n  wildcard that causes it to match a deeper path in practice.)\n* A filter whose path contains fewer wildcards wins over a filter whose path\n  contains more wildcards.\n* Finally, if all other rules fail to differentiate between two filters, they\n  are ordered lexicographically by path.\n"
  },
  {
    "path": "src/tutorials/rebasing.txt",
    "content": "Rebasing a Review\n=================\n\nRebase Types\n------------\n\nCritic's review rebase functionality handles two limited types of rebase:\n\n1 In-place rebase\n2 Move rebase\n\nThe first type is typically used to clean-up or rewrite the history of the\nbranch, without changing the branch's upstream commit (the parent commit of the\nfirst commit in the review.)  Typically, such a rebase doesn't change the state\nof the source tree as it is at the head of the branch; it only changes how it\ncame to be that way.  Critic in fact requires that this type of rebase doesn't\nchange the tree.\n\nThe second type is typically used to update to a newer version of the upstream\nbranch.  Because the source tree onto which the review's changes are applied\nchanges with this type of review, it can typically not produce an identical\ntree.  Critic doesn't require that for this type of rebase, instead it only\nrequires the new upstream commit to be a descendant of the old upstream commit,\nand that you tell it in advance which new upstream commit you're rebasing onto.\n\nPreparing to Rebase\n-------------------\n\nTo perform either kind of rebase, you must first visit the review's front-page\nand press the button labelled \"Prepare Rebase\" that is placed at the bottom of\nthe table titled \"Commits.\"  Without being told in advance that you intend to\nrebase a review, Critic will reject all non-fast-forward updates.\n\nPressing the button opens a dialog that asks you to select which type of rebase\nyou're planning.  Once the process of preparing the review for a rebase is\ncompleted, only a valid rebase push is accepted, and only by you.  If other\nusers try to push to the review branch, their pushes will be rejected, even if\nthey are plain fast-forward updates, or correct rebases.  If you change your\nmind, there's a button labelled \"Cancel Rebase\" on the review front-page where\nthe \"Prepare Rebase\" button used to be.\n\nIn-place Rebase\n---------------\n\nBy selecting the \"History Rewrite / In-place\" alternative in the dialog and\npressing the \"Continue\" button, the process of preparing for an in-place rebase\nis complete, and a dialog telling you that will be displayed.  You can now go\nahead and push the rebased branch using\n\n  <code>git push -f critic HEAD:r/&lt;name&gt;</code>\n\n(the \"-f\" argument is required since Git otherwise rejects all non-fast-forward\nupdates.)\n\nWith this type of rebase you can add new changes to the review as well when\npushing the rebased branch, as long as those new changes are in separate\ncommits.  The push is accepted as long as there is some commit on the rebased\nbranch that references the exact same tree as the current head of the review\nbranch.\n\nMove Rebase\n-----------\n\nBy selecting the \"New Upstream / Move\" alternative in the dialog and pressing\nthe \"Continue\" button, a new dialog is displayed asking you to specify which\nupstream commit you intend to rebase the branch onto.  If you rebase using a\ncommand like\n\n  <code>git rebase --onto &lt;newbase&gt; &lt;upstream&gt;</code>\n\nthen the upstream commit you should specify is whatever \"&lt;newbase&gt;\" resolves\nto.  If you rebase using a command like\n\n  <code>git rebase &lt;upstream&gt;</code>\n\nthen the upstream commit you should specify is whatever \"&lt;upstream&gt;\" resolves\nto.  Critic lets you specify the upstream commit either as a full 40-character\nSHA-1 sum or by selecting a tag that references a suitable commit.  Abbreviated\nSHA-1 sums or branch names are not accepted since it is possible that the commit\nyou intend to rebase onto doesn't exist in Critic's repository yet, or that your\nlocal branch (even 'master') doesn't reference the same commit as Critic's\nbranch with the same name.  (It could be said that the same problem might exist\nwith tags, but it's much less likely.)\n\nIt's very important that you specify the correct upstream commit, or, if you\nprepared the rebase before you performed it locally, that you actually rebase\nonto the upstream commit you specified.  If the specified upstream commit is not\nreachable from the commit you later push, the push is rejected.  If the\nspecified upstream commit is reachable from the commit you later push, the push\nis accepted, even if you actually rebased on a different upstream commit; the\neffect of this is that commits are added to the review that you didn't mean to\nadd, and that reviewing any conflict resolutions you did while rebasing becomes\nmore difficult.\n\nIt's also very important that you don't add any new changes to the rebased\nbranch before pushing.  Such changes, even if added as separate commits, will\nnot be directly visible in the review.  They will, if at all, show up as some\nsort of conflict resolutions.  Critic simply assumes that the rebased branch you\npush corresponds exactly to the current review branch, only rebased onto another\nupstream commit.\n\nLike with an in-place rebase, to finish the rebase you simply push the rebased\nbranch using\n\n  <code>git push -f critic HEAD:r/&lt;name&gt;</code>\n\nThe push will automatically add a constructed \"equivalent\" merge commit to the\nreview.  This commit has as its parents the old head of the review branch and\nthe new upstream commit that you previous specified, and references the same\ntree as the head of the rebased review branch.  The constructed merge is exactly\nthe same as you would get if you had merged the new upstream commit into the\nreview branch instead of rebasing onto it, and is used to let reviewers review\nyour resolutions to any conflicts that happened while rebasing the branch.\n\nReview Front-page Additions\n---------------------------\n\nThe branch log on the review front-page is rendered slightly differently once\nthe review branch has been rebased.  In fact, it's no longer strictly speaking a\nbranch log at all.  All commits that had been added to the review before the\nreview branch was rebased are still displayed, with whatever review status they\nhad before.  All issues and notes are still attached to these commits.\n\nThe point at which the rebase took place is signalled by a line that says either\n\n  History rewritten by &lt;name&gt;\n\nor\n\n  Branch rebased by &lt;name&gt;\n\ndepending on the type of rebase.  The first (most recent) such line also\ncontains a link to the actual rebased/cleaned-up log of the branch.\n\nCommits pushed to the review branch after the rebase are displayed above the\nrebase signalling line.  Issues and notes should be transferred from the old\nbranch to the rebased branch automatically unless addressed by rebase, in which\ncase they would be marked as addressed by the constructed merge commit instead.\n(An in-place rebase can't address open issues since the source tree can't be\nmodified.)\n"
  },
  {
    "path": "src/tutorials/reconfiguring.txt",
    "content": "Reconfiguring Critic\n====================\n\nEmail Configuration\n-------------------\n\nWhen Critic sends emails about new or updated reviews, it generates a separate\nemail for every recipient.  (It then lies about whom it sent the email to by\nadding all recipients to the \"To\" header in each email sent.)  You can configure\nwhat changes you wish to receive emails about, to some degree what information\nthose emails should contain, and how the emails should be formatted.  Any such\nconfiguration affects only emails sent to you.  They do not affect emails sent\nto others because of things you do.\n\nEmail Activation\n----------------\nFirst of all you can configure whether emails are sent to you at all, using the\nconfiguration item CONFIG(email.activated).  It's enabled by default, and you\nshould typically not disable it.\n\nSubject Line Formats\n--------------------\nThere are a number of configuration items that control the subject line of\nemails sent.  Their names all start with \"email.subjectLine.\", for instance\nCONFIG(email.subjectLine.newReview).\n\nThey all work the same way.  The value should be a valid\n  <a href=\"http://docs.python.org/library/stdtypes.html#string-formatting\">Python format string</a>\ncontaining \"%%(key)s\" conversion specifiers to insert details about the review\ninto the subject line.  If the format string is invalid, instead of a useful\nsubject line your emails will contain the error message Python produced.\n\nThe following keys can be used in conversion specifiers:\n\n? id\n= The review ID in \"r/NNNN\" format.\n? summary\n= The review title or summary.\n? progress\n= The current progress of the review in \"NN %% and N issues\" format.\n? branch\n= The name of the review branch.\n\nReview Association Recipients\n-----------------------------\nCritic always adds a custom header <code>OperaCritic-Association</code> to all\nemails sent about a review, containing a comma-separated list of tokens identifying\nhow the user to whom the email is sent is associated with the review.\n\nThe set of possible tokens are\n  <code>owner</code>,\n  <code>author</code>,\n  <code>reviewer</code>,\n  <code>watcher</code> and\n  <code>none</code>.\n\nThe purpose of this email header is to allow for client-side filtering of emails\nfrom Critic according to relevance or importance.  Some email systems, for\ninstance Gmail, doesn't support filtering based on custom headers.  To support\nthe same type of filtering in such limited email systems, Critic can optionally\nbe configured to add phony recipients to the email's <code>To</code> header\ninstead.  These recipients are constructed by taking the email address used as\n<code>Sender</code> (not <code>From</code>) -- usually something like\n<code>critic@example.com</code> -- and appending <code>+token</code> to the user\nname part.  In other words, if the user owns the review, all emails about it\nwould include <code>critic+owner@example.com</code> in the <code>To</code>\nheader.\n\nThis workaround is enabled by the CONFIG(email.enableAssociationRecipients)\nsetting.\n"
  },
  {
    "path": "src/tutorials/repository.txt",
    "content": "Repository Viewer\n=================\n\nDisplaying diffs\n----------------\n\nThe core functionality of Critic is of course the display of diffs, since that's\nwhat is reviewed, and reviewing is what it's all about.  But this functionality\nis not limited to reviews.  A diff of any commit in Critic's repository can be\ndisplayed, and also a diff between any two commits in Critic's repository.\n\nThe diff display in Critic is based on the output of the 'git diff' command, but\nis post-processed and analyzed to produce an optimal visualization of the\nchanges made.  It may not always succeed in producing an optimal visualization\nof the changes, of course, and when it doesn't, please\n  <a href=\"https://github.com/jensl/critic/issues/new\">report bugs</a>\nabout its failures.\n\n\nSingle commit\n-------------\nTo display a diff of a commit, all you need to specify in the URL is the SHA-1\nsum of the commit.  A simple URL such as\n\n  %(configuration.URL)s/&lt;SHA-1>\n\nis enough.  Critic supports having multiple repositories, and here we didn't\nspecify the repository in the URL.  In this case, Critic searches all of its\nrepositories for a commit.  If it finds one, it doesn't really matter which\nrepository it found it in, of course, since it would be the same commit\nregardless of repository.  The first searched repository will be the user's\n  <a href=\"%(configuration.URL)s/config?highlight=defaultRepository\">default repository</a>.\n\nThe SHA-1 sum specified can also be abbreviated--a prefix instead of the\nfull 40 characters--or be specified in any form supported by the\n  <code>git rev-parse</code>\ncommand, such as the name of a branch, or a SHA-1 sum followed by a caret (^).\nIn this case, no search for an appropriate repository is done--only the user's\ndefault repository is considered unless a repository is specified explicitly, in\nwhich case only that repository is considered.\n\nA longer URL for displaying a single commit is\n\n  %(configuration.URL)s/showcommit?sha1=&lt;SHA-1>\n\nbut this is rarely meaningful to use.  This form of URL needs a full 40\ncharacter SHA-1 sum--it doesn't support neither abbreviated SHA-1 sums or other\nways of specifying commits.\n\n\nMultiple commits\n----------------\nTo display a diff of multiple commits, or between two commits, specify the two\ncommits separated by two full stops/periods (..) similar to the syntax used by\nthe\n  <code>git diff</code>\ncommand.  Like when displaying a single commit, the commits can be specified\nusing any format supported by the\n  <code>git rev-parse</code>\ncommand, and again the repository used is always the user's default repository\nunless one is explicitly specified.\n\nExamples:\n\n  %(configuration.URL)s/&lt;SHA-1>..&lt;SHA-1>\n\n  %(configuration.URL)s/master..&lt;branch name>\n\nThe longer form of URL can be used to display multiple commits as well, and\nsupports two different, but largely redundant, ways to specify the range of\ncommits:\n\n  %(configuration.URL)s/showcommit?from=&lt;SHA-1>&amp;to=&lt;SHA-1>\n\n  %(configuration.URL)s/showcommit?first=&lt;SHA-1>&amp;last=&lt;SHA-1>\n\nThe from/to form is exactly equivalent to the A..B form supported by the shorter\nURL, except it requires full 40 character SHA-1 sums.  The first/last form is\ndifferent in that it includes the changes made in the first commit in the diff,\nand is thus equivalent to the A^..B short form.  These longer forms have few\nadvantages over the shorter forms in practice, and might as well be avoided\nentirely.\n\n\nSpecifying repository\n---------------------\nIn some cases, it may be necessary to explicitly specify the repository to use.\nThis is typically the case when the URL used relies on using\n  <code>git rev-parse</code>\nto interpret the arguments, and the result depends on the repository in which\nthe command is run, and the user's default repository is not the right one.\n\nThe repository to use is specified in two ways depending on whether a short URL\nor the longer variant is used:\n\n  %(configuration.URL)s/&lt;repository>/&lt;SHA-1>\n\n  %(configuration.URL)s/showcommit?sha1=&lt;SHA-1>&amp;repository=&lt;repository>\n\nIn both cases, the repository parameter is the short name of the repository.\nThis is the name used in the drop-down list used to set the user's\n  <a href=\"%(configuration.URL)s/config?highlight=defaultRepository\">default repository</a>.\nIn the case of the longer URL, the repository can also be specified by ID,\ntypically 1-N, in the order the repositories occur in drop-down lists, but this\nis of course less convenient to use.\n\nThese are the available repositories, and their corresponding short names:\n\n[repositories]\n"
  },
  {
    "path": "src/tutorials/requesting.txt",
    "content": "Requesting a Review\n===================\n\nWhat is a Review?\n-----------------\n\nCritic is based around regular git repositories.  A review consists of one or\nmore commits at the tip of a branch in this git repository, annotated by\ncomments written by the reviewers and other participants along the way.  Any\ncode changes involved in a review are always communicated to Critic via the git\nrepository, through\n  <code>git push</code>,\nand are always available in the git repository for anyone to fetch, checkout and\ntest.\n\nNot every branch in Critic's git repositories is associated with a review.\nRegular branches, such as 'master', also exist in the repositories, and can be\npushed to the repositories without any particular side-effects.  Branches that\nare associated with reviews always have names that start with the prefix\n  \"<code class=bold>r/</code>\",\nand no branches in the repositories not associated with reviews can have names\nwith that prefix.  This unambigiously identifies branches that are being\nreviewed.\n\n\nStep 1: Pushing changes to Critic\n---------------------------------\n\nIn order to request a review of the changes made by one or more commits, those\ncommits must first be pushed to one of Critic's git repositories.  Branches are\npushed to Critic's git repositories as they are to any other git repository;\njust add one of Critic's repositories as a remote in your local repository and\nuse 'git push' to push the branch to Critic's repository.  These are the\nrepositories that Critic has right now:\n\n[repositories]\n\nOf course, you need to select a name for your branch.  There's really only one\nconcern that is different from when selecting a branch name when pushing a\nbranch to any other shared repository: if it starts with the\n  \"<code class=bold>r/</code>\"\nprefix the branch is associated with a review.  So you have two choices:\n\n1 Push to a branch whose name doesn't have the\n    \"<code class=bold>r/</code>\"\n  prefix, in which case no significant further actions are triggered, or\n2 Push to a branch whose name does have the\n    \"<code class=bold>r/</code>\"\n  prefix, in which case a review of only the first (head) commit of the branch\n  is created automatically and immediately, and emails are sent to relevant\n  users.\n\nIf the prospect of creating a review immediately merely by pushing a branch to a\nremote seems frightening, fear not.  The ability to create a review by pushing a\nbranch is controlled by the configuration option\n  <a href=\"config?highlight=review.createViaPush#go\"><code>review.createViaPush</code></a>,\nwhich is disabled by default.  Trying to push a branch whose name has\nthe\n  \"<code class=bold>r/</code>\"\nprefix without enabling the configuration option first causes the push to be\nrejected with a message saying you need to enable the option.\n\n<b>Note:</b> If you pushed to a branch whose name doesn't have the\n  \"<code class=bold>r/</code>\"\nprefix, this branch is only used to create the review.  When the review is\ncreated, a second branch will be created to go with it, and any updates to the\nfirst branch will have no effect on the review!\n\n\nStep 2: Locating Commit(s)\n--------------------------\n\nIf in step 1 you pushed a branch whose name had the\n  \"<code class=bold>r/</code>\"\nprefix, then there is no step 2.  You're done.  If you used any other\nbranch name, read on.\n\nLocate the branch you pushed, or the commit you want reviewed, in Critic's\nrepository viewer.  A branch can be displayed by loading the URL\n\n  %(configuration.URL)s/log?repository=&lt;short name>&amp;branch=&lt;branch>\n\nand a single commit can be displayed by loading the URL\n\n  %(configuration.URL)s/&lt;SHA-1>\n\nAlso, when you pushed the branch, the reply from the remote, and output from\nyour git client, should have contained a link to the branch and a direct link\nfor creating a review of the whole branch.\n\nOn every page displaying either the branch or a single commit you should see a\nbutton in the top-right corner of the page, with the label \"Create Review\".\nPress that button to proceed to the next step.  Note that pressing the button\ndoes not create a review right away and is not final; you'll still have a chance\nto change your mind.  Also note that there may be a significant delay between\npressing the button and the next page loading.  Do not be worried by this; it is\nsimply Critic preparing all the commits for quicker viewing and takes some time\nif there are many commits and/or large files were modified.\n\n\nStep 3: Creating the Review\n---------------------------\n\nThe \"Create Review\" button takes you to a page where you will get to select a\nbranch name for the review, write a one-line summary, a longer description\n(which is optional) and inspect what users would be assigned to review the\nchanges, and what additional users would be watching the review.  The page also\nlists all the commits that would be part of the review (initially) with links\nallowing you to inspect the diffs and make sure everything is to your\nsatisfaction.\n\nBranch Name\n-----------\nA branch name needs to be selected for the review.  It must have the\n  \"<code class=bold>r/</code>\"\nprefix, but can otherwise be chosen freely.  Two simple patterns are\nrecommended, however:\n\n1 For a review of a bug fix, use the branch name\n    \"<code class=bold>r/&lt;bug ID&gt;</code>\".\n  Critic will automatically pre-fill the branch name field with such a branch\n  name if the review contains a single commit and the first line of its commit\n  message indicates that it's a fix for a bug.  (The detection of such an\n  indication is a bit hit-and-miss; it's just a regexp.)\n2 For other reviews, use a branch name with the prefix\n    \"<code class=bold>r/&lt;username&gt;/</code>\".\n\nThese patterns are only recommendations.  Overly generic branch names should be\navoided, of course, but this holds equally true for the same reasons in any\nother shared repository.\n\nDescription\n-----------\nThe longer review description is optional since commits, and thus commit\nmessages, are an integral part of the review in Critic.  For a review of\na single-commit bug fix, any sensible longer description of the review\nwould likely be more or less exactly the same as the commit message of\nthe single commit, and thus redundant.  For a review of a larger body of\nwork consisting of many commits, a description that summarizes the work\nwould of course make sense.\n\n\nStep 4: Submit the Review\n-------------------------\n\nThe final step in the process is to submit the review by pressing the button\nlabelled \"Submit Review\" found in the top-right corner of the \"Create Review\"\npage.  Before that, no trace of the review exists in Critic's database.  There\nis therefore no need, and no possibility, to \"abort\" the review at this point:\nit doesn' exist yet.\n\nSubmitting the review does a number of things:\n\n1 Creates the review branch in Critic's git repository.\n2 Assigns all the changes in all the commits to all the appropriate reviewers.\n3 Sends emails to all users associated with the review about the review having\n  been created.\n\nAfter submitting the review, you're redirected to the newly created review's\nfront-page, which contains or links to all information relevant to the review.\nAt this point, the most prominent feature of the page will be the text \"No\nprogress\", in big letters.  As reviewers (and others) review the changes, this\ntext will change to a percentage counter describing how much of the changes have\nbeen reviewed so far, and ultimately to the text \"Accepted!\" when all the\nchanges have been reviewed without issues, at which point you'll be able to\nclose the review.\n"
  },
  {
    "path": "src/tutorials/reviewing.txt",
    "content": "Reviewing Changes\n=================\n\nProgress and State\n------------------\n\nCritic keeps track of the progress and state of a review in order to be able to\ndeclare when a set of changes has been accepted, and in order to be able to tell\nindividual users what new changes they need to review.  A review is considered\naccepted when there are nothing that blocks if from being accepted; there is no\nneed for any (or all) reviewers to explicitly signal final acceptance.  Two\nthings block a review from being accepted: changes that haven't been reviewed\nyet, and issues raised by reviewers (or others) while reviewing the changes.  A\nreview is thus considered accepted once all changes have been reviewed, and\neither no issues were raised or all raised issues have been addressed.\n\nThe current progress and/or state of a review is indicated on the review\nfront-page in big letters.  While a review is in progress, the progress is\ndisplayed as a percentage of changed lines that have been reviewed, and a count\nof open issues that need to be addressed.  Once the review is accepted, the\nprogress is displayed as \"Accepted!\".  When the review is in this state, the\nreview owner (or anyone else) can close the review as finished.\n\nAt any point during the review process the review can also be dropped.  To drop\na review, press the \"Drop Review\" button in the top-right corner of the review\nfront-page.  Normally, only the review owner is given the option to drop the\nreview.  This limitation is convenience only: the assumption is that normally\nonly the review owner is ever interested in dropping the review.  If the\nconfiguration option\n  <a href=\"config?highlight=review.dropAnyReview#go\"><code>review.dropAnyReview</code></a>\nis enabled the \"Drop Review\" button is displayed on the front-page of all\nreviews.\n\n\nDisplaying Changes\n------------------\n\nAs a reviewer, the main task in a review is of course to review the\nactual code changes.  The review front-page provides a range of options\nfor displaying the changes:\n\n* In the list of commits, clicking the summary text of any commit will load a\n  full diff of that commit.  If there are changes in a commit that need your\n  reviewing, the numbers in the columns \"Pending\" and \"Total\" in the table of\n  commits are surrounded by a thick red dotted border.\n* If there are several commits in the review, a range of commits can be\n  \"squashed\" to display the sum of changes in those commits.  To do this, press\n  the left mouse button over the summary text of the first commit in the range,\n  move the mouse pointer to the summary text of the last commit in the range and\n  release the left mouse button.\n* In addition, you can display either all changes that remain for you to review\n  or all changes you have reviewed or should review by following the links\n  labelled \"[pending]\" and \"[reviewable]\" in the top-right corner of the\n  \"Commits\" table.  In this view, changes in modules or files you are not\n  reviewing are skipped, producing a smaller diff, but of course not a complete\n  set of changes.\n\n\nReviewing Changes\n-----------------\n\nWhen a diff is displayed in the context of a review&mdash;whether it's a full\ncommit, a range of commits or a filtered set of changes&mdash;the table of\nchanged files displayed will have an extra column titled \"Reviewed\".  For any\nfile in which there are changes for you to review, this column will contain a\ncheckbox.  This checkbox is used to mark the changes as reviewed.  In addition,\nthe first row in the table will have checkbox that can be used to check (or\nuncheck) all the checkboxes in the table.\n\nAs an alternative to manually checking checkboxes after reading the diff, the\nwhole set of changes can be \"paged through\" using the SPACE key.  Repeatedly\npressing the SPACE key will display the changes in the first file, then scroll\ndown one page at a time until the bottom of the page is reached, then hide the\nfile, mark the changes in it as reviewed (checking the checkbox) and display the\nnext file.  Thus, by simply pressing the SPACE key, you can read all the changes\nand check all the checkboxes as you go along.\n\nAll changes made to a review are immediately communicated to the server, but\nrecorded as \"draft changes\" that are not visible to other users.  As soon as any\ndraft changes are stored, the top-right corner of any page related to the review\nwill contain a summary of the changes made, and the buttons \"Submit\" and\n\"Abort\".  Pressing the \"Submit\" button makes the changes visible to other users\n(that is, removes the draft status) and sends emails to all participants of the\nreview about the changes made.\n\nNote: \"Reviewed\" does not mean \"Approved\"\n-----------------------------------------\nMarking changes as reviewed does not in itself represent approval!  It merely\nmeans you reviewed the changes; that you don't expect to be reviewing these same\nchanges much more and that no other user needs to review them.  This is how\nCritic keeps track of what you and other users need to do, and what remains to\ndo before a review is finished.\n\nApproval is signalled implicitly by marking changes as reviewed without\ncomplaining about anything while doing so; there is no explicit approval action.\nYou are of course free to express your approval of the changes in comments, but\nit is not required for Critic to consider a review as accepted.\n\nLeaving changes \"unreviewed\" after reading them because you found flaws in the\ncode and don't wish to approve the code before those flaws are fixed is in\nitself flawed.  The commit you are looking at is fixed and cannot be altered,\nonly an additional commit can fix the flaws in the code.  So the commit you are\nlooking at must either be \"approved\" in its current form, with raised issues\nthat block the review, or the entire review would have to be dropped.\n\n\nWriting Comments\n----------------\n\nA vital component in the reviewing of changes is of course the ability to\nannotate the code with comments.  In Critic, such code comments are attached to\nspecific lines of code, not to lines of a particular diff.  In practice, this\ndifference is not very significant; often you will barely notice the difference.\nComments are added by selecting a range of lines in the diff; just press the\nleft mouse button over the first line, move the mouse pointer over to the last\nline and release the left mouse button, after which a dialog is displayed in\nwhich you write the comment.\n\n  There's typically no need to select additional context lines surrounding the\n  code you wish to comment on when writing a comment; Critic will add such\n  context lines itself when the comment is displayed.  It is thus better to only\n  select the specific lines that the comment relates to.\n\nIn Critic, there are two types of comments: issues and notes.  Issues are\nsignificant to the progress of the review; any issue raised by a reviewer (or\nother user) blocks the review from being accepted until the issue has been\naddressed or resolved.  Notes, on the other hand, do not, and exist to allow\nusers to add informational comments without affecting the progress of the\nreview.\n\nIt may seem drastic sometimes to call a comment an \"issue\", but think of it like\nthis: an issue comment is something, anything, that needs to be handled somehow\nbefore the review is closed.  An answer, from the review owner, to a question\nmight really be all you're after, but by calling it an issue, Critic will help\nboth you and the review owner to not forget about it before going ahead.  If you\ncall it a note instead, feeling that \"issue\" is too harsh, Critic will not care\nwhether the comment receives any further attention from anyone.\n\nIf a comment is added with the wrong type&mdash;an issue that ought to be just a\nnote or a note that ought to be an issue&mdash;the type of the comment can be\naltered after it's been added, using the buttons labelled \"Convert to Note\" and\n\"Convert to Issue\".  Converting an issue into a note may cause the review to\nbecome accepted, since it is quite similar to explicitly resolving the issue.\n\n\nHandling Issues\n---------------\nSince open issues block the review from being accepted and closed, they\nneed to be handled.  There are two basic ways to handle an open issue:\n\n1 Explicitly mark the issue as resolved using the \"Resolve&nbsp;Issue\" button\n  displayed along with the comment.\n2 Push additional commits to the review that change the commented lines, which\n  causes Critic to automatically mark the comment as \"Addressed\".  This is the\n  preferred choice when the comment asked for the code to be changed, since it\n  makes it easy for both the reviewer and review owner to verify that all\n  requested changes have been made, and also spares someone the trouble of\n  manually marking issues as resolved.\n\nAnyone is allowed to explicitly resolve an open issue, including the review\nowner.  This may seem as an opportunity to \"cheat\" and approve your own changes,\nand in practice, that is what it is.  But the reason is simple: Critic is here\nto facilitate reviews, not prevent cheating or enforce rules.\n\nWhen a comment is marked as addressed automatically, there's of course the\npossibility that the change didn't actually address the issue, either because it\nwas a completely unrelated change that just happened to intersect the comment,\nor because it wasn't what the reviewer had in mind.  It may seem easy for issues\nto get lost because of this, but in practice this ought not be a problem, since\nthe change that caused the comment to be marked as addressed still needs to be\nreviewed as any other change, which provides the reviewer with ample opportunity\nto verify that addressed issues were truly addressed.\n\nIf an issue is incorrectly marked as addressed, it can be reopened.  To do this,\npress the \"Reopen Issue\" button displayed along with the comment.  A dialog will\nbe displayed asking you to select the range of lines in the new version of the\ncode where the issue still exists.  When done, the issue will be open again, and\nthe new range of lines will be the lines that need to be changed for the comment\nto be marked as addressed again.\n\nDraft Comments\n--------------\nAll actions involving comments, writing, editing, resolving and reopening, are\nimmediately communicated to the server and stored in the database as draft\nchanges.  Once stored on the server, you can navigate to a different page in\nyour browser, or reload the page, or crash the browser, without risk losing any\ndata.  You will be able to submit the changes, making them visible to all users,\non any page related to the review.  If no \"Submit\" button is displayed, you\nprobably just need to reload the page for it to appear.\n\nAs a rule of thumb: if there's a text input on the screen, any editing you've\ndone in it would be lost if you, for instance, closed the window.  As soon as\nthe text input is removed from the screen, whatever was in the text input is\nstored on the server (unless you used a \"Cancel\" button, of course.)  If the\noperation to store information on the server fails, an error dialog is displayed\nand the dialog containing the text input stays open.\n"
  },
  {
    "path": "src/tutorials/search.txt",
    "content": "Review Quick Search\n===================\n\nAvailability\n------------\n\nThe review quick search feature is available on every Critic page, by pressing\nthe <code>F</code> key.  This opens up a dialog where one enters a query string,\nand then searches by pressing the <code>ENTER</code> key (or clicking the\n\"Search\" button.)\n\nQuery Syntax\n------------\n\nThe query string is split into search terms, at white-space characters.  A term\ncontaining white-space characters can be achieved using quotes (either single or\ndouble.)  A search term can also be qualified by a <code>keyword:</code> prefix,\nwhere the supported keywords are:\n\n? <code>repository</code> (or <code>repo</code> or <code>r</code>)\n= Filter by repository.\n\n? <code>summary</code>\n= Filter by searching for sub-string in the review's summary.\n\n? <code>description</code>\n= Filter by searching for sub-string in the review's description.\n\n? <code>text</code>\n= Filter by searching for sub-string in the review's summary and/or description.\n\n? <code>branch</code> (or <code>b</code>)\n= Filter by matching the review branch name.\n\n? <code>path</code> (or <code>p</code>)\n= Filter by matching the path of files touched by the review.\n\n? <code>user</code> (or <code>u</code>)\n= Filter by user associated with the review.\n\n? <code>owner</code> (or <code>o</code>)\n= Filter by user that owns the review.\n\n? <code>reviewer</code>\n= Filter by user that is assigned to review changes in the review.\n\n? <code>state</code> (or <code>s</code>)\n= Filter by review state: <code>open</code> (any open review),\n  <code>pending</code> (open and not accepted reviews), <code>accepted</code>\n  (open and accepted reviews), <code>closed</code> or <code>dropped</code>.\n\nFilter Value Syntax\n-------------------\nWhen matching against the review's summary or description, the search term's\nvalue is interpreted as a simple glob if it contains either a <code>*</code> or\na <code>?</code> character, in which case it's matched against the whole summary\nor description.  Otherwise, it's interpreted as a plain sub-string and is\nsearched for in the summary or description.  In other words, the search term\n\"summary:foo\" is the same as the term \"summary:*foo*\", but the term\n\"summary:foo*\" only matches reviews whose summaries start with the sub-string\n\"foo\".\n\nWhen matching against path or branch names, the search term's value is\ninterpreted as a pattern similar to how paths are interpreted by Critic's filter\nmechanism: <code>**</code> matches zero or more path segments\n(<code>foo/</code>, <code>foo/bar/</code>, et c.), <code>*</code> matches zero\nor more characters except the path separator (<code>/</code>) and <code>?</code>\nmatches exactly one character.  A path is only interpreted as absolute if it has\na leading <code>/</code>.\n\nThe other search term types require values that are valid repository names, user\nnames or review states, respectively.\n\nUnqualified Search Terms\n------------------------\nA search term that is not qualified by <code>keyword:</code> is interpreted as\nfollows:\n\nReview summaries and descriptions will always be searched. If the search term\ndoesn't contain any white-space characters, review branch names are also\nsearched. If the search term looks like a file path, the search term is also\nmatched against files touched by the review. The term is considered to look\nlike a path if it does not contain any white-space characters, and either\ncontains a path separator or ends with a file name extension.\n\nExamples\n--------\n\nExample query strings:\n\n? <code>example</code>\n= Finds reviews whose summary, description or branch name contains the\n  sub-string \"example\".\n\n? <code>search example</code>\n= Finds reviews whose summary or description contains the sub-string \"search\"\n  and whose summary or description contains the sub-string \"example\".\n\n? <code>\"search example\"</code>\n= Finds reviews whose summary or description contains the sub-string \"search\n  example\".\n\n? <code>\"search*example*\"</code>\n= Finds reviews whose summary or description start with the sub-string \"search\"\n  and contains the sub-string \"example\".\n\n? <code>summary:\"search example\" state:open</code>\n= Finds open (accepted or not) reviews whose summary contains the sub-string\n  \"search example\".\n\n? <code>search_example.py</code>\n= Finds reviews that touch any file named <code>search_example.py</code>.\n\n? <code>owner:alice reviewer:bob</code>\n= Finds reviews owned by the user <code>alice</code> where the user\n  <code>bob</code> is assigned to review some or all changes.\n"
  },
  {
    "path": "src/urlutils.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 the Critic contributors, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport json\nimport requests\n\nclass Response():\n    def __init__(self, response):\n        self.response = response\n\n    def __getattr__(self, name):\n        return getattr(self.response, name)\n\n    def json(self):\n        if hasattr(self.response, \"json\"):\n            try:\n                if callable(self.response.json):\n                    return self.response.json()\n                else:\n                    return self.response.json\n            except Exception:\n                return None\n        else:\n            try:\n                return json.loads(self.response.content)\n            except ValueError:\n                return None\n\ndef get(*args, **kwargs):\n    return Response(requests.get(*args, **kwargs))\n\ndef post(*args, **kwargs):\n    return Response(requests.post(*args, **kwargs))\n"
  },
  {
    "path": "src/wsgi.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\ntry:\n    import maintenance.configtest\nexcept ImportError:\n    import traceback\n    import sys\n\n    exc_info = sys.exc_info()\n\n    def application(environ, start_response):\n        start_response(\"500 Internal Server Error\",\n                       [(\"Content-Type\", \"text/plain\")])\n        header = \"Failed to import 'maintenance.configtest' module\"\n        return ([\"%s\\n%s\\n\\n\" % (header, \"=\" * len(header))] +\n                traceback.format_exception(*exc_info))\nelse:\n    errors, warnings = maintenance.configtest.testConfiguration()\n\n    if errors:\n        def application(environ, start_response):\n            start_response(\"500 Internal Server Error\",\n                           [(\"Content-Type\", \"text/plain\")])\n\n            header = \"Invalid system configuration\"\n            result = \"%s\\n%s\\n\\n\" % (header, \"=\" * len(header))\n            for error in errors:\n                result += str(error) + \"\\n\\n\"\n            for warning in warnings:\n                result += str(warning) + \"\\n\\n\"\n\n            return [result]\n    else:\n        try:\n            import configuration\n\n            if configuration.debug.COVERAGE_DIR:\n                import coverage\n                def import_critic():\n                    import critic\n                coverage.call(\"wsgi\", import_critic)\n\n            import critic\n        except ImportError:\n            import traceback\n            import sys\n\n            exc_info = sys.exc_info()\n\n            def application(environ, start_response):\n                start_response(\"500 Internal Server Error\",\n                               [(\"Content-Type\", \"text/plain\")])\n                header = \"Failed to import 'critic' module\"\n                return ([\"%s\\n%s\\n\\n\" % (header, \"=\" * len(header))] +\n                        traceback.format_exception(*exc_info))\n        else:\n            def application(environ, start_response):\n                return critic.main(environ, start_response)\n"
  },
  {
    "path": "src/wsgistartup.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\ntry:\n    import os\n    import os.path\n    import atexit\n    import time\n    import errno\n\n    # Preload critic.py to reduce initial page load delay.\n    import configuration\n    if configuration.debug.COVERAGE_DIR:\n        import coverage\n        def import_critic():\n            import critic\n        coverage.call(\"wsgi\", import_critic)\n    else:\n        import critic\n\n    pidfile_path = os.path.join(configuration.paths.WSGI_PIDFILE_DIR, str(os.getpid()))\n\n    def deletePidFile():\n        try: os.unlink(pidfile_path)\n        except: pass\n\n    try: os.makedirs(os.path.dirname(pidfile_path))\n    except OSError as error:\n        if error.errno == errno.EEXIST: pass\n        else: raise\n\n    open(pidfile_path, \"w\").write(str(time.time()))\n\n    atexit.register(deletePidFile)\nexcept: pass\n"
  },
  {
    "path": "testing/USAGE.md",
    "content": "Critic Testing Framework\n========================\n\nThe Critic testing framework installs Critic in a VirtualBox instance, and then\nruns tests against it.  Many assumptions are made about the setup of the\nVirtualBox instance, the OS running in it, and the system running VirtualBox and\nthe testing framework.\n\nIn this manual, the system that runs VirtualBox and the testing framework is\ncalled the \"host\" and the system running in VirtualBox is called the \"guest\".\n\n\nHost Setup\n----------\n\nRequired software:\n\n* Python 2.7\n* Git 1.8.5 or later (needs Git bugfix 37cb1dd671e5e22cee363f98637a5a58f16be054)\n* VirtualBox\n* Requests (Debian/Ubuntu package: python-requests)\n* BeautifulSoup 3.x (Debian/Ubuntu package: python-beautifulsoup)\n\nThe host system is assumed to have a clone of the Critic repository, in which\nthe testing framework is executed.  A temporary bare clone of this repository\nwill be created and exported using \"git daemon\" as part of testing.  By default,\nthis \"git daemon\" process listens on TCP port 9418, which will fail if another\n\"git daemon\" process already runs on the host system.  If this is a problem, a\ncustom port can be specified using the --git-daemon-port command-line argument.\n\nThe user that runs the testing framework must have passwordless SSH access to\nthe guest system, and if a different user (name) should be used on the guest\nsystem, this needs to be configured in .ssh/config.\n\n\nVirtualBox Setup\n----------------\n\nThe VirtualBox instance's SSH and HTTP services must be accessible (via network)\nfrom the host system.  Its hostname must be given as command-line argument to\nthe testing framework, and custom ports for SSH and HTTP can also be given as\ncommand-line arguments, if necessary.\n\nIf the VirtualBox instance is configured to use NAT, it is typically not\ndirectly reachable from the host system.  In this case, ports on the host system\ncan be forwarded to the VirtualBox instance by VirtualBox, and \"localhost\" can\nbe used as the hostname.  The host system ports that are forwarded to the\nVirtualBox instance can be given to the testing framework as command-line\narguments.\n\nFinally, the VirtualBox instance must have a snapshot named \"clean\" (or named\nsomething else if overidden using the --vm-snapshot argument.)  This snapshot is\nrestored when testing starts.  If the snapshot is taken with the machine powered\nup and ready, testing will be slightly faster.  Critic should not have been\ninstalled on the guest system at this point.  The software packages that\nCritic's installation script installs if missing (Apache, PostgreSQL et c.) may\nbe installed before the snapshot is taken; this reduces the time it takes to run\ntests, but of course means the software installation part of the installation\nscript is not fully tested.  For complete testing, two snapshots can be taken,\none with the additional software installed and one without, and tests can be run\nonce with each snapshot specified using the --vm-snapshot argument.\n\nImportant note: When taking a \"live\" snapshot of an instance, supplying the\n\"--pause\" argument to the \"VBoxManage snapshot\" operation may be required to\navoid triggering bugs in VirtualBox that corrupts the instance.  Also note that\nVirtualBox supports having multiple snapshots of the same instance with the same\nname; make sure there's only one snapshot named \"clean\".\n\n\nGuest Setup\n-----------\n\nRequired software:\n\n* SSH server\n* Python 2.7\n* Git\n* Sudo\n\nThe user on the guest system that the host system user that runs the testing\nframework logs in as over SSH must be allowed to run \"sudo\" without entering a\npassword.  (This is typically not the default in any system, unless the user is\n\"root\", so /etc/sudoers typically needs to be edited to achieve this.)\n\nThe hostname \"host\" on the guest system must resolve to the host system.  This\ncan for instance be accomplished by editing the guest system's /etc/hosts file.\n\nCritic will be installed on the guest system from the directory $HOME/critic,\nwhich must not exist.  (IOW, the VirtualBox instance's \"clean\" snapshot must be\ntaken at a time when it doesn't exist.)\n\n\nRunning Tests\n-------------\n\nTests are run in the clone of Critic's repository on the host system.  From the\nroot of that repository, run\n\n    $ python -m testing.main ARGS\n\nOne argument is required,\n\n    --vm-identifier=NAME|UUID\n\nwhich specifies which VirtualBox instance to run the tests in.\n\nUnless its hostname is the same as the VM instance name specified by the\n--vm-identifier argument, the argument\n\n    --vm-hostname=HOSTNAME\n\nmust be used to specify how to address the VirtualBox instance from the host\nsystem.\n\nNote: this hostname only needs to work on the host system, and need not be the\nhostname that the guest OS has been configured with.\n\nAlso note: if both the host and guest OS:s use Avahi, the VirtualBox instance\nmight be accessible using the name \"<hostname>.local\" where <hostname> is the\nhostname that the guest OS has been configured with.\n\nThe argument\n\n    --vm-snapshot=NAME|UUID\n\ncan be used to select a snapshot to restore when starting the VirtualBox\ninstance.  A snapshot is always restored; if this argument is not provided, a\nsnapshot named \"clean\" is restored.\n\nThe arguments\n\n    --vm-ssh-port=PORT\n    --vm-http-port=PORT\n\ncan be used to tweak how the VirtualBox instance's SSH and HTTP services are\nreached from the host system.\n\nThe argument\n\n    --git-daemon-port=PORT\n\ncan be used to have the \"git daemon\" process that is automatically started to\nexport the Critic repository listen on a different port (by default it listens\non port 9418.)\n\nThe arguments\n\n    --commit=SHA1|REF\n    --upgrade-from=SHA1|REF\n\ncan be used to control which commit to test.  The --commit argument defaults to\nthe commit that is checked out in the non-bare repository on the host system.\nSince this commit's version of the testing framework and the tests is what will\nbe running, it rarely makes any sense to specify any other commit.  Doing so\nmight not work as intended because of incompatibilities between the testing\nframework in the checked out commit and the installed version of Critic.\n\nThe --upgrade-from argument is more useful.  When given, instead of installing\nthe tested commit directly, the commit specified by the --upgrade-from argument\nis installed, and then the system is upgraded to the tested commit.  A typical\nuse-case for this is to test the changes on a topic branch by installing the\ncommit on 'master' from which the topic branch was branched off and then\nupgrading to the tip of the topic branch, and then run the tests.\n\nNote: The tested commit, as well as the commit to upgrade from, if given, must\nnot be earlier than the integration of the testing framework, since the\ninstall.py and upgrade.py scripts were extended as part of the testing framework\nimplementation.\n\nThe arguments\n\n    --debug\n    --quiet\n\ncan be used to control the amount of output produced while running tests.  With\n--debug, various rather noisy and not terribly useful debugging output is added.\nWith --quiet, only warnings and errors are output, not basic progress messages.\n(With --quiet, a successful test run would produce no output at all.)\n\nFinally, to run only selected tests, or groups of tests, the paths of these can\nbe provided as additional command-line arguments.  These paths should be\nrelative the testing/tests/ directory.\n\nCode Coverage Measurement\n-------------------------\n\nThe testing framework also has support for measuring code coverage during\ntesting.  To enable this mode, the testing framework needs to be started with\nthe argument\n\n    --coverage\n\nin addition to any other arguments needed, as described above.  This argument\ncauses the testing framework's normal output to go the stderr stream instead of\nthe stdout stream, and code coverage data to be written to the stdout stream\nwhen testing has finished.\n\nThe code coverage data is output in the form of a JSON object structure:\n\n    { \"contexts\": [ <context1>, <context2>, ... ],\n      <module path>: { <context1>: [ <line1>, <line2>, ... ],\n                       <context2>: [ <line1>, <line2>, ... ],\n                       ... },\n      ... }\n\nThe <contextN> values are strings identifying the context in which the covered\ncode was called, such as \"wsgi\" for code called via the web-frontend and\n\"changeset\", \"highlight\", et c. for code called via background services.  For\neach (covered) source code file, coverage is then reported as an array of\ncovered lines per context.  The line numbers in the array are zero-based.\n\n\nTest Structure\n--------------\n\nThe actual tests are Python scripts in sub-directories of the testing/tests/\ndirectory.  All file and directory names under testing/tests/ should, by\nconvention, begin with three digits, followed by a '-', followed by a short\nidentifier of the test or test group.  Files and directories are sorted\naccording to the three-digit number in their name, and processed in that order.\nA directory is processed by processing all files and directories under it,\nrecursively.  A file whose name ends with \".py\" is processed by executing it\n(using execfile()).  All other files are ignored.\n\nThere should be no files directly in the testing/tests/ directory.  The\nimmediate sub-directories of testing/tests/ are \"top-level test groups\" and are\nsignificant in that each one starts with a clean, restarted VirtualBox instance.\nThere should be a test (typically the first one) in each top-level test group\nthat calls \"instance.install()\" to install Critic in the VirtualBox instance,\nand one test (possibly also the first one) that calls \"instance.upgrade()\" to\nupgrade to the tested commit.  (The \"instance.upgrade()\" call is a no-op unless\nthe testing framework was started with the --upgrade-from argument.)\n\nThe organization of tests into test groups is mostly free, but there is one\ndetail worth noting: the directory tree layout implicitly defines dependencies\nbetween tests, as such: a test B depends on a test A (IOW, test A must run\nsuccessfully in order for test B to be runnable) if test A runs before B (due to\nthe basic sorting described above) and is either in the top-level group or is in\nan ancestor group of test B.\n\nFor instance, given the tests\n\n    001-main/001-testA.py\n    001-main/002-groupB/001-testB1.py\n    001-main/002-groupB/002-testB2.py\n    001-main/003-testC.py\n    001-main/004-groupD/001-testD.py\n\nthe test 001-testA.py is a dependency of all other tests, and 003-testC.py is a\ndependency of the test 004-groupD/001-testD.py, but the tests under 002-groupB/\nare not dependencies of the tests 003-testC.py or 001-testD.py, despite normally\nexecuting before them, and the test 001-testB1.py is not a dependency of the\ntest 001-testB2.py.\n"
  },
  {
    "path": "testing/__init__.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2013 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport os\nimport re\nimport subprocess\n\nclass Error(Exception):\n    pass\n\nclass InstanceError(Error):\n    \"\"\"Error raised when VM instance is in unexpected/unknown state.\"\"\"\n    pass\n\nclass TestFailure(Error):\n    \"\"\"Error raised for \"expected\" test failures.\"\"\"\n    pass\n\nclass CommandError(InstanceError):\n    def __init__(self, argv, stdout, stderr=None):\n        self.argv = argv\n        self.command = \" \".join(argv)\n        self.stdout = stdout\n        self.stderr = stderr\n\nclass CriticctlError(TestFailure):\n    \"\"\"Error raised for failed criticctl usage.\"\"\"\n    def __init__(self, command, stdout, stderr=None):\n        super(CriticctlError, self).__init__(\n            \"CriticctlError: %s\\nOutput:\\n%s\" % (command, stderr or stdout))\n        self.command = command\n        self.stdout = stdout\n        self.stderr = stderr\n\nclass NotSupported(Error):\n    \"\"\"Error raised when a test is unsupported.\"\"\"\n    pass\n\nclass User(object):\n    RE_DEFINITION = re.compile('var user = new User\\\\(([^,]+), ([^,]+),')\n\n    def __init__(self, user_id, name):\n        self.id = user_id\n        self.name = name\n\n    def __eq__(self, other):\n        if isinstance(other, User):\n            return self.id == other.id and self.name == other.name\n        return False\n\n    def __repr__(self):\n        if self.id is None:\n            return \"<anonymous user>\"\n        return \"<user '%(name)s' (%(id)d)>\" % self\n\n    @staticmethod\n    def from_script(script):\n        match = User.RE_DEFINITION.match(script)\n        if match:\n            if match.groups() == (\"null\", \"null\"):\n                return User.anonymous()\n            return User(int(match.group(1)), eval(match.group(2)))\n\n    @staticmethod\n    def anonymous():\n        return User(None, None)\n\nclass Instance(object):\n    flags_on = []\n    flags_off = []\n\n    # The VirtualBox instance sets this depending on arguments. Other modes\n    # don't support it, so default to False.\n    test_extensions = False\n\n    # This is used to keep track of which commit is currently running.  This is\n    # really only relevant for VM instances when upgrading from an older commit,\n    # so only testing.virtualbox.Instance actually sets this.\n    current_commit = None\n\n    def __init__(self):\n        self.resetusers()\n\n    def __enter__(self):\n        return self\n\n    def __exit__(self, *args):\n        return False\n\n    def resetusers(self):\n        self.__users = []\n        self.__user_map = {}\n\n    def registeruser(self, name):\n        user_id = len(self.__users) + 1\n        user = User(user_id, name)\n        self.__users.append(user)\n        self.__user_map[user_id] = user\n        self.__user_map[name] = user\n\n    def user(self, key):\n        return self.__user_map[key]\n\n    def userid(self, name):\n        return self.user(name).id\n\n    def filter_service_log(self, service_name, level=\"warning\"):\n        data = self.filter_service_logs(level, [service_name])\n        if data is None:\n            return []\n        return data.get(service_name)\n\n    def check_service_logs(self, level=\"warning\"):\n        data = self.filter_service_logs(level, [\"branchtracker\",\n                                                \"changeset\",\n                                                \"githook\",\n                                                \"highlight\",\n                                                \"maildelivery\",\n                                                \"maintenance\",\n                                                \"servicemanager\",\n                                                \"watchdog\"])\n        if data is None:\n            return\n        for service_name, entries in data.items():\n            lines = \"\\n\".join(entries)\n            logger.error(\n                \"%s: service log contains unexpected entries:\\n  %s\"\n                % (service_name, \"\\n  \".join(lines.splitlines())))\n\n    def executeProcess(self, args, log_stdout=True, log_stderr=True, **kwargs):\n        try:\n            process = subprocess.Popen(\n                args, stdout=subprocess.PIPE, stderr=subprocess.PIPE, **kwargs)\n        except OSError as error:\n            raise CommandError(args, None, str(error))\n        stdout, stderr = process.communicate()\n        if stdout.strip() and log_stdout:\n            logger.log(STDOUT, stdout.rstrip(\"\\n\"))\n        if stderr.strip() and log_stderr:\n            logger.log(STDERR, stderr.rstrip(\"\\n\"))\n        if process.returncode != 0:\n            raise CommandError(args, stdout, stderr)\n        return stdout\n\n    def translateUnittestPath(self, module):\n        path = module.split(\".\")\n        if path[0] == \"api\":\n            # API unittests are under api/impl/.\n            path.insert(1, \"impl\")\n        path = os.path.join(*path)\n        if os.path.isdir(os.path.join(\"src\", path)):\n            path = os.path.join(path, \"unittest.py\")\n        else:\n            path += \"_unittest.py\"\n        return path\n\n    def unittest(self, module, tests, args=None):\n        path = self.translateUnittestPath(module)\n        if not args:\n            args = []\n        for test in tests:\n            logger.info(\"Running unit test: %s (%s)\" % (module, test))\n            try:\n                output = self.run_unittest([path] + args + [test])\n                lines = output.strip().splitlines()\n                expected = test + \": ok\"\n                matching = filter(lambda line: line == expected, lines)\n                if len(lines) == 0:\n                    logger.warning(\"No unit test output: %s (%s)\")\n                elif len(matching) == 0:\n                    logger.warning(\"No unit test confirmation (but some output): %s (%s)\"\n                                   % (module, test))\n                elif len(matching) > 1:\n                    logger.warning(\"Multiple unit test confirmations: %s (%s)\"\n                                   % (module, test))\n                if lines and (lines[-1] != expected):\n                    logger.warning(\"Unit test's last line of output isn't unit test confirmation: %s (%s)\"\n                                   % (module, test))\n                if len(lines) > 0:\n                    [logger.info(line) for line in lines[:-1]]\n            except CommandError as error:\n                output = \"\\n  \".join(error.stderr.splitlines())\n                logger.error(\n                    \"Unit tests failed: %s: %s\\nCommand: %s\\nOutput:\\n  %s\"\n                    % (module, test, error.command, output))\n\nimport local\nimport virtualbox\nimport frontend\nimport expect\nimport repository\nimport mailbox\nimport findtests\nimport utils\nimport quickstart\n\nlogger = None\n\nSTREAM = None\nSTDOUT = None\nSTDERR = None\n\ndef configureLogging(arguments=None, wrap=None):\n    import logging\n    import sys\n    global logger, STREAM, STDOUT, STDERR\n    if not logger:\n        # Essentially same as DEBUG, used when logging the output from commands\n        # run in the guest system.\n        STDOUT = logging.DEBUG + 1\n        STDERR = logging.DEBUG + 2\n        logging.addLevelName(STDOUT, \"STDOUT\")\n        logging.addLevelName(STDERR, \"STDERR\")\n        if arguments and getattr(arguments, \"coverage\", False):\n            STREAM = sys.stderr\n        else:\n            STREAM = sys.stdout\n        logging.basicConfig(\n            format=\"%(asctime)-15s | %(levelname)-7s | %(message)s\",\n            stream=STREAM)\n        logger = logging.getLogger(\"critic\")\n        level = logging.INFO\n        if arguments:\n            if getattr(arguments, \"debug\", False):\n                level = logging.DEBUG\n            elif getattr(arguments, \"quiet\", False):\n                level = logging.WARNING\n        logger.setLevel(level)\n        if wrap:\n            logger = wrap(logger)\n    return logger\n\ndef pause(prompt=\"Press ENTER to continue: \"):\n    print >>STREAM\n    try:\n        print >>STREAM, prompt,\n        raw_input()\n    except KeyboardInterrupt:\n        print >>STREAM\n        print >>STREAM\n        raise\n    print >>STREAM\n\nclass Context(object):\n    def __init__(self, start, finish):\n        self.start = start\n        self.finish = finish\n    def __enter__(self):\n        self.start()\n        return self\n    def __exit__(self, *args):\n        self.finish()\n        return False\n\ndef exists_at(commit, path):\n    lstree = subprocess.check_output([\"git\", \"ls-tree\", commit, path])\n    return bool(lstree.strip())\n\ndef has_flag(commit, flag):\n    if flag == \"minimum-password-hash-time\":\n        try:\n            subprocess.check_call(\n                [\"git\", \"grep\", \"--quiet\", \"-e\", \"--minimum-password-hash-time\",\n                 commit, \"--\", \"installation/config.py\"])\n        except subprocess.CalledProcessError:\n            return False\n        else:\n            return True\n    else:\n        return exists_at(commit, \"testing/flags/%s.flag\" % flag)\n"
  },
  {
    "path": "testing/__main__.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2013 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport main\nmain.main()\n"
  },
  {
    "path": "testing/expect.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2013 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport re\nimport traceback\n\nimport testing\n\ndef extract_text(source):\n    result = u\"\"\n    if source:\n        if isinstance(source, list):\n            for element in source:\n                result += extract_text(element)\n        elif isinstance(source, basestring):\n            result += source\n        elif getattr(source, \"string\"):\n            result += source.string\n        elif getattr(source, \"contents\"):\n            result += extract_text(source.contents)\n        else:\n            result += \"[%r]\" % source\n    return result\n\ndef deunicode(v):\n    if type(v) == unicode: return v.encode(\"utf-8\")\n    elif type(v) == list: return map(deunicode, v)\n    elif type(v) == dict: return dict([(deunicode(a), deunicode(b)) for a, b in v.items()])\n    else: return v\n\nclass FailedCheck(testing.TestFailure):\n    def __init__(self, expected, actual, location=None, message=None):\n        if message is None:\n            message = \"check failed\"\n        if location is not None:\n            message += \":\\n At %s:%d\" % location[0]\n            for filename, linenr in location[1:]:\n                message += \",\\n   called from %s:%d\" % (filename, linenr)\n        super(FailedCheck, self).__init__(\n            \"%s:\\n  Expected: %r,\\n  Actual:   %r\"\n            % (message, expected, deunicode(actual)))\n        self.expected = expected\n        self.actual = actual\n\n    @staticmethod\n    def current_location():\n        location = []\n        for filename, linenr, _, _ in reversed(traceback.extract_stack()):\n            if filename.startswith(\"testing/tests/\"):\n                location.append((filename[len(\"testing/tests/\"):], linenr))\n            elif location:\n                break\n        else:\n            location = None\n        return location\n\ndef simple_equal(expected, actual):\n    return expected == actual\n\ndef equal(expected, actual, equal=simple_equal, message=None):\n    if not equal(expected, actual):\n        location = FailedCheck.current_location()\n        raise FailedCheck(expected, actual, location=location, message=message)\n\ndef true(actual, message):\n    if not (actual is True):\n        location = FailedCheck.current_location()\n        raise FailedCheck(True, actual, location=location, message=message)\n\ndef false(actual, message):\n    if not (actual is False):\n        location = FailedCheck.current_location()\n        raise FailedCheck(False, actual, location=location, message=message)\n\ndef none(actual, message):\n    if not (actual is None):\n        location = FailedCheck.current_location()\n        raise FailedCheck(None, actual, location=location, message=message)\n\n# For backwards compatibility...\ncheck = equal\n\ndef with_class(*names):\n    def check(value):\n        if value is None:\n            return False\n        tokens = set(value.split())\n        for name in names:\n            if name not in tokens:\n                return False\n        return True\n    return { \"class\": check }\n\ndef find_paleyellow(document, index):\n    \"\"\"Find index:th \".paleyellow\" in the document.\"\"\"\n    tables = document.findAll(attrs=with_class(\"paleyellow\"))\n    if index >= len(tables):\n        raise FailedCheck(\"<paleyellow: index=%d>\" % index,\n                          \"<no paleyellow: count=%d>\" % len(tables))\n    return tables[index]\n\ndef document_title(expected):\n    \"\"\"Return <title> checker.\"\"\"\n    return lambda document: check(expected, document.title.string)\n\ndef paleyellow_title(index, expected):\n    \"\"\"Return index:th \".paleyellow\" title checker.\"\"\"\n    def checker(document):\n        table = find_paleyellow(document, index)\n        actual = \"<no title found>\"\n        h1 = table.find(\"h1\")\n        if h1 and h1.contents:\n            actual = h1.contents[0]\n        return check(expected, actual)\n    return checker\n\ndef message(expected_title, expected_body, title_equal=simple_equal,\n            body_equal=simple_equal):\n    \"\"\"Return <div class=\"message\"> title checker.\"\"\"\n    def checker(document):\n        message = document.find(\n            \"div\", attrs={ \"class\": lambda value: \"message\" in value.split() })\n        actual_title = None\n        actual_body = None\n        if message:\n            title = message.find(\"h1\")\n            actual_title = extract_text(title)\n            if expected_body is not None:\n                body = message.find(\"p\")\n                actual_body = extract_text(body)\n        if not actual_title:\n            actual_title = \"<no message title found>\"\n        check(expected_title, actual_title, equal=title_equal,\n              message=\"title check failed\")\n        if expected_body is not None:\n            if not actual_body:\n                actual_body = \"<no message body found>\"\n            check(expected_body, actual_body, equal=body_equal,\n                  message=\"body check failed\")\n    return checker\n\ndef message_title(expected_title):\n    return message(expected_title, None)\n\ndef no_message():\n    \"\"\"Return negative <div class=\"message\"> checker.\"\"\"\n    def checker(document):\n        message = document.find(\n            \"div\", attrs={ \"class\": lambda value: \"message\" in value.split() })\n        if message:\n            actual = \"<message: %s>\" % message.find(\"h1\").contents[0]\n        else:\n            actual = \"<no message found>\"\n        return check(\"<no message found>\", actual)\n    return checker\n\ndef pageheader_links(*scopes):\n    scopes = set(scopes)\n    expected = []\n    for label, scope in [(\"Home\", \"authenticated\"),\n                         (\"Dashboard\", None),\n                         (\"Branches\", None),\n                         (\"Search\", None),\n                         (\"Services\", \"administrator\"),\n                         (\"Repositories\", \"administrator\"),\n                         (\"Extensions(?: \\\\(\\\\d+\\\\))?\", \"extensions\"),\n                         (\"Config\", None),\n                         (\"Tutorial\", None),\n                         (\"News(?: \\\\(\\\\d+\\\\))?\", None),\n                         (\"Sign in\", \"anonymous\"),\n                         (\"Sign out\", \"authenticated\"),\n                         (\"Back to Review\", \"review\")]:\n        if scope is None or scope in scopes:\n            expected.append(label)\n    def checker(document):\n        pageheader = document.find(\"table\", attrs={ \"class\": \"pageheader\" })\n        actual = []\n        for link in pageheader.find(\"ul\").findAll(\"a\"):\n            actual.append(link.string)\n        return check(\",\".join(expected), \",\".join(actual), equal=re.match)\n    return checker\n\ndef script_user(expected):\n    def checker(document):\n        for script in document.findAll(\"script\"):\n            if script.string:\n                actual = testing.User.from_script(script.string)\n                if actual:\n                    testing.expect.equal(expected, actual)\n                    return\n        raise FailedCheck(expected, \"<no user found>\")\n    return checker\n\ndef script_anonymous_user():\n    return script_user(testing.User.anonymous())\n\ndef script_no_user():\n    def checker(document):\n        for script in document.findAll(\"script\"):\n            if script.string:\n                actual = testing.User.from_script(script.string)\n                if actual:\n                    raise FailedCheck(\"<no user found>\", actual)\n    return checker\n"
  },
  {
    "path": "testing/findtests.py",
    "content": "import os\nimport re\nimport fnmatch\n\nimport testing\n\nTESTS = None\nTESTS_BY_FILENAME = {}\n\ndef automaticDependencies(filename):\n    this_dirname = os.path.dirname(filename)\n    for test in TESTS:\n        other_dirname = os.path.dirname(test.filename)\n        if this_dirname.startswith(other_dirname):\n            if os.path.sep not in other_dirname \\\n                    or len(other_dirname) < len(this_dirname):\n                yield test\n\nRE_DEPENDENCY = re.compile(r\"#\\s+@dependency\\s+([^\\s]+)\")\nRE_FLAG = re.compile(r\"#\\s+@flag\\s+([-\\w]+)\")\nRE_IGNORE = re.compile(r\"(?:\\s*#.*)?\\s*$\")\n\nclass Test(object):\n    def __init__(self, filename):\n        self.filename = filename\n        self.groups = []\n        dirname = filename\n        while True:\n            dirname, basename = os.path.split(dirname)\n            if not dirname:\n                break\n            self.groups.insert(0, dirname)\n        self.dependencies = set()\n        self.flags = set()\n\n        has_dependency_declarations = []\n\n        def process_file(path):\n            path = os.path.join(\"testing\", \"tests\", path)\n            if not os.path.isfile(path):\n                return\n            with open(path) as source_file:\n                for index, line in enumerate(source_file):\n                    match = RE_DEPENDENCY.match(line)\n                    if match:\n                        has_dependency_declarations.append(True)\n                        dependency = match.group(1)\n                        if dependency == \"none\":\n                            pass\n                        elif dependency not in TESTS_BY_FILENAME:\n                            testing.logger.error(\n                                \"%s:%d: invalid depdency: %s\"\n                                % (filename, index + 1, dependency))\n                        else:\n                            self.dependencies.add(TESTS_BY_FILENAME[dependency])\n                        continue\n                    match = RE_FLAG.match(line)\n                    if match:\n                        self.flags.add(match.group(1))\n                        continue\n                    match = RE_IGNORE.match(line)\n                    if not match:\n                        break\n\n        process_file(filename)\n\n        dirname = filename\n        while True:\n            dirname = os.path.dirname(dirname)\n            if not dirname:\n                break\n            process_file(os.path.join(dirname, \"__init__.py\"))\n\n        if not has_dependency_declarations:\n            self.dependencies.update(automaticDependencies(filename))\n\n        TESTS.append(self)\n        TESTS_BY_FILENAME[self.filename] = self\n\n    def __str__(self):\n        return self.filename\n    def __hash__(self):\n        return hash(self.filename)\n    def __eq__(self, other):\n        return self.filename == str(other)\n\n    def __repr__(self):\n        return \"Test(%r): %r\" % (self.filename, sorted([test.filename for test in self.dependencies]))\n\ndef findTests():\n    global TESTS\n\n    RE_TEST_FILENAME = re.compile(r\"/\\d\\d\\d-[^/]*\\.py$\")\n    RE_IGNORE_FILENAME = re.compile(r\"(?:/__init__.py|~)$\")\n\n    TESTS = []\n\n    def traverse(dirname):\n        for filename in sorted(os.listdir(dirname)):\n            filename = os.path.join(dirname, filename)\n            if os.path.isdir(filename):\n                traverse(filename)\n            elif RE_TEST_FILENAME.search(filename):\n                Test(os.path.relpath(filename, \"testing/tests\"))\n            elif not RE_IGNORE_FILENAME.search(filename):\n                testing.logger.warning(\n                    \"%s: unexpected non-test file under testing/tests/\"\n                    % filename)\n\n    traverse(\"testing/tests\")\n\ndef filterPatterns(patterns):\n    RE_LEADING_TESTS = re.compile(\"^(?:testing/)?tests(?:/|$)\")\n\n    patterns = [RE_LEADING_TESTS.sub(\"\", pattern) for pattern in patterns]\n    patterns = [pattern.rstrip(\"/\") for pattern in patterns]\n    patterns = filter(None, patterns)\n\n    return patterns\n\ndef selectTests(patterns, strict, flags_on=set(), flags_off=set()):\n    if TESTS is None:\n        findTests()\n\n    patterns = filterPatterns(patterns)\n\n    if not patterns and not flags_on and not flags_off:\n        return TESTS, set()\n\n    selected = set()\n    dependencies = set()\n\n    def select(test, is_dependency=False):\n        if test in selected:\n            # Test already selected.\n            return\n        selected.add(test.filename)\n        if strict:\n            # Don't select dependencies when strict=True.\n            return\n        if is_dependency:\n            dependencies.add(test.filename)\n        for dependency in test.dependencies:\n            select(dependency, True)\n\n    for test in TESTS:\n        if flags_on - test.flags:\n            continue\n        if flags_off & test.flags:\n            continue\n\n        if patterns:\n            for pattern in patterns:\n                filename = test.filename\n                while filename:\n                    if fnmatch.fnmatch(filename, pattern):\n                        select(test)\n                        break\n                    if strict:\n                        break\n                    filename = os.path.dirname(filename)\n                if test in selected:\n                    break\n        else:\n            select(test)\n\n    return [test for test in TESTS if test in selected], dependencies\n"
  },
  {
    "path": "testing/flags/addrepository-has-mirror-parameter.flag",
    "content": "The /addrepository has a parameter named 'mirror' (instead of 'remote').\n"
  },
  {
    "path": "testing/flags/fixed-batch-preview.flag",
    "content": "The /showbatch page does not crash in preview mode.\n"
  },
  {
    "path": "testing/flags/is-testing.flag",
    "content": "The installation (and upgrade) scripts support --is-testing.\n"
  },
  {
    "path": "testing/flags/pwd-independence.flag",
    "content": "The installation (and upgrade) scripts are independent of $PWD."
  },
  {
    "path": "testing/flags/reliable-admin-newswriter.flag",
    "content": "The admininistrator user is given the 'newswriter' role on installation.\n"
  },
  {
    "path": "testing/flags/reliable-git-emails.flag",
    "content": "Newly created users have their primary email address set as their (only) Git\nemail address too.\n"
  },
  {
    "path": "testing/flags/system-recipients.flag",
    "content": "The installation script supports --system-recipient.\n"
  },
  {
    "path": "testing/flags/web-server-integration.flag",
    "content": "The installation script supports --web-server-integration.\n"
  },
  {
    "path": "testing/frontend.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2013 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport json\nimport contextlib\nimport urllib\n\ntry:\n    import requests\n    import BeautifulSoup\nexcept ImportError:\n    # testing/main.py detects and abort if either of these are missing, so just\n    # ignore errors here.\n    pass\n\nimport testing\n\nclass Error(testing.TestFailure):\n    def __init__(self, url, message):\n        super(Error, self).__init__(\"page '%s' failed: %s\" % (url, message))\n        self.url = url\n\nclass HTTPError(Error):\n    def __init__(self, url, expected, actual, body=None):\n        message = \"HTTP status differs: expected=%r, actual=%r\" % (expected, actual)\n        if body:\n            message += \"\\n\" + body\n        super(HTTPError, self).__init__(url, message)\n        self.expected = expected\n        self.actual = actual\n\nclass PageError(Error):\n    def __init__(self, url, key, expected, actual):\n        super(PageError, self).__init__(\n            url, \"%s differs: expected=%r, actual=%r\" % (key, expected, actual))\n        self.key = key\n        self.expected = expected\n        self.actual = actual\n\nclass OperationError(Error):\n    def __init__(self, url, message=None, key=None, expected=None, actual=None):\n        if message is None:\n            message = \"\"\n        if key:\n            message += \"%s differs: expected=%r, actual=%r\" % (key, expected, actual)\n        super(OperationError, self).__init__(url, message)\n        self.key = key\n        self.expected = expected\n        self.actual = actual\n\nclass SessionBase(object):\n    def apply(self, kwargs):\n        pass\n\n    def process_response(self, response):\n        if \"sid\" in response.cookies:\n            raise Error(response.url, \"unexpected session cookie set\")\n\nclass NoSession(SessionBase):\n    pass\n\nclass CookieSession(SessionBase):\n    def __init__(self, sid=None):\n        self.sid = sid\n\n    def apply(self, kwargs):\n        if self.sid is not None:\n            headers = kwargs.setdefault(\"headers\", {})\n            headers[\"Cookie\"] = \"sid=%s; has_sid=1\" % self.sid\n\n    def process_response(self, response):\n        for name, value in response.cookies.items():\n            if name == \"sid\":\n                self.sid = value\n            elif name == \"has_sid\" and value == \"0\":\n                # This means we've signed out. The response would also have\n                # deleted the \"sid\" cookie, but unfortunately we can't really\n                # get that information from the response.\n                self.sid = None\n\nclass HTTPAuthSession(SessionBase):\n    def __init__(self, username, password):\n        self.username = username\n        self.password = password\n\n    def apply(self, kwargs):\n        kwargs[\"auth\"] = (self.username, self.password)\n\nclass Frontend(object):\n    def __init__(self, hostname, http_port=8080):\n        self.hostname = hostname\n        self.http_port = http_port\n        self.sessions = [NoSession()]\n        self.instance = None\n\n    @property\n    def current_session(self):\n        return self.sessions[-1]\n\n    def prefix(self, username=None):\n        if username:\n            username += \"@\"\n        else:\n            username = \"\"\n        return \"http://%s%s:%d\" % (username, self.hostname, self.http_port)\n\n    def page(self, url, params={}, expect={},\n             expected_content_type=\"text/html\",\n             expected_http_status=200,\n             disable_redirects=False,\n             post=None, put=None, delete=False):\n        full_url = \"%s/%s\" % (self.prefix(), url)\n\n        log_url = full_url\n        if params:\n            query = urllib.urlencode(sorted(params.items()))\n            log_url = \"%s?%s\" % (log_url, query)\n\n        testing.logger.debug(\"Fetching page: %s ...\" % log_url)\n\n        kwargs = {}\n\n        self.current_session.apply(kwargs)\n\n        if post is not None:\n            kwargs[\"data\"] = post\n            method = \"POST\"\n        elif put is not None:\n            kwargs[\"data\"] = put\n            method = \"PUT\"\n        elif delete:\n            method = \"DELETE\"\n        else:\n            method = \"GET\"\n\n        response = requests.request(method,\n                                    full_url,\n                                    params=params,\n                                    allow_redirects=not disable_redirects,\n                                    **kwargs)\n\n        self.current_session.process_response(response)\n\n        def text(response):\n            if hasattr(response, \"text\"):\n                if callable(response.text):\n                    return response.text()\n                else:\n                    return response.text\n            else:\n                return response.content\n\n        if isinstance(expected_http_status, int):\n            expected_http_status = [expected_http_status]\n\n        try:\n            if response.status_code not in expected_http_status:\n                if response.headers[\"content-type\"].startswith(\"text/plain\"):\n                    body = text(response)\n                else:\n                    body = None\n                raise HTTPError(url, expected_http_status, response.status_code, body)\n        except testing.TestFailure as error:\n            testing.logger.error(\"Page '%s': %s\" % (url, error.message))\n            raise testing.TestFailure\n\n        if response.status_code >= 400 and 200 in expected_http_status:\n            # The caller expected a successful load or an error.  Signal errors\n            # by returning None.\n            return None\n\n        if response.status_code >= 300 and response.status_code < 400 \\\n                and disable_redirects:\n            # Redirection, and the caller disabled following it.  The caller is\n            # interested in the redirect itself, so return the whole response.\n            return response\n\n        testing.logger.debug(\"Fetched page: %s\" % log_url)\n\n        document = text(response)\n\n        content_type, _, _ = response.headers[\"content-type\"].partition(\";\")\n\n        if isinstance(expected_content_type, str):\n            expected_content_type = (expected_content_type,)\n\n        if response.status_code == 200:\n            if content_type not in expected_content_type:\n                testing.logger.error(\n                    \"Page '%s': wrong content type: %s\" % (url, content_type))\n                raise testing.TestFailure\n\n        if content_type == \"text/html\":\n            document = BeautifulSoup.BeautifulSoup(document)\n\n            div_fatal = document.find(\"div\", attrs={ \"class\": \"fatal\" })\n            if div_fatal:\n                message = div_fatal.find(\"pre\")\n                testing.logger.error(\n                    \"Page '%s': crash during incremental page generation:\\n%s\"\n                    % (url, message.string if message else \"<no message found>\"))\n                raise testing.TestFailure\n\n        if expect:\n            testing.logger.debug(\"Checking page: %s ...\" % log_url)\n\n            failed_checks = False\n\n            for key, check in expect.items():\n                try:\n                    check(document)\n                except testing.expect.FailedCheck as failed_check:\n                    print text(response)\n                    testing.logger.error(\"Page '%s', test '%s': %s\"\n                                         % (url, key, failed_check.message))\n                    failed_checks = True\n                except Exception as error:\n                    raise Error(url, \"'%s' checker failed: %s\" % (key, str(error)))\n\n            if failed_checks:\n                raise testing.TestFailure\n\n            testing.logger.debug(\"Checked page: %s ...\" % log_url)\n\n        return document\n\n    def operation(self, url, data, expect={}):\n        full_url = \"%s/%s\" % (self.prefix(), url)\n\n        testing.logger.debug(\"Executing operation: %s ...\" % full_url)\n\n        kwargs = {}\n\n        self.current_session.apply(kwargs)\n\n        if not isinstance(data, basestring):\n            data = json.dumps(data)\n            kwargs.setdefault(\"headers\", {})[\"Content-Type\"] = \"text/json\"\n\n        response = requests.post(full_url,\n                                 data=data,\n                                 **kwargs)\n\n        try:\n            if response.status_code != 200:\n                raise HTTPError(url, 200, response.status_code)\n\n            if expect is None:\n                result = response.content\n            elif hasattr(response, \"json\"):\n                if callable(response.json):\n                    try:\n                        result = response.json()\n                    except:\n                        raise OperationError(url, message=\"malformed response (not JSON)\")\n                else:\n                    result = response.json\n                    if result is None:\n                        raise OperationError(url, message=\"malformed response (not JSON)\")\n            else:\n                try:\n                    result = json.loads(response.content)\n                except ValueError:\n                    raise OperationError(url, message=\"malformed response (not JSON)\")\n        except testing.TestFailure as error:\n            testing.logger.error(\"Operation '%s': %s\" % (url, error.message))\n            raise testing.TestFailure\n\n        self.current_session.process_response(response)\n\n        testing.logger.debug(\"Executed operation: %s\" % full_url)\n\n        if expect is not None:\n            testing.logger.debug(\"Checking operation: %s\" % full_url)\n\n            # Check result[\"status\"] first; if it doesn't have the expected value,\n            # it's likely all other expected keys are simply missing from the\n            # result, and thus produce rather meaningless errors.\n            expected = expect.get(\"status\", \"ok\")\n            actual = result.get(\"status\")\n            if actual != expected:\n                if actual == \"error\":\n                    extra = \"\\nError:\\n  %s\" % \"\\n  \".join(result.get(\"error\").splitlines())\n                elif actual == \"failure\":\n                    extra = \" (code=%r)\" % result.get(\"code\")\n                else:\n                    extra = \"\"\n                testing.logger.error(\n                    \"Operation '%s', key 'status': check failed: \"\n                    \"expected=%r, actual=%r%s\"\n                    % (url, expected, actual, extra))\n                raise testing.TestFailure\n\n            failed_checks = False\n\n            # Then check any other expected keys.\n            for key, expected in expect.items():\n                if key != \"status\":\n                    actual = result.get(key)\n                    if callable(expected):\n                        checked = expected(actual)\n                        if checked:\n                            expected, actual = checked\n                        else:\n                            continue\n                    if expected != actual:\n                        testing.logger.error(\n                            \"Operation '%s', key '%s': check failed: \"\n                            \"expected=%r, actual=%r\"\n                            % (url, key, expected, actual))\n                        failed_checks = True\n\n            if failed_checks:\n                raise testing.TestFailure\n\n            testing.logger.debug(\"Checked operation: %s\" % full_url)\n\n        return result\n\n    def json(self, path, expect=None, params={}, expected_http_status=200,\n             post=None, put=None, delete=False):\n        url = \"api/v1/\" + path\n        full_url = \"http://%s:%d/%s\" % (self.hostname, self.http_port, url)\n\n        log_url = full_url\n        if params:\n            query = urllib.urlencode(sorted(params.items()))\n            log_url = \"%s?%s\" % (log_url, query)\n\n        kwargs = { \"params\": params,\n                   \"headers\": { \"Accept\": \"application/vnd.api+json\" } }\n        method = \"GET\"\n\n        self.current_session.apply(kwargs)\n\n        if post is not None:\n            method = \"POST\"\n            kwargs[\"data\"] = json.dumps(post)\n        elif put is not None:\n            method = \"PUT\"\n            kwargs[\"data\"] = json.dumps(put)\n        elif delete:\n            method = \"DELETE\"\n\n        testing.logger.debug(\"Accessing JSON API: %s %s ...\"\n                             % (method, log_url))\n\n        response = requests.request(method, full_url, **kwargs)\n\n        testing.logger.debug(\"Accessed JSON API: %s %s ...\"\n                             % (method, log_url))\n\n        self.current_session.process_response(response)\n\n        def response_json():\n            if hasattr(response, \"json\"):\n                if callable(response.json):\n                    try:\n                        result = response.json()\n                    except:\n                        raise OperationError(url, message=\"malformed response (not JSON)\")\n                else:\n                    result = response.json\n                    if result is None:\n                        raise OperationError(url, message=\"malformed response (not JSON)\")\n            else:\n                try:\n                    result = json.loads(response.content)\n                except ValueError:\n                    raise OperationError(url, message=\"malformed response (not JSON)\")\n            return result\n\n        if isinstance(expected_http_status, int):\n            expected_http_status = [expected_http_status]\n\n        try:\n            if response.status_code not in expected_http_status:\n                if response.status_code in (400, 404):\n                    try:\n                        error = response_json()[\"error\"]\n                    except OperationError:\n                        testing.logger.exception(\"Unexpected response\")\n                    except KeyError:\n                        testing.logger.error(\"Malformed JSON error response\")\n                    else:\n                        testing.logger.error(\n                            \"JSON error:\\n  Title: %s\\n  Message: %s\"\n                            % (error[\"title\"], error[\"message\"]))\n                raise HTTPError(url, expected_http_status, response.status_code)\n\n            if response.status_code == 204:\n                # No content.\n                return None\n\n            if hasattr(response, \"json\"):\n                if callable(response.json):\n                    try:\n                        result = response.json()\n                    except:\n                        raise OperationError(url, message=\"malformed response (not JSON)\")\n                else:\n                    result = response.json\n                    if result is None:\n                        raise OperationError(url, message=\"malformed response (not JSON)\")\n            else:\n                try:\n                    result = json.loads(response.content)\n                except ValueError:\n                    raise OperationError(url, message=\"malformed response (not JSON)\")\n        except testing.TestFailure as error:\n            testing.logger.error(\"JSON '%s': %s\" % (path, error.message))\n            raise testing.TestFailure\n\n        def deunicode(value):\n            if isinstance(value, list):\n                return [deunicode(v) for v in value]\n            elif isinstance(value, dict):\n                return { deunicode(k): deunicode(v) for k, v in value.items() }\n            elif isinstance(value, unicode):\n                return value.encode(\"utf-8\")\n            return value\n\n        result = deunicode(result)\n\n        if expect is None:\n            return result\n\n        testing.logger.debug(\"Checking JSON: %s\" % log_url)\n\n        errors = []\n\n        def describe(value):\n            if isinstance(value, dict) or value is dict:\n                return \"object\"\n            if isinstance(value, list) or value is list:\n                return \"array\"\n            if isinstance(value, set):\n                return \"one of: \" % \",\".join(sorted(value))\n            if isinstance(value, type):\n                return { int: \"integer\", float: \"float\", str: \"string\" }[value]\n            if isinstance(value, (str, int, float)):\n                return repr(value)\n            if value is None:\n                return \"null\"\n            return \"unexpected\"\n\n        def check_object(path, expected, actual):\n            if not isinstance(actual, dict):\n                errors.append(\"%s: value is %s, expected object\"\n                              % (path, describe(actual)))\n                return\n            if expected is dict:\n                return\n            expected_keys = set(expected.keys())\n            actual_keys = set(actual.keys())\n            if \"*\" in expected_keys:\n                expected_keys.remove(\"*\")\n            elif actual_keys - expected_keys:\n                errors.append(\"%s: unexpected keys: %r\"\n                              % (path, tuple(actual_keys - expected_keys)))\n            if expected_keys - actual_keys:\n                errors.append(\"%s: missing keys: %r\"\n                              % (path, tuple(expected_keys - actual_keys)))\n            for key in sorted(expected_keys & actual_keys):\n                check(\"%s/%s\" % (path, key), expected[key], actual[key])\n\n        def check_array(path, expected, actual):\n            if not isinstance(actual, list):\n                errors.append(\"%s: value is %s, expected array\"\n                            % (path, describe(actual)))\n            if expected is list:\n                return\n            if len(actual) != len(expected):\n                errors.append(\"%s: wrong array length: got %s, expected %s\"\n                              % (path, len(actual), len(expected)))\n                return\n            for index, (expected, actual) in enumerate(zip(expected, actual)):\n                check(\"%s[%d]\" % (path, index), expected, actual)\n\n        def check_set(path, expected, actual):\n            if not isinstance(actual, str):\n                errors.append(\"%s: value is %s, expected string\"\n                              % (path, describe(actual)))\n            if actual not in expected:\n                errors.append(\"%s: value is %s, expected %s\"\n                              % (path, describe(actual), describe(expected)))\n\n        def check_null(path, actual):\n            if actual is not None:\n                errors.append(\"%s: value is %s, expected null\"\n                              % (path, describe(actual)))\n\n        def check_value(path, expected, actual):\n            if isinstance(actual, (dict, list)):\n                errors.append(\"%s: value is %s, expected %s\"\n                              % (path, describe(actual), describe(expected)))\n            if isinstance(expected, type):\n                if not isinstance(actual, expected):\n                    errors.append(\"%s: wrong value: got %r, expected %r\"\n                                  % (path, actual, describe(expected)))\n            elif actual != expected:\n                errors.append(\"%s: wrong value: got %r, expected %r\"\n                              % (path, actual, expected))\n\n        def check(path, expected, actual):\n            errors_before = len(errors)\n            if callable(expected) and not isinstance(expected, type):\n                errors.extend(expected(path, actual, check) or ())\n            elif isinstance(expected, dict) or expected is dict:\n                check_object(path, expected, actual)\n            elif isinstance(expected, list) or expected is list:\n                check_array(path, expected, actual)\n            elif isinstance(expected, set):\n                check_set(path, expected, actual)\n            elif expected is None:\n                check_null(path, actual)\n            else:\n                check_value(path, expected, actual)\n            return errors_before == len(errors)\n\n        check(path, expect, result)\n\n        if errors:\n            testing.logger.error(\"Wrong JSON received for %s:\\n  %s\"\n                                 % (path, \"\\n  \".join(errors)))\n            testing.logger.error(\"Received JSON: %r\" % result)\n\n        testing.logger.debug(\"Checked JSON: %s\" % log_url)\n\n        return result\n\n    @contextlib.contextmanager\n    def cookie_session(self, signout):\n        if self.current_session.sid is None:\n            testing.expect.check(\"<signed in>\", \"<no session cookie received>\")\n        testing.logger.debug(\"Starting cookie session\")\n        try:\n            yield\n        finally:\n            # Sign out unless we seem to have signed out already. Some tests may\n            # want to do the signout explicitly, which is fine.\n            if self.current_session.sid is not None:\n                try:\n                    signout()\n                except testing.TestFailure as failure:\n                    if failure.message:\n                        testing.logger.error(failure.message)\n                except Exception:\n                    testing.logger.exception(\"Failed to sign out!\")\n\n                if self.current_session.sid is not None:\n                    testing.expect.check(\"<signed out>\",\n                                         \"<session cookie not removed>\")\n\n            # Dropping the cookie effectively signs out even if the \"endsession\"\n            # operation failed.\n            self.sessions.pop()\n\n            testing.logger.debug(\"Ended cookie session\")\n\n    @contextlib.contextmanager\n    def no_session(self):\n        self.sessions.append(NoSession())\n        try:\n            yield\n        finally:\n            self.sessions.pop()\n\n    def collect_session_cookie(self):\n        self.sessions.append(CookieSession())\n\n    def validatelogin(self, username, password, expect_failure=False):\n        data = { \"fields\": { \"username\": username,\n                             \"password\": password }}\n\n        # Check if the current commit predates the user authentication\n        # restructuring that added the \"fields\" wrapper.\n        if self.instance.current_commit:\n            if not testing.exists_at(\n                    self.instance.current_commit, \"src/auth/database.py\"):\n                data = data[\"fields\"]\n\n        if expect_failure:\n            expect = { \"message\": expect_failure }\n        else:\n            expect = { \"message\": None }\n\n        self.operation(\n            \"validatelogin\",\n            data=data,\n            expect=expect)\n\n    @contextlib.contextmanager\n    def signin(self, username=\"admin\", password=\"testing\", use_httpauth=False,\n               use_json_api=False, access_token=None):\n        if access_token:\n            username = access_token[\"part1\"]\n            password = access_token[\"part2\"]\n            use_httpauth = True\n        if use_httpauth:\n            self.sessions.append(HTTPAuthSession(username, password))\n            try:\n                yield\n            finally:\n                self.sessions.pop()\n        else:\n            with self.no_session():\n                self.collect_session_cookie()\n                if use_json_api:\n                    self.json(\n                        \"sessions\",\n                        post={\n                            \"username\": username,\n                            \"password\": password\n                        },\n                        expect={\n                            \"user\": self.instance.userid(username),\n                            \"type\": \"normal\",\n                            \"*\": \"*\"\n                        })\n                    def signout():\n                        self.json(\n                            \"sessions/current\",\n                            delete=True,\n                            expected_http_status=204)\n                else:\n                    self.validatelogin(username, password)\n                    def signout():\n                        self.operation(\"endsession\", data={})\n                with self.cookie_session(signout):\n                    yield\n\n    def run_basic_tests(self):\n        # The /tutorials page is essentially static content and doesn't require\n        # a signed in user, so a good test-case for checking if the site is up\n        # and accessible at all.\n        self.page(\"tutorial\", expect={ \"document_title\": testing.expect.document_title(u\"Tutorials\"),\n                                       \"content_title\": testing.expect.paleyellow_title(0, u\"Tutorials\") })\n\n        # The /validatelogin operation is a) necessary for most meaningful\n        # additional testing, and b) a simple enough operation to test.\n        with self.signin():\n            # Load /home to determine whether /validatelogin successfully signed in\n            # (and that we stored the session id cookie correctly.)\n            self.page(\"home\", expect={ \"document_title\": testing.expect.document_title(u\"Testing Administrator's Home\"),\n                                       \"content_title\": testing.expect.paleyellow_title(0, u\"Testing Administrator's Home\") })\n"
  },
  {
    "path": "testing/input/SystemExtension/MANIFEST",
    "content": "Author = \"Jens Lindstr\\u00f6m\"\nDescription = \"Extension used to test system extension support.\"\n\n[Page check]\nDescription = \"Simple page to check that the extension is installed and working.\"\nScript = check.js\nFunction = check\n"
  },
  {
    "path": "testing/input/SystemExtension/check.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*- */\n\n\"use strict\";\n\nfunction check() {\n  writeln(200);\n  writeln(\"Content-Type: text/json\");\n  writeln();\n  writeln(JSON.stringify({ \"status\": \"ok\" }));\n}\n"
  },
  {
    "path": "testing/input/SystemExtension/resources/HelloWorld.txt",
    "content": "Hello world!\n"
  },
  {
    "path": "testing/input/TestExtension/MANIFEST",
    "content": "Author = \"Jens Lindstr\\u00f6m\"\nDescription = \"Extension used to test extension support.\"\n\n[Page echo]\nDescription = \"Echoes function argument list in JSON form.\"\nScript = echo.js\nFunction = echo\n\n[Page evaluate]\nDescription = \"Evaluates expression and returns result.\"\nScript = evaluate.js\nFunction = evaluate\n\n[Page nothandled]\nDescription = \"Page that is not handled.\"\nScript = nothandled.js\nFunction = nothandled\n\n[Page empty]\nDescription = \"Empty page.\"\nScript = empty.js\nFunction = empty\n\n[Page version]\nDescription = \"Outputs extension version.\"\nScript = version.js\nFunction = version\n\n[Page restrictions]\nDescription = \"Checks various restrictions.\"\nScript = restrictions.js\nFunction = restrictions\n\n[Page error.compilation]\nDescription = \"A script that doesn't compile.\"\nScript = error.compilation.js\nFunction = irrelevant\n\n[Page error.runtime]\nDescription = \"A function that throws an exception.\"\nScript = error.runtime.js\nFunction = test\n\n[Page Review.list]\nDescription = \"Testing critic.Review.list().\"\nScript = Review.list.js\nFunction = test\n\n[Page MailTransaction]\nDescription = \"Testing MailTransaction.\"\nScript = MailTransaction.js\nFunction = test\n\n[Inject home]\nDescription = \"Basic injection testing.\"\nScript = inject.js\nFunction = inject\n\n[Inject home]\nDescription = \"Custom injection testing.\"\nScript = inject.js\nFunction = injectCustom\n\n[Inject critic/*]\nDescription = \"Inject path handling test.\"\nScript = inject.js\nFunction = showcommitShort\n\n[Inject showcommit]\nDescription = \"Inject path handling test.\"\nScript = inject.js\nFunction = showcommitLong\n\n[ProcessCommits]\nDescription = \"Basic commits processing testing.\"\nScript = processcommits.js\nFunction = processcommits\n\n[FilterHook echo]\nDescription = \"Filter hook that echoes its arguments via a mail.\"\nDataDescription = \"Some random data.\"\nScript = filterhook.js\nFunction = filterhook\n"
  },
  {
    "path": "testing/input/TestExtension/MailTransaction.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*- */\n\n\"use strict\";\n\nfunction test() {\n  var data = JSON.parse(read());\n  var transaction = new critic.MailTransaction;\n  var message = null;\n\n  try {\n    data.mails.forEach(\n      function (data) {\n        transaction.add(data);\n      });\n    transaction.finish();\n  } catch (error) {\n    message = error.message;\n  }\n\n  writeln(200);\n  writeln(\"Content-Type: text/json\");\n  writeln();\n  writeln(JSON.stringify({ status: \"ok\", message: message }));\n}\n"
  },
  {
    "path": "testing/input/TestExtension/Review.list.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*- */\n\n\"use strict\";\n\nfunction test() {\n  var passed = [];\n  var failed = [];\n\n  function correct(test, fn) {\n    try {\n      var reviews = fn();\n      passed.push({ test: test, result: format(\"%d reviews\", reviews.length) });\n    } catch (error) {\n      failed.push({ test: test, message: error.message });\n    }\n  }\n\n  function incorrect(test, fn, expected) {\n    try {\n      fn();\n      failed.push({ test: test, message: \"no exception thrown\" });\n    } catch (error) {\n      if (error.message != expected)\n        failed.push({ test: test, message: format(\n          \"wrong error: expected=%r, actual=%r\",\n          expected, error.message) });\n      else\n        passed.push({ test: test, result: \"call failed as expected\" });\n    }\n  }\n\n  /* For now we're only testing that the various calls produce correct database\n     queries in so far that the database doesn't outright reject them. */\n\n  correct(\"no filtering\", function () {\n    return critic.Review.list();\n  });\n\n  correct(\"filter by repository (instance)\", function () {\n    return critic.Review.list({ repository: new critic.Repository(\"critic\") });\n  });\n  correct(\"filter by repository (id)\", function () {\n    return critic.Review.list({ repository: 1 });\n  });\n  correct(\"filter by repository (name)\", function () {\n    return critic.Review.list({ repository: \"critic\" });\n  });\n\n  correct(\"filter by state (open)\", function () {\n    return critic.Review.list({ state: \"open\" });\n  });\n\n  correct(\"filter by state (closed)\", function () {\n    return critic.Review.list({ state: \"closed\" });\n  });\n\n  correct(\"filter by state (dropped)\", function () {\n    return critic.Review.list({ state: \"dropped\" });\n  });\n\n  correct(\"filter by owner (instance)\", function () {\n    return critic.Review.list({ owner: new critic.User(\"alice\") });\n  });\n  correct(\"filter by owner (id)\", function () {\n    return critic.Review.list({ owner: 1 });\n  });\n  correct(\"filter by owner (name)\", function () {\n    return critic.Review.list({ owner: \"alice\" });\n  });\n\n  /* Only check 'id' and 'name' variants from now on; the 'instance' variant\n     is really just an alternative way to specify the id. */\n\n  correct(\"filter by repository (id) and state\", function () {\n    return critic.Review.list({ repository: 1,\n                                state: \"open\" });\n  });\n  correct(\"filter by repository (name) and state\", function () {\n    return critic.Review.list({ repository: \"critic\",\n                                state: \"open\" });\n  });\n\n  correct(\"filter by repository (id) and owner (id)\", function () {\n    return critic.Review.list({ repository: 1,\n                                owner: 1 });\n  });\n  correct(\"filter by repository (name) and owner (id)\", function () {\n    return critic.Review.list({ repository: \"critic\",\n                                owner: 1 });\n  });\n  correct(\"filter by repository (id) and owner (name)\", function () {\n    return critic.Review.list({ repository: 1,\n                                owner: \"alice\" });\n  });\n  correct(\"filter by repository (name) and owner (name)\", function () {\n    return critic.Review.list({ repository: \"critic\",\n                                owner: \"alice\" });\n  });\n\n  correct(\"filter by state and owner (id)\", function () {\n    return critic.Review.list({ state: \"open\",\n                                owner: 1 });\n  });\n  correct(\"filter by state and owner (name)\", function () {\n    return critic.Review.list({ state: \"open\",\n                                owner: \"alice\" });\n  });\n\n  incorrect(\"filter by bogus state\", function () {\n    return critic.Review.list({ state: \"bogus\" });\n  }, \"invalid argument: data.state=\\\"bogus\\\" not valid\");\n\n  writeln(200);\n  writeln(\"Content-Type: text/json\");\n  writeln();\n  writeln(JSON.stringify({ status: \"ok\", passed: passed, failed: failed }));\n}\n"
  },
  {
    "path": "testing/input/TestExtension/echo.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*- */\n\n\"use strict\";\n\nfunction echo() {\n  writeln(200);\n  writeln(\"Content-Type: text/json\");\n  writeln();\n  writeln(JSON.stringify({ \"status\": \"ok\",\n                           \"arguments\": [].slice.call(arguments, 0, 3),\n                           \"headers\": arguments[3],\n                           \"stdin\": read() }));\n}\n"
  },
  {
    "path": "testing/input/TestExtension/empty.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*- */\n\n\"use strict\";\n\nfunction empty() {\n  writeln(200);\n  writeln(\"Content-Type: text/plain\");\n  writeln();\n}\n"
  },
  {
    "path": "testing/input/TestExtension/error.compilation.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*- */\n\n\"use strict\";\n\n/* Strict mode disallows duplicated parameter names. */\nfunction wrong(x, x) {\n}\n\nfunction irrelevant() {\n  writeln(200);\n  writeln(\"Content-Type: text/plain\");\n  writeln();\n}\n"
  },
  {
    "path": "testing/input/TestExtension/error.runtime.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*- */\n\n\"use strict\";\n\nfunction test() {\n  writeln(200);\n  writeln(\"Content-Type: text/plain\");\n  writeln();\n  writeln(new critic.User({ name: \"nosuchuser\" }).fullname);\n}\n"
  },
  {
    "path": "testing/input/TestExtension/evaluate.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*- */\n\n\"use strict\";\n\nfunction evaluate() {\n  var data = JSON.parse(read());\n\n  writeln(200);\n  writeln(\"Content-Type: text/json\");\n  writeln();\n\n  try {\n    var source = \"(function () { \" + data[\"source\"] + \" })\";\n    var fn = eval(source);\n    writeln(JSON.stringify({ \"status\": \"ok\",\n                             \"result\": fn() }));\n  } catch (error) {\n    writeln(JSON.stringify({ \"status\": \"error\",\n                             \"source\": source,\n                             \"error\": String(error) }));\n  }\n}\n"
  },
  {
    "path": "testing/input/TestExtension/filterhook.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*- */\n\n\"use strict\";\n\nfunction filterhook(data, review, user, commits, files) {\n  files.forEach(\n    function (file) {\n      if (file.path == \"015-filterhook/include/explode\")\n        throw Error(\"Boom!\");\n    });\n\n  var transaction = new critic.MailTransaction;\n\n  transaction.add({\n    to: critic.User.current,\n    subject: \"filterhook.js::filterhook()\",\n    review: review,\n    body: format(\"data: %r\\n\" +\n                 \"review.id: %d\\n\" +\n                 \"user.name: %s\\n\" +\n                 \"commits: %r\\n\" +\n                 \"files: %r\\n\",\n                 data,\n                 review.id,\n                 user.name,\n                 commits.map(function (commit) { return commit.message; }),\n                 files.map(function (file) { return file.path; }).sort())\n  });\n\n  transaction.finish();\n}\n"
  },
  {
    "path": "testing/input/TestExtension/inject.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*- */\n\n\"use strict\";\n\nfunction inject() {\n  writeln(\"script %r\", format(\"data:text/javascript,var injected=%r;\",\n                              [].slice.call(arguments)));\n}\n\nfunction showcommitShort() {\n  writeln(\"script %r\", format(\"data:text/javascript,var showcommitShort=%r;\",\n                              [].slice.call(arguments)));\n}\n\nfunction showcommitLong() {\n  writeln(\"script %r\", format(\"data:text/javascript,var showcommitLong=%r;\",\n                              [].slice.call(arguments)));\n}\n\nfunction injectCustom(path, query) {\n  if (query && query.params.expr) {\n    writeln(\"script %r\", format(\"data:text/javascript,var injectedCustom=%r;\",\n                                eval(query.params.expr)));\n  }\n}\n"
  },
  {
    "path": "testing/input/TestExtension/nothandled.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*- */\n\n\"use strict\";\n\nfunction nothandled() {\n}\n"
  },
  {
    "path": "testing/input/TestExtension/processcommits.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*- */\n\n\"use strict\";\n\nfunction processcommits(review, changeset, commitset) {\n  writeln(\"r/%d\", review.id);\n  writeln(\"%s..%s\", changeset.parent.sha1.substring(0, 8), changeset.child.sha1.substring(0, 8));\n  writeln(\"%s\", commitset.map(function (commit) { return commit.sha1.substring(0, 8); }));\n}\n"
  },
  {
    "path": "testing/input/TestExtension/resources/hello.world.js",
    "content": "window.onload = function () {\n  alert(\"Hello world!\");\n};\n"
  },
  {
    "path": "testing/input/TestExtension/resources/helloworld.css",
    "content": "h1 { color: lime }\n"
  },
  {
    "path": "testing/input/TestExtension/resources/helloworld.html",
    "content": "<!DOCTYPE html>\n<link rel=stylesheet href=helloworld.css>\n<script src=helloworld.js></script>\n<h1>Hello world!</h1>\n"
  },
  {
    "path": "testing/input/TestExtension/restrictions.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*- */\n\n\"use strict\";\n\nfunction checkDatabaseConnection() {\n  try {\n    new PostgreSQL.Connection({ dbname: \"critic\",\n                                user: \"critic\" });\n    return \"no error\";\n  } catch (error) {\n    return error.message;\n  }\n}\n\nfunction restrictions() {\n  writeln(200);\n  writeln(\"Content-Type: text/json\");\n  writeln();\n  writeln(\"%r\", { status: \"ok\",\n                  database_connection: checkDatabaseConnection() });\n}\n"
  },
  {
    "path": "testing/input/TestExtension/version.js",
    "content": "/* -*- mode: js; indent-tabs-mode: nil -*- */\n\n\"use strict\";\n\nfunction version() {\n  writeln(200);\n  writeln(\"Content-Type: text/plain\");\n  writeln();\n  write(IO.File.read(\"version.txt\"));\n}\n"
  },
  {
    "path": "testing/input/customization/githook.py",
    "content": "import sys\nimport json\n\nclass Reject(Exception):\n    pass\n\ndef update(repository_path, ref_name, old_value, new_value):\n    data = json.dumps({ \"repository_path\": repository_path,\n                        \"ref_name\": ref_name,\n                        \"old_value\": old_value,\n                        \"new_value\": new_value })\n\n    if ref_name == \"refs/heads/reject-create\" and old_value is None:\n        raise Reject(\"REJECT:\" + data)\n    elif ref_name == \"refs/heads/reject-delete\" and new_value is None:\n        raise Reject(\"REJECT:\" + data)\n    elif ref_name == \"refs/heads/reject-update\" \\\n            and not (old_value is None or new_value is None):\n        raise Reject(\"REJECT:\" + data)\n    else:\n        sys.stdout.write(\"ACCEPT:\" + data + \"\\n\")\n"
  },
  {
    "path": "testing/input/customization/linktypes.py",
    "content": "import linkify\n\nclass IssueLink(linkify.LinkType):\n    def __init__(self):\n        super(IssueLink, self).__init__(\"#[0-9]+\")\n    def linkify(self, word, context):\n        return \"https://issuetracker.example.com/showIssue?id=\" + word[1:]\n\nIssueLink()\n"
  },
  {
    "path": "testing/input/empty.txt",
    "content": ""
  },
  {
    "path": "testing/input/service_log_filter.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport os\nimport logging\nimport re\nimport json\n\ndef level_value(level):\n    return getattr(logging, level.upper())\n\nfilter_level = level_value(sys.argv[1])\nlogfile_paths = sys.argv[2:]\n\ndef include_entry(entry_level):\n    return filter_level <= level_value(entry_level)\n\nHEADER = r\"\\d{4}-\\d\\d-\\d\\d \\d\\d:\\d\\d:\\d\\d,\\d\\d\\d -  *\"\n\nRE_ENTRY = re.compile(\n    \"{header}(([A-Z]+) - .*?)(?=$|\\n{header}[A-Z]+ - )\".format(header=HEADER),\n    re.DOTALL)\n\ndata = {}\n\nfor logfile_path in logfile_paths:\n    with open(logfile_path) as logfile:\n        log = logfile.read()\n\n    if os.path.isfile(logfile_path + \".skip\"):\n        with open(logfile_path + \".skip\") as logfile_skip:\n            skip = int(logfile_skip.read())\n    else:\n        skip = 0\n\n    entries = []\n\n    for index, match in enumerate(RE_ENTRY.finditer(log)):\n        if index < skip:\n            continue\n        entry, entry_level = match.groups()\n        if include_entry(entry_level):\n            entries.append(entry)\n\n    if entries:\n        data[logfile_path] = entries\n\n    skip = index + 1\n\n    with open(logfile_path + \".skip\", \"w\") as logfile_skip:\n        logfile_skip.write(str(skip))\n\nif data:\n    json.dump(data, sys.stdout)\n    sys.exit(0)\n\nsys.exit(1)\n"
  },
  {
    "path": "testing/input/service_synchronization_helper.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport os\nimport time\n\npidfile_path, signal, timeout = sys.argv[1:]\n\nwith open(pidfile_path) as pidfile:\n    pid = int(pidfile.read().strip())\n\nwith open(pidfile_path + \".busy\", \"w\"):\n    pass\n\nos.kill(pid, int(signal))\n\ndeadline = time.time() + int(timeout)\n\nwhile os.path.isfile(pidfile_path + \".busy\"):\n    if time.time() > deadline:\n        sys.exit(1)\n    time.sleep(0.01)\n"
  },
  {
    "path": "testing/input/syntaxhighlight/example.cpp",
    "content": "/* This is a comment\n   that spans multiple\n   lines. */\n\n/* This one does not. */\n\n// Neither does this. /* Or this. */\n\n#if !defined(FOO)\n#  define FOO BAR // Comment\n#  define FOO \\\n  BAR \\\n  FIE\n#endif\n\nint main(int argc, char** argv) {\n  double x = float(5.5) + int(3);\n  char* s = \"this is a string\";\n  char c = 'c'; // <= that's a character\n  return x != 10;\n}\n"
  },
  {
    "path": "testing/local.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2014 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport os\nimport sys\n\nimport testing\n\nclass Instance(testing.Instance):\n    flags_on = [\"local\"]\n\n    def has_flag(self, flag):\n        return testing.has_flag(\"HEAD\", flag)\n\n    def run_unittest(self, args):\n        PYTHONPATH = os.path.join(os.getcwd(), \"src\")\n        argv = [sys.executable, \"-u\", \"-m\", \"run_unittest\"] + args\n        return self.executeProcess(argv, cwd=\"src\", log_stderr=False,\n                                   env={ \"PYTHONPATH\": PYTHONPATH })\n\n    def filter_service_logs(self, level, service_names):\n        # We have no services.\n        pass\n"
  },
  {
    "path": "testing/mailbox.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2013 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport socket\nimport threading\nimport time\nimport re\nimport email\nimport base64\n\nimport testing\n\nclass MissingMail(testing.TestFailure):\n    def __init__(self, criteria):\n        super(MissingMail, self).__init__(\n            \"No mail matching %r received\" % criteria)\n        self.criteria = criteria\n\nclass User(object):\n    def __init__(self, name, address):\n        self.name = name\n        self.address = address\n\nclass Mail(object):\n    def __init__(self, return_path):\n        self.return_path = return_path\n        self.recipient = None\n        self.headers = {}\n        self.lines = []\n\n    def header(self, name, default=None):\n        if name.lower() in self.headers:\n            return self.headers[name.lower()][0][\"value\"]\n        else:\n            return default\n\n    def all_headers(self):\n        for header_name in sorted(self.headers.keys()):\n            for header in self.headers[header_name]:\n                yield (header[\"name\"], header[\"value\"])\n\n    def __str__(self):\n        return \"%s\\n\\n%s\" % (\"\\n\".join((\"%s: %s\" % header)\n                                       for header in self.all_headers()),\n                             \"\\n\".join(self.lines))\n\nclass EOF(Exception):\n    pass\n\nclass Quit(Exception):\n    pass\n\nclass Error(Exception):\n    pass\n\nclass ParseError(Error):\n    def __init__(self, line):\n        super(ParseError, self).__init__(\"line=%r\" % line)\n        self.line = line\n\nclass Client(threading.Thread):\n    def __init__(self, mailbox, client, debug_mails):\n        super(Client, self).__init__()\n        self.mailbox = mailbox\n        self.credentials = mailbox.credentials\n        self.client = client\n        self.client.settimeout(None)\n        self.debug_mails = debug_mails\n        self.buffered = \"\"\n        self.start()\n\n    def sendline(self, string):\n        self.client.sendall(\"%s\\r\\n\" % string)\n\n    def recvline(self):\n        while \"\\r\\n\" not in self.buffered:\n            data = self.client.recv(4096)\n            if not data:\n                raise EOF\n            self.buffered += data\n        line, self.buffered = self.buffered.split(\"\\r\\n\", 1)\n        return line\n\n    def expectline(self, pattern):\n        line = self.recvline()\n        match = re.match(pattern, line, re.IGNORECASE)\n        if not match:\n            raise ParseError(line)\n        return match.groups()\n\n    def handshake(self):\n        self.sendline(\"220 critic.example.org I'm the Critic Testing Framework\")\n\n        line = self.recvline()\n        if re.match(r\"helo\\s+(\\S+)$\", line, re.IGNORECASE):\n            if self.credentials:\n                raise Error\n            self.sendline(\"250 critic.example.org\")\n        elif re.match(r\"ehlo\\s+(\\S+)$\", line, re.IGNORECASE):\n            if self.credentials:\n                self.sendline(\"250-critic.example.org\")\n                self.sendline(\"250 AUTH LOGIN\")\n\n                line = self.recvline()\n                match = re.match(r\"auth\\s+login(?:\\s+(.+))?$\",\n                                 line, re.IGNORECASE)\n                if not match:\n                    raise ParseError(line)\n\n                (username_b64,) = match.groups()\n\n                if not username_b64:\n                    self.sendline(\"334 %s\" % base64.b64encode(\"Username:\"))\n                    username_b64 = self.recvline()\n\n                self.sendline(\"334 %s\" % base64.b64encode(\"Password:\"))\n                password_b64 = self.recvline()\n\n                try:\n                    username = base64.b64decode(username_b64)\n                except TypeError:\n                    raise Error(\"Invalid base64: %r\" % username_b64)\n\n                try:\n                    password = base64.b64decode(password_b64)\n                except TypeError:\n                    raise Error(\"Invalid base64: %r\" % password_b64)\n\n                if username != self.credentials[\"username\"] \\\n                        or password != self.credentials[\"password\"]:\n                    raise Error(\"Wrong credentials: %r / %r\" % (username, password))\n\n                self.sendline(\"235 Welcome, %s!\" % username)\n\n                testing.logger.debug(\"Mailbox: Client authenticated.\")\n            else:\n                self.sendline(\"250 critic.example.org\")\n        else:\n            raise Error\n\n    def receive(self):\n        try:\n            (return_path,) = self.expectline(r\"mail\\s+from:<([^>]+)>(?:\\s+size=\\d+)?$\")\n        except ParseError as error:\n            if error.line.lower() == \"quit\":\n                self.sendline(\"221 critic.example.org Bye, bye\")\n                raise Quit\n            raise\n\n        self.sendline(\"250 OK\")\n\n        mail = Mail(return_path)\n\n        # For simplicity we only support a single recipient.  Critic (currently)\n        # never sends mails with multiple recipients.  (It often sends identical\n        # mails to multiple recipients, but on the SMTP level, they are multiple\n        # single-recipient mails.)\n        (mail.recipient,) = self.expectline(r\"rcpt\\s+to:<([^>]+)>$\")\n\n        testing.logger.debug(\"Mailbox: Mail to <%s>.\" % mail.recipient)\n\n        self.sendline(\"250 OK\")\n        self.expectline(\"data\")\n        self.sendline(\"354 Right\")\n\n        message_source = \"\"\n\n        while True:\n            line = self.recvline()\n            if line == \".\":\n                break\n            message_source += line + \"\\r\\n\"\n\n        message = email.message_from_string(message_source)\n\n        for name in message.keys():\n            headers = mail.headers.setdefault(name.lower(), [])\n            for value in message.get_all(name):\n                value = re.sub(\"\\r\\n[ \\t]+\", \" \", value)\n                headers.append({ \"name\": name, \"value\": value })\n\n        mail.lines = message.get_payload(decode=True).splitlines()\n\n        testing.logger.debug(\"Received mail to: <%s> \\\"%s\\\"\"\n                             % (mail.recipient, mail.header(\"Subject\")))\n\n        if self.debug_mails:\n            source = \"--------------------------------------------------\\n\"\n            for name, value in message.items():\n                source += \"%s: %s\\n\" % (name, value)\n            source += \"\\n\"\n            for line in mail.lines:\n                source += line + \"\\n\"\n            source += \"--------------------------------------------------\"\n            testing.logger.debug(source)\n\n        self.mailbox.add(mail)\n        self.sendline(\"250 OK\")\n\n    def run(self):\n        try:\n            testing.logger.debug(\"Mailbox: Client connected.\")\n            self.handshake()\n            testing.logger.debug(\"Mailbox: Client ready.\")\n            while True:\n                self.receive()\n        except Error as error:\n            testing.logger.error(\"Mailbox: Client error: %s\" % error.message)\n        except Quit:\n            testing.logger.debug(\"Mailbox: Client quit.\")\n        except EOF:\n            testing.logger.debug(\"Mailbox: Client disconnected prematurely.\")\n        except Exception:\n            testing.logger.exception(\"Mailbox: Client error!\")\n        self.close()\n\n    def close(self):\n        try:\n            self.client.close()\n        except socket.error:\n            pass\n\nclass Listener(threading.Thread):\n    def __init__(self, mailbox, debug_mails):\n        super(Listener, self).__init__()\n        self.daemon = True\n        self.mailbox = mailbox\n        self.debug_mails = debug_mails\n        self.socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)\n        self.socket.settimeout(0.1)\n        self.socket.bind((\"\", 0))\n        self.socket.listen(1)\n        self.stopped = False\n        self.start()\n\n    def run(self):\n        while not self.stopped:\n            try:\n                client, _ = self.socket.accept()\n            except socket.timeout:\n                pass\n            else:\n                Client(self.mailbox, client, self.debug_mails)\n\n    def stop(self):\n        self.stopped = True\n\nclass Mailbox(object):\n    def __init__(self, instance, credentials=None, debug_mails=False):\n        self.instance = instance\n        self.credentials = credentials\n        self.queued = []\n        self.errors = []\n        self.condition = threading.Condition()\n        self.listener = Listener(self, debug_mails)\n\n    def add(self, mail):\n        with self.condition:\n            self.queued.append(mail)\n            self.condition.notify()\n\n    def pop(self, accept=None):\n        def is_accepted(mail):\n            if accept is None:\n                return True\n            if callable(accept):\n                return accept(mail)\n            for fn in accept:\n                if not fn(mail):\n                    return False\n            return True\n\n        def find_mail():\n            with self.condition:\n                for mail in self.queued:\n                    if is_accepted(mail):\n                        self.queued.remove(mail)\n                        return mail\n\n        mail = find_mail()\n        if mail:\n            return mail\n\n        if accept is not None and self.instance:\n            # Wait until the instance's mail delivery service is idle, which\n            # means it has delivered all pending mail.  After that, the mail\n            # should be here, or it never will be.\n            self.instance.synchronize_service(\"maildelivery\")\n\n            mail = find_mail()\n            if mail:\n                return mail\n\n        raise MissingMail(accept)\n\n    def reset(self):\n        with self.condition:\n            self.queued = []\n\n    def pop_error(self):\n        with self.condition:\n            return self.errors.pop(0)\n\n    def stop(self):\n        self.listener.stop()\n\n    def check_empty(self):\n        try:\n            while True:\n                unexpected = self.pop()\n                testing.logger.error(\"Unexpected mail to <%s>:\\n%s\"\n                                     % (unexpected.recipient, unexpected))\n        except MissingMail:\n            pass\n\n    @property\n    def port(self):\n        return self.listener.socket.getsockname()[1]\n\n    def __enter__(self):\n        return self\n\n    def __exit__(self, *args):\n        self.stop()\n        return False\n\nclass WithSubject(object):\n    def __init__(self, value):\n        self.regexp = re.compile(value)\n    def __call__(self, mail):\n        return self.regexp.match(mail.header(\"Subject\")) is not None\n    def __repr__(self):\n        return \"subject=%r\" % self.regexp.pattern\n\nclass ToRecipient(object):\n    def __init__(self, address):\n        self.address = address\n    def __call__(self, mail):\n        return mail.recipient == self.address\n    def __repr__(self):\n        return \"recipient=<%s>\" % self.address\n"
  },
  {
    "path": "testing/main.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2013 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport os\nimport argparse\nimport logging\nimport re\nimport subprocess\nimport traceback\nimport time\nimport datetime\n\nimport testing\n\nclass Counters:\n    def __init__(self):\n        self.tests_run = 0\n        self.tests_failed = 0\n        self.errors_logged = 0\n        self.warnings_logged = 0\n\ncounters = Counters()\nlogger = None\n\nclass TestingAborted(Exception):\n    pass\n\ndef run():\n    global logger\n\n    parser = argparse.ArgumentParser(description=\"Critic testing framework\")\n\n    parser.add_argument(\"--debug\", action=\"store_true\",\n                        help=\"Enable DEBUG level logging\")\n    parser.add_argument(\"--debug-mails\", action=\"store_true\",\n                        help=\"Log every mail sent by the tested system\")\n    parser.add_argument(\"--quiet\", action=\"store_true\",\n                        help=\"Disable INFO level logging\")\n\n    parser.add_argument(\"--quickstart\", action=\"store_true\",\n                        help=\"Test against a quick-start instance\")\n    parser.add_argument(\"--coverage\", action=\"store_true\",\n                        help=\"Enable coverage measurement mode\")\n    parser.add_argument(\"--commit\",\n                        help=\"Commit (symbolic ref or SHA-1) to test [default=HEAD]\")\n    parser.add_argument(\"--upgrade-from\",\n                        help=\"Commit (symbolic ref or SHA-1) to install first and upgrade from\")\n    parser.add_argument(\"--strict-fs-permissions\", action=\"store_true\",\n                        help=\"Set strict file-system permissions in guest OS\")\n    parser.add_argument(\"--test-extensions\", action=\"store_true\",\n                        help=\"Test extensions\")\n\n    parser.add_argument(\"--local\", action=\"store_true\",\n                        help=\"Run local standalone tests only\")\n\n    parser.add_argument(\"--vbox-host\", default=\"host\",\n                        help=\"Host that's running VirtualBox [default=host]\")\n    parser.add_argument(\"--vm-identifier\",\n                        help=\"VirtualBox instance name or UUID\")\n    parser.add_argument(\"--vm-hostname\",\n                        help=\"VirtualBox instance hostname [default=VM_IDENTIFIER\")\n    parser.add_argument(\"--vm-snapshot\", default=\"clean\",\n                        help=\"VirtualBox snapshot (name or UUID) to restore [default=clean]\")\n    parser.add_argument(\"--vm-ssh-port\", type=int, default=22,\n                        help=\"VirtualBox instance SSH port [default=22]\")\n    parser.add_argument(\"--vm-http-port\", type=int, default=80,\n                        help=\"VirtualBox instance HTTP port [default=80]\")\n    parser.add_argument(\"--vm-web-server\", choices=(\"apache\", \"nginx+uwsgi\", \"uwsgi\"),\n                        help=\"Web server to tell Critic to install and configure\")\n\n    parser.add_argument(\"--git-daemon-port\", type=int,\n                        help=\"Port to tell 'git daemon' to bind to\")\n    parser.add_argument(\"--cache-dir\", default=\"testing/cache\",\n                        help=\"Directory where cache files are stored\")\n\n    parser.add_argument(\"--upgrade-after\",\n                        help=\"Upgrade after specified test\")\n    parser.add_argument(\"--pause-before\", action=\"append\",\n                        help=\"Pause testing before specified test(s)\")\n    parser.add_argument(\"--pause-after\", action=\"append\",\n                        help=\"Pause testing before specified test(s)\")\n    parser.add_argument(\"--pause-on-failure\", action=\"store_true\",\n                        help=\"Pause testing after each failed test\")\n    parser.add_argument(\"--pause-upgrade-loop\", action=\"store_true\",\n                        help=\"Support upgrading the tested system while paused\")\n    parser.add_argument(\"--pause-upgrade-retry\", action=\"store_true\",\n                        help=(\"Support upgrading the tested system while paused \"\n                              \"after a failed test, and retrying the failed test\"))\n    parser.add_argument(\"--pause-upgrade-hook\", action=\"append\",\n                        help=\"Command to run (locally) before upgrading\")\n\n    parser.add_argument(\"test\", nargs=\"*\",\n                        help=\"Specific tests to run [default=all]\")\n\n    arguments = parser.parse_args()\n\n    class CountingLogger(object):\n        def __init__(self, real, counters):\n            self.real = real\n            self.counters = counters\n\n        def log(self, level, message):\n            if level == logging.ERROR:\n                self.counters.errors_logged += 1\n            elif level == logging.WARNING:\n                self.counters.warnings_logged += 1\n            for line in message.splitlines() or [\"\"]:\n                self.real.log(level, line)\n        def debug(self, message):\n            self.log(logging.DEBUG, message)\n        def info(self, message):\n            self.log(logging.INFO, message)\n        def warning(self, message):\n            self.log(logging.WARNING, message)\n        def error(self, message):\n            self.log(logging.ERROR, message)\n        def exception(self, message):\n            self.log(logging.ERROR, message + \"\\n\" + traceback.format_exc())\n\n    logger = testing.configureLogging(\n        arguments, wrap=lambda logger: CountingLogger(logger, counters))\n    logger.info(\"\"\"\\\nCritic Testing Framework\n========================\n\n\"\"\")\n\n    key_arguments = [arguments.local,\n                     arguments.quickstart,\n                     arguments.vm_identifier]\n\n    if len(filter(None, key_arguments)) != 1:\n        logger.error(\"Must specify exactly one of --local, --quickstart and \"\n                     \"--vm-identifier!\")\n        return\n\n    if arguments.local or arguments.quickstart:\n        incompatible_arguments = []\n\n        # This is not a complete list; just those that are most significantly\n        # incompatible or irrelevant with --local/--quickstart.\n        if arguments.commit:\n            incompatible_arguments.append(\"--commit\")\n        if arguments.upgrade_from:\n            incompatible_arguments.append(\"--upgrade-from\")\n        if arguments.coverage:\n            incompatible_arguments.append(\"--coverage\")\n        if arguments.test_extensions:\n            incompatible_arguments.append(\"--strict-fs-permissions\")\n        if arguments.test_extensions:\n            incompatible_arguments.append(\"--test-extensions\")\n        if arguments.vm_identifier:\n            incompatible_arguments.append(\"--vm-identifier\")\n\n        if incompatible_arguments:\n            logger.error(\"These arguments can't be combined with \"\n                         \"--local/--quickstart:\\n  \" +\n                         \"\\n  \".join(incompatible_arguments))\n            return\n\n    import_errors = False\n\n    try:\n        import requests\n    except ImportError:\n        logger.error(\"Failed to import 'requests'!\")\n        import_errors = True\n\n    try:\n        import BeautifulSoup\n    except ImportError:\n        logger.error(\"Failed to import 'BeautifulSoup'!\")\n        import_errors = True\n\n    git_version = subprocess.check_output([\"git\", \"--version\"]).strip()\n    m = re.search(\"(\\d+)\\.(\\d+)\\.(\\d+)(?:[^\\d]+|$)\", git_version)\n    if not m:\n        logger.warning(\"Failed to parse host-side git version number: '%s'\" % git_version)\n    else:\n        version_tuple = tuple(map(int, m.groups()))\n        if version_tuple >= (1, 8, 5):\n            logger.debug(\"Using Git version %s on host.\" % git_version)\n        else:\n            logger.error(\"Git version on host machine must be version 1.8.5 or above (detected version %s).\" % git_version)\n            logger.error(\"Earlier Git versions crashed with SIGBUS causing test suite flakiness.\")\n            import_errors = True\n\n    if import_errors:\n        logger.error(\"Required software missing; see testing/USAGE.md for details.\")\n        return\n\n    if arguments.test_extensions:\n        # Check that the v8-jsshell submodule is checked out if extension\n        # testing was requested.\n        output = subprocess.check_output([\"git\", \"submodule\", \"status\",\n                                          \"installation/externals/v8-jsshell\"])\n        if output.startswith(\"-\"):\n            logger.error(\"\"\"\\\nThe v8-jsshell submodule must be checked for extension testing.  Please run\n  git submodule update --init installation/externals/v8-jsshell\nfirst or run this script without --test-extensions.\"\"\")\n            return\n\n    if arguments.vm_identifier:\n        # Note: we are not ignoring typical temporary editor files such as the\n        # \".#<name>\" files created by Emacs when a buffer has unsaved changes.\n        # This is because unsaved changes in an editor is probably also\n        # something you don't want to test with.\n\n        locally_modified_paths = []\n\n        status_output = subprocess.check_output(\n            [\"git\", \"status\", \"--porcelain\"])\n\n        for line in status_output.splitlines():\n            locally_modified_paths.extend(line[3:].split(\" -> \"))\n\n        tests_modified = []\n        input_modified = []\n        other_modified = []\n\n        for path in locally_modified_paths:\n            if path.startswith(\"testing/input/\"):\n                input_modified.append(path)\n            elif path.startswith(\"testing/\"):\n                tests_modified.append(path)\n            else:\n                other_modified.append(path)\n\n        if input_modified:\n            logger.error(\"Test input files locally modified:\\n  \" +\n                         \"\\n  \".join(input_modified))\n        if other_modified:\n            logger.error(\"Critic files locally modified:\\n  \" +\n                         \"\\n  \".join(other_modified))\n        if input_modified or other_modified:\n            logger.error(\"Please commit or stash local modifications before \"\n                         \"running tests.\")\n            return\n\n        if tests_modified:\n            logger.warning(\"Running tests using locally modified files:\\n  \" +\n                           \"\\n  \".join(tests_modified))\n\n    tested_commit = subprocess.check_output(\n        [\"git\", \"rev-parse\", \"--verify\", arguments.commit or \"HEAD\"]).strip()\n\n    if arguments.upgrade_from:\n        install_commit = subprocess.check_output(\n            [\"git\", \"rev-parse\", \"--verify\", arguments.upgrade_from]).strip()\n        upgrade_commit = tested_commit\n    else:\n        install_commit = tested_commit\n        upgrade_commit = None\n\n    install_commit_description = subprocess.check_output(\n        [\"git\", \"log\", \"--oneline\", \"-1\", install_commit]).strip()\n\n    if upgrade_commit:\n        upgrade_commit_description = subprocess.check_output(\n            [\"git\", \"log\", \"--oneline\", \"-1\", upgrade_commit]).strip()\n    else:\n        upgrade_commit_description = None\n\n    flags_on = set()\n    flags_off = set()\n\n    try:\n        if arguments.local:\n            frontend = None\n            instance = testing.local.Instance()\n        else:\n            frontend = testing.frontend.Frontend(\n                hostname=arguments.vm_hostname or arguments.vm_identifier,\n                http_port=arguments.vm_http_port)\n\n            if arguments.quickstart:\n                instance = testing.quickstart.Instance(\n                    frontend=frontend)\n            else:\n                instance = testing.virtualbox.Instance(\n                    arguments,\n                    install_commit=(install_commit, install_commit_description),\n                    upgrade_commit=(upgrade_commit, upgrade_commit_description),\n                    frontend=frontend)\n\n            frontend.instance = instance\n    except testing.Error as error:\n        logger.error(error.message)\n        return\n\n    if not arguments.test_extensions:\n        flags_off.add(\"extensions\")\n\n    flags_on.update(instance.flags_on)\n    flags_off.update(instance.flags_off)\n\n    tests, dependencies = testing.findtests.selectTests(\n        arguments.test, strict=False, flags_on=flags_on, flags_off=flags_off)\n\n    if not tests:\n        logger.error(\"No tests selected!\")\n        return\n\n    if arguments.upgrade_after:\n        upgrade_after = testing.findtests.filterPatterns([arguments.upgrade_after])\n        upgrade_after_tests, _ = testing.findtests.selectTests(upgrade_after,\n                                                               strict=True)\n        upgrade_after_tests = set(upgrade_after_tests)\n        upgrade_after_groups = set(upgrade_after)\n\n        def maybe_upgrade_after(test):\n            def do_upgrade(what):\n                logger.info(\"Upgrading after: %s\" % what)\n                instance.upgrade(is_after_test=True)\n\n            if test in upgrade_after_tests:\n                do_upgrade(test)\n            else:\n                for group in test.groups:\n                    if group in upgrade_after_groups \\\n                            and test == all_groups[group][-1]:\n                        do_upgrade(group)\n                        break\n    else:\n        def maybe_upgrade_after(test):\n            pass\n\n    def pause(failed_test=None):\n        if arguments.pause_upgrade_loop \\\n                or (failed_test and arguments.pause_upgrade_retry):\n            print \"Testing paused.\"\n\n            while True:\n                if failed_test and arguments.pause_upgrade_retry:\n                    testing.pause(\"Press ENTER to upgrade (to HEAD) and \"\n                                  \"retry %s, CTRL-c to stop: \"\n                                  % os.path.basename(failed_test))\n                else:\n                    testing.pause(\"Press ENTER to upgrade (to HEAD), \"\n                                  \"CTRL-c to stop: \")\n\n                if arguments.pause_upgrade_hook:\n                    for command in arguments.pause_upgrade_hook:\n                        subprocess.check_call(command, shell=True)\n\n                if arguments.quickstart:\n                    instance.restart()\n                elif not arguments.local:\n                    repository.push(\"HEAD\")\n\n                    instance.execute([\"git\", \"fetch\", \"origin\", \"master\"], cwd=\"critic\")\n                    instance.upgrade_commit = \"FETCH_HEAD\"\n                    instance.upgrade()\n\n                if failed_test and arguments.pause_upgrade_retry:\n                    return \"retry\"\n        else:\n            testing.pause(\"Testing paused.  Press ENTER to continue: \")\n\n    if arguments.pause_before:\n        pause_before = testing.findtests.filterPatterns(arguments.pause_before)\n        pause_before_tests, _ = testing.findtests.selectTests(pause_before,\n                                                              strict=True)\n        pause_before_tests = set(pause_before_tests)\n        pause_before_groups = set(pause_before)\n\n        def maybe_pause_before(test):\n            def do_pause(what):\n                logger.info(\"Pausing before: %s\" % what)\n                pause()\n\n            if test in pause_before_tests:\n                do_pause(test)\n            else:\n                for group in test.groups:\n                    if group in pause_before_groups \\\n                            and test == all_groups[group][0]:\n                        do_pause(group)\n                        break\n    else:\n        def maybe_pause_before(test):\n            pass\n\n    if arguments.pause_after:\n        pause_after = testing.findtests.filterPatterns(arguments.pause_after)\n        pause_after_tests, _ = testing.findtests.selectTests(pause_after,\n                                                             strict=True)\n        pause_after_tests = set(pause_after_tests)\n        pause_after_groups = set(pause_after)\n\n        def maybe_pause_after(test):\n            def do_pause(what):\n                logger.info(\"Pausing after: %s\" % what)\n                pause()\n\n            if test in pause_after_tests:\n                do_pause(test)\n            else:\n                for group in test.groups:\n                    if group in pause_after_groups \\\n                            and test == all_groups[group][-1]:\n                        do_pause(group)\n                        break\n    else:\n        def maybe_pause_after(test):\n            pass\n\n    root_groups = {}\n    all_groups = {}\n\n    for test in tests:\n        for group in test.groups:\n            all_groups.setdefault(group, []).append(test)\n        root_groups.setdefault(test.groups[0], []).append(test)\n\n    failed_tests = set()\n\n    def run_test(test, scope):\n        prefix = \"testing/tests\"\n        def run_file(filename):\n            try:\n                execfile(os.path.join(prefix, filename), scope)\n            except testing.Error:\n                raise\n            except Exception as error:\n                logger.exception(\"Unexpected exception!\")\n                raise testing.TestFailure\n        path = \"\"\n        for component in test.filename.split(\"/\")[:-1]:\n            path = os.path.join(path, component)\n            init_filename = os.path.join(path, \"__init__.py\")\n            if os.path.isfile(os.path.join(prefix, init_filename)):\n                logger.debug(\"Including: %s\" % init_filename)\n                run_file(init_filename)\n        run_file(test.filename)\n\n    def run_group(group_name, tests):\n        scope = { \"testing\": testing,\n                  \"logger\": logger,\n                  \"instance\": instance }\n\n        if not arguments.local:\n            scope.update({ \"frontend\": frontend,\n                           \"repository\": repository,\n                           \"mailbox\": mailbox })\n\n        try:\n            for test in tests:\n                if test.dependencies & failed_tests:\n                    logger.info(\"Skipping %s (failed dependency)\" % test)\n                    continue\n\n                maybe_pause_before(test)\n\n                if test in dependencies:\n                    logger.info(\"Running: %s (dependency)\" % test)\n                else:\n                    logger.info(\"Running: %s\" % test)\n\n                counters.tests_run += 1\n\n                while True:\n                    try:\n                        errors_before = counters.errors_logged\n                        run_test(test, scope.copy())\n                        if mailbox:\n                            mailbox.check_empty()\n                        instance.check_service_logs()\n                        if errors_before < counters.errors_logged:\n                            raise testing.TestFailure\n                    except testing.Error as error:\n                        counters.tests_failed += 1\n\n                        failed_tests.add(test)\n\n                        if not isinstance(error, testing.TestFailure):\n                            raise\n\n                        if error.message:\n                            logger.error(error.message)\n\n                        if mailbox:\n                            try:\n                                while True:\n                                    mail = mailbox.pop(\n                                        accept=testing.mailbox.ToRecipient(\n                                            \"system@example.org\"))\n                                    logger.error(\"System message: %s\\n  %s\"\n                                                 % (mail.header(\"Subject\"),\n                                                    \"\\n  \".join(mail.lines)))\n                            except testing.mailbox.MissingMail:\n                                pass\n\n                        instance.check_service_logs()\n\n                        if arguments.pause_on_failure \\\n                                or arguments.pause_upgrade_retry:\n                            if pause(test.filename) == \"retry\":\n                                # Re-run test due to --pause-upgrade-retry.\n                                continue\n                    except testing.NotSupported as not_supported:\n                        failed_tests.add(test)\n                        logger.info(\"Test not supported: %s\"\n                                    % not_supported.message)\n                    else:\n                        maybe_upgrade_after(test)\n                        maybe_pause_after(test)\n                    break\n        except KeyboardInterrupt:\n            raise TestingAborted\n        except testing.Error as error:\n            if error.message:\n                logger.exception(error.message)\n            if arguments.pause_on_failure:\n                pause()\n            return False\n        except Exception:\n            logger.exception(\"Unexpected exception!\")\n            if arguments.pause_on_failure:\n                pause()\n            return False\n        else:\n            return True\n\n    for group_name in sorted(root_groups.keys()):\n        if arguments.local:\n            repository = None\n            mailbox = None\n\n            if not run_group(group_name, all_groups[group_name]):\n                return False\n        else:\n            repository = testing.repository.Repository(\n                \"localhost\" if arguments.quickstart else arguments.vbox_host,\n                arguments.git_daemon_port,\n                tested_commit,\n                instance)\n            mailbox = testing.mailbox.Mailbox(instance,\n                                              { \"username\": \"smtp_username\",\n                                                \"password\": \"SmTp_PaSsWoRd\" },\n                                              arguments.debug_mails)\n\n            with repository:\n                with mailbox:\n                    if not repository.export():\n                        return False\n\n                    with instance:\n                        instance.mailbox = mailbox\n\n                        testing.utils.instance = instance\n                        testing.utils.frontend = frontend\n\n                        if not run_group(group_name, all_groups[group_name]):\n                            return False\n\n                        instance.finish()\n\n                    mailbox.instance = None\n                    mailbox.check_empty()\n\n    return True\n\ndef main():\n    start_time = time.time()\n\n    try:\n        run_failed = not run()\n\n        if run_failed:\n            logger.error(\"Tests did not run as expected.\")\n\n        time_taken = str(datetime.timedelta(seconds=round(time.time() - start_time)))\n\n        logger.info(\"\"\"\nTest summary\n============\nTests run:       %9d\nTests failed:    %9d\nErrors logged:   %9d\nWarnings logged: %9d\nTime taken:      %9s\n\"\"\" % (counters.tests_run,\n       counters.tests_failed,\n       counters.errors_logged,\n       counters.warnings_logged,\n       time_taken))\n\n        if run_failed or counters.tests_failed or counters.errors_logged:\n            sys.exit(1)\n    except TestingAborted:\n        logger.error(\"Testing aborted.\")\n        sys.exit(1)\n\nif __name__ == \"__main__\":\n    main()\n"
  },
  {
    "path": "testing/password-invalid",
    "content": "#!/bin/sh\necho -n invalid\n"
  },
  {
    "path": "testing/password-testing",
    "content": "#!/bin/sh\necho -n testing\n"
  },
  {
    "path": "testing/quickstart.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2013 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport os\nimport subprocess\nimport signal\nimport threading\nimport time\nimport json\n\nimport testing\n\nclass RepositoryURL(object):\n    def __init__(self, path, name):\n        self.path = path\n        self.name = name\n\nclass Instance(testing.Instance):\n    flags_off = [\"full\", \"postgresql\", \"extensions\", \"upgrade\", \"uninstall\"]\n\n    install_commit = \"HEAD\"\n    tested_commit = \"HEAD\"\n\n    def __init__(self, frontend):\n        super(Instance, self).__init__()\n        self.frontend = frontend\n        self.mailbox = None\n        self.process = None\n        self.hostname = \"localhost\"\n        self.registeruser(\"admin\")\n\n    def __enter__(self):\n        return self\n\n    def __exit__(self, *args):\n        if self.process:\n            self.stop()\n        return False\n\n    @property\n    def etc_dir(self):\n        return os.path.join(self.state_dir, \"etc\")\n\n    def start(self):\n        pass\n\n    def stop(self):\n        testing.logger.debug(\"Stopping ...\")\n\n        self.process.send_signal(signal.SIGINT)\n        self.process.wait()\n        self.process = None\n\n        testing.logger.debug(\"Stopped\")\n\n    def execute(self, *args, **kwargs):\n        raise testing.NotSupported(\"quick-started instance doesn't support execute()\")\n\n    def criticctl(self, argv):\n        for index, arg in enumerate(argv):\n            if arg[0] == \"'\" == arg[-1]:\n                argv[index] = arg[1:-1]\n\n        argv = [os.path.join(self.state_dir, \"bin\", \"criticctl\")] + argv\n\n        testing.logger.debug(\"Running: %s\" % \" \".join(argv))\n\n        process = subprocess.Popen(\n            argv,\n            stdout=subprocess.PIPE,\n            stderr=subprocess.PIPE)\n\n        stdout, stderr = process.communicate()\n\n        for line in stdout.splitlines():\n            testing.logger.log(testing.STDOUT, line)\n        for line in stderr.splitlines():\n            testing.logger.log(testing.STDERR, line)\n\n        if process.returncode == 0:\n            return stdout\n        else:\n            raise testing.CriticctlError(\" \".join(argv), stdout, stderr)\n\n    def adduser(self, name, email=None, fullname=None, password=None):\n        if email is None:\n            email = \"%s@example.org\" % name\n        if fullname is None:\n            fullname = \"%s von Testing\" % name.capitalize()\n        if password is None:\n            password = \"testing\"\n\n        self.criticctl([\"adduser\",\n                        \"--name\", name,\n                        \"--email\", email,\n                        \"--fullname\", fullname,\n                        \"--password\", password])\n\n        self.registeruser(name)\n\n    def has_flag(self, flag):\n        return testing.has_flag(\"HEAD\", flag)\n\n    def repository_path(self, repository=\"critic\"):\n        return os.path.join(self.state_dir, \"git/%s.git\" % repository)\n\n    def repository_url(self, name=None, repository=\"critic\"):\n        path = self.repository_path(repository)\n        if name is None:\n            return path\n        return RepositoryURL(path, name)\n\n    def install(self, repository, override_arguments={}, other_cwd=False,\n                quick=False, interactive=False):\n        argv = [sys.executable, \"-u\", \"quickstart.py\",\n                \"--testing\",\n                \"--admin-username\", \"admin\",\n                \"--admin-fullname\", \"Testing Administrator\",\n                \"--admin-email\", \"admin@example.org\",\n                \"--admin-password\", \"testing\",\n                \"--system-recipient\", \"system@example.org\",\n                \"--http-port\", \"0\", # Use a random port.\n                \"--smtp-port\", str(self.mailbox.port),\n                \"--smtp-username\", self.mailbox.credentials[\"username\"],\n                \"--smtp-password\", self.mailbox.credentials[\"password\"]]\n\n        testing.logger.debug(\"Running: %s\" % \" \".join(argv))\n\n        self.process = subprocess.Popen(\n            argv, stdout=subprocess.PIPE, stderr=subprocess.PIPE)\n\n        def consume(stream, loglevel):\n            try:\n                while True:\n                    line = stream.readline()\n                    if not line:\n                        break\n                    testing.logger.log(loglevel, line.rstrip())\n            except IOError:\n                pass\n\n        stderr_thread = threading.Thread(\n            target=consume, args=(self.process.stderr, testing.STDERR))\n        stderr_thread.daemon = True\n        stderr_thread.start()\n\n        line = self.process.stdout.readline().strip()\n        key, _, value = line.partition(\"=\")\n\n        if key != \"STATE\":\n            raise testing.InstanceError(\"Unexpected output: %r\" % line)\n\n        self.state_dir = value\n\n        testing.logger.debug(\"State directory: %s\" % value)\n\n        line = self.process.stdout.readline().strip()\n        key, _, value = line.partition(\"=\")\n\n        if key != \"HTTP\":\n            raise testing.InstanceError(\"Unexpected output: %r\" % line)\n\n        hostname, _, port = value.partition(\":\")\n\n        self.frontend.hostname = hostname\n        self.frontend.http_port = int(port)\n\n        testing.logger.debug(\"HTTP address: %s:%s\" % (hostname, port))\n\n        line = self.process.stdout.readline().strip()\n\n        if line != \"STARTED\":\n            raise testing.InstanceError(\"Unexpected output: %r\" % line)\n\n        # Add some regular users.\n        for name in (\"alice\", \"bob\", \"dave\", \"erin\"):\n            self.adduser(name)\n\n        self.adduser(\"howard\")\n        self.criticctl([\"addrole\",\n                        \"--name\", \"howard\",\n                        \"--role\", \"newswriter\"])\n\n        try:\n            self.frontend.run_basic_tests()\n            self.mailbox.check_empty()\n        except testing.TestFailure as error:\n            if error.message:\n                testing.logger.error(\"Basic test: %s\" % error.message)\n\n            # If basic tests fail, there's no reason to further test this\n            # instance; it seems to be properly broken.\n            raise testing.InstanceError\n\n        testing.logger.info(\"Quick-started Critic in %s (%d)\"\n                            % (self.state_dir, self.frontend.http_port))\n\n    def check_upgrade(self):\n        raise testing.NotSupported(\"quick-started instance can't be upgraded\")\n\n    def upgrade(self, override_arguments={}, other_cwd=False, quick=False,\n                interactive=False):\n        pass\n\n    def check_extend(self, repository, pre_upgrade=False):\n        raise testing.NotSupported(\"quick-started instance doesn't support extensions\")\n\n    def extend(self, repository):\n        self.check_extend(repository)\n\n    def uninstall(self):\n        raise testing.NotSupported(\"quick-started instance can't be uninstalled\")\n\n    def finish(self):\n        pass\n\n    def run_unittest(self, args):\n        PYTHONPATH = \":\".join([os.path.join(self.state_dir, \"etc/main\"),\n                               os.path.join(os.getcwd(), \"src\"),\n                               os.getcwd()])\n        argv = [sys.executable, \"-u\", \"-m\", \"run_unittest\"] + args\n        return self.executeProcess(argv, cwd=\"src\", log_stderr=False,\n                                   env={ \"PYTHONPATH\": PYTHONPATH })\n\n    def gc(self, repository):\n        self.executeProcess(\n            [\"git\", \"gc\", \"--prune=now\"],\n            cwd=os.path.join(self.state_dir, \"git\", repository))\n\n    def synchronize_service(self, service_name, force_maintenance=False, timeout=30):\n        helper = \"testing/input/service_synchronization_helper.py\"\n        testing.logger.debug(\"Synchronizing service: %s\" % service_name)\n        pidfile_path = os.path.join(\n            self.state_dir, \"run/main\", service_name + \".pid\")\n        if force_maintenance:\n            signum = signal.SIGUSR2\n        else:\n            signum = signal.SIGUSR1\n        before = time.time()\n        self.executeProcess(\n            [\"python\", helper, pidfile_path, str(signum), str(timeout)])\n        after = time.time()\n        testing.logger.debug(\"Synchronized service: %s in %.2f seconds\"\n                             % (service_name, after - before))\n\n    def filter_service_logs(self, level, service_names):\n        helper = \"testing/input/service_log_filter.py\"\n        logfile_paths = {\n            os.path.join(\n                self.state_dir, \"log/main\", service_name + \".log\"): service_name\n            for service_name in service_names }\n        try:\n            data = json.loads(self.executeProcess(\n                [\"python\", helper, level] + logfile_paths.keys(),\n                log_stdout=False))\n            return { logfile_paths[logfile_path]: entries\n                     for logfile_path, entries in sorted(data.items()) }\n        except testing.CommandError:\n            return None\n\n    def restart(self):\n        self.process.send_signal(signal.SIGUSR1)\n\n        line = self.process.stdout.readline().strip()\n\n        if line != \"RESTARTED\":\n            raise testing.InstanceError(\"Unexpected output: %r\" % line)\n"
  },
  {
    "path": "testing/repository.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2013 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport os\nimport time\nimport tempfile\nimport shutil\nimport subprocess\nimport re\n\nimport testing\n\nclass GitCommandError(testing.TestFailure):\n    def __init__(self, command, output):\n        super(GitCommandError, self).__init__(\n            \"GitCommandError: %s\\nOutput:\\n  %s\"\n            % (command, \"\\n  \".join(output.strip().splitlines())))\n        self.command = command\n        self.output = output\n\ndef _git(args, **kwargs):\n    argv = [\"git\"] + args\n    if \"cwd\" in kwargs:\n        cwd = \" (in %s)\" % kwargs[\"cwd\"]\n    else:\n        cwd = \"\"\n    testing.logger.debug(\"Running: %s%s\" % (\" \".join(argv), cwd))\n    env = os.environ.copy()\n    for name, value in kwargs.get(\"env\", {}).items():\n        if value is None:\n            if name in env:\n                del env[name]\n        else:\n            env[name] = value\n    env.setdefault(\"GIT_AUTHOR_NAME\", \"Critic Tester\")\n    env.setdefault(\"GIT_COMMITTER_NAME\", \"Critic Tester\")\n    env.setdefault(\"GIT_AUTHOR_EMAIL\", \"tester@example.org\")\n    env.setdefault(\"GIT_COMMITTER_EMAIL\", \"tester@example.org\")\n    kwargs[\"env\"] = env\n    try:\n        return subprocess.check_output(\n            argv, stdin=open(\"/dev/null\"), stderr=subprocess.STDOUT, **kwargs)\n    except subprocess.CalledProcessError as error:\n        raise GitCommandError(\" \".join(argv), error.output)\n\ndef submodule_sha1(repository_path, parent_sha1, submodule_path):\n    try:\n        lstree = _git([\"ls-tree\", parent_sha1, submodule_path],\n                      cwd=repository_path)\n    except GitCommandError:\n        # Sub-module doesn't exist?  Will probably fail later, but doesn't need\n        # to fail here.\n        return None\n    mode, object_type, sha1, path = lstree.strip().split(None, 3)\n    if object_type != \"commit\":\n        # Odd.  The repository doesn't look at all like we expect.\n        return None\n    return sha1\n\nclass Repository(object):\n    def __init__(self, host, port, tested_commit, instance):\n        self.host = host\n        self.port = port\n        self.instance = instance\n        self.base_path = tempfile.mkdtemp()\n        self.path = os.path.join(self.base_path, \"critic.git\")\n        self.work = os.path.join(self.base_path, \"work\")\n\n        if port:\n            self.url = \"git://%s:%d/critic.git\" % (host, port)\n        else:\n            self.url = \"git://%s/critic.git\" % host\n\n        testing.logger.debug(\"Creating temporary repositories in: %s\" % self.base_path)\n\n        _git([\"clone\", \"--bare\", os.getcwd(), \"critic.git\"],\n             cwd=self.base_path)\n\n        _git([\"config\", \"receive.denyDeletes\", \"false\"],\n             cwd=self.path)\n        _git([\"config\", \"receive.denyNonFastforwards\", \"false\"],\n             cwd=self.path)\n\n        self.push(tested_commit)\n\n        self.v8_jsshell_path = None\n        self.v8_path = None\n        self.v8_url = None\n\n        if instance.test_extensions:\n            if os.path.exists(\"installation/externals/v8-jsshell/.git\"):\n                v8_jsshell_path = os.path.join(os.getcwd(), \"installation/externals/v8-jsshell\")\n                _git([\"clone\", \"--bare\", v8_jsshell_path, \"v8-jsshell.git\"],\n                     cwd=self.base_path)\n                self.v8_jsshell_path = os.path.join(self.base_path, \"v8-jsshell.git\")\n                v8_jsshell_sha1 = submodule_sha1(os.getcwd(), tested_commit,\n                                                 \"installation/externals/v8-jsshell\")\n                if v8_jsshell_sha1:\n                    _git([\"push\", \"--quiet\", \"--force\", self.v8_jsshell_path,\n                          v8_jsshell_sha1 + \":refs/heads/master\"],\n                         cwd=v8_jsshell_path)\n            else:\n                v8_jsshell_sha1 = None\n\n            if os.path.exists(\"installation/externals/v8-jsshell/v8/.git\"):\n                v8_path = os.path.join(os.getcwd(), \"installation/externals/v8-jsshell/v8\")\n                _git([\"clone\", \"--bare\", v8_path, \"v8/v8.git\"],\n                     cwd=self.base_path)\n                self.v8_path = os.path.join(self.base_path, \"v8/v8.git\")\n                if port:\n                    self.v8_url = \"git://%s:%d/v8/v8.git\" % (host, port)\n                else:\n                    self.v8_url = \"git://%s/v8/v8.git\" % host\n                if v8_jsshell_sha1:\n                    v8_sha1 = submodule_sha1(\"installation/externals/v8-jsshell\",\n                                             v8_jsshell_sha1, \"v8\")\n                    if v8_sha1:\n                        _git([\"push\", \"--quiet\", \"--force\", self.v8_path,\n                              v8_sha1 + \":refs/heads/master\"],\n                             cwd=v8_path)\n\n    def push(self, commit):\n        _git([\"push\", \"--quiet\", \"--force\", self.path,\n              \"%s:refs/heads/master\" % commit])\n\n    def export(self):\n        argv = [\"git\", \"daemon\", \"--reuseaddr\", \"--export-all\",\n                \"--base-path=%s\" % self.base_path]\n        if self.port:\n            argv.append(\"--port=%d\" % self.port)\n        argv.append(self.path)\n        if self.v8_jsshell_path:\n            argv.append(self.v8_jsshell_path)\n        if self.v8_path:\n            argv.append(self.v8_path)\n\n        self.daemon = subprocess.Popen(argv)\n\n        time.sleep(1)\n\n        pid, status = os.waitpid(self.daemon.pid, os.WNOHANG)\n        if pid != 0:\n            self.daemon = None\n            testing.logger.error(\"Failed to export repository!\")\n            return False\n\n        testing.logger.debug(\"Exported repository: %s\" % self.path)\n        if self.v8_jsshell_path:\n            testing.logger.debug(\"Exported repository: %s\" % self.v8_jsshell_path)\n        if self.v8_path:\n            testing.logger.debug(\"Exported repository: %s\" % self.v8_path)\n        return True\n\n    def run(self, args, cwd=None, env=None):\n        if cwd is None:\n            cwd = self.path\n        if env is None:\n            env = {}\n        if isinstance(self.instance, testing.quickstart.Instance):\n            for index, arg in enumerate(args[1:]):\n                if isinstance(arg, testing.quickstart.RepositoryURL):\n                    args[index + 1] = arg.path\n                    env[\"REMOTE_USER\"] = arg.name\n        return _git(args, cwd=cwd, env=env)\n\n    def workcopy(self, name=\"critic\", empty=False):\n        master = self\n\n        class Workcopy(testing.Context):\n            def __init__(self, path, start, finish):\n                super(Workcopy, self).__init__(start, finish)\n                self.path = path\n\n            def run(self, args, **kwargs):\n                if kwargs:\n                    env = {}\n                    for name in kwargs.keys():\n                        if name.lower() != name == name.upper():\n                            env[name] = kwargs[name]\n                            del kwargs[name]\n                else:\n                    env = None\n                return master.run(args, cwd=self.path, env=env, **kwargs)\n\n        path = os.path.join(self.work, name)\n\n        if os.path.exists(path):\n            raise testing.InstanceError(\n                \"Can't create work copy; path already exists!\")\n\n        def start():\n            if not os.path.isdir(self.work):\n                os.mkdir(self.work)\n            if not empty:\n                _git([\"clone\", self.path, name], cwd=self.work)\n            else:\n                os.mkdir(path)\n                _git([\"init\"], cwd=path)\n\n        def finish():\n            shutil.rmtree(path)\n\n        return Workcopy(path, start, finish)\n\n    def __enter__(self):\n        return self\n\n    def __exit__(self, *args):\n        try:\n            if self.daemon:\n                self.daemon.terminate()\n                self.daemon.wait()\n        except:\n            testing.logger.exception(\"Repository clean-up failed!\")\n\n        try:\n            shutil.rmtree(self.base_path)\n        except:\n            testing.logger.exception(\"Repository clean-up failed!\")\n\n        return False\n"
  },
  {
    "path": "testing/tests/001-main/000-install.py",
    "content": "# Start instance and install (and upgrade, optionally) Critic with the default\n# arguments.\ninstance.start()\ninstance.install(repository)\ninstance.upgrade()\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/001-anonymous/001-dashboard.py",
    "content": "frontend.page(\"dashboard\", expect={ \"document_title\": testing.expect.document_title(u\"Dashboard\"),\n                                    \"message_title\": testing.expect.message_title(u\"No reviews!\"),\n                                    \"pageheader_links\": testing.expect.pageheader_links(\"anonymous\"),\n                                    \"script_user\": testing.expect.script_anonymous_user() })\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/001-anonymous/002-branches.py",
    "content": "frontend.page(\n    \"branches\", expect={ \"document_title\": testing.expect.document_title(u\"Branches\"),\n                         \"content_title\": testing.expect.paleyellow_title(0, u\"Branches\"),\n                         \"pageheader_links\": testing.expect.pageheader_links(\"anonymous\"),\n                         \"script_user\": testing.expect.script_no_user() })\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/001-anonymous/003-search.py",
    "content": "frontend.page(\n    \"search\",\n    expect={ \"document_title\": testing.expect.document_title(u\"Review Search\"),\n             \"content_title\": testing.expect.paleyellow_title(0, u\"Review Search\"),\n             \"pageheader_links\": testing.expect.pageheader_links(\"anonymous\"),\n             \"script_user\": testing.expect.script_anonymous_user() })\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/001-anonymous/004-config.py",
    "content": "frontend.page(\n    \"config\",\n    expect={ \"document_title\": testing.expect.document_title(u\"User preferences\"),\n             \"content_title\": testing.expect.paleyellow_title(0, u\"User preferences\"),\n             \"pageheader_links\": testing.expect.pageheader_links(\"anonymous\"),\n             \"script_user\": testing.expect.script_anonymous_user() })\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/001-anonymous/005-tutorial.py",
    "content": "frontend.page(\"tutorial\",\n              expect={ \"document_title\": testing.expect.document_title(u\"Tutorials\"),\n                       \"content_title\": testing.expect.paleyellow_title(0, u\"Tutorials\"),\n                       \"pageheader_links\": testing.expect.pageheader_links(\"anonymous\"),\n                       \"script_user\": testing.expect.script_no_user() })\n\nfrontend.page(\"tutorial\",\n              params={ \"item\": \"request\" },\n              expect={ \"document_title\": testing.expect.document_title(u\"Requesting a Review\"),\n                       \"content_title\": testing.expect.paleyellow_title(0, u\"Requesting a Review\"),\n                       \"pageheader_links\": testing.expect.pageheader_links(\"anonymous\"),\n                       \"script_user\": testing.expect.script_no_user() })\n\nfrontend.page(\"tutorial\",\n              params={ \"item\": \"review\" },\n              expect={ \"document_title\": testing.expect.document_title(u\"Reviewing Changes\"),\n                       \"content_title\": testing.expect.paleyellow_title(0, u\"Reviewing Changes\"),\n                       \"pageheader_links\": testing.expect.pageheader_links(\"anonymous\"),\n                       \"script_user\": testing.expect.script_no_user() })\n\nfrontend.page(\"tutorial\",\n              params={ \"item\": \"filters\" },\n              expect={ \"document_title\": testing.expect.document_title(u\"Filters\"),\n                       \"content_title\": testing.expect.paleyellow_title(0, u\"Filters\"),\n                       \"pageheader_links\": testing.expect.pageheader_links(\"anonymous\"),\n                       \"script_user\": testing.expect.script_no_user() })\n\nfrontend.page(\"tutorial\",\n              params={ \"item\": \"archival\" },\n              expect={ \"document_title\": testing.expect.document_title(u\"Review branch archival\"),\n                       \"content_title\": testing.expect.paleyellow_title(0, u\"Review branch archival\"),\n                       \"pageheader_links\": testing.expect.pageheader_links(\"anonymous\"),\n                       \"script_user\": testing.expect.script_no_user() })\n\nfrontend.page(\"tutorial\",\n              params={ \"item\": \"viewer\" },\n              expect={ \"document_title\": testing.expect.document_title(u\"Repository Viewer\"),\n                       \"content_title\": testing.expect.paleyellow_title(0, u\"Repository Viewer\"),\n                       \"pageheader_links\": testing.expect.pageheader_links(\"anonymous\"),\n                       \"script_user\": testing.expect.script_no_user() })\n\nfrontend.page(\"tutorial\",\n              params={ \"item\": \"reconfigure\" },\n              expect={ \"document_title\": testing.expect.document_title(u\"Reconfiguring Critic\"),\n                       \"content_title\": testing.expect.paleyellow_title(0, u\"Reconfiguring Critic\"),\n                       \"pageheader_links\": testing.expect.pageheader_links(\"anonymous\"),\n                       \"script_user\": testing.expect.script_no_user() })\n\nfrontend.page(\"tutorial\",\n              params={ \"item\": \"rebase\" },\n              expect={ \"document_title\": testing.expect.document_title(u\"Rebasing a Review\"),\n                       \"content_title\": testing.expect.paleyellow_title(0, u\"Rebasing a Review\"),\n                       \"pageheader_links\": testing.expect.pageheader_links(\"anonymous\"),\n                       \"script_user\": testing.expect.script_no_user() })\n\nfrontend.page(\"tutorial\",\n              params={ \"item\": \"administration\" },\n              expect={ \"document_title\": testing.expect.document_title(u\"System Administration\"),\n                       \"content_title\": testing.expect.paleyellow_title(0, u\"System Administration\"),\n                       \"pageheader_links\": testing.expect.pageheader_links(\"anonymous\"),\n                       \"script_user\": testing.expect.script_no_user() })\n\nfrontend.page(\"tutorial\",\n              params={ \"item\": \"customization\" },\n              expect={ \"document_title\": testing.expect.document_title(u\"System Customization\"),\n                       \"content_title\": testing.expect.paleyellow_title(0, u\"System Customization\"),\n                       \"pageheader_links\": testing.expect.pageheader_links(\"anonymous\"),\n                       \"script_user\": testing.expect.script_no_user() })\n\nfrontend.page(\"tutorial\",\n              params={ \"item\": \"search\" },\n              expect={ \"document_title\": testing.expect.document_title(u\"Review Quick Search\"),\n                       \"content_title\": testing.expect.paleyellow_title(0, u\"Review Quick Search\"),\n                       \"pageheader_links\": testing.expect.pageheader_links(\"anonymous\"),\n                       \"script_user\": testing.expect.script_no_user() })\n\n# Unknown items are ignored and the main Tutorials page is returned instead.\nfrontend.page(\"tutorial\",\n              params={ \"item\": \"nonexisting\" },\n              expect={ \"document_title\": testing.expect.document_title(u\"Tutorials\"),\n                       \"content_title\": testing.expect.paleyellow_title(0, u\"Tutorials\"),\n                       \"pageheader_links\": testing.expect.pageheader_links(\"anonymous\"),\n                       \"script_user\": testing.expect.script_no_user() })\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/001-anonymous/006-news.py",
    "content": "with_class = testing.expect.with_class\nextract_text = testing.expect.extract_text\n\nfrontend.page(\n    \"news\",\n    expect={ \"document_title\": testing.expect.document_title(u\"News\"),\n             \"content_title\": testing.expect.paleyellow_title(0, u\"News\"),\n             \"pageheader_links\": testing.expect.pageheader_links(\"anonymous\"),\n             \"script_user\": testing.expect.script_no_user() })\n\n# Load all news items to make sure they are syntactically correct.\n#\n# There may not be any, and we can't easily test that the right set of news\n# items are listed, since this depends on whether we upgraded and from what.\n# But this testing is still somewhat meaningful.\n\ndocument = frontend.page(\"news\", params={ \"display\": \"all\" })\nitems = document.findAll(attrs=with_class(\"item\"))\n\nfor item in items:\n    item_id = item[\"critic-item-id\"]\n    item_title = extract_text(item.find(attrs=with_class(\"title\")))\n\n    frontend.page(\n        \"news\",\n        params={ \"item\": item_id },\n        expect={ \"document_title\": testing.expect.document_title(item_title),\n                 \"content_title\": testing.expect.paleyellow_title(0, item_title),\n                 \"pageheader_links\": testing.expect.pageheader_links(\"anonymous\"),\n                 \"script_user\": testing.expect.script_no_user() })\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/001-anonymous/007-home.py",
    "content": "# Note: /home redirects to /login for anonymous users.\nfrontend.page(\"home\", expect={ \"document_title\": testing.expect.document_title(u\"Sign in\"),\n                               \"content_title\": testing.expect.paleyellow_title(0, u\"Sign in\"),\n                               \"pageheader_links\": testing.expect.pageheader_links(\"anonymous\"),\n                               \"script_user\": testing.expect.script_no_user() })\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/001-anonymous/008-repositories.py",
    "content": "frontend.page(\"repositories\", expect={ \"document_title\": testing.expect.document_title(u\"Repositories\"),\n                                       \"content_title\": testing.expect.paleyellow_title(0, u\"Repositories\"),\n                                       \"pageheader_links\": testing.expect.pageheader_links(\"anonymous\"),\n                                       \"script_user\": testing.expect.script_anonymous_user() })\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/001-anonymous/009-services.py",
    "content": "frontend.page(\"services\", expect={ \"document_title\": testing.expect.document_title(u\"Services\"),\n                                   \"content_title\": testing.expect.paleyellow_title(0, u\"Services\"),\n                                   \"pageheader_links\": testing.expect.pageheader_links(\"anonymous\"),\n                                   \"script_user\": testing.expect.script_anonymous_user() })\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/001-anonymous/010-createreview.py",
    "content": "# Note: /createreview redirects to /login for anonymous users.\nfrontend.page(\"createreview\", expect={ \"document_title\": testing.expect.document_title(u\"Sign in\"),\n                                       \"content_title\": testing.expect.paleyellow_title(0, u\"Sign in\"),\n                                       \"pageheader_links\": testing.expect.pageheader_links(\"anonymous\"),\n                                       \"script_user\": testing.expect.script_no_user() })\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/001-anonymous/011-manageextensions.py",
    "content": "# Only available if extension support has been enabled.  (The body of the\n# message is quite long, and particularly interesting to check here.)\nexpected_message = testing.expect.message(\"Extension support not enabled\", None)\n\nfrontend.page(\n    \"manageextensions\",\n    expect={ \"message\": expected_message })\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/001-anonymous/012-statistics.py",
    "content": "# The /statistics page has no <title>, and has a pale yellow table that doesn't\n# have the 'paleyellow' class, and which has five (!) different main headings.\n# Its generation should be fixed, but for now, just skip testing the common page\n# elements that it's missing.\nfrontend.page(\"statistics\", expect={ \"pageheader_links\": testing.expect.pageheader_links(\"anonymous\"),\n                                     \"script_user\": testing.expect.script_no_user() })\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/001-anonymous/013-static-resource.py",
    "content": "def check_user_constructor(document):\n    expected = \"<User constructor found>\"\n    if \"\\nfunction User(\" in document:\n        actual = expected\n    else:\n        actual = \"<no User constructor found>\"\n    testing.expect.check(expected, actual)\n\ndef check_jquery_foundation(document):\n    expected = \"<jQuery header found>\"\n    if \"jQuery Foundation, Inc\" in document:\n        actual = expected\n    else:\n        actual = \"<no jQuery header found>\"\n    testing.expect.check(expected, actual)\n\n# Test a basic regular file.\nfrontend.page(\n    \"static-resource/basic.js\",\n    expected_content_type=(\"application/javascript\", \"text/javascript\"),\n    expect={ \"user_constructor\": check_user_constructor })\n\n# Test jquery.js, which is a symlink to the current version.\nfrontend.page(\n    \"static-resource/third-party/jquery.js\",\n    expected_content_type=(\"application/javascript\", \"text/javascript\"),\n    expect={ \"jquery_foundation\": check_jquery_foundation })\n\n# Test a non-existing file.\nfrontend.page(\n    \"static-resource/does-not-exist.js\",\n    expected_http_status=404)\n\n# Test that directory listing is not enabled.\nfrontend.page(\n    \"static-resource/\",\n    expected_http_status=403)\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/001-anonymous/100-preferences/001-commit.diff.rulerColumn.py",
    "content": "# Check existence of preference commit.diff.rulerColumn, added by\n#\n#   http://critic-review.org/r/57\n\ndef check_heading(document):\n    headings = document.findAll(\"td\", attrs={ \"class\": \"heading\" })\n\n    for heading in headings:\n        if heading.find(text=\"commit.diff.rulerColumn:\"):\n            return\n\n    testing.expect.check(\"<preference heading>\",\n                         \"<expected content not found>\")\n\ndef check_input(document):\n    input = document.find(\"input\", attrs={ \"name\": \"commit.diff.rulerColumn\" })\n\n    if not input:\n        testing.expect.check(\"<preference input>\",\n                             \"<expected content not found>\")\n\n    testing.expect.check(\"number\", input[\"type\"])\n    testing.expect.check(\"0\", input[\"value\"])\n    testing.expect.check(\"0\", input[\"critic-default\"])\n\nfrontend.page(\"config\",\n              expect={ \"preference_heading\": check_heading,\n                       \"preference_input\": check_input })\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/001-anonymous/100-preferences/002-review.defaultOptOut.py",
    "content": "# Check existence of preference review.defaultOptOut, added by\n#\n#   http://critic-review.org/r/40\n\ndef check_heading(document):\n    headings = document.findAll(\"td\", attrs={ \"class\": \"heading\" })\n\n    for heading in headings:\n        if heading.find(text=\"review.defaultOptOut:\"):\n            return\n\n    testing.expect.check(\"<preference heading>\",\n                         \"<expected content not found>\")\n\ndef check_input(document):\n    input = document.find(\"input\", attrs={ \"name\": \"review.defaultOptOut\" })\n\n    if not input:\n        testing.expect.check(\"<preference input>\",\n                             \"<expected content not found>\")\n\n    testing.expect.check(\"checkbox\", input[\"type\"])\n    testing.expect.check(False, input.has_key(\"checked\"))\n    testing.expect.check(\"false\", input[\"critic-default\"])\n\nfrontend.page(\"config\",\n              expect={ \"preference_heading\": check_heading,\n                       \"preference_input\": check_input })\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/001-anonymous/100-preferences/003-timezone.py",
    "content": "# Check existence of preference review.defaultOptOut, added by\n#\n#   http://critic-review.org/r/40\n\ndef check_heading(document):\n    headings = document.findAll(\"td\", attrs={ \"class\": \"heading\" })\n\n    for heading in headings:\n        if heading.find(text=\"timezone:\"):\n            return\n\n    testing.expect.check(\"<preference heading>\",\n                         \"<expected content not found>\")\n\ndef check_select(document):\n    select = document.find(\"select\", attrs={ \"name\": \"timezone\" })\n\n    if not select:\n        testing.expect.check(\"<preference select>\",\n                             \"<expected content not found>\")\n\n    testing.expect.check('\"Universal/UTC\"', select[\"critic-default\"])\n\n    option = select.find(\"option\", attrs={ \"selected\": \"selected\" })\n\n    if not option:\n        testing.expect.check(\"<pre-selected option>\",\n                             \"<expected content not found>\")\n\n    testing.expect.check(\"Universal/UTC\", option[\"value\"])\n    testing.expect.check(\"UTC (UTC / UTC+00:00)\", option.string)\n\nfrontend.page(\"config\",\n              expect={ \"preference_heading\": check_heading,\n                       \"preference_input\": check_select })\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/001-anonymous/100-preferences/__init__.py",
    "content": "# Tests in this directory don't depend on anything but having installed Critic.\n# @dependency 001-main/000-install.py\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/002-authenticated/001-dashboard.py",
    "content": "with frontend.signin():\n    frontend.page(\n        \"dashboard\",\n        expect={\n            \"document_title\": testing.expect.document_title(u\"Dashboard\"),\n            \"message_title\": testing.expect.message_title(u\"No reviews!\"),\n            \"pageheader_links\": testing.expect.pageheader_links(\"authenticated\",\n                                                                \"administrator\"),\n            \"script_user\": testing.expect.script_user(instance.user(\"admin\"))\n        })\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/002-authenticated/002-branches.py",
    "content": "with frontend.signin():\n    frontend.page(\n        \"branches\", expect={ \"document_title\": testing.expect.document_title(u\"Branches\"),\n                             \"content_title\": testing.expect.paleyellow_title(0, u\"Branches\"),\n                             \"pageheader_links\": testing.expect.pageheader_links(\"authenticated\",\n                                                                                 \"administrator\"),\n                             \"script_user\": testing.expect.script_no_user() })\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/002-authenticated/003-search.py",
    "content": "with frontend.signin():\n    frontend.page(\n        \"search\",\n        expect={\n            \"document_title\": testing.expect.document_title(u\"Review Search\"),\n            \"content_title\": testing.expect.paleyellow_title(0, u\"Review Search\"),\n            \"pageheader_links\": testing.expect.pageheader_links(\"authenticated\",\n                                                                \"administrator\"),\n            \"script_user\": testing.expect.script_user(instance.user(\"admin\")) })\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/002-authenticated/004-config.py",
    "content": "with frontend.signin():\n    frontend.page(\n        \"config\",\n        expect={\n            \"document_title\": testing.expect.document_title(u\"User preferences\"),\n            \"content_title\": testing.expect.paleyellow_title(0, u\"User preferences\"),\n            \"pageheader_links\": testing.expect.pageheader_links(\"authenticated\",\n                                                                \"administrator\"),\n            \"script_user\": testing.expect.script_user(instance.user(\"admin\"))\n        })\n\n    frontend.page(\n        \"config\",\n        params={ \"defaults\": \"yes\" },\n        expect={\n            \"document_title\": testing.expect.document_title(u\"User preferences\"),\n            \"content_title\": testing.expect.paleyellow_title(0, u\"User preferences\"),\n            \"pageheader_links\": testing.expect.pageheader_links(\"authenticated\",\n                                                                \"administrator\"),\n            \"script_user\": testing.expect.script_user(instance.user(\"admin\"))\n        })\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/002-authenticated/005-tutorial.py",
    "content": "with frontend.signin():\n    frontend.page(\"tutorial\",\n                  expect={ \"document_title\": testing.expect.document_title(u\"Tutorials\"),\n                           \"content_title\": testing.expect.paleyellow_title(0, u\"Tutorials\"),\n                           \"pageheader_links\": testing.expect.pageheader_links(\"authenticated\",\n                                                                               \"administrator\"),\n                           \"script_user\": testing.expect.script_no_user() })\n\n    frontend.page(\"tutorial\",\n                  params={ \"item\": \"request\" },\n                  expect={ \"document_title\": testing.expect.document_title(u\"Requesting a Review\"),\n                           \"content_title\": testing.expect.paleyellow_title(0, u\"Requesting a Review\"),\n                           \"pageheader_links\": testing.expect.pageheader_links(\"authenticated\",\n                                                                               \"administrator\"),\n                           \"script_user\": testing.expect.script_no_user() })\n\n    frontend.page(\"tutorial\",\n                  params={ \"item\": \"review\" },\n                  expect={ \"document_title\": testing.expect.document_title(u\"Reviewing Changes\"),\n                           \"content_title\": testing.expect.paleyellow_title(0, u\"Reviewing Changes\"),\n                           \"pageheader_links\": testing.expect.pageheader_links(\"authenticated\",\n                                                                               \"administrator\"),\n                           \"script_user\": testing.expect.script_no_user() })\n\n    frontend.page(\"tutorial\",\n                  params={ \"item\": \"filters\" },\n                  expect={ \"document_title\": testing.expect.document_title(u\"Filters\"),\n                           \"content_title\": testing.expect.paleyellow_title(0, u\"Filters\"),\n                           \"pageheader_links\": testing.expect.pageheader_links(\"authenticated\",\n                                                                               \"administrator\"),\n                           \"script_user\": testing.expect.script_no_user() })\n\n    frontend.page(\"tutorial\",\n                  params={ \"item\": \"archival\" },\n                  expect={ \"document_title\": testing.expect.document_title(u\"Review branch archival\"),\n                           \"content_title\": testing.expect.paleyellow_title(0, u\"Review branch archival\"),\n                           \"pageheader_links\": testing.expect.pageheader_links(\"authenticated\",\n                                                                               \"administrator\"),\n                           \"script_user\": testing.expect.script_no_user() })\n\n    frontend.page(\"tutorial\",\n                  params={ \"item\": \"viewer\" },\n                  expect={ \"document_title\": testing.expect.document_title(u\"Repository Viewer\"),\n                           \"content_title\": testing.expect.paleyellow_title(0, u\"Repository Viewer\"),\n                           \"pageheader_links\": testing.expect.pageheader_links(\"authenticated\",\n                                                                               \"administrator\"),\n                           \"script_user\": testing.expect.script_no_user() })\n\n    frontend.page(\"tutorial\",\n                  params={ \"item\": \"reconfigure\" },\n                  expect={ \"document_title\": testing.expect.document_title(u\"Reconfiguring Critic\"),\n                           \"content_title\": testing.expect.paleyellow_title(0, u\"Reconfiguring Critic\"),\n                           \"pageheader_links\": testing.expect.pageheader_links(\"authenticated\",\n                                                                               \"administrator\"),\n                           \"script_user\": testing.expect.script_no_user() })\n\n    frontend.page(\"tutorial\",\n                  params={ \"item\": \"rebase\" },\n                  expect={ \"document_title\": testing.expect.document_title(u\"Rebasing a Review\"),\n                           \"content_title\": testing.expect.paleyellow_title(0, u\"Rebasing a Review\"),\n                           \"pageheader_links\": testing.expect.pageheader_links(\"authenticated\",\n                                                                               \"administrator\"),\n                           \"script_user\": testing.expect.script_no_user() })\n\n    frontend.page(\"tutorial\",\n                  params={ \"item\": \"administration\" },\n                  expect={ \"document_title\": testing.expect.document_title(u\"System Administration\"),\n                           \"content_title\": testing.expect.paleyellow_title(0, u\"System Administration\"),\n                           \"pageheader_links\": testing.expect.pageheader_links(\"authenticated\",\n                                                                               \"administrator\"),\n                           \"script_user\": testing.expect.script_no_user() })\n\n    frontend.page(\"tutorial\",\n                  params={ \"item\": \"customization\" },\n                  expect={ \"document_title\": testing.expect.document_title(u\"System Customization\"),\n                           \"content_title\": testing.expect.paleyellow_title(0, u\"System Customization\"),\n                           \"pageheader_links\": testing.expect.pageheader_links(\"authenticated\",\n                                                                               \"administrator\"),\n                           \"script_user\": testing.expect.script_no_user() })\n\n    frontend.page(\"tutorial\",\n                  params={ \"item\": \"search\" },\n                  expect={ \"document_title\": testing.expect.document_title(u\"Review Quick Search\"),\n                           \"content_title\": testing.expect.paleyellow_title(0, u\"Review Quick Search\"),\n                           \"pageheader_links\": testing.expect.pageheader_links(\"authenticated\",\n                                                                               \"administrator\"),\n                           \"script_user\": testing.expect.script_no_user() })\n\n    # Unknown items are ignored and the main Tutorials page is returned instead.\n    frontend.page(\"tutorial\",\n                  params={ \"item\": \"nonexisting\" },\n                  expect={ \"document_title\": testing.expect.document_title(u\"Tutorials\"),\n                           \"content_title\": testing.expect.paleyellow_title(0, u\"Tutorials\"),\n                           \"pageheader_links\": testing.expect.pageheader_links(\"authenticated\",\n                                                                               \"administrator\"),\n                           \"script_user\": testing.expect.script_no_user() })\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/002-authenticated/006-news.py",
    "content": "with_class = testing.expect.with_class\nextract_text = testing.expect.extract_text\n\nwith frontend.signin():\n    frontend.page(\n        \"news\",\n        expect={ \"document_title\": testing.expect.document_title(u\"News\"),\n                 \"content_title\": testing.expect.paleyellow_title(0, u\"News\"),\n                 \"pageheader_links\": testing.expect.pageheader_links(\"authenticated\",\n                                                                     \"administrator\"),\n                 \"script_user\": testing.expect.script_no_user() })\n\n    # Load all news items to make sure they are syntactically correct.\n    #\n    # There may not be any, and we can't easily test that the right\n    # set of news items are listed, since this depends on whether we\n    # upgraded and from what.  But this testing is still somewhat\n    # meaningful.\n\n    document = frontend.page(\"news\", params={ \"display\": \"all\" })\n    items = document.findAll(attrs=with_class(\"item\"))\n\n    for item in items:\n        item_id = item[\"critic-item-id\"]\n        item_title = extract_text(item.find(attrs=with_class(\"title\")))\n\n        frontend.page(\n            \"news\",\n            params={ \"item\": item_id },\n            expect={ \"document_title\": testing.expect.document_title(item_title),\n                     \"content_title\": testing.expect.paleyellow_title(0, item_title),\n                     \"pageheader_links\": testing.expect.pageheader_links(\"authenticated\",\n                                                                         \"administrator\"),\n                     \"script_user\": testing.expect.script_no_user() })\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/002-authenticated/007-home.py",
    "content": "with frontend.signin():\n    frontend.page(\n        \"home\",\n        expect={\n            \"document_title\": testing.expect.document_title(u\"Testing Administrator's Home\"),\n            \"content_title\": testing.expect.paleyellow_title(0, u\"Testing Administrator's Home\"),\n            \"pageheader_links\": testing.expect.pageheader_links(\"authenticated\",\n                                                                \"administrator\"),\n            \"script_user\": testing.expect.script_user(instance.user(\"admin\"))\n        })\n\nwith frontend.signin(\"bob\"):\n    frontend.operation(\n        \"changepassword\",\n        data={ \"current_pw\": \"testing\",\n               \"new_pw\": \"gnitset\" })\n\nfrontend.operation(\n    \"validatelogin\",\n    data={ \"fields\": { \"username\": \"bob\",\n                       \"password\": \"testing\" }},\n    expect={ \"message\": \"Wrong password\" })\n\nwith frontend.signin(\"bob\", \"gnitset\"):\n    pass\n\nwith frontend.signin():\n    frontend.operation(\n        \"changepassword\",\n        data={ \"subject\": \"bob\",\n               \"new_pw\": \"testing\" })\n\nwith frontend.signin(\"bob\"):\n    pass\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/002-authenticated/008-repositories.py",
    "content": "with frontend.signin():\n    frontend.page(\n        \"repositories\",\n        expect={\n            \"document_title\": testing.expect.document_title(u\"Repositories\"),\n            \"content_title\": testing.expect.paleyellow_title(0, u\"Repositories\"),\n            \"pageheader_links\": testing.expect.pageheader_links(\"authenticated\",\n                                                                \"administrator\"),\n            \"script_user\": testing.expect.script_user(instance.user(\"admin\"))\n        })\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/002-authenticated/009-services.py",
    "content": "import time\n\nservices = {}\n\nwith_class = testing.expect.with_class\nextract_text = testing.expect.extract_text\n\ndef check_services(services, restarted=frozenset()):\n    if isinstance(restarted, basestring):\n        restarted = frozenset([restarted])\n\n    def checker(document):\n        expected = set(services.keys())\n\n        for service_tr in document.findAll(\"tr\", attrs=with_class(\"service\")):\n            td_name = service_tr.find(\"td\", attrs=with_class(\"name\"))\n            td_pid = service_tr.find(\"td\", attrs=with_class(\"pid\"))\n            td_rss = service_tr.find(\"td\", attrs=with_class(\"rss\"))\n\n            name = str(extract_text(td_name))\n            pid = extract_text(td_pid)\n            rss = extract_text(td_rss)\n\n            # These only start up fully when extensions are enabled.\n            if name.startswith(\"extension\"):\n                continue\n\n            try:\n                pid = int(pid)\n            except ValueError:\n                if pid == \"(not running)\":\n                    testing.logger.error(\"Service %r is not running!\" % name)\n                else:\n                    testing.logger.error(\n                        \"Service %r has unexpected PID value: %r\" % (name, pid))\n            else:\n                if rss == \"N/A\":\n                    testing.logger.error(\"Service %r is not running \"\n                                         \"(and the PID value is stale...)!\"\n                                         % name)\n\n            if name in restarted:\n                if pid == services[name]:\n                    testing.logger.error(\n                        \"Service %r not restarted as expected!\" % name)\n            elif name in services:\n                testing.expect.check(services[name], pid,\n                                     message=\"service unexpectedly restarted\")\n\n            if name in expected:\n                expected.remove(name)\n\n            services[name] = pid\n\n        if expected:\n            testing.logger.error(\"Service(s) have gone missing: %r\"\n                                 % \", \".join(expected))\n\n    return checker\n\nwith frontend.signin():\n    services = {}\n\n    frontend.page(\n        \"services\",\n        expect={\n            \"document_title\": testing.expect.document_title(u\"Services\"),\n            \"content_title\": testing.expect.paleyellow_title(0, u\"Services\"),\n            \"pageheader_links\": testing.expect.pageheader_links(\"authenticated\",\n                                                                \"administrator\"),\n            \"script_user\": testing.expect.script_user(instance.user(\"admin\")),\n            \"services\": check_services(services)\n        })\n\n    all_services = set([\"manager\"])\n\n    for service_name in services.keys():\n        if service_name not in (\"manager\", \"extensiontasks\") \\\n                and not service_name.startswith(\"wsgi:\"):\n            all_services.add(service_name)\n\n            frontend.operation(\n                \"restartservice\",\n                data={ \"service_name\": service_name })\n\n            frontend.page(\n                \"services\",\n                expect={ \"services\": check_services(services, service_name) })\n\n    # Need to give the last service(s) restarted some time to actually start up;\n    # otherwise they might receive their TERM signal before they register a\n    # signal handler.\n    time.sleep(0.5)\n\n    frontend.operation(\n        \"restartservice\",\n        data={ \"service_name\": \"manager\" })\n\n    # Need to give the service manager some time to actually restart.  Or rather\n    # time to stop; once it has stopped, the /services page has code that waits\n    # (up to 10 seconds) for it to start up again, should it not be up and\n    # running already.\n    time.sleep(0.5)\n\n    frontend.page(\n        \"services\",\n        expect={ \"services\": check_services(services, all_services) })\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/002-authenticated/010-createreview.py",
    "content": "with frontend.signin():\n    frontend.page(\n        \"createreview\",\n        expect={\n            \"document_title\": testing.expect.document_title(u\"Create Review\"),\n            \"content_title\": testing.expect.paleyellow_title(0, u\"Create Review\"),\n            \"pageheader_links\": testing.expect.pageheader_links(\"authenticated\",\n                                                                \"administrator\"),\n            \"script_user\": testing.expect.script_user(instance.user(\"admin\"))\n        })\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/002-authenticated/011-manageextensions.py",
    "content": "# Only available if extension support has been enabled.  (The body of the\n# message is quite long, and particularly interesting to check here.)\nexpected_message = testing.expect.message(\"Extension support not enabled\", None)\n\nwith frontend.signin():\n    frontend.page(\n        \"manageextensions\",\n        expect={ \"message\": expected_message })\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/002-authenticated/012-statistics.py",
    "content": "# The /statistics page has no <title>, and has a pale yellow table that doesn't\n# have the 'paleyellow' class, and which has five (!) different main headings.\n# Its generation should be fixed, but for now, just skip testing the common page\n# elements that it's missing.\nwith frontend.signin():\n    frontend.page(\"statistics\", expect={ \"pageheader_links\": testing.expect.pageheader_links(\"authenticated\",\n                                                                                             \"administrator\"),\n                                         \"script_user\": testing.expect.script_no_user() })\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/002-authenticated/100-preferences/001-commit.diff.rulerColumn.py",
    "content": "# Check existence of preference commit.diff.rulerColumn, added by\n#\n#   http://critic-review.org/r/57\n\ndef check_heading(document):\n    headings = document.findAll(\"td\", attrs={ \"class\": \"heading\" })\n\n    for heading in headings:\n        if heading.find(text=\"commit.diff.rulerColumn:\"):\n            return\n\n    testing.expect.check(\"<preference heading>\",\n                         \"<expected content not found>\")\n\ndef check_input(document):\n    input = document.find(\"input\", attrs={ \"name\": \"commit.diff.rulerColumn\" })\n\n    if not input:\n        testing.expect.check(\"<preference input>\",\n                             \"<expected content not found>\")\n\n    testing.expect.check(\"number\", input[\"type\"])\n    testing.expect.check(\"0\", input[\"value\"])\n    testing.expect.check(\"0\", input[\"critic-default\"])\n\nwith frontend.signin():\n    frontend.page(\"config\",\n                  expect={ \"preference_heading\": check_heading,\n                           \"preference_input\": check_input })\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/002-authenticated/100-preferences/002-review.defaultOptOut.py",
    "content": "# Check existence of preference review.defaultOptOut, added by\n#\n#   http://critic-review.org/r/40\n\ndef check_heading(document):\n    headings = document.findAll(\"td\", attrs={ \"class\": \"heading\" })\n\n    for heading in headings:\n        if heading.find(text=\"review.defaultOptOut:\"):\n            return\n\n    testing.expect.check(\"<preference heading>\",\n                         \"<expected content not found>\")\n\ndef check_input(document):\n    input = document.find(\"input\", attrs={ \"name\": \"review.defaultOptOut\" })\n\n    if not input:\n        testing.expect.check(\"<preference input>\",\n                             \"<expected content not found>\")\n\n    testing.expect.check(\"checkbox\", input[\"type\"])\n    testing.expect.check(False, input.has_key(\"checked\"))\n    testing.expect.check(\"false\", input[\"critic-default\"])\n\nwith frontend.signin():\n    frontend.page(\"config\",\n                  expect={ \"preference_heading\": check_heading,\n                           \"preference_input\": check_input })\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/002-authenticated/100-preferences/003-timezone.py",
    "content": "# Check existence of preference review.defaultOptOut, added by\n#\n#   http://critic-review.org/r/40\n\ndef check_heading(document):\n    headings = document.findAll(\"td\", attrs={ \"class\": \"heading\" })\n\n    for heading in headings:\n        if heading.find(text=\"timezone:\"):\n            return\n\n    testing.expect.check(\"<preference heading>\",\n                         \"<expected content not found>\")\n\ndef check_select(document):\n    select = document.find(\"select\", attrs={ \"name\": \"timezone\" })\n\n    if not select:\n        testing.expect.check(\"<preference select>\",\n                             \"<expected content not found>\")\n\n    testing.expect.check('\"Universal/UTC\"', select[\"critic-default\"])\n\n    option = select.find(\"option\", attrs={ \"selected\": \"selected\" })\n\n    if not option:\n        testing.expect.check(\"<pre-selected option>\",\n                             \"<expected content not found>\")\n\n    testing.expect.check(\"Universal/UTC\", option[\"value\"])\n    testing.expect.check(\"UTC (UTC / UTC+00:00)\", option.string)\n\nwith frontend.signin():\n    frontend.page(\"config\",\n                  expect={ \"preference_heading\": check_heading,\n                           \"preference_input\": check_select })\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/002-authenticated/100-preferences/__init__.py",
    "content": "# Tests in this directory don't depend on anything but having installed Critic.\n# @dependency 001-main/000-install.py\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/003-criticctl/001-basic.py",
    "content": "def expect_success(argv, expected_output_lines=[]):\n    try:\n        output = instance.criticctl(argv)\n    except testing.CriticctlError as error:\n        logger.error(\"'criticctl %s': correct criticctl usage failed:\\n%s\"\n                     % (\" \".join(argv), error.stdout))\n        return []\n    else:\n        output_lines = set(map(str.strip, output.splitlines()))\n        for line in expected_output_lines:\n            if line.strip() not in output_lines:\n                logger.error(\"'%s': Expected output line not found:\\n  %r\"\n                             % (\" \".join(argv), line))\n        return output_lines\n\ndef expect_failure(argv, expected_output_lines=[]):\n    try:\n        instance.criticctl(argv)\n    except testing.CriticctlError as error:\n        output_lines = set(map(str.strip, error.stderr.splitlines()))\n        for line in expected_output_lines:\n            if line.strip() not in output_lines:\n                logger.error(\"'%s': Expected output line not found:\\n  %r\"\n                             % (\" \".join(argv), line))\n        return output_lines\n    else:\n        logger.error(\"'criticctl %s': incorrect criticctl usage did not fail\"\n                     % \" \".join(argv))\n        return []\n\ntry:\n    instance.execute([\"criticctl\"])\nexcept testing.NotSupported:\n    # When testing with --quickstart, we can run criticctl, but we're\n    # not running it as root, so this particular check is irrelevant.\n    pass\nexcept testing.virtualbox.GuestCommandError as error:\n    output_lines = set(map(str.strip, error.stderr.splitlines()))\n    if \"ERROR: Failed to set UID = critic. Run as root?\" not in output_lines:\n        logger.error(\"Running 'criticctl' as non-root failed with unexpected \"\n                     \"error message\")\nelse:\n    logger.error(\"Running 'criticctl' as non-root did not fail\")\n\n# Test -h/--help argument.\nusage_lines = expect_success([],\n                             [\"Critic administration interface\",\n                              \"Available commands are:\"])\nexpect_success([\"-h\"],\n               usage_lines)\nexpect_success([\"--help\"],\n               usage_lines)\n\n# Test --etc-dir/-e argument.\nexpect_success([\"--etc-dir\", instance.etc_dir],\n               usage_lines)\nexpect_success([\"--etc-dir=\" + instance.etc_dir],\n               usage_lines)\nexpect_success([\"-e\", instance.etc_dir],\n               usage_lines)\nexpect_success([\"-e\" + instance.etc_dir],\n               usage_lines)\nlines = expect_failure([\"--etc-dir\", \"/etc/wrong\"],\n                       [\"ERROR: Directory is inaccessible: /etc/wrong\"])\nexpect_failure([\"-e\", \"/etc/wrong\"],\n               lines)\nlines = expect_failure([\"--etc-dir\"],\n                       [\"criticctl: error: argument --etc-dir/-e: \"\n                        \"expected one argument\"])\nexpect_failure([\"-e\"],\n               lines)\n\n# Test --identity/-i argument.\nexpect_success([\"--identity\", \"main\"],\n               usage_lines)\nexpect_success([\"--identity=main\"],\n               usage_lines)\nexpect_success([\"-i\", \"main\"],\n               usage_lines)\nexpect_success([\"-imain\"],\n               usage_lines)\nlines = expect_failure([\"--identity\", \"wrong\"],\n                       [\"ERROR: Invalid identity: wrong\"])\nexpect_failure([\"-i\", \"wrong\"],\n               lines)\nlines = expect_failure([\"--identity\"],\n                       [\"criticctl: error: argument --identity/-i: \"\n                        \"expected one argument\"])\nexpect_failure([\"-i\"],\n               lines)\n\n# Test unknown arguments.\nexpect_failure([\"-x\"],\n               [\"criticctl: error: unrecognized arguments: -x\"])\nexpect_failure([\"--xxx\"],\n               [\"criticctl: error: unrecognized arguments: --xxx\"])\n\n# Test unknown command.\nlines = expect_failure([\"foo\"],\n                       [\"ERROR: Invalid command: foo\"])\nexpect_failure([\"-e\", instance.etc_dir, \"foo\"],\n               lines)\nexpect_failure([\"-e\" + instance.etc_dir, \"foo\"],\n               lines)\nexpect_failure([\"-i\", \"main\", \"foo\"],\n               lines)\nexpect_failure([\"-imain\", \"foo\"],\n               lines)\nexpect_failure([\"-e\", instance.etc_dir, \"-i\", \"main\", \"foo\"],\n               lines)\nexpect_failure([\"-e\" + instance.etc_dir, \"-imain\", \"foo\"],\n               lines)\nexpect_failure([\"-i\", \"main\", \"-e\", instance.etc_dir, \"foo\"],\n               lines)\nexpect_failure([\"-imain\", \"-e\" + instance.etc_dir, \"foo\"],\n               lines)\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/003-criticctl/002-adduser-deluser.py",
    "content": "# Scenario: Try to add a user 'alice' (already exists).\ntry:\n    instance.criticctl(\n        [\"adduser\",\n         \"--name\", \"alice\",\n         \"--email\", \"alice@example.org\",\n         \"--fullname\", \"'Alice von Testing'\",\n         \"--password\", \"testing\"])\nexcept testing.CriticctlError as error:\n    if \"alice: user exists\" not in error.stderr.splitlines():\n        logger.error(\"criticctl failed with unexpected error message:\\n%s\"\n                     % error.stdout)\nelse:\n    logger.error(\"incorrect criticctl usage did not fail\")\n\n# Scenario: Try to delete the user 'nosuchuser' (no such user).\ntry:\n    instance.criticctl(\n        [\"deluser\",\n         \"--name\", \"nosuchuser\"])\nexcept testing.CriticctlError as error:\n    if \"nosuchuser: no such user\" not in error.stderr.splitlines():\n        logger.error(\"criticctl failed with unexpected error message:\\n%s\"\n                     % error.stdout)\nelse:\n    logger.error(\"incorrect criticctl usage did not fail\")\n\n# Scenario: Add a user 'extra' and then delete the user again.\ntry:\n    instance.criticctl(\n        [\"adduser\",\n         \"--name\", \"extra\",\n         \"--email\", \"extra@example.org\",\n         \"--fullname\", \"'Extra von Testing'\",\n         \"--password\", \"testing\"])\nexcept testing.CriticctlError as error:\n    logger.error(\"correct criticctl usage failed:\\n%s\"\n                 % error.stdout)\nelse:\n    instance.registeruser(\"extra\")\n\n    try:\n        instance.criticctl(\n            [\"deluser\",\n             \"--name\", \"extra\"])\n    except testing.CriticctlError as error:\n        logger.error(\"correct criticctl usage failed:\\n%s\"\n                     % error.stdout)\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/003-criticctl/003-addrole-delrole.py",
    "content": "ROLES = [\"administrator\", \"developer\", \"newswriter\", \"repositories\"]\n\n# Scenario: Try to add a role that 'admin' already has.\ntry:\n    output = instance.criticctl(\n        [\"addrole\",\n         \"--name\", \"admin\",\n         \"--role\", \"administrator\"])\n    expected_output = \"admin: user already has role 'administrator'\"\n    if expected_output not in output.splitlines():\n        logger.error(\"Expected output not found: %r\\n%s\"\n                     % (expected_output, output))\nexcept testing.CriticctlError as error:\n    logger.error(\"correct criticctl usage failed:\\n%s\"\n                 % error.stdout)\n\n# Scenario: Try to delete a role 'alice' doesn't have.\ntry:\n    output = instance.criticctl(\n        [\"delrole\",\n         \"--name\", \"alice\",\n         \"--role\", \"administrator\"])\n    expected_output = \"alice: user doesn't have role 'administrator'\"\n    if expected_output not in output.splitlines():\n        logger.error(\"Expected output not found: %r\\n%s\"\n                     % (expected_output, output))\nexcept testing.CriticctlError as error:\n    logger.error(\"correct criticctl usage failed:\\n%s\"\n                 % error.stdout)\n\n# Scenario: Try to add a role to a non-existing user.\ntry:\n    instance.criticctl(\n        [\"addrole\",\n         \"--name\", \"nosuchuser\",\n         \"--role\", \"administrator\"])\nexcept testing.CriticctlError as error:\n    if \"nosuchuser: no such user\" not in error.stderr.splitlines():\n        logger.error(\"criticctl failed with unexpected error message:\\n%s\"\n                     % error.stdout)\nelse:\n    logger.error(\"incorrect criticctl usage did not fail: \"\n                 \"addrole, non-existing user\")\n\n# Scenario: Try to delete a role from a non-existing user.\ntry:\n    instance.criticctl(\n        [\"delrole\",\n         \"--name\", \"nosuchuser\",\n         \"--role\", \"administrator\"])\nexcept testing.CriticctlError as error:\n    if \"nosuchuser: no such user\" not in error.stderr.splitlines():\n        logger.error(\"criticctl failed with unexpected error message:\\n%s\"\n                     % error.stdout)\nelse:\n    logger.error(\"incorrect criticctl usage did not fail: \"\n                 \"delrole, non-existing user\")\n\n# Scenario: Try to add an invalid role.\ntry:\n    instance.criticctl(\n        [\"addrole\",\n         \"--name\", \"alice\",\n         \"--role\", \"joker\"])\nexcept testing.CriticctlError as error:\n    if \"invalid choice: 'joker'\" not in error.stderr:\n        logger.error(\"criticctl failed with unexpected error message:\\n%s\"\n                     % error.stderr)\nelse:\n    logger.error(\"incorrect criticctl usage did not fail: \"\n                 \"addrole, invalid role\")\n\n# Scenario: Try to delete an invalid role.\ntry:\n    instance.criticctl(\n        [\"delrole\",\n         \"--name\", \"alice\",\n         \"--role\", \"joker\"])\nexcept testing.CriticctlError as error:\n    if \"invalid choice: 'joker'\" not in error.stderr:\n        logger.error(\"criticctl failed with unexpected error message:\\n%s\"\n                     % error.stderr)\nelse:\n    logger.error(\"incorrect criticctl usage did not fail: \"\n                 \"delrole, invalid role\")\n\n# Scenario: Add and then delete each role.\ndef test_role(role):\n    try:\n        instance.criticctl(\n            [\"addrole\",\n             \"--name\", \"alice\",\n             \"--role\", role])\n    except testing.CriticctlError as error:\n        logger.error(\"correct criticctl usage failed:\\n%s\"\n                     % error.stdout)\n    else:\n        try:\n            instance.criticctl(\n                [\"delrole\",\n                 \"--name\", \"alice\",\n                 \"--role\", role])\n        except testing.CriticctlError as error:\n            logger.error(\"correct criticctl usage failed:\\n%s\"\n                         % error.stdout)\nfor role in ROLES:\n    test_role(role)\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/003-criticctl/004-listusers.py",
    "content": "# Scenario: Invalid format.\ntry:\n    instance.criticctl(\n        [\"listusers\", \"--format\", \"oranges\"])\nexcept testing.CriticctlError as error:\n    if \"invalid choice: 'oranges'\" not in error.stderr:\n        logger.error(\"criticctl failed with unexpected error message:\\n%s\"\n                     % error.stderr)\nelse:\n    logger.error(\"incorrect criticctl usage did not fail\")\n\nexpected = \"\"\"\\\n  id |    name    |              email             |            fullname            | status\n-----+------------+--------------------------------+--------------------------------+--------\n   1 |      admin |              admin@example.org | Testing Administrator          | current\n   2 |      alice |              alice@example.org | Alice von Testing              | current\n   3 |        bob |                bob@example.org | Bob von Testing                | current\n   4 |       dave |               dave@example.org | Dave von Testing               | current\n   5 |       erin |               erin@example.org | Erin von Testing               | current\n   6 |     howard |             howard@example.org | Howard von Testing             | current\n   7 |      extra |              extra@example.org | Extra von Testing              | retired\n\n\"\"\"\n\n# Scenario: Default / human readable format.\ntry:\n    output = instance.criticctl([\"listusers\"])\nexcept testing.CriticctlError as error:\n    logger.error(\"correct criticctl usage failed:\\n%s\"\n                 % error.stdout)\nelse:\n    testing.expect.check(expected, output)\n\ntry:\n    output = instance.criticctl([\"listusers\", \"-f\", \"table\"])\nexcept testing.CriticctlError as error:\n    logger.error(\"correct criticctl usage failed:\\n%s\"\n                 % error.stdout)\nelse:\n    testing.expect.check(expected, output)\n\ntry:\n    output = instance.criticctl([\"listusers\", \"--format\", \"table\"])\nexcept testing.CriticctlError as error:\n    logger.error(\"correct criticctl usage failed:\\n%s\"\n                 % error.stdout)\nelse:\n    testing.expect.check(expected, output)\n\nexpected = \"\"\"\\\n# id, name, email, fullname, status\n[\n (1, 'admin', 'admin@example.org', 'Testing Administrator', 'current'),\n (2, 'alice', 'alice@example.org', 'Alice von Testing', 'current'),\n (3, 'bob', 'bob@example.org', 'Bob von Testing', 'current'),\n (4, 'dave', 'dave@example.org', 'Dave von Testing', 'current'),\n (5, 'erin', 'erin@example.org', 'Erin von Testing', 'current'),\n (6, 'howard', 'howard@example.org', 'Howard von Testing', 'current'),\n (7, 'extra', 'extra@example.org', 'Extra von Testing', 'retired'),\n]\n\"\"\"\n\n# Scenario: Tuples format.\ntry:\n    output = instance.criticctl([\"listusers\", \"-f\", \"tuples\"])\nexcept testing.CriticctlError as error:\n    logger.error(\"correct criticctl usage failed:\\n%s\"\n                 % error.stdout)\nelse:\n    testing.expect.check(expected, output)\n\ntry:\n    output = instance.criticctl([\"listusers\", \"--format\", \"tuples\"])\nexcept testing.CriticctlError as error:\n    logger.error(\"correct criticctl usage failed:\\n%s\"\n                 % error.stdout)\nelse:\n    testing.expect.check(expected, output)\n\nexpected = \"\"\"\\\n[\n {'id': 1, 'name': 'admin', 'email': 'admin@example.org', 'fullname': 'Testing Administrator', 'status': 'current'},\n {'id': 2, 'name': 'alice', 'email': 'alice@example.org', 'fullname': 'Alice von Testing', 'status': 'current'},\n {'id': 3, 'name': 'bob', 'email': 'bob@example.org', 'fullname': 'Bob von Testing', 'status': 'current'},\n {'id': 4, 'name': 'dave', 'email': 'dave@example.org', 'fullname': 'Dave von Testing', 'status': 'current'},\n {'id': 5, 'name': 'erin', 'email': 'erin@example.org', 'fullname': 'Erin von Testing', 'status': 'current'},\n {'id': 6, 'name': 'howard', 'email': 'howard@example.org', 'fullname': 'Howard von Testing', 'status': 'current'},\n {'id': 7, 'name': 'extra', 'email': 'extra@example.org', 'fullname': 'Extra von Testing', 'status': 'retired'},\n]\n\"\"\"\n\n# Scenario: Dicts format.\ntry:\n    output = instance.criticctl([\"listusers\", \"-f\", \"dicts\"])\nexcept testing.CriticctlError as error:\n    logger.error(\"correct criticctl usage failed:\\n%s\"\n                 % error.stdout)\nelse:\n    testing.expect.check(expected, output)\n\ntry:\n    output = instance.criticctl([\"listusers\", \"--format\", \"dicts\"])\nexcept testing.CriticctlError as error:\n    logger.error(\"correct criticctl usage failed:\\n%s\"\n                 % error.stdout)\nelse:\n    testing.expect.check(expected, output)\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/003-criticctl/005-configtest.py",
    "content": "try:\n    output = instance.criticctl([\"configtest\"])\nexcept testing.CriticctlError as error:\n    logger.error(\"correct criticctl usage failed:\\n%s\"\n                 % error.stdout)\nelse:\n    testing.expect.check(\"System configuration valid.\\n\", output)\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/003-criticctl/006-restart.py",
    "content": "# Need a VM (full installation) to restart Critic.\n# @flag full\n\ntry:\n    output = instance.execute([\"sudo\", \"criticctl\", \"restart\"])\nexcept testing.virtualbox.GuestCommandError as error:\n    logger.error(\"correct criticctl usage failed:\\n%s\"\n                 % error.stdout)\nelse:\n    # Expected output is system dependent, so don't check.\n    pass\n\n# Check that all services are responding.  As a bonus, this also tests that the\n# synchronization mechanism is working for all of them.\ninstance.synchronize_service(\"highlight\")\ninstance.synchronize_service(\"changeset\")\ninstance.synchronize_service(\"githook\")\ninstance.synchronize_service(\"branchtracker\")\ninstance.synchronize_service(\"maildelivery\")\ninstance.synchronize_service(\"watchdog\")\ninstance.synchronize_service(\"maintenance\")\n\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/004-mixed/001-newswriter.py",
    "content": "import re\n\ndef unread_news_count(document):\n    pageheader = document.find(\"table\", attrs={ \"class\": \"pageheader\" })\n    for link in pageheader.find(\"ul\").findAll(\"a\"):\n        m = re.match(\"News \\((\\d+)\\)\", link.string)\n        if m:\n            return int(m.group(1))\n    return 0\n\nNEWSTEXT = \"I'm as mad as hell, and I'm not going to take this anymore.\"\n\nwith frontend.signin(\"alice\"):\n    dashboard = frontend.page(\"dashboard\", expect={ \"document_title\": testing.expect.document_title(u\"Dashboard\") })\n    initial_unread = unread_news_count(dashboard)\n\nwith frontend.signin(\"howard\"):\n    response = frontend.operation(\n        \"addnewsitem\",\n        data={ \"text\": \"I'm as mad as hell\" })\n    newsitem_id = response[\"item_id\"]\n\n    frontend.operation(\n        \"editnewsitem\",\n        data={ \"item_id\": newsitem_id,\n               \"text\": NEWSTEXT })\n\nwith frontend.signin(\"alice\"):\n    dashboard = frontend.page(\"dashboard\", expect={ \"document_title\": testing.expect.document_title(u\"Dashboard\") })\n    testing.expect.check(initial_unread + 1, unread_news_count(dashboard))\n\n    newsitem = frontend.page(\"news\", params={ \"item\": newsitem_id })\n    newstext = newsitem.find(\"td\", attrs={ \"class\": \"text\" })\n    testing.expect.check(NEWSTEXT, testing.expect.extract_text(newstext).strip())\n\n    dashboard = frontend.page(\"dashboard\", expect={ \"document_title\": testing.expect.document_title(u\"Dashboard\") })\n    testing.expect.check(initial_unread, unread_news_count(dashboard))\n\n    frontend.operation(\n        \"addnewsitem\",\n        data={ \"text\": \"Quid quid latine dictum sit, altum viditur.\" },\n        expect={ \"status\": \"failure\",\n                 \"code\": \"notallowed\" })\n\n    frontend.operation(\n        \"editnewsitem\",\n        data={ \"item_id\": newsitem_id,\n               \"text\": \"It's all hat, no cattle.\" },\n        expect={ \"status\": \"failure\",\n                 \"code\": \"notallowed\" })\n\nwith frontend.signin(\"bob\"):\n    # Howard's news item should still be unread by bob.\n    dashboard = frontend.page(\"dashboard\", expect={ \"document_title\": testing.expect.document_title(u\"Dashboard\") })\n    testing.expect.check(initial_unread + 1, unread_news_count(dashboard))\n\n# Anonymous users should not be able to add or edit news items.\nfrontend.operation(\n    \"addnewsitem\",\n    data={ \"text\": \"If you have a lifetime warranty on something, it is also a hammer.\" },\n    expect={ \"status\": \"failure\",\n             \"code\": \"mustlogin\" })\n\nfrontend.operation(\n    \"editnewsitem\",\n    data={ \"item_id\": newsitem_id,\n           \"text\": \"The only completely consistent people are dead.\" },\n    expect={ \"status\": \"failure\",\n             \"code\": \"mustlogin\" })\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/004-mixed/002-email.py",
    "content": "import re\n\nwith_class = testing.expect.with_class\nextract_text = testing.expect.extract_text\n\ndef extract_addresses(document):\n    addresses = []\n    for address in document.findAll(attrs=with_class(\"address\")):\n        email_id = int(address[\"data-email-id\"])\n        selected = \"selected\" in address[\"class\"].split()\n        value = address.find(attrs=with_class(\"value\")).string\n        if address.find(attrs=with_class(\"verified\")):\n            verified = \"verified\"\n        elif address.find(attrs=with_class(\"unverified\")):\n            verified = \"unverified\"\n        else:\n            verified = None\n        addresses.append((email_id, selected, value, verified))\n    return addresses\n\ndef emails(expected):\n    def check(document):\n        actual = [(selected, value, verified)\n                  for _, selected, value, verified\n                  in extract_addresses(document)]\n        testing.expect.check(expected, actual)\n    return check\n\ndef no_emails(document):\n    row = document.find(\"tr\", attrs=with_class(\"email\"))\n    testing.expect.check(\n        \"No email address\",\n        extract_text(row.find(\"td\", attrs=with_class(\"value\")).find(\"i\")))\n\nALICE_ID = 2\nALICE_AT_EXAMPLE = \"alice@example.org\"\nALICE_AT_WONDERLAND = \"alice@wonderland.net\"\nRE_ALICE_AT_WONDERLAND = r\"alice@wonderland\\.net\"\n\nwith frontend.signin(\"alice\"):\n    # Check initial state.\n\n    frontend.page(\n        \"home\",\n        expect={ \"addresses\": emails([(True, ALICE_AT_EXAMPLE, None)]) })\n\n    # Add another, initially unverified, email address.\n\n    frontend.operation(\n        \"addemailaddress\",\n        data={ \"subject_id\": ALICE_ID,\n               \"email\": \"alice@wonderland.net\" })\n\n    document = frontend.page(\n        \"home\",\n        expect={ \"addresses\": emails(\n                [(True, ALICE_AT_EXAMPLE, None),\n                 (False, ALICE_AT_WONDERLAND, \"unverified\")]) })\n\n    addresses = extract_addresses(document)\n    alice_at_example_id = addresses[0][0]\n    alice_at_wonderland_id = addresses[1][0]\n\n    # Check that we got a verification mail.\n\n    subject = r\"\\[Critic\\] Please verify your email: \" + RE_ALICE_AT_WONDERLAND\n\n    verification_mail = mailbox.pop(\n        accept=[testing.mailbox.ToRecipient(ALICE_AT_WONDERLAND),\n                testing.mailbox.WithSubject(subject)])\n\n    # Extract the verification link from the verification mail.\n\n    for line in verification_mail.lines:\n        match = re.match(\n            r\"\\s+http://[^/]+/verifyemail\\?email=([^&]+)&token=([^&]+)\", line)\n        if match:\n            email, token = match.groups()\n            testing.expect.check(ALICE_AT_WONDERLAND, email)\n            break\n    else:\n        testing.expect.check(\n            \"<verification link in verification mail>\",\n            \"<expected content not found>\")\n\n    # Request another verification mail.\n\n    frontend.operation(\n        \"requestverificationemail\",\n        data={ \"email_id\": alice_at_wonderland_id })\n\n    mailbox.pop(accept=[testing.mailbox.ToRecipient(ALICE_AT_WONDERLAND),\n                        testing.mailbox.WithSubject(subject)])\n\n    # Verify the new email address.\n\n    response = frontend.page(\n        \"verifyemail\",\n        params={ \"email\": ALICE_AT_WONDERLAND,\n                 \"token\": token },\n        disable_redirects=True,\n        expected_http_status=307)\n\n    testing.expect.check(\n        \"/home?email_verified=%d\" % alice_at_wonderland_id,\n        response.headers[\"Location\"])\n\n    # Check that it's now displayed as verified.\n\n    frontend.page(\n        \"home\",\n        params={ \"email_verified\": str(alice_at_wonderland_id) },\n        expect={ \"addresses\": emails(\n                [(True, ALICE_AT_EXAMPLE, None),\n                 (False, ALICE_AT_WONDERLAND, \"verified\")]) })\n\n    # Make the new address the selected one.\n\n    frontend.operation(\n        \"selectemailaddress\",\n        data={ \"email_id\": alice_at_wonderland_id })\n\n    frontend.page(\n        \"home\",\n        expect={ \"addresses\": emails(\n                [(False, ALICE_AT_EXAMPLE, None),\n                 (True, ALICE_AT_WONDERLAND, \"verified\")]) })\n\n    # Try to delete the now selected address.\n\n    frontend.operation(\n        \"deleteemailaddress\",\n        data={ \"email_id\": alice_at_wonderland_id },\n        expect={ \"status\": \"failure\",\n                 \"code\": \"notallowed\" })\n\n    frontend.page(\n        \"home\",\n        expect={ \"addresses\": emails(\n                [(False, ALICE_AT_EXAMPLE, None),\n                 (True, ALICE_AT_WONDERLAND, \"verified\")]) })\n\n    # Delete the other address instead.\n\n    frontend.operation(\n        \"deleteemailaddress\",\n        data={ \"email_id\": alice_at_example_id })\n\n    frontend.page(\n        \"home\",\n        expect={ \"addresses\": emails(\n                [(True, ALICE_AT_WONDERLAND, \"verified\")]) })\n\n    # Now delete the single, selected address.\n\n    frontend.operation(\n        \"deleteemailaddress\",\n        data={ \"email_id\": alice_at_wonderland_id })\n\n    frontend.page(\n        \"home\",\n        expect={ \"addresses\": no_emails })\n\nwith frontend.signin():\n    # Re-add Alice's original address as the system administrator.\n\n    frontend.operation(\n        \"addemailaddress\",\n        data={ \"subject_id\": ALICE_ID,\n               \"email\": ALICE_AT_EXAMPLE })\n\n    frontend.page(\n        \"home\",\n        params={ \"user\": \"alice\" },\n        expect={ \"addresses\": emails([(True, ALICE_AT_EXAMPLE, None)]) })\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/004-mixed/003-oauth.py",
    "content": "import re\nimport urllib\nimport urlparse\n\ndef externalauthURL(name):\n    return \"externalauth/%s?%s\" % (name, urllib.urlencode({ \"target\": \"/\" }))\n\ndef signout():\n    frontend.operation(\"endsession\", data={})\n\nwith_class = testing.expect.with_class\n\ndef isprefix(expected, actual):\n    return actual.startswith(expected)\ndef issuffix(expected, actual):\n    return actual.endswith(expected)\n\ndef expect_system_mail(subject):\n    system_mail = mailbox.pop(testing.mailbox.ToRecipient(\"system@example.org\"))\n    testing.expect.check(subject, system_mail.headers[\"subject\"][0][\"value\"])\n\ndef start_externalauth(name):\n    response = frontend.page(\n        externalauthURL(name),\n        disable_redirects=True,\n        expected_http_status=302)\n\n    redirect_url = response.headers[\"Location\"]\n\n    testing.expect.check(\"https://example.com/authorize?\",\n                         redirect_url,\n                         equal=isprefix)\n\n    parsed_url = urlparse.urlparse(redirect_url)\n    parsed_query = urlparse.parse_qs(parsed_url.query)\n    state = parsed_query.get(\"state\", [\"no state received\"])[0]\n\n    if state == \"no state received\":\n        testing.expect.check(\"<state parameter in authorize URI query>\",\n                             \"<no state parameter: %r>\" % parsed_url.query)\n\n    return state\n\ndef finish_externalauth(name, state):\n    response = frontend.page(\n        \"oauth/\" + name,\n        params={ \"state\": state,\n                 \"code\": \"correct\" },\n        disable_redirects=True,\n        expected_http_status=302)\n\n    return response.headers[\"Location\"]\n\n# Check that all the expected links to external providers are present\n# on the \"Sign in\" page.\n\nNAMES = [\"alice\", \"carol\", \"felix\", \"gina\"]\n\ndef oauth_links(document):\n    providers = document.findAll(\"div\", attrs=with_class(\"provider\"))\n    names = set(NAMES)\n    expected_text = \"Sign in using your \"\n    for provider in providers:\n        testing.expect.check(expected_text, provider.contents[0])\n        expected_text = \"or \"\n\n        link = provider.find(\"a\")\n        words = link.string.split()\n        testing.expect.check(2, len(words))\n        testing.expect.check(\"account\", words[-1], equal=issuffix)\n\n        name = words[0].lower()\n        if name in names:\n            testing.expect.check(\"/\" + externalauthURL(name), link[\"href\"])\n            names.remove(name)\n        else:\n            testing.logger.error(\"Unexpected provider: %r\" % name)\n\n    if names:\n        testing.expect.check(\"<link to providers: %r>\" % names,\n                             \"<no links found>\")\n\nfrontend.page(\n    \"login\",\n    expect={ \"oauth links\": oauth_links })\n\n#\n# Try to sign in using the 'alice' provider, then connect alice's\n# account manually, and try again.  Make some mistakes along the way.\n#\n\nstate = start_externalauth(\"alice\")\n\n# Try with the wrong state.\nfrontend.page(\n    \"oauth/alice\",\n    params={ \"state\": \"not the right state\",\n             \"code\": \"irrelevant\" },\n    expect={ \"message\": testing.expect.message(\"Authentication failed\",\n                                               \"Invalid OAuth state\",\n                                               body_equal=re.search) })\nexpect_system_mail(\n    \"wsgi: InvalidRequest: Invalid OAuth state: not the right state\")\n\n# Try with the wrong code (the right code is always \"correct\".)\nfrontend.page(\n    \"oauth/alice\",\n    params={ \"state\": state,\n             \"code\": \"incorrect\" },\n    expect={ \"message\": testing.expect.message(\"Authentication failed\",\n                                               \"Incorrect code\",\n                                               body_equal=re.search) })\nexpect_system_mail(\"wsgi: Failure: Incorrect code\")\n\nredirect_url = finish_externalauth(\"alice\", state)\nmessage_check = testing.expect.message(\"User registration not enabled\", None)\n\nfrontend.page(\n    redirect_url,\n    expect={ \"message\": message_check })\n\n# Connect the account manually.\ninstance.criticctl([\"connect\",\n                    \"--name\", \"alice\",\n                    \"--provider\", \"alice\",\n                    \"--account\", \"account-alice\"])\n\n# Sign in for real now.\nstate = start_externalauth(\"alice\")\n\nfrontend.collect_session_cookie()\n\nredirect_url = finish_externalauth(\"alice\", state)\ntesting.expect.check(\"/\", redirect_url)\n\nwith frontend.cookie_session(signout):\n    document_title_check = testing.expect.document_title(\n        \"Alice von Testing's Home\")\n\n    frontend.page(\n        \"home\",\n        expect={ \"document title\": document_title_check })\n\n#\n# Create user 'carol' by signing in using the 'carol' provider.\n#\n\nstate = start_externalauth(\"carol\")\nredirect_url = finish_externalauth(\"carol\", state)\n\ntesting.expect.check(\"/createuser?\",\n                     redirect_url,\n                     equal=isprefix)\n\nparsed_url = urlparse.urlparse(redirect_url)\nparsed_query = urlparse.parse_qs(parsed_url.query)\n\ntesting.expect.check([\"carol\"], parsed_query.get(\"provider\"))\ntesting.expect.check([\"account-carol\"], parsed_query.get(\"account\"))\ntesting.expect.check(1, len(parsed_query.get(\"token\")))\ntesting.expect.check([\"/\"], parsed_query.get(\"target\"))\ntesting.expect.check([\"carol\"], parsed_query.get(\"username\"))\ntesting.expect.check([\"carol@example.org\"], parsed_query.get(\"email\"))\ntesting.expect.check([\"Carol von Testing\"], parsed_query.get(\"fullname\"))\n\ntoken = parsed_query.get(\"token\")[0]\n\n# Try with wrong account name.\nfrontend.operation(\n    \"registeruser\",\n    data={ \"username\": \"carol\",\n           \"fullname\": \"Carol von Testing\",\n           \"email\": \"carol@example.org\",\n           \"external\": { \"provider\": \"carol\",\n                         \"account\": \"wrong-carol\",\n                         \"token\": token }},\n    expect={ \"message\": \"Invalid external authentication state.\" })\n\n# Try with wrong token.\nfrontend.operation(\n    \"registeruser\",\n    data={ \"username\": \"carol\",\n           \"fullname\": \"Carol von Testing\",\n           \"email\": \"carol@example.org\",\n           \"external\": { \"provider\": \"carol\",\n                         \"account\": \"account-carol\",\n                         \"token\": \"wrong token\" }},\n    expect={ \"message\": \"Invalid external authentication state.\" })\n\nfrontend.collect_session_cookie()\n\n# Use right account and token.  This should leave us signed in as carol.\nfrontend.operation(\n    \"registeruser\",\n    data={ \"username\": \"carol\",\n           \"fullname\": \"Carol von Testing\",\n           \"email\": \"carol@example.org\",\n           \"external\": { \"provider\": \"carol\",\n                         \"account\": \"account-carol\",\n                         \"token\": token }})\n\ninstance.registeruser(\"carol\")\n\nwith frontend.cookie_session(signout):\n    # Check that the email address isn't unverified.\n    def email_not_unverified(document):\n        address = document.find(attrs=with_class(\"address\"))\n        if address.find(attrs=with_class(\"unverified\")):\n            testing.expect.check(\"<carol's email is not unverified>\",\n                                 \"<carol's email is unverified>\")\n\n    document_title_check = testing.expect.document_title(\n        \"Carol von Testing's Home\")\n\n    frontend.page(\n        \"home\",\n        expect={ \"document title\": document_title_check,\n                 \"email not unverified\": email_not_unverified })\n\n    expect_system_mail(\"wsgi[registeruser]: User 'carol' registered\")\n\n#\n# Create user 'felix' by signin in using the 'felix' provider, which\n# has 'bypass_createuser' set, so this will be quick.\n#\n\nstate = start_externalauth(\"felix\")\n\nfrontend.collect_session_cookie()\n\nredirect_url = finish_externalauth(\"felix\", state)\n\ninstance.registeruser(\"felix\")\n\nwith frontend.cookie_session(signout):\n    document_title_check = testing.expect.document_title(\n        \"Felix von Testing's Home\")\n\n    frontend.page(\n        \"home\",\n        expect={ \"document title\": document_title_check,\n                 \"email not unverified\": email_not_unverified })\n\n    expect_system_mail(\"wsgi[oauth/felix]: User 'felix' registered\")\n\n#\n# Create user 'gina' by signin in using the 'gina' provider, which\n# has 'verify_email_addresses' set.\n#\n\nstate = start_externalauth(\"gina\")\nredirect_url = finish_externalauth(\"gina\", state)\n\ntesting.expect.check(\"/createuser?\",\n                     redirect_url,\n                     equal=isprefix)\n\nparsed_url = urlparse.urlparse(redirect_url)\nparsed_query = urlparse.parse_qs(parsed_url.query)\ntoken = parsed_query.get(\"token\")[0]\n\nfrontend.collect_session_cookie()\n\n# Use right account and token.  This should leave us signed in as carol.\nfrontend.operation(\n    \"registeruser\",\n    data={ \"username\": \"gina\",\n           \"fullname\": \"Gina von Testing\",\n           \"email\": \"gina@example.org\",\n           \"external\": { \"provider\": \"gina\",\n                         \"account\": \"account-gina\",\n                         \"token\": token }})\n\ninstance.registeruser(\"gina\")\n\nwith frontend.cookie_session(signout):\n    # Check that the email address is unverified.\n    def email_unverified(document):\n        address = document.find(attrs=with_class(\"address\"))\n        if not address.find(attrs=with_class(\"unverified\")):\n            testing.expect.check(\"<carol's email unverified>\",\n                                 \"<carol's email is not unverified>\")\n\n    document_title_check = testing.expect.document_title(\n        \"Gina von Testing's Home\")\n\n    frontend.page(\n        \"home\",\n        expect={ \"document title\": document_title_check,\n                 \"email unverified\": email_unverified })\n\n    expect_system_mail(\"wsgi[registeruser]: User 'gina' registered\")\n\n    subject = r\"\\[Critic\\] Please verify your email: gina@example\\.org\"\n\n    mailbox.pop(accept=[testing.mailbox.ToRecipient(\"gina@example.org\"),\n                        testing.mailbox.WithSubject(subject)])\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/004-mixed/004-password.py",
    "content": "# Create user 'iris' with no password.\n\ninstance.criticctl([\"adduser\",\n                    \"--name\", \"iris\",\n                    \"--email\", \"iris@example.org\",\n                    \"--fullname\", \"'Iris von Testing'\",\n                    \"--no-password\"])\n\ninstance.registeruser(\"iris\")\n\nwith_class = testing.expect.with_class\n\ndef check_password_ui(expected_value, expected_action):\n    def check(document):\n        row = document.find(\"tr\", attrs=with_class(\"password\"))\n        cell = row.find(\"td\", attrs=with_class(\"value\"))\n        button = cell.find(\"button\")\n\n        testing.expect.check(expected_value, cell.contents[0])\n        testing.expect.check(\n            expected_action, button.string if button else \"(no action)\")\n    return check\n\nwith frontend.signin(\"alice\"):\n    frontend.page(\n        \"home\",\n        params={ \"user\": \"iris\" },\n        expect={ \"password UI\": check_password_ui(\"not set\", \"(no action)\") })\n\nwith frontend.signin():\n    frontend.page(\n        \"home\",\n        params={ \"user\": \"iris\",\n                 \"readonly\": \"no\" },\n        expect={ \"password UI\": check_password_ui(\"not set\", \"Set password\") })\n\n    frontend.operation(\n        \"changepassword\",\n        data={ \"subject\": \"iris\",\n               \"new_pw\": \"testing\" })\n\n    frontend.page(\n        \"home\",\n        params={ \"user\": \"iris\",\n                 \"readonly\": \"no\" },\n        expect={ \"password UI\": check_password_ui(\"****\", \"Set password\") })\n\nwith frontend.signin(\"alice\"):\n    frontend.page(\n        \"home\",\n        params={ \"user\": \"iris\" },\n        expect={ \"password UI\": check_password_ui(\"****\", \"(no action)\") })\n\nwith frontend.signin(\"iris\"):\n    frontend.page(\n        \"home\",\n        expect={ \"password UI\": check_password_ui(\"****\", \"Change password\") })\n\n    frontend.operation(\n        \"changepassword\",\n        data={ \"subject\": \"alice\",\n               \"new_pw\": \"custom\" },\n        expect={ \"status\": \"failure\",\n                 \"code\": \"notallowed\" })\n\n    frontend.operation(\n        \"changepassword\",\n        data={ \"subject\": \"iris\",\n               \"new_pw\": \"custom\" },\n        expect={ \"status\": \"failure\",\n                 \"message\": \"The provided current password is not correct.\" })\n\n    frontend.operation(\n        \"changepassword\",\n        data={ \"subject\": \"iris\",\n               \"current_pw\": \"wrong\",\n               \"new_pw\": \"custom\" },\n        expect={ \"status\": \"failure\",\n                 \"message\": \"The provided current password is not correct.\" })\n\n    frontend.operation(\n        \"changepassword\",\n        data={ \"subject\": \"iris\",\n               \"current_pw\": \"testing\",\n               \"new_pw\": \"custom\" })\n\n    frontend.page(\n        \"home\",\n        expect={ \"password UI\": check_password_ui(\"****\", \"Change password\") })\n\nwith frontend.signin(\"iris\", \"custom\"):\n    instance.criticctl([\"passwd\",\n                        \"--name\", \"iris\", \"--no-password\"])\n\n    frontend.page(\n        \"home\",\n        expect={ \"password UI\": check_password_ui(\"not set\", \"Set password\") })\n\n    frontend.operation(\n        \"changepassword\",\n        data={ \"subject\": \"iris\",\n               \"current_pw\": \"wrong\",\n               \"new_pw\": \"testing\" },\n        expect={ \"status\": \"failure\",\n                 \"message\": \"The provided current password is not correct.\" })\n\n    frontend.operation(\n        \"changepassword\",\n        data={ \"subject\": \"iris\",\n               \"new_pw\": \"testing\" })\n\n    frontend.page(\n        \"home\",\n        expect={ \"password UI\": check_password_ui(\"****\", \"Change password\") })\n\ninstance.criticctl([\"passwd\",\n                    \"--name\", \"iris\", \"--password\", \"other\"])\n\nwith frontend.signin(\"iris\", \"other\"):\n    frontend.page(\n        \"home\",\n        expect={ \"password UI\": check_password_ui(\"****\", \"Change password\") })\n\n# Try changing admin's password too.\n\nwith frontend.signin():\n    frontend.page(\n        \"home\",\n        expect={ \"password UI\": check_password_ui(\"****\", \"Change password\") })\n\n    frontend.operation(\n        \"changepassword\",\n        data={ \"new_pw\": \"custom\" },\n        expect={ \"status\": \"failure\",\n                 \"message\": \"The provided current password is not correct.\" })\n\n    frontend.operation(\n        \"changepassword\",\n        data={ \"current_pw\": \"wrong\",\n               \"new_pw\": \"custom\" },\n        expect={ \"status\": \"failure\",\n                 \"message\": \"The provided current password is not correct.\" })\n\n    frontend.operation(\n        \"changepassword\",\n        data={ \"current_pw\": \"testing\",\n               \"new_pw\": \"custom\" })\n\n    frontend.page(\n        \"home\",\n        expect={ \"password UI\": check_password_ui(\"****\", \"Change password\") })\n\n    # Better change it back again, or we'd break lots of following tests...\n\n    frontend.operation(\n        \"changepassword\",\n        data={ \"current_pw\": \"custom\",\n               \"new_pw\": \"testing\" })\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/004-mixed/005-accesstoken.py",
    "content": "# Sign in and create an access token.\nwith frontend.signin(\"alice\"):\n    check_user(alice)\n\n    access_token = frontend.json(\n        \"users/%d/accesstokens\" % alice.id,\n        post={ \"title\": \"005-accesstoken\" },\n        expect={\n            \"id\": int,\n            \"access_type\": \"user\",\n            \"user\": alice.id,\n            \"title\": \"005-accesstoken\",\n            \"part1\": str,\n            \"part2\": str,\n            \"profile\": {\n                \"http\": { \"rule\": \"allow\",\n                          \"exceptions\": [] },\n                \"repositories\": { \"rule\": \"allow\",\n                                  \"exceptions\": [] },\n                \"extensions\": { \"rule\": \"allow\",\n                                \"exceptions\": [] }\n            }\n        })\n\n    token_id = access_token[\"id\"]\n    username = access_token[\"part1\"]\n    password = access_token[\"part2\"]\n\n    # Get the access token and its components.\n    frontend.json(\n        \"accesstokens/%d\" % token_id,\n        expect=access_token)\n    frontend.json(\n        \"users/me/accesstokens/%d\" % token_id,\n        expect=access_token)\n    frontend.json(\n        \"accesstokens/%d/profile\" % token_id,\n        expect={\n            \"profile\": access_token[\"profile\"]\n        })\n    frontend.json(\n        \"accesstokens/%d/profile/http\" % token_id,\n        expect={\n            \"profile/http\": access_token[\"profile\"][\"http\"]\n        })\n    frontend.json(\n        \"accesstokens/%d/profile/repositories\" % token_id,\n        expect={\n            \"profile/repositories\": access_token[\"profile\"][\"repositories\"]\n        })\n    frontend.json(\n        \"accesstokens/%d/profile/extensions\" % token_id,\n        expect={\n            \"profile/extensions\": access_token[\"profile\"][\"extensions\"]\n        })\n\ncheck_user(anonymous)\n\n# Check that Alice can't authenticate using the token via the regular login\n# page.\nfrontend.validatelogin(username, password, expect_failure=\"Invalid username\")\n\ncheck_user(anonymous)\n\n# Check that Alice can authenticate using the token and HTTP authentication.\nwith frontend.signin(access_token=access_token):\n    check_user(alice, \"accesstoken\")\n\ncheck_user(anonymous)\n\n# Check that Bob can't access Alice's access tokens.\nwith frontend.signin(\"bob\"):\n    frontend.json(\n        \"users/%d/accesstokens\" % alice.id,\n        expected_http_status=403)\n    frontend.json(\n        \"users/%d/accesstokens/%d\" % (alice.id, token_id),\n        expected_http_status=403)\n    frontend.json(\n        \"accesstokens\",\n        expected_http_status=403)\n    frontend.json(\n        \"accesstokens/%d\" % token_id,\n        expected_http_status=403)\n\n# Check that an administrator can access Alice's access tokens.\nwith frontend.signin():\n    frontend.json(\n        \"users/%d/accesstokens\" % alice.id,\n        expect={\n            \"accesstokens\": [access_token]\n        })\n    frontend.json(\n        \"users/%d/accesstokens/%d\" % (alice.id, token_id),\n        expect=access_token)\n    frontend.json(\n        \"accesstokens\",\n        expect={\n            \"accesstokens\": [access_token]\n        })\n    frontend.json(\n        \"accesstokens/%d\" % token_id,\n        expect=access_token)\n\ncheck_user(anonymous)\n\n# Sign in and delete the access token.\nwith frontend.signin(\"alice\"):\n    check_user(alice)\n\n    frontend.json(\n        \"users/%d/accesstokens/%d\" % (alice.id, token_id),\n        delete=True,\n        expected_http_status=204)\n\ncheck_user(anonymous)\n\n# Check that Alice can no longer authenticate using the token and HTTP\n# authentication.\nwith frontend.signin(access_token=access_token):\n    # Using invalid HTTP authentication should trigger a 401 Unauthorized (and\n    # not lead to anonymous access.)\n    frontend.page(\n        \"tutorial\",\n        expected_http_status=401)\n\ncheck_user(anonymous)\n\n# Sign in as admin and create an access token for anonymous access.\nwith frontend.signin():\n    check_user(admin)\n\n    access_token = frontend.json(\n        \"accesstokens\",\n        post={\n            \"access_type\": \"anonymous\",\n            \"title\": \"005-accesstoken (anonymous)\"\n        },\n        expect={\n            \"id\": int,\n            \"access_type\": \"anonymous\",\n            \"user\": None,\n            \"title\": \"005-accesstoken (anonymous)\",\n            \"part1\": str,\n            \"part2\": str,\n            \"profile\": {\n                \"http\": { \"rule\": \"allow\",\n                          \"exceptions\": [] },\n                \"repositories\": { \"rule\": \"allow\",\n                                  \"exceptions\": [] },\n                \"extensions\": { \"rule\": \"allow\",\n                                \"exceptions\": [] }\n            }\n        })\n\n    token_id = access_token[\"id\"]\n\ncheck_user(anonymous)\n\n# Check that we can authenticate using the token and HTTP authentication, and\n# that we're then anonymous.\n#\n# This is somewhat silly; we were anonymous before, so it's difficult to know if\n# authentication succeeded or not.  This kind of access token is mostly useful\n# in a system that doesn't otherwise allow anonymous access.\nwith frontend.signin(access_token=access_token):\n    check_user(anonymous, \"accesstoken\")\n\ncheck_user(anonymous)\n\n# Sign in and delete the access token.\nwith frontend.signin():\n    check_user(admin)\n\n    access_token = frontend.json(\n        \"accesstokens/%d\" % token_id,\n        delete=True,\n        expected_http_status=204)\n\n# Check that Alice (not an administrator) can't create an anonymous token.\nwith frontend.signin(\"alice\"):\n    check_user(alice)\n\n    access_token = frontend.json(\n        \"accesstokens\",\n        post={\n            \"access_type\": \"anonymous\"\n        },\n        expected_http_status=403,\n        expect={\n            \"error\": {\n                \"title\": \"Permission denied\",\n                \"message\": \"Must be an administrator\"\n            }\n        })\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/004-mixed/006-accesscontrol-http.py",
    "content": "# Create an access token, and restrict it to not allow loading /home.\nwith frontend.signin(\"alice\"):\n    access_token = frontend.json(\n        \"users/me/accesstokens\",\n        post={ \"title\": \"token #1 for 006-accesscontrol-http.py\" },\n        expect={\n            \"profile\": {\n                \"http\": {\n                    \"rule\": \"allow\",\n                    \"exceptions\": []\n                },\n                \"repositories\": {\n                    \"rule\": \"allow\",\n                    \"exceptions\": []\n                },\n                \"extensions\": {\n                    \"rule\": \"allow\",\n                    \"exceptions\": []\n                }\n            },\n            # use lenient checking\n            \"*\": \"*\"\n        })\n\n    frontend.json(\n        \"users/me/accesstokens/%d/profile/http\" % access_token[\"id\"],\n        put={\n            \"exceptions\": [\n                {\n                    \"path_pattern\": \"home\"\n                }\n            ]\n        },\n        expect={\n            \"profile/http\": {\n                \"rule\": \"allow\",\n                \"exceptions\": [\n                    {\n                        \"id\": int,\n                        \"request_method\": None,\n                        \"path_pattern\": \"home\"\n                    }\n                ]\n            }\n        })\n\n    # Just to make sure: check that Alice can (still) access /home when\n    # authenticating normally.\n    frontend.page(\"home\")\n\nwith frontend.signin(access_token=access_token):\n    # /home should now return \"403 Forbidden\".\n    frontend.page(\n        \"home\",\n        expected_http_status=403,\n        expect={\n            \"message_title\": testing.expect.message(\n                u\"Access denied\",\n                u\"Access denied: GET /home\")\n        })\n\n    # A POST request should also return \"403 Forbidden\" (even though it wouldn't\n    # have worked anyway.)\n    frontend.page(\n        \"home\",\n        post=\"\",\n        expected_http_status=403,\n        expect={\n            \"message_title\": testing.expect.message(\n                u\"Access denied\",\n                u\"Access denied: POST /home\")\n        })\n\n    # /dashboard should still work, of course.\n    frontend.page(\n        \"dashboard\",\n        expect={\n            \"document_title\": testing.expect.document_title(u\"Dashboard\"),\n            \"script_user\": testing.expect.script_user(instance.user(\"alice\"))\n        })\n\n# Update the access token to deny all requests except \"GET /home\" instead.\nwith frontend.signin(\"alice\"):\n    result = frontend.json(\n        \"users/me/accesstokens/%d/profile\" % access_token[\"id\"],\n        put={\n            \"http\": {\n                \"rule\": \"deny\",\n                \"exceptions\": [\n                    {\n                        \"request_method\": \"GET\",\n                        \"path_pattern\": \"home\"\n                    }\n                ]\n            }\n        },\n        expect={\n            \"profile\": {\n                \"http\": {\n                    \"rule\": \"deny\",\n                    \"exceptions\": [\n                        {\n                            \"id\": int,\n                            \"request_method\": \"GET\",\n                            \"path_pattern\": \"home\"\n                        }\n                    ]\n                },\n                \"repositories\": {\n                    \"rule\": \"allow\",\n                    \"exceptions\": []\n                },\n                \"extensions\": {\n                    \"rule\": \"allow\",\n                    \"exceptions\": []\n                }\n            }\n        })\n\n    home_exception = result[\"profile\"][\"http\"][\"exceptions\"][0]\n\nwith frontend.signin(access_token=access_token):\n    # /home should now be allowed.\n    frontend.page(\n        \"home\",\n        expect={\n            \"document_title\": testing.expect.document_title(u\"Alice von Testing's Home\"),\n            \"script_user\": testing.expect.script_user(instance.user(\"alice\"))\n        })\n\n    # A POST request should still return \"403 Forbidden\" though.\n    frontend.page(\n        \"home\",\n        post=\"\",\n        expected_http_status=403,\n        expect={\n            \"message_title\": testing.expect.message(\n                u\"Access denied\",\n                u\"Access denied: POST /home\")\n        })\n\n    # /dashboard should no longer work.\n    frontend.page(\n        \"dashboard\",\n        expected_http_status=403,\n        expect={\n            \"message_title\": testing.expect.message(\n                u\"Access denied\",\n                u\"Access denied: GET /dashboard\")\n        })\n\n# Update the access token to also allow access to \"GET /dashboard\".\nwith frontend.signin(\"alice\"):\n    frontend.json(\n        \"users/me/accesstokens/%d/profile/http/exceptions\" % access_token[\"id\"],\n        post={\n            \"request_method\": \"GET\",\n            \"path_pattern\": \"dashboard\"\n        },\n        expect={\n            \"profile/http/exceptions\": [\n                {\n                    \"id\": int,\n                    \"request_method\": \"GET\",\n                    \"path_pattern\": \"home\"\n                },\n                {\n                    \"id\": int,\n                    \"request_method\": \"GET\",\n                    \"path_pattern\": \"dashboard\"\n                }\n            ]\n        })\n\nwith frontend.signin(access_token=access_token):\n    # /dashboard should now work again.\n    frontend.page(\n        \"dashboard\",\n        expect={\n            \"document_title\": testing.expect.document_title(u\"Dashboard\"),\n            \"script_user\": testing.expect.script_user(instance.user(\"alice\"))\n        })\n\n# Update the access token by deleting the /home exception.\nwith frontend.signin(\"alice\"):\n    frontend.json(\n        (\"users/me/accesstokens/%d/profile/http/exceptions/%d\"\n         % (access_token[\"id\"], home_exception[\"id\"])),\n        delete=True,\n        expect={\n            \"profile\": {\n                \"http\": {\n                    \"rule\": \"deny\",\n                    \"exceptions\": [\n                        {\n                            \"id\": int,\n                            \"request_method\": \"GET\",\n                            \"path_pattern\": \"dashboard\"\n                        }\n                    ]\n                },\n                \"repositories\": {\n                    \"rule\": \"allow\",\n                    \"exceptions\": []\n                },\n                \"extensions\": {\n                    \"rule\": \"allow\",\n                    \"exceptions\": []\n                }\n            },\n            # use lenient checking\n            \"*\": \"*\"\n        })\n\nwith frontend.signin(access_token=access_token):\n    # /home should now return \"403 Forbidden\" again.\n    frontend.page(\n        \"home\",\n        expected_http_status=403,\n        expect={\n            \"message_title\": testing.expect.message(\n                u\"Access denied\",\n                u\"Access denied: GET /home\")\n        })\n\n    # /dashboard should still work.\n    frontend.page(\n        \"dashboard\",\n        expect={\n            \"document_title\": testing.expect.document_title(u\"Dashboard\"),\n            \"script_user\": testing.expect.script_user(instance.user(\"alice\"))\n        })\n\n# Create an access token for anonymous access, and restrict it to not allow\n# loading /branches.\nwith frontend.signin():\n    access_token = frontend.json(\n        \"accesstokens\",\n        post={\n            \"title\": \"token #2 for 006-accesscontrol-http.py\",\n            \"access_type\": \"anonymous\",\n            \"profile\": {\n                \"http\": {\n                    \"rule\": \"allow\",\n                    \"exceptions\": [{\n                        \"path_pattern\": \"branches\"\n                    }]\n                }\n            },\n        })\n\nwith frontend.signin(access_token=access_token):\n    check_user(anonymous, \"accesstoken\")\n\n    # /branches should now return \"403 Forbidden\".\n    frontend.page(\n        \"branches\",\n        expected_http_status=403,\n        expect={\n            \"message_title\": testing.expect.message(\n                u\"Access denied\",\n                u\"Access denied: GET /branches\")\n        })\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/004-mixed/007-json-session.py",
    "content": "frontend.json(\n    \"sessions/current\",\n    expect={\n        \"user\": None,\n        \"type\": None,\n        \"fields\": [{\n            \"identifier\": \"username\",\n            \"label\": \"Username:\",\n            \"hidden\": False,\n            \"description\": None\n        }, {\n            \"identifier\": \"password\",\n            \"label\": \"Password:\",\n            \"hidden\": True,\n            \"description\": None\n        }]\n    })\n\n# Sign in as alice.\nwith frontend.signin(\"alice\", use_json_api=True):\n    check_user(alice)\n\n    # Sign out prematurely, just to make sure the signout actually works as\n    # expected.\n    #\n    # Exiting the frontend.signin() scope will do the same, but will cause the\n    # session cookie to be \"deleted\" regardless of what the server does, so\n    # could hide signout failures.\n    frontend.json(\n        \"sessions/current\",\n        delete=True,\n        expected_http_status=204)\n\n    check_user(anonymous)\n\nwith testing.utils.access_token(\"alice\", profile={}) as access_token:\n    with frontend.signin(access_token=access_token):\n        check_user(alice, \"accesstoken\")\n\nfrontend.json(\n    \"sessions\",\n    expected_http_status=400,\n    expect={\n        \"error\": {\n            \"title\": \"Invalid API request\",\n            \"message\": \"Resource requires an argument: v1/sessions\"\n        }\n    })\n\nfrontend.json(\n    \"sessions/invalid\",\n    expected_http_status=400,\n    expect={\n        \"error\": {\n            \"title\": \"Invalid API request\",\n            \"message\": 'Resource argument must be \"current\"'\n        }\n    })\n\nfrontend.json(\n    \"sessions\",\n    post={\n        \"username\": \"alice\",\n        \"password\": \"wrong\"\n    },\n    expected_http_status=403,\n    expect={\n        \"error\": {\n            \"title\": \"Session error\",\n            \"message\": \"Wrong password\"\n        }\n    })\n\nfrontend.json(\n    \"sessions\",\n    post={\n        \"username\": \"bobsicle\",\n        \"password\": \"testing\"\n    },\n    expected_http_status=403,\n    expect={\n        \"error\": {\n            \"title\": \"Session error\",\n            \"message\": \"Invalid username\"\n        }\n    })\n"
  },
  {
    "path": "testing/tests/001-main/001-empty/004-mixed/__init__.py",
    "content": "def check_user(user, session_type=False):\n    frontend.page(\n        \"dashboard\",\n        expect={\n            \"script_user\": testing.expect.script_user(user)\n        })\n\n    if session_type is False:\n        if user.id is None:\n            # Anonymous user.\n            session_type = None\n        else:\n            session_type = \"normal\"\n\n    frontend.json(\n        \"sessions/current\",\n        params={ \"fields\": \"user,type\" },\n        expect={\n            \"user\": user.id,\n            \"type\": session_type\n        })\n\nanonymous = testing.User.anonymous()\nalice = instance.user(\"alice\")\nadmin = instance.user(\"admin\")\n\ncheck_user(anonymous)\n"
  },
  {
    "path": "testing/tests/001-main/002-createrepository.py",
    "content": "import time\n\ndef check_repository(document):\n    rows = document.findAll(\"tr\", attrs=testing.expect.with_class(\"repository\"))\n    testing.expect.check(1, len(rows))\n\n    def check_cell(row, class_name, expected_string, inline_element_type=None):\n        cells = row.findAll(\"td\", attrs=testing.expect.with_class(class_name))\n        testing.expect.check(1, len(cells))\n        if inline_element_type:\n            testing.expect.check(1, len(cells[0].findAll(inline_element_type)))\n            string = cells[0].findAll(\"i\")[0].string\n        else:\n            string = cells[0].string\n        if string is None:\n            string = \"\"\n        testing.expect.check(expected_string, string)\n\n    check_cell(rows[0], \"name\", \"critic\")\n    check_cell(rows[0], \"location\", \"http://%s/critic.git\" % instance.hostname)\n    check_cell(rows[0], \"upstream\", \"&nbsp;\")\n\n    rows = document.findAll(\"tr\", attrs=testing.expect.with_class(\"details\"))\n    testing.expect.check(1, len(rows))\n\n    tables = rows[0].findAll(\"table\", attrs=testing.expect.with_class(\"trackedbranches\"))\n    testing.expect.check(1, len(tables))\n\n    # Would like to use 'tables[0].findAll()' here, but BeautifulSoup apparently\n    # doesn't parse nested tables correctly, so these rows aren't actually part\n    # of the 'trackedbranches' table according to it.\n    rows = document.findAll(\"tr\", attrs=testing.expect.with_class(\"branch\"))\n    testing.expect.check(2, len(rows))\n\n    check_cell(rows[0], \"localname\", \"Tags\", inline_element_type=\"i\")\n    check_cell(rows[0], \"remote\", repository.url)\n    check_cell(rows[0], \"remotename\", \"N/A\", inline_element_type=\"i\")\n    check_cell(rows[0], \"enabled\", \"Yes\")\n    check_cell(rows[0], \"users\", \"\")\n\n    check_cell(rows[1], \"localname\", \"master\")\n    check_cell(rows[1], \"remote\", repository.url)\n    check_cell(rows[1], \"remotename\", \"master\")\n    check_cell(rows[1], \"enabled\", \"Yes\")\n    check_cell(rows[1], \"users\", \"\")\n\nwith frontend.signin():\n    # Check that this URL isn't handled already.  We're using it later to detect\n    # that the repository has been created and the tracked branch fetched, and\n    # if it's already handled for some reason, that check won't be reliable.\n    frontend.page(\"critic/master\", expected_http_status=404)\n\n    frontend.operation(\"addrepository\",\n                       data={ \"name\": \"critic\",\n                              \"path\": \"critic\",\n                              \"mirror\": { \"remote_url\": repository.url,\n                                          \"remote_branch\": \"master\",\n                                          \"local_branch\": \"master\" } })\n\n    instance.synchronize_service(\"branchtracker\")\n\n    with repository.workcopy(empty=True) as work:\n        REMOTE_URL = instance.repository_url(\"alice\")\n\n        try:\n            work.run(\n                [\"ls-remote\", \"--exit-code\", REMOTE_URL, \"refs/heads/master\"])\n        except testing.repository.GitCommandError:\n            logger.error(\"Repository main branch ('refs/heads/master') \"\n                         \"not fetched as expected.\")\n            raise testing.TestFailure\n\n    # Check that /repositories still loads correctly now that there's a\n    # repository in the system.\n    frontend.page(\n        \"repositories\",\n        expect={ \"document_title\": testing.expect.document_title(u\"Repositories\"),\n                 \"content_title\": testing.expect.paleyellow_title(0, u\"Repositories\"),\n                 \"repository\": check_repository })\n\n    # Add another repository. This time, without a tracking branch, but we'll\n    # actually push the same branch (IOW our current branch of critic.git) to\n    # it, simply because we don't really have another available with anything\n    # useful in it.\n    frontend.operation(\"addrepository\",\n                       data={ \"name\": \"other\",\n                              \"path\": \"other\" })\n\n    repository.run(\n        [\"push\", instance.repository_url(\"alice\", repository=\"other\"),\n         \"HEAD:refs/heads/master\"])\n\n    frontend.operation(\"addrepository\",\n                       data={ \"name\": \"a\" * 65,\n                              \"path\": \"validpath2\" },\n                       expect={ \"status\": \"failure\",\n                                \"code\": \"paramtoolong:data.name\" })\n\n    frontend.operation(\"addrepository\",\n                       data={ \"name\": \"\",\n                              \"path\": \"validpath1\" },\n                       expect={ \"status\": \"failure\",\n                                \"code\": \"paramtooshort:data.name\" })\n\n    frontend.operation(\"addrepository\",\n                       data={ \"name\": \"a/b\",\n                              \"path\": \"validpath3\" },\n                       expect={ \"status\": \"failure\",\n                                \"code\": \"paramcontainsillegalchar:data.name\",\n                                \"message\": \"invalid input: short name may not contain the character '/'\" })\n\n    frontend.operation(\"addrepository\",\n                       data={ \"name\": \"critic.git\",\n                              \"path\": \"validpath3\" },\n                       expect={ \"status\": \"failure\",\n                                \"code\": \"badsuffix_name\" })\n\n    frontend.operation(\"addrepository\",\n                       data={ \"name\": \"r\",\n                              \"path\": \"validpath\" },\n                       expect={ \"status\": \"failure\",\n                                \"code\": \"invalid_name\" })\n\n    frontend.operation(\"addrepository\",\n                       data={ \"name\": \"validname\",\n                              \"path\": \"\" },\n                       expect={ \"status\": \"failure\",\n                                \"code\": \"paramtooshort:data.path\" })\n"
  },
  {
    "path": "testing/tests/001-main/003-self/001-rulerColumn.py",
    "content": "import re\n\n# This is an arbitrary (and fairly small) commit on master:\nCOMMIT = \"927e2ba833cb0c9ce588b5f59c42bbb246e3e20c\"\n\ndef check_rulerColumn(document):\n    for script in document.findAll(\"script\"):\n        # Ignore external scripts.\n        if script.has_key(\"src\"):\n            continue\n\n        if re.match(r\"var\\s+rulerColumn\\s*=\\s*0;\", script.string):\n            break\n    else:\n        testing.expect.check(\"<rulerColumn script>\",\n                             \"<expected content not found>\")\n\nfrontend.page(\"critic/%s\" % COMMIT,\n              expect={ \"rulerColumn_script\": check_rulerColumn })\n"
  },
  {
    "path": "testing/tests/001-main/003-self/002-emptyfile.py",
    "content": "# This is the commit that adds testing/input/empty.txt.\nCOMMIT = \"47c6cea51af517107c403d96810fce946825aacc\"\n\ndef check_description(document):\n    actual = \"<expected content not found>\"\n\n    for row in document.findAll(\"tr\"):\n        cells = row.findAll(\"td\")\n        if len(cells) >= 2 \\\n                and cells[0].has_key(\"class\") \\\n                and cells[0][\"class\"] == \"path\" \\\n                and cells[0].a \\\n                and cells[0].a.string \\\n                and cells[0].a.string.endswith(\"/empty.txt\") \\\n                and cells[1].i \\\n                and cells[1].i.string:\n            actual = cells[1].i.string\n            break\n\n    testing.expect.check(u\"empty\", actual)\n\nwith frontend.signin():\n    frontend.page(\"showcommit\",\n                  params={ \"repository\": \"critic\",\n                           \"sha1\": COMMIT },\n                  expect={ \"description\": check_description })\n"
  },
  {
    "path": "testing/tests/001-main/003-self/003-binaryfile.py",
    "content": "# This is the commit that adds testing/input/binary.\nCOMMIT = \"47c6cea51af517107c403d96810fce946825aacc\"\n\ndef check_description(document):\n    actual = \"<expected content not found>\"\n\n    for row in document.findAll(\"tr\"):\n        cells = row.findAll(\"td\")\n        if len(cells) >= 2 \\\n                and cells[0].has_key(\"class\") \\\n                and cells[0][\"class\"] == \"path\" \\\n                and cells[0].a \\\n                and cells[0].a.string \\\n                and cells[0].a.string.endswith(\"/binary\") \\\n                and cells[1].i \\\n                and cells[1].i.string:\n            actual = cells[1].i.string\n            break\n\n    testing.expect.check(u\"binary\", actual)\n\nwith frontend.signin():\n    frontend.page(\"showcommit\",\n                  params={ \"repository\": \"critic\",\n                           \"sha1\": COMMIT },\n                  expect={ \"description\": check_description })\n"
  },
  {
    "path": "testing/tests/001-main/003-self/004-createreview.py",
    "content": "# Scenario: Alice creates a review of a single commit with review filters that\n# make Bob a reviewer and Dave a watcher, and then pushes a second commit to\n# that review.\n#\n# Checks: Mostly that this doesn't fail completely, and that the expected mails\n# appear to be sent.\n\nimport re\n\n# Random commit on master:\nCOMMIT_SHA1 = \"f771149aba230c4712c9cb9c6af4ccfea2b7967d\"\nCOMMIT_SUMMARY = \"Minor /dashboard query optimizations\"\n\n# The next commit on master:\nFOLLOWUP_SHA1 = \"e0892183f38932cec0d33408bdfebb290a13f8f3\"\n\ndef check_summary_input(document):\n    input = document.find(\"input\", attrs={ \"id\": \"summary\" })\n\n    if not input:\n        testing.expect.check(\"<review summary input>\",\n                             \"<expected content not found>\")\n\n    testing.expect.check(COMMIT_SUMMARY, input[\"value\"])\n\nwith frontend.signin(\"alice\"):\n    # Loading /createreview first is not really necessary, but might as well try\n    # that as well.\n\n    document = frontend.page(\n        \"createreview\",\n        expect={ \"document_title\": testing.expect.document_title(u\"Create Review\") })\n\n    document = frontend.page(\n        \"createreview\",\n        params={ \"repository\": \"critic\",\n                 \"commits\": COMMIT_SHA1 },\n        expect={ \"document_title\": testing.expect.document_title(u\"Create Review\"),\n                 \"summary_input\": check_summary_input })\n\n    scripts = document.findAll(\"script\")\n\n    for script in scripts:\n        if script.has_key(\"src\"):\n            continue\n        match = re.search(\n            r\"^\\s*var review_data\\s*=\\s*\\{\\s*commit_ids:\\s*\\[\\s*(\\d+)\\s*\\]\",\n            script.string, re.MULTILINE)\n        if match:\n            commit_id = int(match.group(1))\n            break\n    else:\n        testing.expect.check(\"<data script>\",\n                             \"<expected content not found>\")\n\n    result = frontend.operation(\n        \"submitreview\",\n        data={ \"repository_id\": 1,\n               \"commit_ids\": [commit_id],\n               \"branch\": \"r/004-createreview\",\n               \"summary\": COMMIT_SUMMARY,\n               \"applyfilters\": True,\n               \"applyparentfilters\": True,\n               \"reviewfilters\": [{ \"username\": \"bob\",\n                                   \"type\": \"reviewer\",\n                                   \"path\": \"/\" },\n                                 { \"username\": \"dave\",\n                                   \"type\": \"watcher\",\n                                   \"path\": \"/\" },\n                                 { \"username\": \"erin\",\n                                   \"type\": \"watcher\",\n                                   \"path\": \"/\" }],\n               \"recipientfilters\": { \"mode\": \"opt-out\" }},\n        expect={ \"review_id\": 1 })\n\n    def to(name):\n        return testing.mailbox.ToRecipient(\"%s@example.org\" % name)\n\n    def check_initial(mail):\n        testing.expect.check(\"New Review: %s\" % COMMIT_SUMMARY,\n                             mail.header(\"Subject\"))\n        line = \"Commit: %s\" % COMMIT_SHA1\n        if line not in to_alice.lines:\n            testing.expect.check(\"<%r line>\" % line,\n                                 \"<expected content not found>\")\n\n    to_alice = mailbox.pop(accept=to(\"alice\"))\n    check_initial(to_alice)\n    testing.expect.check(\"owner\",\n                         to_alice.header(\"OperaCritic-Association\"))\n\n    to_bob = mailbox.pop(accept=to(\"bob\"))\n    check_initial(to_bob)\n    testing.expect.check(\"reviewer\",\n                         to_bob.header(\"OperaCritic-Association\"))\n\n    to_dave = mailbox.pop(accept=to(\"dave\"))\n    check_initial(to_dave)\n    testing.expect.check(\"watcher\",\n                         to_dave.header(\"OperaCritic-Association\"))\n\n    to_erin = mailbox.pop(accept=to(\"erin\"))\n    check_initial(to_erin)\n    testing.expect.check(\"watcher\",\n                         to_erin.header(\"OperaCritic-Association\"))\n\n    mailbox.check_empty()\n\n    with repository.workcopy() as work:\n        work.run([\"checkout\", \"-q\", \"-b\", \"r/004-createreview\", COMMIT_SHA1])\n        work.run([\"cherry-pick\", FOLLOWUP_SHA1])\n\n        followup_sha1 = work.run([\"rev-parse\", \"HEAD\"]).strip()\n\n        SETTINGS = { \"email.subjectLine.updatedReview.commitsPushed\": \"\" }\n\n        with testing.utils.settings(\"erin\", SETTINGS):\n            work.run(\n                [\"push\", \"-q\", instance.repository_url(\"alice\"),\n                 \"HEAD:refs/heads/r/004-createreview\"])\n\n    def check_followup(mail):\n        testing.expect.check(\"Updated Review: %s\" % COMMIT_SUMMARY,\n                             mail.header(\"Subject\"))\n        line = \"Commit: %s\" % followup_sha1\n        if line not in mail.lines:\n            testing.expect.check(\"<%r line>\" % line,\n                                 \"<expected content not found>\")\n\n    to_alice = mailbox.pop(accept=to(\"alice\"))\n    check_followup(to_alice)\n\n    to_bob = mailbox.pop(accept=to(\"bob\"))\n    check_followup(to_bob)\n\n    to_dave = mailbox.pop(accept=to(\"dave\"))\n    check_followup(to_dave)\n\n    # Note: Erin is a watcher too, but because of the empty subject line\n    # preference set above, she shouldn't receive this email.\n\n    mailbox.check_empty()\n"
  },
  {
    "path": "testing/tests/001-main/003-self/004-first-review-created/001-addreviewfilters-bogus.py",
    "content": "INVALID_USER_ID = 0\n\nwith frontend.signin(\"alice\"):\n    frontend.operation(\n        \"addreviewfilters\",\n        data={ \"review_id\": 1,\n               \"filters\": [{ \"type\": \"watcher\",\n                             \"user_ids\": [INVALID_USER_ID],\n                             \"paths\": [\"/\"] }] },\n        expect={ \"status\": \"failure\",\n                 \"code\": \"invaliduserid\" })\n"
  },
  {
    "path": "testing/tests/001-main/003-self/004-first-review-created/002-review-archival.py",
    "content": "with repository.workcopy() as work, frontend.signin(\"alice\"):\n    REMOTE_URL = instance.repository_url(\"alice\")\n\n    def assert_branch_state(archived):\n        # Check that the branch is or isn't flagged as archived on the review\n        # front-page.\n        document = frontend.page(\"r/1\")\n        basic = testing.expect.find_paleyellow(document, 0)\n        branch = basic.find(attrs=testing.expect.with_class(\"branch\"))\n        actual = \"archived\" in branch[\"class\"].split()\n        testing.expect.check(archived, actual)\n        branch_name = testing.expect.extract_text(branch)\n\n        # Also check that the branch's ref exists or doesn't exist in the\n        # repository.\n        expected = \"ref is missing\" if archived else \"ref is present\"\n        try:\n            work.run([\"ls-remote\", \"--exit-code\", REMOTE_URL,\n                      \"refs/heads/\" + branch_name])\n            actual = \"ref is present\"\n        except testing.repository.GitCommandError:\n            actual = \"ref is missing\"\n        testing.expect.check(expected, actual)\n\n    # Check that the branch isn't already archived (no reason whatsoever it\n    # should be.)\n    assert_branch_state(archived=False)\n\n    # Check that the operations fail as expected on an open review whose branch\n    # is not already archived.\n    frontend.operation(\n        \"archivebranch\",\n        data={ \"review_id\": 1 },\n        expect={ \"status\": \"failure\",\n                 \"code\": \"invalidstate\" })\n    frontend.operation(\n        \"resurrectbranch\",\n        data={ \"review_id\": 1 },\n        expect={ \"status\": \"failure\",\n                 \"code\": \"invalidstate\" })\n\n    # Drop the review, archive the branch, and check that it became archived.\n    frontend.operation(\n        \"dropreview\",\n        data={ \"review_id\": 1 })\n    frontend.operation(\n        \"archivebranch\",\n        data={ \"review_id\": 1 })\n    assert_branch_state(archived=True)\n\n    # Check that this operation now fails.\n    frontend.operation(\n        \"archivebranch\",\n        data={ \"review_id\": 1 },\n        expect={ \"status\": \"failure\",\n                 \"code\": \"invalidstate\" })\n\n    # Resurrect the branch and check that it becomes not archived again.\n    frontend.operation(\n        \"resurrectbranch\",\n        data={ \"review_id\": 1 })\n    assert_branch_state(archived=False)\n\n    # Schedule an archival in -1 days (i.e. ASAP), force maintenance to run, and\n    # check that the branch was archived.\n    frontend.operation(\n        \"schedulebrancharchival\",\n        data={ \"review_id\": 1,\n               \"delay\": -1 })\n    assert_branch_state(archived=False)\n    instance.synchronize_service(\"maintenance\", force_maintenance=True)\n    assert_branch_state(archived=True)\n\n    # Resurrect branch again.\n    frontend.operation(\n        \"resurrectbranch\",\n        data={ \"review_id\": 1 })\n    assert_branch_state(archived=False)\n\n    # Schedule an archival in 1 day (i.e. not now), force maintenance to run,\n    # and check that the branch wasn't archived.\n    frontend.operation(\n        \"schedulebrancharchival\",\n        data={ \"review_id\": 1,\n               \"delay\": 1 })\n    instance.synchronize_service(\"maintenance\", force_maintenance=True)\n    assert_branch_state(archived=False)\n\n    # Schedule another archival in -1 days (i.e. ASAP), then reopen the review,\n    # force maintenance to run, and check that the branch wasn't archived.\n    frontend.operation(\n        \"schedulebrancharchival\",\n        data={ \"review_id\": 1,\n               \"delay\": -1 })\n    frontend.operation(\n        \"reopenreview\",\n        data={ \"review_id\": 1 })\n    instance.synchronize_service(\"maintenance\", force_maintenance=True)\n    assert_branch_state(archived=False)\n\n    # Drop the review, archive the branch, and check that it became archived,\n    # then reopen the review, and check that the branch was resurrected\n    # automatically.\n    frontend.operation(\n        \"dropreview\",\n        data={ \"review_id\": 1 })\n    frontend.operation(\n        \"archivebranch\",\n        data={ \"review_id\": 1 })\n    assert_branch_state(archived=True)\n    frontend.operation(\n        \"reopenreview\",\n        data={ \"review_id\": 1 })\n    assert_branch_state(archived=False)\n"
  },
  {
    "path": "testing/tests/001-main/003-self/004-first-review-created/__init__.py",
    "content": "# @dependency 001-main/003-self/004-createreview.py\n"
  },
  {
    "path": "testing/tests/001-main/003-self/005-checkbranch.py",
    "content": "# Random commit on master:\nCOMMIT_SHA1 = \"bc661163b11234e85ec7b0efe1195cce473f234a\"\n\ndocument_title = testing.expect.document_title(\"Check branch review status\")\ncontent_title = testing.expect.paleyellow_title(0, \"Check branch review status\")\n\n# First load /checkbranch without parameters; this just returns a form.\nfrontend.page(\n    url=\"checkbranch\",\n    expect={ \"document_title\": document_title,\n             \"content_title\": content_title })\n\n# Create some branches.  The commits on them are not really that relevant, but\n# they should not be on master.  We generate some such commits simply by\n# reverting some commits that are on master.\n#\n# One branch is pushed to Critic's repository, but also to \"origin\" where it\n# has one additional commit.\n#\n# Another branch is not pushed to Critic's repository, only to \"origin\".\nwith repository.workcopy() as work:\n    work.run([\"checkout\", \"-q\", \"-b\", \"005-checkbranch\", COMMIT_SHA1])\n\n    first_sha1 = work.run([\"rev-parse\", \"HEAD\"]).strip()\n    second_sha1 = work.run([\"rev-parse\", \"HEAD^\"]).strip()\n    third_sha1 = work.run([\"rev-parse\", \"HEAD^^\"]).strip()\n\n    work.run([\"revert\", \"--no-edit\", first_sha1])\n    work.run([\"push\", \"-q\", instance.repository_url(\"alice\"),\n              \"HEAD:refs/heads/005-checkbranch-1\"])\n\n    work.run([\"revert\", \"--no-edit\", second_sha1])\n    work.run([\"push\", \"-q\", \"origin\", \"HEAD:refs/heads/005-checkbranch-1\"])\n\n    work.run([\"revert\", \"--no-edit\", third_sha1])\n    work.run([\"push\", \"-q\", \"origin\", \"HEAD:refs/heads/005-checkbranch-2\"])\n\ndocument_title = testing.expect.document_title(\"Branch review status: 005-checkbranch-1\")\ncontent_title = testing.expect.paleyellow_title(0, \"Unmerged Commits (1)\")\n\n# Load /checkbranch with fetch=no checking the first branch.\nfrontend.page(\n    url=\"checkbranch\",\n    params={ \"repository\": \"critic\",\n             \"commit\": \"005-checkbranch-1\",\n             \"upstream\": \"master\" },\n    expect={ \"document_title\": document_title,\n             \"content_title\": content_title })\n\ncontent_title = testing.expect.paleyellow_title(0, \"Unmerged Commits (2)\")\n\n# Load /checkbranch with fetch=yes checking the first branch.\nfrontend.page(\n    url=\"checkbranch\",\n    params={ \"repository\": \"critic\",\n             \"commit\": \"005-checkbranch-1\",\n             \"fetch\": \"yes\",\n             \"upstream\": \"master\" },\n    expect={ \"document_title\": document_title,\n             \"content_title\": content_title })\n\nmessage_title = testing.expect.message_title(\n    \"Unable to interpret '005-checkbranch-2' as a commit reference.\")\n\n# Load /checkbranch with fetch=no checking the second branch.  This essentially\n# fails, since we didn't push this branch to Critic's repository.\nfrontend.page(\n    url=\"checkbranch\",\n    params={ \"repository\": \"critic\",\n             \"commit\": \"005-checkbranch-2\",\n             \"upstream\": \"master\" },\n    expect={ \"message_title\": message_title })\n\ndocument_title = testing.expect.document_title(\"Branch review status: 005-checkbranch-2\")\ncontent_title = testing.expect.paleyellow_title(0, \"Unmerged Commits (3)\")\n\n# Load /checkbranch with fetch=yes checking the second branch.\nfrontend.page(\n    url=\"checkbranch\",\n    params={ \"repository\": \"critic\",\n             \"commit\": \"005-checkbranch-2\",\n             \"fetch\": \"yes\",\n             \"upstream\": \"master\" },\n    expect={ \"document_title\": document_title,\n             \"content_title\": content_title })\n\ncontent_title = testing.expect.paleyellow_title(0, \"Unmerged Commits (1)\")\n\n# Load /checkbranch checking the second branch, using the first branch as the\n# upstream instead of master.\nfrontend.page(\n    url=\"checkbranch\",\n    params={ \"repository\": \"critic\",\n             \"commit\": \"005-checkbranch-2\",\n             \"upstream\": \"005-checkbranch-1\" },\n    expect={ \"document_title\": document_title,\n             \"content_title\": content_title })\n\n# Load /checkbranchtext checking the first branch.\ndocument = frontend.page(\n    url=\"checkbranchtext\",\n    expected_content_type=\"text/plain\",\n    params={ \"repository\": \"critic\",\n             \"commit\": \"005-checkbranch-1\",\n             \"upstream\": \"master\" })\n\ntesting.expect.check('Revert \"Fix typo s/orderIndeces/orderIndices/\": REVIEW STATUS UNKNOWN!\\n' +\n    'Revert \"Delete unused commented out code\": REVIEW STATUS UNKNOWN!\\n', str(document))\n"
  },
  {
    "path": "testing/tests/001-main/003-self/006-showreview-reviewfilter.py",
    "content": "# Scenario: Alice creates a review of a single commit with a review filter with\n# empty path.  Loading the review front-page after that should not crash, but\n# did due to a problem introduced by the filter system rewrite.\n\nimport re\n\n# Random commit on master:\nCOMMIT_SHA1 = \"f771149aba230c4712c9cb9c6af4ccfea2b7967d\"\nREVIEW_SUMMARY = \"006-showreview-reviewfilter.py\"\n\nwith frontend.signin(\"alice\"):\n    # Loading /createreview is not really necessary, but might as well try that\n    # as well.\n    document = frontend.page(\n        \"createreview\",\n        params={ \"repository\": \"critic\",\n                 \"commits\": COMMIT_SHA1 })\n\n    scripts = document.findAll(\"script\")\n\n    for script in scripts:\n        if script.has_key(\"src\"):\n            continue\n        match = re.search(\n            r\"^\\s*var review_data\\s*=\\s*\\{\\s*commit_ids:\\s*\\[\\s*(\\d+)\\s*\\]\",\n            script.string, re.MULTILINE)\n        if match:\n            commit_id = int(match.group(1))\n            break\n    else:\n        testing.expect.check(\"<data script>\",\n                             \"<expected content not found>\")\n\n    result = frontend.operation(\n        \"submitreview\",\n        data={ \"repository\": 1,\n               \"commit_ids\": [commit_id],\n               \"branch\": \"r/006-showreview-reviewfilter\",\n               \"summary\": REVIEW_SUMMARY,\n               \"applyfilters\": True,\n               \"applyparentfilters\": True,\n               \"reviewfilters\": [{ \"username\": \"bob\",\n                                   \"type\": \"reviewer\",\n                                   \"path\": \"\" },\n                                 { \"username\": \"dave\",\n                                   \"type\": \"watcher\",\n                                   \"path\": \"\" },\n                                 { \"username\": \"erin\",\n                                   \"type\": \"watcher\",\n                                   \"path\": \"\" } ],\n               \"recipientfilters\": { \"mode\": \"opt-out\",\n                                     \"excluded\": [\"erin\"] }})\n\n    def to(name):\n        return testing.mailbox.ToRecipient(\"%s@example.org\" % name)\n\n    mailbox.pop(accept=to(\"alice\"))\n    mailbox.pop(accept=to(\"bob\"))\n    mailbox.pop(accept=to(\"dave\"))\n\n    mailbox.check_empty()\n\n    review_id = result[\"review_id\"]\n    document_title = \"r/%d (No progress) - %s - Opera Critic\" % (review_id, REVIEW_SUMMARY)\n\nwith frontend.signin(\"admin\"):\n    frontend.page(\n        \"r/%d\" % review_id,\n        expect={ \"document_title\": testing.expect.document_title(document_title) })\n"
  },
  {
    "path": "testing/tests/001-main/003-self/007-http-backend.py",
    "content": "import os\nimport subprocess\nimport tempfile\nimport shutil\n\nenviron = os.environ.copy()\n\ndef git(args, cwd=None):\n    argv = [\"git\"]\n    argv.extend(args)\n\n    logger.debug(\"Running: %s\" % \" \".join(argv))\n\n    output = subprocess.check_output(\n        argv, cwd=cwd, env=environ, stderr=subprocess.STDOUT)\n\n    if output.strip():\n        logger.debug(\"Output:\\n%s\" % output.rstrip())\n\nwork_dir = tempfile.mkdtemp()\n\ntry:\n    # Set invalid password so that authentication (if required) fails.\n    environ[\"GIT_ASKPASS\"] = os.path.abspath(\"testing/password-invalid\")\n\n    # This should not require a password.\n    try:\n        git([\"clone\", \"--quiet\", \"--branch\", \"master\",\n             \"%s/critic.git\" % frontend.prefix(\"alice\")],\n            cwd=work_dir)\n    except subprocess.CalledProcessError as error:\n        logger.error(\"'git clone' failed: %s\\n%s\"\n                     % (str(error), error.output.rstrip()))\n\n    # This should require a password.\n    try:\n        git([\"push\", \"--quiet\", \"origin\", \"HEAD:007-http-backend-1\"],\n            cwd=os.path.join(work_dir, \"critic\"))\n        logger.error(\"Unauthenticated push (apparently) accepted!\")\n    except subprocess.CalledProcessError:\n        pass\n\n    # Set valid password so that authentication succeeds.\n    environ[\"GIT_ASKPASS\"] = os.path.abspath(\"testing/password-testing\")\n\n    # This should require a password.\n    try:\n        git([\"push\", \"--quiet\", \"origin\", \"HEAD:007-http-backend-2\"],\n            cwd=os.path.join(work_dir, \"critic\"))\n    except subprocess.CalledProcessError as error:\n        logger.error(\"'git push' failed: %s\\n%s\"\n                     % (str(error), error.output.rstrip()))\nfinally:\n    shutil.rmtree(work_dir)\n\n# Same thing again, only with a repository URL without \".git\" suffix.\n\nwork_dir = tempfile.mkdtemp()\n\ntry:\n    # Set invalid password so that authentication (if required) fails.\n    environ[\"GIT_ASKPASS\"] = os.path.abspath(\"testing/password-invalid\")\n\n    # This should not require a password.\n    try:\n        git([\"clone\", \"--quiet\", \"--branch\", \"master\",\n             \"%s/critic\" % frontend.prefix(\"alice\")],\n            cwd=work_dir)\n    except subprocess.CalledProcessError as error:\n        logger.error(\"'git clone' failed: %s\\n%s\"\n                     % (str(error), error.output.rstrip()))\n\n    # This should require a password.\n    try:\n        git([\"push\", \"--quiet\", \"origin\", \"HEAD:007-http-backend-3\"],\n            cwd=os.path.join(work_dir, \"critic\"))\n        logger.error(\"Unauthenticated push (apparently) accepted!\")\n    except subprocess.CalledProcessError:\n        pass\n\n    # Set valid password so that authentication succeeds.\n    environ[\"GIT_ASKPASS\"] = os.path.abspath(\"testing/password-testing\")\n\n    # This should require a password.\n    try:\n        git([\"push\", \"--quiet\", \"origin\", \"HEAD:007-http-backend-4\"],\n            cwd=os.path.join(work_dir, \"critic\"))\n    except subprocess.CalledProcessError as error:\n        logger.error(\"'git push' failed: %s\\n%s\"\n                     % (str(error), error.output.rstrip()))\nfinally:\n    shutil.rmtree(work_dir)\n"
  },
  {
    "path": "testing/tests/001-main/003-self/008-initial-commit-diff.py",
    "content": "import os\nimport re\n\ndef to(name):\n    return testing.mailbox.ToRecipient(\"%s@example.org\" % name)\n\ndef about(subject):\n    return testing.mailbox.WithSubject(subject)\n\nFILENAME = \"008-root-commit-pending.txt\"\nSUMMARY = \"Added %s\" % FILENAME\n\nreview_id = None\ncommits = {}\nfirst_commit = None\nsecond_commit = None\nthird_commit = None\n\nSETTINGS = { \"review.createViaPush\": True }\n\nwith testing.utils.settings(\"alice\", SETTINGS), frontend.signin(\"alice\"):\n    with repository.workcopy(empty=True) as work:\n        REMOTE_URL = instance.repository_url(\"alice\")\n\n        def commit(fixup_message=None):\n            if fixup_message:\n                full_message = \"fixup! %s\\n\\n%s\" % (SUMMARY, fixup_message)\n                message = fixup_message\n            else:\n                full_message = message = SUMMARY\n            work.run([\"add\", FILENAME])\n            work.run([\"commit\", \"-m\", full_message],\n                     GIT_AUTHOR_NAME=\"Alice von Testing\",\n                     GIT_AUTHOR_EMAIL=\"alice@example.org\",\n                     GIT_COMMITTER_NAME=\"Alice von Testing\",\n                     GIT_COMMITTER_EMAIL=\"alice@example.org\")\n            sha1 = work.run([\"rev-parse\", \"HEAD\"]).strip()\n            commits[sha1] = message\n            return sha1\n\n        def push():\n            work.run([\"push\", \"-q\", REMOTE_URL,\n                      \"HEAD:refs/heads/r/008-root-commit-pending\"])\n\n        with open(os.path.join(work.path, FILENAME), \"w\") as text_file:\n            print >>text_file, \"First line.\"\n\n        first_commit = commit()\n\n        push()\n\n        to_alice = mailbox.pop(accept=to(\"alice\"))\n        testing.expect.check(\"New Review: %s\" % SUMMARY,\n                             to_alice.header(\"Subject\"))\n\n        for line in to_alice.lines:\n            match = re.search(\n                r\"\\bhttp://[^/]+/r/(\\d+)\\b\", line)\n            if match:\n                review_id = int(match.group(1))\n                break\n        else:\n            testing.expect.check(\"<review URL in mail>\",\n                                 \"<expected content not found>\")\n\n        with open(os.path.join(work.path, FILENAME), \"a\") as text_file:\n            print >>text_file, \"Second line.\"\n\n        second_commit = commit(\"Added second line\")\n\n        with open(os.path.join(work.path, FILENAME), \"a\") as text_file:\n            print >>text_file, \"Third line.\"\n\n        third_commit = commit(\"Added third line\")\n\n        push()\n\n        to_alice = mailbox.pop(accept=to(\"alice\"))\n        testing.expect.check(\"Updated Review: %s\" % SUMMARY,\n                             to_alice.header(\"Subject\"))\n\n    frontend.operation(\n        \"addreviewfilters\",\n        data={ \"review_id\": review_id,\n               \"filters\": [{ \"type\": \"reviewer\",\n                             \"user_names\": [\"bob\"],\n                             \"paths\": [\"/\"] }] })\n\n    mailbox.pop(accept=(to(\"bob\"), about(r\"New\\(ish\\) Review: %s\" % SUMMARY)))\n    mailbox.pop(accept=(to(\"bob\"), about(\"Updated Review: %s\" % SUMMARY)))\n\ndef check_squashed_history(sha1s):\n    def check(document):\n        table = document.find(\"table\", attrs=testing.expect.with_class(\"log\"))\n\n        if not table:\n            testing.expect.check(\"<table class='log'>\",\n                                 \"<expected content not found>\")\n\n        links = table.findAll(\"a\", attrs=testing.expect.with_class(\"commit\"))\n\n        for link in links:\n            testing.expect.check(\n                \"%s?review=%d\" % (sha1s[-1][:8], review_id), link[\"href\"])\n            del sha1s[-1]\n\n        if sha1s:\n            logger.error(\n                \"Commits missing from 'Squashed history':\\n  %s\"\n                % (\"\\n  \".join(commits[sha1] for sha1 in sha1s)))\n\n    return check\n\nfrontend.page(\"r/%d\" % review_id)\nfrontend.page(\"showcommit?sha1=%s&review=%d\" % (first_commit, review_id))\n\nfrontend.page(\n    (\"showcommit?first=%s&last=%s&review=%d\"\n     % (first_commit, second_commit, review_id)),\n    expect={ \"squashed_history\": check_squashed_history([first_commit,\n                                                         second_commit]) })\n\nfrontend.page(\n    (\"showcommit?first=%s&last=%s&review=%d\"\n     % (second_commit, third_commit, review_id)),\n    expect={ \"squashed_history\": check_squashed_history([second_commit,\n                                                         third_commit]) })\n\nfrontend.page(\n    (\"showcommit?first=%s&last=%s&review=%d\"\n     % (first_commit, third_commit, review_id)),\n    expect={ \"squashed_history\": check_squashed_history([first_commit,\n                                                         second_commit,\n                                                         third_commit]) })\n\ndef check_path(document):\n    table = document.find(\"table\", attrs=testing.expect.with_class(\"filter\"))\n\n    if not table:\n        testing.expect.check(\"<table class='filter'>\",\n                             \"<expected content not found>\")\n\n    for cell in table.findAll(\"td\", attrs=testing.expect.with_class(\"path\")):\n        if cell.string and cell.string == FILENAME:\n            break\n    else:\n        testing.expect.check(\"<td class='path'>%s</td>\" % FILENAME,\n                             \"<expected content not found>\")\n\nfrontend.page(\n    \"filterchanges?review=%d\" % review_id,\n    expect={ \"path\": check_path })\n\nfrontend.page(\n    (\"filterchanges?first=%s&last=%s&review=%d\"\n     % (first_commit, second_commit, review_id)),\n    expect={ \"path\": check_path })\n\nfrontend.page(\n    (\"filterchanges?first=%s&last=%s&review=%d\"\n     % (second_commit, third_commit, review_id)),\n    expect={ \"path\": check_path })\n\nfrontend.page(\n    (\"filterchanges?first=%s&last=%s&review=%d\"\n     % (first_commit, third_commit, review_id)),\n    expect={ \"path\": check_path })\n\nwith frontend.signin(\"bob\"):\n    frontend.page(\n        \"showcommit?review=%d&filter=pending\" % review_id,\n        expect={ \"squashed_history\": check_squashed_history([first_commit,\n                                                             second_commit,\n                                                             third_commit]) })\n\n    frontend.page(\n        \"showcommit?review=%d&filter=reviewable\" % review_id,\n        expect={ \"squashed_history\": check_squashed_history([first_commit,\n                                                             second_commit,\n                                                             third_commit]) })\n\n    frontend.page(\n        \"showcommit?review=%d&filter=relevant\" % review_id,\n        expect={ \"squashed_history\": check_squashed_history([first_commit,\n                                                             second_commit,\n                                                             third_commit]) })\n"
  },
  {
    "path": "testing/tests/001-main/003-self/009-fetchremotebranch.py",
    "content": "import os\n\nTESTNAME = \"009-fetchremotebranch\"\nFILENAME = \"%s.txt\" % TESTNAME\n\nwith repository.workcopy() as work:\n    upstream_sha1 = work.run([\"rev-parse\", \"HEAD\"]).strip()\n\n    work.run([\"branch\", \"%s/upstream\" % TESTNAME])\n    work.run([\"checkout\", \"-b\", \"%s/branch\" % TESTNAME])\n\n    with open(os.path.join(work.path, FILENAME), \"w\") as text_file:\n        print >>text_file, \"This is a text file.\"\n\n    work.run([\"add\", FILENAME])\n    work.run([\"commit\", \"-m\", \"Add %s\" % FILENAME],\n             GIT_AUTHOR_NAME=\"Alice von Testing\",\n             GIT_AUTHOR_EMAIL=\"alice@example.org\",\n             GIT_COMMITTER_NAME=\"Alice von Testing\",\n             GIT_COMMITTER_EMAIL=\"alice@example.org\")\n\n    head_sha1 = work.run([\"rev-parse\", \"HEAD\"]).strip()\n\n    work.run([\"push\", \"origin\", \"%s/upstream\" % TESTNAME])\n    work.run([\"push\", \"origin\", \"%s/branch\" % TESTNAME])\n\nwith frontend.signin(\"alice\"):\n    result = frontend.operation(\n        \"fetchremotebranch\",\n        data={ \"repository_name\": \"critic\",\n               \"remote\": repository.url,\n               \"branch\": \"refs/heads/%s/branch\" % TESTNAME,\n               \"upstream\": \"refs/heads/%s/upstream\" % TESTNAME },\n        expect={ \"head_sha1\": head_sha1,\n                 \"upstream_sha1\": upstream_sha1 })\n\n    commit_ids = result[\"commit_ids\"]\n\n    def check_commit_ids(value):\n        if set(value) != set(commit_ids):\n            return repr(sorted(value)), repr(sorted(commit_ids))\n\n    frontend.operation(\n        \"fetchremotebranch\",\n        data={ \"repository_name\": \"critic\",\n               \"remote\": repository.url,\n               \"branch\": \"%s/branch\" % TESTNAME,\n               \"upstream\": \"refs/heads/%s/upstream\" % TESTNAME },\n        expect={ \"head_sha1\": head_sha1,\n                 \"upstream_sha1\": upstream_sha1,\n                 \"commit_ids\": check_commit_ids })\n"
  },
  {
    "path": "testing/tests/001-main/003-self/010-linkification.py",
    "content": "import os\n\nTESTNAME = \"010-linkification\"\nFILENAME = \"%s.txt\" % TESTNAME\n\nLONG_SHA1_1 = \"ca89553db7a2ba22fef70535a65beedf33c97216\"\nSHORT_SHA1_1 = LONG_SHA1_1[:8]\n\nLONG_SHA1_2 = \"132dbfb7c2ac0f4333fb483a70f1e8cce0333d11\"\nSHORT_SHA1_2 = LONG_SHA1_2[:8]\n\nMESSAGE = \"\"\"\\\nAdd %(FILENAME)s\n\nThe rest of this commit message contains various things that should be\nturned into links by the automatic linkification.\n\nA plain HTTP URL: http://critic-review.org/tutorials.\n\nA \"wrapped\" URL: <URL:mailto:jl@critic-review.org>.\n\nA full SHA-1: %(LONG_SHA1_1)s.\n\nA shortened SHA-1: %(SHORT_SHA1_1)s.\n\nA diff (full SHA-1s): %(LONG_SHA1_2)s..%(LONG_SHA1_1)s.\n\nA diff (shortened SHA-1s): %(SHORT_SHA1_2)s..%(SHORT_SHA1_1)s, should work too.\n\nA review link: r/123 (it doesn't matter if the review exists or not.)\n\nNo review link: harrharr/1337\n\nNo SHA-1: g%(SHORT_SHA1_1)s\n\nAlso no SHA-1: %(SHORT_SHA1_1)sg\n\"\"\" % { \"FILENAME\": FILENAME,\n        \"LONG_SHA1_1\": LONG_SHA1_1,\n        \"SHORT_SHA1_1\": SHORT_SHA1_1,\n        \"LONG_SHA1_2\": LONG_SHA1_2,\n        \"SHORT_SHA1_2\": SHORT_SHA1_2 }\n\nwith repository.workcopy() as work:\n    work.run([\"checkout\", \"-b\", TESTNAME])\n\n    with open(os.path.join(work.path, FILENAME), \"w\") as text_file:\n        print >>text_file, \"This line is not significant.\"\n\n    work.run([\"add\", FILENAME])\n    work.run([\"commit\", \"-m\", MESSAGE])\n    work.run([\"push\", instance.repository_url(\"alice\"), \"HEAD\"])\n\nLINKS = { \"A plain HTTP URL\": (\"http://critic-review.org/tutorials\",\n                               \"http://critic-review.org/tutorials\" ),\n          'A \"wrapped\" URL': (\"mailto:jl@critic-review.org\",\n                              \"&lt;URL:mailto:jl@critic-review.org&gt;\"),\n          \"A full SHA-1\": (\"/critic/%s\" % LONG_SHA1_1, LONG_SHA1_1),\n          \"A shortened SHA-1\": (\"/critic/%s\" % LONG_SHA1_1, SHORT_SHA1_1),\n          \"A diff (full SHA-1s)\": (\"/critic/%s..%s\" % (LONG_SHA1_2, LONG_SHA1_1),\n                                   \"%s..%s\" % (LONG_SHA1_2, LONG_SHA1_1)),\n          \"A diff (shortened SHA-1s)\": (\"/critic/%s..%s\" % (LONG_SHA1_2, LONG_SHA1_1),\n                                        \"%s..%s\" % (SHORT_SHA1_2, SHORT_SHA1_1)),\n          \"A review link\": (\"/r/123\", \"r/123\") }\n\ndef check_link(label, expected_href, expected_string):\n    def check(document):\n        line_attrs = testing.expect.with_class(\"line\", \"commit-msg\")\n        for line in document.findAll(\"td\", attrs=line_attrs):\n            if not isinstance(line.contents[0], basestring):\n                continue\n            if not line.contents[0].startswith(label + \": \"):\n                continue\n            if len(line.contents) < 2:\n                continue\n            link = line.contents[1]\n            try:\n                if link.name != \"a\":\n                    continue\n            except AttributeError:\n                continue\n            break\n        else:\n            testing.expect.check(\"line: '%s: <a ...>...</a>'\" % label,\n                                 \"<expected content not found>\")\n\n        testing.expect.check(expected_href, link[\"href\"])\n        testing.expect.check(expected_string, link.string)\n\n    return check\n\ndef check_nonlink(text):\n    def check(document):\n        line_attrs = testing.expect.with_class(\"line\", \"commit-msg\")\n        for line in document.findAll(\"td\", attrs=line_attrs):\n            if line.string == text:\n                break\n        else:\n            testing.expect.check(\"line: %r\" % text,\n                                 \"<expected content not found>\")\n\n    return check\n\nexpect = dict((label, check_link(label, href, string))\n              for label, (href, string) in LINKS.items())\n\nexpect[\"No review link\"] = check_nonlink(\"No review link: harrharr/1337\")\nexpect[\"No SHA-1\"] = check_nonlink(\"No SHA-1: g%s\" % SHORT_SHA1_1)\nexpect[\"Also no SHA-1\"] = check_nonlink(\"Also no SHA-1: %sg\" % SHORT_SHA1_1)\n\nfrontend.page(\n    \"critic/%s\" % TESTNAME,\n    expect=expect)\n"
  },
  {
    "path": "testing/tests/001-main/003-self/011-linkification-custom.py",
    "content": "# Need a VM (full installation) to do customizations.\n# @flag full\n\nimport os\n\nTESTNAME = \"010-linkification-custom\"\nFILENAME = \"%s.txt\" % TESTNAME\n\nMESSAGE = \"\"\"\\\nAdd %(FILENAME)s\n\nThe rest of this commit message contains some \"issue links\".\n\nAt end of line: #1001\nFollowed by text: #1002 is fixed!\nFollowed by a period: #1003.\nFollowed by a comma: #1004, huh?\nWithin parentheses: (#1005)\n\nThat's all, folks!\n\"\"\" % { \"FILENAME\": FILENAME }\n\nwith repository.workcopy() as work:\n    work.run([\"checkout\", \"-b\", TESTNAME])\n\n    with open(os.path.join(work.path, FILENAME), \"w\") as text_file:\n        print >>text_file, \"This line is not significant.\"\n\n    work.run([\"add\", FILENAME])\n    work.run([\"commit\", \"-m\", MESSAGE])\n    work.run([\"push\", instance.repository_url(\"alice\"), \"HEAD\"])\n\ninstance.execute(\n    [\"sudo\", \"mkdir\", \"-p\", \"/etc/critic/main/customization\",\n     \"&&\",\n     \"sudo\", \"touch\", \"/etc/critic/main/customization/__init__.py\",\n     \"&&\",\n     \"sudo\", \"cp\", \"critic/testing/input/customization/linktypes.py\",\n     \"/etc/critic/main/customization\",\n     \"&&\",\n     \"sudo\", \"chown\", \"-R\", \"critic.critic\", \"/etc/critic/main/customization\"])\n\ninstance.restart()\n\ndef issue(number):\n    return (\"https://issuetracker.example.com/showIssue?id=%d\" % number,\n            \"#%d\" % number)\n\nLINKS = { \"At end of line\": issue(1001),\n          \"Followed by text\": issue(1002),\n          \"Followed by a period\": issue(1003),\n          \"Followed by a comma\": issue(1004),\n          \"Within parentheses\": issue(1005) }\n\ndef check_link(label, expected_href, expected_string):\n    def check(document):\n        line_attrs = testing.expect.with_class(\"line\", \"commit-msg\")\n        for line in document.findAll(\"td\", attrs=line_attrs):\n            if not isinstance(line.contents[0], basestring):\n                continue\n            if not line.contents[0].startswith(label + \": \"):\n                continue\n            if len(line.contents) < 2:\n                continue\n            link = line.contents[1]\n            try:\n                if link.name != \"a\":\n                    continue\n            except AttributeError:\n                continue\n            break\n        else:\n            testing.expect.check(\"line: '%s: <a ...>...</a>'\" % label,\n                                 \"<expected content not found>\")\n\n        testing.expect.check(expected_href, link[\"href\"])\n        testing.expect.check(expected_string, link.string)\n\n    return check\n\nfrontend.page(\n    \"critic/%s\" % TESTNAME,\n    expect=dict((label, check_link(label, href, string))\n                for label, (href, string) in LINKS.items()))\n\n\n"
  },
  {
    "path": "testing/tests/001-main/003-self/012-createreview-recipients.py",
    "content": "# Scenario: Alice creates an opt-in review and includes \"bob\" as a recipient.\n\nimport re\n\n# Random commit on master:\nCOMMIT_SHA1 = \"f771149aba230c4712c9cb9c6af4ccfea2b7967d\"\nCOMMIT_SUMMARY = \"Minor /dashboard query optimizations\"\n\nwith frontend.signin(\"alice\"):\n    # Load /createreview to get commit_id.\n    document = frontend.page(\n        \"createreview\",\n        params={ \"repository\": \"critic\", \"commits\": COMMIT_SHA1 })\n\n    scripts = document.findAll(\"script\")\n\n    for script in scripts:\n        if script.has_key(\"src\"):\n            continue\n        match = re.search(\n            r\"^\\s*var review_data\\s*=\\s*\\{\\s*commit_ids:\\s*\\[\\s*(\\d+)\\s*\\]\",\n            script.string, re.MULTILINE)\n        if match:\n            commit_id = int(match.group(1))\n            break\n    else:\n        testing.expect.check(\"<data script>\",\n                             \"<expected content not found>\")\n\n    result = frontend.operation(\n        \"submitreview\",\n        data={ \"repository_name\": \"critic\",\n               \"commit_ids\": [commit_id],\n               \"branch\": \"r/012-createreview-recipients\",\n               \"summary\": COMMIT_SUMMARY,\n               \"applyfilters\": True,\n               \"applyparentfilters\": True,\n               \"reviewfilters\": [{ \"username\": \"bob\",\n                                   \"type\": \"reviewer\",\n                                   \"path\": \"/\" },\n                                 { \"username\": \"dave\",\n                                   \"type\": \"watcher\",\n                                   \"path\": \"/\" }],\n               \"recipientfilters\": { \"mode\": \"opt-in\",\n                                     \"included\": [\"bob\"] }})\n\n    def to(name):\n        return testing.mailbox.ToRecipient(\"%s@example.org\" % name)\n\n    mailbox.pop(accept=to(\"alice\"))\n    mailbox.pop(accept=to(\"bob\"))\n    mailbox.check_empty()\n"
  },
  {
    "path": "testing/tests/001-main/003-self/012-replayrebase.py",
    "content": "import os\nimport re\nimport shutil\n\ndef to(name):\n    return testing.mailbox.ToRecipient(\"%s@example.org\" % name)\n\ndef about(subject):\n    return testing.mailbox.WithSubject(subject)\n\n# The commit we'll be \"reviewing.\"\nCOMMIT_SHA1 = \"aca57d0899e5193232dbbea726d94a838a4274ed\"\n# The original parent of that commit.\nPARENT_SHA1 = \"ca89553db7a2ba22fef70535a65beedf33c97216\"\n# An ancestor of the original parent, onto which we'll be rebasing the reviewed\n# commit.\nTARGET_SHA1 = \"132dbfb7c2ac0f4333fb483a70f1e8cce0333d11\"\n\n# The subject of the reviewed commit.\nSUMMARY = \"Use temporary clones for relaying instead of temporary remotes\"\n\nSETTINGS = { \"review.createViaPush\": True,\n             \"email.subjectLine.updatedReview.reviewRebased\":\n                 \"Rebased Review: %(summary)s\" }\n\nwith testing.utils.settings(\"alice\", SETTINGS), frontend.signin(\"alice\"):\n    with repository.workcopy() as work:\n        REMOTE_URL = instance.repository_url(\"alice\")\n\n        work.run([\"checkout\", \"-b\", \"r/012-replayrebase\", PARENT_SHA1])\n        work.run([\"cherry-pick\", COMMIT_SHA1],\n                 GIT_COMMITTER_NAME=\"Alice von Testing\",\n                 GIT_COMMITTER_EMAIL=\"alice@example.org\")\n\n        output = work.run([\"push\", REMOTE_URL, \"HEAD\"])\n        next_is_review_url = False\n\n        for line in output.splitlines():\n            if not line.startswith(\"remote:\"):\n                continue\n            line = line[len(\"remote:\"):].split(\"\\x1b\", 1)[0].strip()\n            if line == \"Submitted review:\":\n                next_is_review_url = True\n            elif next_is_review_url:\n                review_id = int(re.search(r\"/r/(\\d+)$\", line).group(1))\n                break\n        else:\n            testing.expect.check(\"<review URL in git hook output>\",\n                                 \"<expected content not found>\")\n\n        mailbox.pop(accept=[to(\"alice\"),\n                            about(\"New Review: %s\" % SUMMARY)])\n\n        frontend.operation(\n            \"preparerebase\",\n            data={ \"review_id\": review_id,\n                   \"new_upstream\": TARGET_SHA1 })\n\n        work.run([\"rebase\", \"--onto\", TARGET_SHA1, PARENT_SHA1])\n\n        # Create some new files as part of the rebase.  This serves two\n        # purposes:\n        #\n        # 1) Generate some \"changes introduced by rebase\" for the rebase replay\n        #    mechanism to deal with, and\n        #\n        # 2) make sure the push creates a new pack file (a certain amount of new\n        #    objects are required to cause this) so that \"git receive-pack\"\n        #    creates a pack-*.keep file, which creates trouble for \"git clone\".\n        source_path = os.path.join(work.path, \"testing\")\n        for index in range(10):\n            destination_path = os.path.join(work.path, \"testing%d\" % index)\n            shutil.copytree(source_path, destination_path)\n            for path, _, filenames in os.walk(destination_path):\n                for filename in filenames:\n                    with open(os.path.join(path, filename), \"a\") as copied:\n                        copied.write(\"%d\\n\" % index)\n            work.run([\"add\", \"testing%d\" % index])\n        work.run([\"commit\", \"--amend\", \"--reuse-message=HEAD\"])\n\n        work.run([\"push\", \"--force\", REMOTE_URL, \"HEAD\"])\n\n        mailbox.pop(accept=[to(\"alice\"),\n                            about(\"Updated Review: %s\" % SUMMARY)])\n\n        mailbox.pop(accept=[to(\"alice\"),\n                            about(\"Rebased Review: %s\" % SUMMARY)])\n"
  },
  {
    "path": "testing/tests/001-main/003-self/014-non-ascii-filenames.py",
    "content": "# coding=utf-8\n\n# Scenario: Alice creates a review for a commit where a file that contains\n#           non-ascii chars has been added. Critic should not crash.\n\nimport os\n\nTC_NAME_PREFIX = \"014-non-ascii-filename\"\nTC_NAME_UTF8 = (u\"%s-åäö\\x01\\ufffd\" % TC_NAME_PREFIX).encode(\"utf-8\")\nTC_NAME_ESCAPED = u\"%s-åäö\\\\x01\\\\ufffd\" % TC_NAME_PREFIX\n\ndef check_filename(class_name):\n    def check(document):\n        cells = document.findAll(\"td\", attrs=testing.expect.with_class(class_name))\n\n        for cell in cells:\n            anchor = cell.find(\"a\")\n            if not anchor:\n                continue\n            if anchor.string.startswith(TC_NAME_PREFIX):\n                testing.expect.check(TC_NAME_ESCAPED, anchor.string)\n                break\n        else:\n            testing.expect.check(\"<td class=%s><a>%s\" % (class_name, TC_NAME_ESCAPED),\n                                 \"<expected content not found>\")\n\n    return check\n\nwith frontend.signin(\"alice\"):\n    with repository.workcopy(empty=True) as work:\n        REMOTE_URL = instance.repository_url(\"alice\")\n\n        def commit():\n            work.run([\"add\", TC_NAME_UTF8])\n            work.run([\"commit\", \"-m\", TC_NAME_UTF8],\n                     GIT_AUTHOR_NAME=\"Alice von Testing\",\n                     GIT_AUTHOR_EMAIL=\"alice@example.org\",\n                     GIT_COMMITTER_NAME=\"Alice von Testing\",\n                     GIT_COMMITTER_EMAIL=\"alice@example.org\")\n            return work.run([\"rev-parse\", \"HEAD\"]).strip()\n\n        def push():\n            work.run([\"push\", \"-q\", REMOTE_URL,\n                      \"HEAD:refs/heads/\" + TC_NAME_PREFIX])\n\n        with open(os.path.join(work.path, TC_NAME_UTF8), \"w\") as text_file:\n            print >>text_file, \"Content of file \" + TC_NAME_UTF8\n\n        sha1 = commit()\n        push()\n\n    frontend.page(\n        \"showcommit\",\n        params={ \"repository\": \"critic\",\n                 \"sha1\": sha1 },\n        expect={ \"filename\": check_filename(\"path\") })\n\n    frontend.page(\n        \"showtree\",\n        params={ \"repository\": \"critic\",\n                 \"sha1\": sha1,\n                 \"path\": \"/\" },\n        expect={ \"filename\": check_filename(\"name\") })\n"
  },
  {
    "path": "testing/tests/001-main/003-self/015-non-ascii-line-diff.py",
    "content": "import os\nimport json\n\nimport BeautifulSoup\n\nTESTNAME = \"015-non-ascii-line-diff\"\n\nwith repository.workcopy() as work:\n    REMOTE_URL = instance.repository_url(\"alice\")\n\n    def commit(encoding, content, index):\n        filename = \"%s.%s.txt\" % (TESTNAME, encoding)\n\n        with open(os.path.join(work.path, filename), \"w\") as text_file:\n            text_file.write(content.encode(encoding))\n\n        work.run([\"add\", filename])\n        work.run([\"commit\", \"-m\", \"%s (%s #%d)\" % (TESTNAME, encoding, index)],\n                 GIT_AUTHOR_NAME=\"Alice von Testing\",\n                 GIT_AUTHOR_EMAIL=\"alice@example.org\",\n                 GIT_COMMITTER_NAME=\"Alice von Testing\",\n                 GIT_COMMITTER_EMAIL=\"alice@example.org\")\n\n        return work.run([\"rev-parse\", \"HEAD\"]).strip()\n\n    work.run([\"checkout\", \"-b\", TESTNAME])\n\n    utf8_from_sha1 = commit(\"utf-8\", u\"Non-ascii: \\xf6\\n\", 1)\n    utf8_to_sha1 = commit(\"utf-8\", u\"Non-ascii: \\xf7\\n\", 2)\n\n    latin1_from_sha1 = commit(\"latin-1\", u\"Non-ascii: \\xf6\\n\", 1)\n    latin1_to_sha1 = commit(\"latin-1\", u\"Non-ascii: \\xf7\\n\", 2)\n\n    work.run([\"push\", REMOTE_URL, \"HEAD\"])\n\ndef check_line_diff(document):\n    tbody_lines = document.findAll(\"tbody\", attrs={ \"class\": \"lines\" })\n\n    testing.expect.check(1, len(tbody_lines))\n    testing.expect.check(1, len(tbody_lines[0].contents))\n\n    comment = tbody_lines[0].contents[0]\n\n    testing.expect.check(BeautifulSoup.Comment, comment.__class__)\n\n    try:\n        data = json.loads(comment)\n    except ValueError:\n        testing.expect.check(\"<valid JSON>\", repr(str(comment)))\n\n    testing.expect.check(5, len(data))\n\n    file_id, sides, old_offset, new_offset, lines = data\n\n    testing.expect.check(2, sides)\n    testing.expect.check(1, old_offset)\n    testing.expect.check(1, new_offset)\n    testing.expect.check(1, len(lines))\n    testing.expect.check(3, len(lines[0]))\n\n    line_type, old_line, new_line = lines[0]\n\n    # See diff/__init__.py, class Line\n    MODIFIED = 3\n\n    testing.expect.check(MODIFIED, line_type)\n    testing.expect.check(u\"Non-ascii: <ir>\\xf6</i>\", old_line)\n    testing.expect.check(u\"Non-ascii: <ir>\\xf7</i>\", new_line)\n\nfrontend.page(\n    \"showcommit\",\n    params={ \"repository\": \"critic\",\n             \"from\": utf8_from_sha1,\n             \"to\": utf8_to_sha1 },\n    expect={ \"utf8_line_diff\": check_line_diff })\n\nfrontend.page(\n    \"showcommit\",\n    params={ \"repository\": \"critic\",\n             \"from\": latin1_from_sha1,\n             \"to\": latin1_to_sha1 },\n    expect={ \"latin1_line_diff\": check_line_diff })\n"
  },
  {
    "path": "testing/tests/001-main/003-self/016-showcommit-ranges.py",
    "content": "# Scenario: Alice opens the BRANCHES page, switches to the master branch in the\n# Critic-inside-Critic repository and selects a range of two adjacent non-merge\n# commits and verifies that there is no error. She then selects a range that\n# starts with a merge commit and makes sure that the appropriate error message\n# is shown.\n\nwith frontend.signin(\"alice\"):\n    # Two adjacent non-merge commits.\n    document = frontend.page(\n        \"showcommit\",\n        params={ \"first\": \"016f2149c334ff7dabac98700e74a7e9500e702e\",\n                 \"last\": \"007b4b53a2a8e9561f5143eff27300ea693ca621\" },\n        expect={ \"document_title\": testing.expect.document_title(u\"fa686f55..007b4b53\"),\n                 \"content_title\": testing.expect.paleyellow_title(0, u\"Squashed History\") })\n\n    # 57a886e is a merge commit.\n    document = frontend.page(\n        \"showcommit\",\n        params={ \"first\": \"57a886e6352b229991c81e7ba43244ace7e02d76\",\n                 \"last\": \"b2b78ca013b49c73231bee11674bcdb3edf6d3f2\" },\n        expect={ \"message\": testing.expect.message_title(u\"Invalid parameters; 'first' can not be a merge commit.\") })\n\n    mailbox.check_empty()\n"
  },
  {
    "path": "testing/tests/001-main/003-self/017-showcommit-merge-replay.py",
    "content": "# Scenario: Alice opens the BRANCHES page, switches to the master branch in the\n# Critic-inside-Critic repository and clicks on the merge commit 8ebec44a.\n# Finally, even though the merge is empty she clicks the link \"display conflict\n# resolution changes\" near the top of the page. After that she also views\n# 030afecd which had some actual conflicts.\n\ndocument_title = testing.expect.document_title(u\"Merge pull request #30 from rchl/exception-fixes (8ebec44a)\")\nwith frontend.signin(\"alice\"):\n    document = frontend.page(\n        \"showcommit\",\n        params={ \"sha1\": \"8ebec44af03197c9679f08afc2b19606c839db99\",\n                 \"conflicts\": \"yes\" },\n        expect={ \"document_title\": document_title })\n\ndocument_title = testing.expect.document_title(u\"Merge remote-tracking branch 'github/master' into r/molsson/showcommit_sends_no_data (030afecd)\")\nfrontend.page(\n    url=\"showcommit\",\n    params={ \"sha1\": \"030afecdfb40235af03faa52a2a193c7d8199b66\",\n             \"conflicts\": \"yes\" },\n    expect={ \"document_title\": document_title })\n\nmailbox.check_empty()\n"
  },
  {
    "path": "testing/tests/001-main/003-self/018-detect-moves-no-moved-code.py",
    "content": "# Scenario: Bob is viewing a commit that doesn't contain any chunks that Critic\n# detects as \"moved code\". Bob is not sure though, so he hits 'm', selects the\n# appropriate filenames and clicks SEARCH. Critic should not crash.\n\nCOMMIT_WITH_NO_MOVES = 'cc1c1a25'\n\nwith frontend.signin(\"bob\"):\n    document = frontend.page(\n        \"critic/%s\" % COMMIT_WITH_NO_MOVES,\n        params={ \"moves\": \"yes\" },\n        expect={ \"message\": testing.expect.message_title(u\"No moved code found!\") })\n\n    mailbox.check_empty()\n"
  },
  {
    "path": "testing/tests/001-main/003-self/019-showtree-showfile-bogus.py",
    "content": "# Scenario: /showtree or /showfile loaded, with or without a repository\n# specifier, with a SHA-1 that is or is not present in that/any repository, or a\n# path that is or is not valid.\n\nVALID_SHA1 = \"378a00935735431d5408dc8acbca77e6887f91c6\"\nINVALID_SHA1 = \"aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa\"\n\nVALID_TREE_PATH = \"src/page\"\nINVALID_TREE_PATH = \"src/horse\"\n\nVALID_FILE_PATH = \"src/page/showtree.py\"\nINVALID_FILE_PATH = \"src/page/showpuppy.py\"\n\nmissing_sha1 = testing.expect.message(\n    expected_title=\"SHA-1 not found\",\n    expected_body=\"Couldn't find commit %s in any repository.\" % INVALID_SHA1)\n\nmissing_tree = testing.expect.message(\n    expected_title=\"Directory does not exist\",\n    expected_body=(\"There is no directory named /%s in the commit %s.\"\n                   % (INVALID_TREE_PATH, VALID_SHA1[:8])))\n\nmissing_file = testing.expect.message(\n    expected_title=\"File does not exist\",\n    expected_body=(\"There is no file named /%s in the commit %s.\"\n                   % (INVALID_FILE_PATH, VALID_SHA1[:8])))\n\ninvalid_file = testing.expect.message(\n    expected_title=\"Invalid path parameter\",\n    expected_body=\"The path must be non-empty and must not end with a /.\")\n\nfrontend.page(\n    \"showtree\",\n    params={ \"sha1\": VALID_SHA1,\n             \"path\": VALID_TREE_PATH },\n    expect={ \"message\": testing.expect.no_message() })\n\nfrontend.page(\n    \"showtree\",\n    params={ \"sha1\": INVALID_SHA1,\n             \"path\": VALID_TREE_PATH },\n    expect={ \"message\": missing_sha1 })\n\nfrontend.page(\n    \"showtree\",\n    params={ \"sha1\": VALID_SHA1,\n             \"path\": INVALID_TREE_PATH },\n    expect={ \"message\": missing_tree })\n\nfrontend.page(\n    \"showfile\",\n    params={ \"sha1\": VALID_SHA1,\n             \"path\": VALID_FILE_PATH },\n    expect={ \"message\": testing.expect.no_message() })\n\nfrontend.page(\n    \"showfile\",\n    params={ \"sha1\": INVALID_SHA1,\n             \"path\": VALID_FILE_PATH },\n    expect={ \"message\": missing_sha1 })\n\nfrontend.page(\n    \"showfile\",\n    params={ \"sha1\": VALID_SHA1,\n             \"path\": INVALID_FILE_PATH },\n    expect={ \"message\": missing_file })\n\nfrontend.page(\n    \"showfile\",\n    params={ \"sha1\": VALID_SHA1,\n             \"path\": \"\" },\n    expect={ \"message\": invalid_file })\n\nfrontend.page(\n    \"showfile\",\n    params={ \"sha1\": VALID_SHA1,\n             \"path\": VALID_FILE_PATH + \"/\" },\n    expect={ \"message\": invalid_file })\n"
  },
  {
    "path": "testing/tests/001-main/003-self/020-fixup-review-via-push.py",
    "content": "import os\n\ndef to(name):\n    return testing.mailbox.ToRecipient(\"%s@example.org\" % name)\n\ndef about(subject):\n    return testing.mailbox.WithSubject(subject)\n\nFILENAME = \"020-fixup-review-via-push.txt\"\nSETTINGS = { \"review.createViaPush\": True }\n\nwith testing.utils.settings(\"alice\", SETTINGS), frontend.signin(\"alice\"):\n    with repository.workcopy() as work:\n        REMOTE_URL = instance.repository_url(\"alice\")\n\n        with open(os.path.join(work.path, FILENAME), \"w\") as text_file:\n            print >>text_file, \"Some content.\"\n\n        work.run([\"add\", FILENAME])\n        work.run([\"commit\", \"-m\", \"\"\"\\\nfixup! Commit reference\n\nRelevant summary\n\"\"\"],\n                 GIT_AUTHOR_NAME=\"Alice von Testing\",\n                 GIT_AUTHOR_EMAIL=\"alice@example.org\",\n                 GIT_COMMITTER_NAME=\"Alice von Testing\",\n                 GIT_COMMITTER_EMAIL=\"alice@example.org\")\n        work.run([\"push\", \"-q\", REMOTE_URL,\n                  \"HEAD:refs/heads/r/020-fixup-review-via-push\"])\n\n        mailbox.pop(accept=[to(\"alice\"), about(\"New Review: Relevant summary\")])\n"
  },
  {
    "path": "testing/tests/001-main/003-self/020-reviewrebase.py",
    "content": "import os\n\nFILENAME = \"020-reviewrebase.txt\"\nFILENAME_BASE = \"020-reviewrebase.base.txt\"\n\nNONSENSE = \"\"\"\\\nLorem ipsum dolor sit amet, consectetur adipiscing\nelit. Donec ut enim sit amet purus ultricies\nlobortis. Pellentesque nisi arcu, convallis sed purus sed,\nsemper ultrices velit. Ut egestas lorem tortor, vitae\nlacinia lorem consectetur nec. Integer tempor ornare ipsum\nat viverra. Curabitur nec orci mollis, lacinia sapien eget,\nultricies ipsum. Curabitur a libero tortor. Curabitur\nvolutpat lacinia erat, ac suscipit enim dignissim nec.\"\"\"\n\ndef lines(*args):\n    return \"\\n\".join((line.upper() if index in args else line)\n                     for index, line in enumerate(NONSENSE.splitlines()))\n\ndef to(name):\n    return testing.mailbox.ToRecipient(\"%s@example.org\" % name)\n\ndef about(subject):\n    return testing.mailbox.WithSubject(subject)\n\nSETTINGS = { \"review.createViaPush\": True,\n             \"email.subjectLine.updatedReview.reviewRebased\":\n                 \"Rebased Review: %(summary)s\" }\n\nwork = repository.workcopy()\nsettings = testing.utils.settings(\"alice\", SETTINGS)\nsignin = frontend.signin(\"alice\")\nreviews = []\n\nwith work, settings, signin:\n    REMOTE_URL = instance.repository_url(\"alice\")\n\n    def write(*args, **kwargs):\n        \"\"\"Write the file<tm>, optionally with lines upper-cased.\"\"\"\n        filename = kwargs.get(\"filename\", FILENAME)\n        with open(os.path.join(work.path, filename), \"w\") as target:\n            print >>target, lines(*args)\n\n    def commit(message_or_ref=\"HEAD\", generate=None, *args, **kwargs):\n        \"\"\"If called with two or more arguments, create a commit and return,\n           otherwise return commit referenced by first argument.\"\"\"\n        if generate is not None:\n            generate(*args, **kwargs)\n            work.run([\"add\", kwargs.get(\"filename\", FILENAME)])\n            work.run([\"commit\", \"-m\", message_or_ref])\n            message_or_ref = \"HEAD\"\n\n        oneline = work.run([\"log\", \"--no-abbrev\", \"--pretty=oneline\", \"-1\",\n                            message_or_ref])\n        sha1, summary = oneline.strip().split(\" \", 1)\n\n        return { \"sha1\": sha1, \"summary\": summary }\n\n    def expectmail(title):\n        review = reviews[-1]\n        mailbox.pop(accept=[to(\"alice\"),\n                            about(\"%s: %s\" % (title, review[\"summary\"]))])\n\n    def expecthead(expected):\n        \"\"\"Check that the review branch in Critic's repository is where we want\n           it to be.\"\"\"\n        review = reviews[-1]\n        actual = work.run([\"ls-remote\", REMOTE_URL, review[\"branch\"]]).split()[0]\n        testing.expect.check(expected[\"sha1\"], actual)\n\n    def createreview(commits):\n        \"\"\"Create a review of the specified commits.\"\"\"\n        index = len(reviews) + 1\n        branch = \"020-reviewrebase/%d\" % index\n        summary = \"020-reviewrebase, test %d\" % index\n        work.run([\"push\", REMOTE_URL, \"%s:refs/heads/%s\" % (commits[-1][\"sha1\"],\n                                                            branch)])\n        result = frontend.operation(\n            \"submitreview\",\n            data={ \"repository\": \"critic\",\n                   \"commit_sha1s\": [commit[\"sha1\"] for commit in commits],\n                   \"branch\": \"r/\" + branch,\n                   \"frombranch\": branch,\n                   \"summary\": summary })\n        reviews.append({ \"id\": result[\"review_id\"],\n                         \"branch\": \"r/\" + branch,\n                         \"summary\": summary })\n        expectmail(\"New Review\")\n\n    def push(new_head, force=False):\n        \"\"\"Push specified commit to the review branch in Critic's repository,\n           optionally forced.\"\"\"\n        review = reviews[-1]\n        args = [\"push\"]\n        if force:\n            args.append(\"-f\")\n        args.extend([REMOTE_URL, \"%s:refs/heads/%s\" % (new_head[\"sha1\"],\n                                                       review[\"branch\"])])\n        work.run(args)\n        expecthead(new_head)\n\n    def moverebase(new_upstream, new_head):\n        \"\"\"Perform a move rebase.\"\"\"\n        review = reviews[-1]\n        work.run([\"reset\", \"--hard\", new_head[\"sha1\"]])\n        frontend.operation(\n            \"preparerebase\",\n            data={ \"review_id\": review[\"id\"],\n                   \"new_upstream\": new_upstream[\"sha1\"] })\n        push(new_head, force=True)\n        expectmail(\"Rebased Review\")\n\n    def historyrewrite(new_head):\n        \"\"\"Perform a history rewrite rebase.\"\"\"\n        review = reviews[-1]\n        work.run([\"reset\", \"--hard\", new_head[\"sha1\"]])\n        frontend.operation(\n            \"preparerebase\",\n            data={ \"review_id\": review[\"id\"] })\n        push(new_head, force=True)\n        expectmail(\"Rebased Review\")\n\n    def expectlog(expected):\n        \"\"\"Fetch the review front-page and check that the commit log contains\n           the expected lines.\n\n           Also fetch a /showcommit page whose 'Squashed History' log lists\n           everything in the review and check that it contains the same lines\n           too.\"\"\"\n\n        expected = [(item if isinstance(item, str) else item[\"summary\"])\n                    for item in expected]\n\n        def checklog(document):\n            with_class = testing.expect.with_class\n            actual = []\n            for tr in document.findAll(\"tr\"):\n                if not tr.has_key(\"class\"):\n                    continue\n                classes = tr[\"class\"].split()\n                if \"commit\" in classes:\n                    td = tr.find(\"td\", attrs=with_class(\"summary\"))\n                    a = td.find(\"a\", attrs=with_class(\"commit\"))\n                    actual.append(a.string)\n                elif \"rebase\" in classes:\n                    td = tr.find(\"td\")\n                    if td.contents[0].startswith(\"Branch rebased\"):\n                        a = td.find(\"a\")\n                        sha1 = a[\"href\"].split(\"/\")[-1]\n                        actual.append(\"rebased onto \" + sha1)\n                    elif td.contents[0].startswith(\"History rewritten\"):\n                        actual.append(\"history rewritten\")\n            testing.expect.check(expected, actual)\n\n        review = reviews[-1]\n\n        frontend.page(\n            \"r/%d\" % review[\"id\"],\n            expect={ \"log\": checklog })\n        frontend.page(\n            \"showcommit\",\n            params={ \"review\": review[\"id\"],\n                     \"filter\": \"files\",\n                     \"file\": FILENAME },\n            expect={ \"log\": checklog })\n\n    def revertrebase():\n        \"\"\"Revert the most recent rebase.\"\"\"\n        review = reviews[-1]\n        document = frontend.page(\"r/%d\" % review[\"id\"])\n        for a in document.findAll(\"a\"):\n            if a.string == \"[revert]\":\n                def revertRebase(rebase_id):\n                    return rebase_id\n                rebase_id = eval(a[\"href\"].split(\":\", 1)[1])\n                break\n        else:\n            logger.error(\"No [revert] link found!\")\n        frontend.operation(\n            \"revertrebase\",\n            data={ \"review_id\": review[\"id\"],\n                   \"rebase_id\": rebase_id })\n\n    start_sha1 = work.run([\"rev-parse\", \"HEAD\"]).strip()\n\n    # TEST #1: Create a review with three commits, then history rewrite so that\n    # the branch points to the first commit (i.e. remove the second and third\n    # commit.)  Then push another pair of commits, and history rewrite back to\n    # the first commit again.\n\n    work.run([\"checkout\", \"-b\", \"020-reviewrebase-1\", start_sha1])\n    commits = [commit(\"Test #1, commit 1\", write),\n               commit(\"Test #1, commit 2\", write, 4, 5),\n               commit(\"Test #1, commit 3\", write)]\n    createreview(commits)\n    historyrewrite(commits[0])\n    expectlog([\"history rewritten\",\n               commits[2],\n               commits[1],\n               commits[0]])\n    commits.extend([commit(\"Test #1, commit 4\", write, 3, 4, 5),\n                    commit(\"Test #1, commit 5\", write, 4, 5),\n                    commit(\"Test #1, commit 6\", write)])\n    push(commits[-1])\n    expectmail(\"Updated Review\")\n    expectlog([commits[5],\n               commits[4],\n               commits[3],\n               \"history rewritten\",\n               commits[2],\n               commits[1],\n               commits[0]])\n    historyrewrite(commits[0])\n    expectlog([\"history rewritten\",\n               commits[5],\n               commits[4],\n               commits[3],\n               \"history rewritten\",\n               commits[2],\n               commits[1],\n               commits[0]])\n    revertrebase()\n    expectlog([commits[5],\n               commits[4],\n               commits[3],\n               \"history rewritten\",\n               commits[2],\n               commits[1],\n               commits[0]])\n\n    # Random extra check for crash fixed in http://critic-review.org/r/207:\n    # Use the [partial] filter to look at the two last commits in the review.\n    # We're only interested in checking that the page loads successfully.\n    frontend.page(\n        \"showcommit\",\n        params={ \"from\": commits[3][\"sha1\"],\n                 \"to\": commits[5][\"sha1\"],\n                 \"review\": reviews[-1][\"id\"],\n                 \"filter\": \"files\",\n                 \"file\": FILENAME })\n\n    # TEST #2: First, set up two different commits that we'll be basing our\n    # review branch on.  Then create a review with three commits, move rebase\n    # it (ff), rewrite the history, and move rebase it (non-ff) again.\n\n    work.run([\"checkout\", \"-b\", \"020-reviewrebase-2-base\", start_sha1])\n    base_commits = [commit(\"Test #2 base, commit 1\", write),\n                    commit(\"Test #2 base, commit 2\", write, 0)]\n    work.run([\"push\", REMOTE_URL, \"020-reviewrebase-2-base\"])\n    work.run([\"checkout\", \"-b\", \"020-reviewrebase-2\", base_commits[0][\"sha1\"]])\n    commits = [commit(\"Test #2, commit 1\", write, 5),\n               commit(\"Test #2, commit 2\", write, 5, 6),\n               commit(\"Test #2, commit 3\", write, 5, 6, 7)]\n    createreview(commits)\n    work.run([\"reset\", \"--hard\", base_commits[1][\"sha1\"]])\n    commits.extend([commit(\"Test #2, commit 4\", write, 0, 5),\n                    commit(\"Test #2, commit 5\", write, 0, 5, 6),\n                    commit(\"Test #2, commit 6\", write, 0, 5, 6, 7)])\n    moverebase(base_commits[1], commits[-1])\n    expectlog([\"rebased onto \" + base_commits[1][\"sha1\"],\n               commits[2],\n               commits[1],\n               commits[0]])\n    work.run([\"reset\", \"--hard\", base_commits[1][\"sha1\"]])\n    commits.append(commit(\"Test #2, commit 7\", write, 0, 5, 6, 7))\n    historyrewrite(commits[-1])\n    expectlog([\"history rewritten\",\n               \"rebased onto \" + base_commits[1][\"sha1\"],\n               commits[2],\n               commits[1],\n               commits[0]])\n    work.run([\"reset\", \"--hard\", base_commits[0][\"sha1\"]])\n    commits.append(commit(\"Test #2, commit 8\", write, 5, 6, 7))\n    moverebase(base_commits[0], commits[-1])\n    expectlog([\"rebased onto \" + base_commits[0][\"sha1\"],\n               \"history rewritten\",\n               \"rebased onto \" + base_commits[1][\"sha1\"],\n               commits[2],\n               commits[1],\n               commits[0]])\n\n    # TEST #3: Like test #2, but the base commits have changes that trigger\n    # \"conflicts\" and thus equivalent merge commits.\n\n    work.run([\"checkout\", \"-b\", \"020-reviewrebase-3-base\", start_sha1])\n    base_commits = [commit(\"Test #3 base, commit 1\", write),\n                    commit(\"Test #3 base, commit 2\", write, 2)]\n    work.run([\"push\", REMOTE_URL, \"020-reviewrebase-3-base\"])\n    work.run([\"checkout\", \"-b\", \"020-reviewrebase-3\", base_commits[0][\"sha1\"]])\n    commits = [commit(\"Test #3, commit 1\", write, 5),\n               commit(\"Test #3, commit 2\", write, 5, 6),\n               commit(\"Test #3, commit 3\", write, 5, 6, 7)]\n    createreview(commits)\n    work.run([\"reset\", \"--hard\", base_commits[1][\"sha1\"]])\n    commits.extend([commit(\"Test #3, commit 4\", write, 2, 5),\n                    commit(\"Test #3, commit 5\", write, 2, 5, 6),\n                    commit(\"Test #3, commit 6\", write, 2, 5, 6, 7)])\n    moverebase(base_commits[1], commits[-1])\n    expectmail(\"Updated Review\")\n    expectlog([\"rebased onto \" + base_commits[1][\"sha1\"],\n               \"Merge commit '%s' into %s\" % (base_commits[1][\"sha1\"],\n                                              reviews[-1][\"branch\"]),\n               commits[2],\n               commits[1],\n               commits[0]])\n    work.run([\"reset\", \"--hard\", base_commits[1][\"sha1\"]])\n    commits.append(commit(\"Test #3, commit 7\", write, 2, 5, 6, 7))\n    historyrewrite(commits[-1])\n    expectlog([\"history rewritten\",\n               \"rebased onto \" + base_commits[1][\"sha1\"],\n               \"Merge commit '%s' into %s\" % (base_commits[1][\"sha1\"],\n                                              reviews[-1][\"branch\"]),\n               commits[2],\n               commits[1],\n               commits[0]])\n    work.run([\"reset\", \"--hard\", base_commits[0][\"sha1\"]])\n    commits.append(commit(\"Test #3, commit 8\", write, 5, 6, 7))\n    moverebase(base_commits[0], commits[-1])\n    expectlog([\"rebased onto \" + base_commits[0][\"sha1\"],\n               \"history rewritten\",\n               \"rebased onto \" + base_commits[1][\"sha1\"],\n               \"Merge commit '%s' into %s\" % (base_commits[1][\"sha1\"],\n                                              reviews[-1][\"branch\"]),\n               commits[2],\n               commits[1],\n               commits[0]])\n\n    # TEST #4: Create a review with three commits based on master~2, then merge\n    # master~1 into the review, and then rebase the review onto master.\n\n    work.run([\"fetch\", REMOTE_URL, \"refs/heads/master\"])\n    base_commits = [commit(\"FETCH_HEAD~2\"),\n                    commit(\"FETCH_HEAD~1\"),\n                    commit(\"FETCH_HEAD\")]\n    work.run([\"checkout\", \"-b\", \"020-reviewrebase-4-1\",\n              base_commits[0][\"sha1\"]])\n    commits = [commit(\"Test #4, commit 1\", write, 7),\n               commit(\"Test #4, commit 2\", write, 6, 7),\n               commit(\"Test #4, commit 3\", write, 5, 6, 7)]\n    createreview(commits)\n    work.run([\"checkout\", \"-b\", \"020-reviewrebase-4-2\",\n              base_commits[1][\"sha1\"]])\n    work.run([\"merge\", \"020-reviewrebase-4-1\"])\n    commits.append(commit())\n    push(commits[-1])\n    expectmail(\"Updated Review\")\n    work.run([\"reset\", \"--hard\", base_commits[2][\"sha1\"]])\n    commits.extend([commit(\"Test #4, commit 4\", write, 7),\n                    commit(\"Test #4, commit 5\", write, 6, 7),\n                    commit(\"Test #4, commit 6\", write, 5, 6, 7)])\n    moverebase(base_commits[2], commits[-1])\n    expectlog([\"rebased onto \" + base_commits[2][\"sha1\"],\n               commits[3],\n               commits[2],\n               commits[1],\n               commits[0]])\n\n    # TEST #5: First, set up two different commits that we'll be basing our\n    # review branch on.  Then create a review with three commits, then history\n    # rewrite so that the branch points to the first commit (i.e. remove the\n    # second and third commit.)  Then non-ff move-rebase the review.\n\n    work.run([\"checkout\", \"-b\", \"020-reviewrebase-5-base\", start_sha1])\n    base_commits = [commit(\"Test #5 base, commit 1\", write, filename=FILENAME_BASE),\n                    commit(\"Test #5 base, commit 2\", write, 0, filename=FILENAME_BASE)]\n    work.run([\"push\", REMOTE_URL, \"020-reviewrebase-5-base\"])\n    work.run([\"checkout\", \"-b\", \"020-reviewrebase-5\", base_commits[1][\"sha1\"]])\n    commits = [commit(\"Test #5, commit 1\", write, 4),\n               commit(\"Test #5, commit 2\", write, 4, 5),\n               commit(\"Test #5, commit 3\", write, 4)]\n    createreview(commits)\n    historyrewrite(commits[0])\n    expectlog([\"history rewritten\",\n               commits[2],\n               commits[1],\n               commits[0]])\n    work.run([\"reset\", \"--hard\", base_commits[0][\"sha1\"]])\n    commits.append(commit(\"Test #5, commit 4\", write, 4))\n    moverebase(base_commits[0], commits[-1])\n    expectlog([\"rebased onto \" + base_commits[0][\"sha1\"],\n               \"history rewritten\",\n               commits[2],\n               commits[1],\n               commits[0]])\n\n    # TEST #6: Like test #5, but we revert the rebases afterwards.\n\n    work.run([\"checkout\", \"-b\", \"020-reviewrebase-6-base\", start_sha1])\n    base_commits = [commit(\"Test #6 base, commit 1\", write, filename=FILENAME_BASE),\n                    commit(\"Test #6 base, commit 2\", write, 0, filename=FILENAME_BASE)]\n    work.run([\"push\", REMOTE_URL, \"020-reviewrebase-6-base\"])\n    work.run([\"checkout\", \"-b\", \"020-reviewrebase-6\", base_commits[1][\"sha1\"]])\n    commits = [commit(\"Test #6, commit 1\", write, 4),\n               commit(\"Test #6, commit 2\", write, 4, 5),\n               commit(\"Test #6, commit 3\", write, 4)]\n    createreview(commits)\n    historyrewrite(commits[0])\n    expectlog([\"history rewritten\",\n               commits[2],\n               commits[1],\n               commits[0]])\n    work.run([\"reset\", \"--hard\", base_commits[0][\"sha1\"]])\n    commits.append(commit(\"Test #6, commit 4\", write, 4))\n    moverebase(base_commits[0], commits[-1])\n    expectlog([\"rebased onto \" + base_commits[0][\"sha1\"],\n               \"history rewritten\",\n               commits[2],\n               commits[1],\n               commits[0]])\n    revertrebase()\n    expectlog([\"history rewritten\",\n               commits[2],\n               commits[1],\n               commits[0]])\n    revertrebase()\n    expectlog([commits[2],\n               commits[1],\n               commits[0]])\n\n    # TEST #7: Test reverting a ff-move rebase with conflicts.\n\n    work.run([\"checkout\", \"-b\", \"020-reviewrebase-7-base\", start_sha1])\n    base_commits = [commit(\"Test #7 base, commit 1\", write),\n                    commit(\"Test #7 base, commit 2\", write, 2)]\n    work.run([\"push\", REMOTE_URL, \"020-reviewrebase-7-base\"])\n    work.run([\"checkout\", \"-b\", \"020-reviewrebase-7\", base_commits[0][\"sha1\"]])\n    commits = [commit(\"Test #7, commit 1\", write, 5),\n               commit(\"Test #7, commit 2\", write, 5, 6),\n               commit(\"Test #7, commit 3\", write, 5, 6, 7)]\n    createreview(commits)\n    work.run([\"reset\", \"--hard\", base_commits[1][\"sha1\"]])\n    commits.extend([commit(\"Test #7, commit 4\", write, 2, 5),\n                    commit(\"Test #7, commit 5\", write, 2, 5, 6),\n                    commit(\"Test #7, commit 6\", write, 2, 5, 6, 7)])\n    moverebase(base_commits[1], commits[-1])\n    expectmail(\"Updated Review\")\n    expectlog([\"rebased onto \" + base_commits[1][\"sha1\"],\n               \"Merge commit '%s' into %s\" % (base_commits[1][\"sha1\"],\n                                              reviews[-1][\"branch\"]),\n               commits[2],\n               commits[1],\n               commits[0]])\n    revertrebase()\n    expectlog([commits[2],\n               commits[1],\n               commits[0]])\n\n    # TEST #8: Test reverting a non-ff-move rebase with conflicts.\n\n    work.run([\"checkout\", \"-b\", \"020-reviewrebase-8-base\", start_sha1])\n    base_commits = [commit(\"Test #8 base, commit 1\", write),\n                    commit(\"Test #8 base, commit 2\", write, 2)]\n    work.run([\"push\", REMOTE_URL, \"020-reviewrebase-8-base\"])\n    work.run([\"checkout\", \"-b\", \"020-reviewrebase-8\", base_commits[1][\"sha1\"]])\n    commits = [commit(\"Test #8, commit 1\", write, 2, 3),\n               commit(\"Test #8, commit 2\", write, 2, 3, 6),\n               commit(\"Test #8, commit 3\", write, 2, 3, 6, 7)]\n    createreview(commits)\n    work.run([\"reset\", \"--hard\", base_commits[0][\"sha1\"]])\n    commits.extend([commit(\"Test #8, commit 4\", write, 3),\n                    commit(\"Test #8, commit 5\", write, 3, 6),\n                    commit(\"Test #8, commit 6\", write, 3, 6, 7)])\n    moverebase(base_commits[0], commits[-1])\n    expectmail(\"Updated Review\")\n    expectlog([\"rebased onto \" + base_commits[0][\"sha1\"],\n               \"Changes introduced by rebase\",\n               commits[2],\n               commits[1],\n               commits[0]])\n    revertrebase()\n    expectlog([commits[2],\n               commits[1],\n               commits[0]])\n"
  },
  {
    "path": "testing/tests/001-main/003-self/021-updatereview-bogus.py",
    "content": "with frontend.signin(\"alice\"):\n    frontend.operation(\n        \"updatereview\",\n        data={ \"review_id\": -1,\n               \"new_owners\": [\"alice\"] },\n        expect={ \"status\": \"failure\",\n                 \"code\": \"nosuchreview\" })\n"
  },
  {
    "path": "testing/tests/001-main/003-self/022-removereviewfilter-bogus.py",
    "content": "with frontend.signin(\"alice\"):\n    frontend.operation(\n        \"removereviewfilter\",\n        data={ \"filter_id\": -1 },\n        expect={ \"status\": \"failure\",\n                 \"code\": \"nosuchfilter\" })\n"
  },
  {
    "path": "testing/tests/001-main/003-self/024-customizations.githook.py",
    "content": "# Need a VM (full installation) to do customizations.\n# @flag full\n\nimport re\nimport json\n\n# Install the githook customization.\ninstance.execute(\n    [\"sudo\", \"mkdir\", \"-p\", \"/etc/critic/main/customization\",\n     \"&&\",\n     \"sudo\", \"touch\", \"/etc/critic/main/customization/__init__.py\",\n     \"&&\",\n     \"sudo\", \"cp\", \"critic/testing/input/customization/githook.py\",\n     \"/etc/critic/main/customization\",\n     \"&&\",\n     \"sudo\", \"chown\", \"-R\", \"critic.critic\", \"/etc/critic/main/customization\"])\n\n# Note: no need to restart, since the githook background service effectively\n# re-imports the 'index' module, which imports the 'customizations.githook'\n# module.\n\nwith repository.workcopy() as work:\n    REMOTE_URL = instance.repository_url(\"alice\")\n\n    def lsremote(ref_name):\n        try:\n            output = work.run([\"ls-remote\", \"--exit-code\", REMOTE_URL, ref_name])\n        except testing.repository.GitCommandError:\n            return None\n\n        lines = output.splitlines()\n\n        testing.expect.check(1, len(lines))\n        testing.expect.check(\"[0-9a-f]{40}\\t\" + ref_name, lines[0],\n                             equal=re.match)\n\n        return lines[0][:40]\n\n    def push(new_value, ref_name, expected_result):\n        old_value = lsremote(ref_name)\n\n        if new_value is not None:\n            new_value = work.run([\"rev-parse\", new_value]).strip()\n\n        try:\n            output = work.run([\"push\", \"--quiet\", REMOTE_URL,\n                               \"%s:%s\" % (new_value or \"\", ref_name)])\n            testing.expect.check(\"ACCEPT\", expected_result)\n        except testing.repository.GitCommandError as error:\n            output = error.output\n            testing.expect.check(\"REJECT\", expected_result)\n\n        from_hook = []\n        for line in output.splitlines():\n            line = line.partition(\"\\x1b\")[0]\n            if line.startswith(\"remote: \"):\n                from_hook.append(line[len(\"remote: \"):].strip())\n\n        testing.expect.check(\"^%s:\" % expected_result, from_hook[0],\n                             equal=re.match)\n        testing.expect.check({ \"repository_path\": \"/var/git/critic.git\",\n                               \"ref_name\": ref_name,\n                               \"old_value\": old_value,\n                               \"new_value\": new_value },\n                             json.loads(from_hook[0][7:]))\n        if expected_result == \"ACCEPT\":\n            testing.expect.check(new_value, lsremote(ref_name))\n        else:\n            testing.expect.check(old_value, lsremote(ref_name))\n\n    push(\"HEAD\", \"refs/heads/reject-create\", \"REJECT\")\n    push(\"HEAD^\", \"refs/heads/reject-delete\", \"ACCEPT\")\n    push(\"HEAD\", \"refs/heads/reject-delete\", \"ACCEPT\")\n    push(None, \"refs/heads/reject-delete\", \"REJECT\")\n    push(\"HEAD^\", \"refs/heads/reject-update\", \"ACCEPT\")\n    push(\"HEAD\", \"refs/heads/reject-update\", \"REJECT\")\n    push(\"HEAD^\", \"refs/heads/reject-nothing\", \"ACCEPT\")\n    push(\"HEAD\", \"refs/heads/reject-nothing\", \"ACCEPT\")\n    push(None, \"refs/heads/reject-nothing\", \"ACCEPT\")\n\n# Remove the githook customization again.\ninstance.execute(\n    [\"sudo\", \"rm\", \"-f\",\n     \"/etc/critic/main/customization/githook.py\",\n     \"/etc/critic/main/customization/githook.pyc\"])\n\n# And again, no need to restart.\n"
  },
  {
    "path": "testing/tests/001-main/003-self/025-trackedbranch.py",
    "content": "import time\n\nBRANCH_NAME = \"025-trackedbranch\"\n\nwith repository.workcopy() as work, frontend.signin():\n    REMOTE_URL = instance.repository_url(\"alice\")\n\n    def wait_for_branch(branch_name, value):\n        instance.synchronize_service(\"branchtracker\")\n\n        try:\n            output = work.run([\"ls-remote\", \"--exit-code\", REMOTE_URL,\n                               \"refs/heads/\" + branch_name])\n            if output.startswith(value):\n                return\n        except testing.repository.GitCommandError:\n            logger.error(\"Tracked branch %s not updated as expected.\"\n                         % branch_name)\n            raise testing.TestFailure\n\n    def get_branch_log(branch_id, expected_length):\n        result = frontend.operation(\n            \"trackedbranchlog\",\n            data={ \"branch_id\": branch_id })\n\n        branch_log = result[\"items\"]\n\n        testing.expect.check(expected_length, len(branch_log))\n\n        return branch_log\n\n    def check_log_item(branch_log_item, from_sha1, to_sha1, hook_output,\n                       successful):\n        testing.expect.check(from_sha1, branch_log_item[\"from_sha1\"])\n        testing.expect.check(to_sha1, branch_log_item[\"to_sha1\"])\n        testing.expect.check(hook_output, branch_log_item[\"hook_output\"])\n        testing.expect.check(successful, branch_log_item[\"successful\"])\n\n    work.run([\"push\", \"origin\", \"HEAD:refs/heads/\" + BRANCH_NAME])\n\n    sha1s = { \"HEAD\": work.run([\"rev-parse\", \"HEAD\"]).strip(),\n              \"HEAD^\": work.run([\"rev-parse\", \"HEAD^\"]).strip() }\n\n    result = frontend.operation(\n        \"addtrackedbranch\",\n        data={ \"repository_id\": 1,\n               \"source_location\": repository.url,\n               \"source_name\": BRANCH_NAME,\n               \"target_name\": BRANCH_NAME,\n               \"users\": [\"alice\"],\n               \"forced\": False })\n\n    branch_id = result[\"branch_id\"]\n\n    wait_for_branch(BRANCH_NAME, sha1s[\"HEAD\"])\n\n    branch_log = get_branch_log(branch_id, expected_length=1)\n\n    check_log_item(branch_log[0],\n                   from_sha1=\"0\" * 40,\n                   to_sha1=sha1s[\"HEAD\"],\n                   hook_output=\"\",\n                   successful=True)\n\n    work.run([\"push\", \"origin\", \"-f\", \"HEAD^:refs/heads/\" + BRANCH_NAME])\n\n    frontend.operation(\n        \"triggertrackedbranchupdate\",\n        data={ \"branch_id\": branch_id })\n\n    instance.synchronize_service(\"branchtracker\")\n\n    log_entries = instance.filter_service_log(\"branchtracker\", \"error\")\n\n    testing.expect.check(1, len(log_entries))\n    testing.expect.check(\"ERROR - update of branch 025-trackedbranch from \"\n                         \"025-trackedbranch in %s failed\" % repository.url,\n                         log_entries[0].splitlines()[0])\n\n    to_system = testing.mailbox.ToRecipient(\"system@example.org\")\n    system_subject = testing.mailbox.WithSubject(\n        \"branchtracker.log: update of branch %s from %s in %s failed\"\n        % (BRANCH_NAME, BRANCH_NAME, repository.url))\n    mailbox.pop(accept=[to_system, system_subject])\n\n    to_alice = testing.mailbox.ToRecipient(\"alice@example.org\")\n    alice_subject = testing.mailbox.WithSubject(\n        \"%s: update from %s in %s\" % (BRANCH_NAME, BRANCH_NAME, repository.url))\n    mailbox.pop(accept=[to_alice, alice_subject])\n\n    branch_log = get_branch_log(branch_id, expected_length=2)\n\n    check_log_item(branch_log[0],\n                   from_sha1=\"0\" * 40,\n                   to_sha1=sha1s[\"HEAD\"],\n                   hook_output=\"\",\n                   successful=True)\n    check_log_item(branch_log[1],\n                   from_sha1=sha1s[\"HEAD\"],\n                   to_sha1=sha1s[\"HEAD^\"],\n                   hook_output=\"\"\"\\\nRejecting non-fast-forward update of branch.  To perform the update, you\ncan delete the branch using\n  git push critic :%s\nfirst, and then repeat this push.\n\"\"\" % BRANCH_NAME,\n                   successful=False)\n\n    work.run([\"push\", \"origin\", \"HEAD:refs/heads/%s-forced\" % BRANCH_NAME])\n\n    result = frontend.operation(\n        \"addtrackedbranch\",\n        data={ \"repository_id\": 1,\n               \"source_location\": repository.url,\n               \"source_name\": BRANCH_NAME + \"-forced\",\n               \"target_name\": BRANCH_NAME + \"-forced\",\n               \"users\": [\"alice\"],\n               \"forced\": True })\n\n    branch_id = result[\"branch_id\"]\n\n    wait_for_branch(BRANCH_NAME + \"-forced\", sha1s[\"HEAD\"])\n\n    branch_log = get_branch_log(branch_id, expected_length=1)\n\n    check_log_item(branch_log[0],\n                   from_sha1=\"0\" * 40,\n                   to_sha1=sha1s[\"HEAD\"],\n                   hook_output=\"\",\n                   successful=True)\n\n    work.run([\"push\", \"origin\", \"-f\", \"HEAD^:refs/heads/%s-forced\" % BRANCH_NAME])\n\n    frontend.operation(\n        \"triggertrackedbranchupdate\",\n        data={ \"branch_id\": branch_id })\n\n    wait_for_branch(BRANCH_NAME + \"-forced\", sha1s[\"HEAD^\"])\n\n    branch_log = get_branch_log(branch_id, expected_length=2)\n\n    check_log_item(branch_log[0],\n                   from_sha1=\"0\" * 40,\n                   to_sha1=sha1s[\"HEAD\"],\n                   hook_output=\"\",\n                   successful=True)\n    check_log_item(branch_log[1],\n                   from_sha1=sha1s[\"HEAD\"],\n                   to_sha1=sha1s[\"HEAD^\"],\n                   hook_output=\"\"\"\\\nNon-fast-forward update detected; deleting and recreating branch.\n\"\"\",\n                   successful=True)\n\n    mailbox.check_empty()\n"
  },
  {
    "path": "testing/tests/001-main/003-self/026-searchreview.py",
    "content": "import re\n\nREVIEWS = { \"giraffe\": { \"sha1\": \"5360e5d734e3b990c0dc67496c7a83f94013d01d\",\n                         \"branch\": \"r/026-searchreview/giraffe\",\n                         \"owners\": [\"alice\"],\n                         \"summary\": \"Make sure TH element lives inside a TR element\",\n                         \"paths\": [\"src/page/repositories.py\"] },\n            \"elephant\": { \"sha1\": \"18db724faccfb2f8d04c81309feadf05b48ec9e3\",\n                          \"branch\": \"r/026-searchreview/elephant\",\n                          \"owners\": [\"alice\", \"bob\"],\n                          \"reviewers\": [\"alice\"],\n                          \"watchers\": [\"erin\"],\n                          \"summary\": \"URL escape shortname in repository SELECT\",\n                          \"description\": \"\"\"\\\nBefore this fix, a repository shortname such as \"a&b\" meant that\nthe user would be forwarded to /branches?repository=a&b and Critic\nwould say \"No such repository: a\". Same problem existed for shortname\n\"a#b\". Also shortname \"a+b\" hit the error \"No such repository: a b\".\"\"\",\n                          \"paths\": [\"src/resources/branches.js\",\n                                    \"src/resources/config.js\"] },\n            \"tiger\": { \"sha1\": \"95e52c53a4a183c9f0eada7401e1da174353e00e\",\n                       \"branch\": \"r/026-searchreview/cat\",\n                       \"owners\": [\"dave\"],\n                       \"reviewers\": [\"dave\", \"erin\"],\n                       \"summary\": \"Extend testing.tools.upgrade: support custom maintenance and reboot\",\n                       \"paths\": [\"testing/tools/upgrade.py\",\n                                 \"testing/virtualbox.py\"] },\n            \"ashtray\": { \"sha1\": \"94391a18858c05b2619dfea4b58507d08d932bd3\",\n                         \"branch\": \"r/026-searchreview/ashtray\",\n                         \"owners\": [\"dave\", \"alice\"],\n                         \"reviewers\": [\"dave\"],\n                         \"summary\": \"Add support for installing packages in instance\",\n                         \"description\": \"\"\"\\\nExtend the testing.tools.upgrade tool to support installing extra\npackages in the instance and retake the snapshot afterwards.\"\"\",\n                         \"paths\": [\"testing/tools/upgrade.py\"],\n                         \"dropped\": True } }\n\nFAILED = False\nSETTINGS = { \"review.createViaPush\": True }\n\ndef to(name):\n    return testing.mailbox.ToRecipient(\"%s@example.org\" % name)\n\ndef about(subject):\n    return testing.mailbox.WithSubject(subject)\n\nwith repository.workcopy() as work:\n    for review in REVIEWS.values():\n        primary_owner = review[\"owners\"][0]\n\n        with testing.utils.settings(primary_owner, SETTINGS), \\\n                frontend.signin(primary_owner):\n            REMOTE_URL = instance.repository_url(primary_owner)\n\n            output = work.run(\n                [\"push\", REMOTE_URL, \"%(sha1)s:refs/heads/%(branch)s\" % review])\n\n            next_is_review_url = False\n\n            for line in output.splitlines():\n                if not line.startswith(\"remote:\"):\n                    continue\n                line = line[len(\"remote:\"):].split(\"\\x1b\", 1)[0].strip()\n                if line == \"Submitted review:\":\n                    next_is_review_url = True\n                elif next_is_review_url:\n                    logger.debug(line)\n                    review[\"id\"] = int(re.search(r\"/r/(\\d+)$\", line).group(1))\n                    break\n            else:\n                testing.expect.check(\"<review URL in git hook output>\",\n                                     \"<expected content not found>\")\n\n            mailbox.pop(\n                accept=[to(primary_owner),\n                        about(\"New Review: %s\" % review[\"summary\"])])\n\n            updatereview_data = {}\n\n            if len(review[\"owners\"]) > 1:\n                updatereview_data[\"new_owners\"] = review[\"owners\"]\n            if \"description\" in review:\n                updatereview_data[\"new_description\"] = review[\"description\"]\n\n            if updatereview_data:\n                updatereview_data[\"review_id\"] = review[\"id\"]\n                frontend.operation(\n                    \"updatereview\",\n                    data=updatereview_data)\n\n            recipients = set()\n\n            if \"reviewers\" in review:\n                frontend.operation(\n                    \"addreviewfilters\",\n                    data={ \"review_id\": review[\"id\"],\n                           \"filters\": [{ \"type\": \"reviewer\",\n                                         \"user_names\": review[\"reviewers\"],\n                                         \"paths\": [\"/\"] }] })\n                recipients.update(review[\"reviewers\"])\n\n            if \"watchers\" in review:\n                frontend.operation(\n                    \"addreviewfilters\",\n                    data={ \"review_id\": review[\"id\"],\n                           \"filters\": [{ \"type\": \"watcher\",\n                                         \"user_names\": review[\"watchers\"],\n                                         \"paths\": [\"/\"] }] })\n                recipients.update(review[\"watchers\"])\n\n            for username in recipients:\n                if username not in review[\"owners\"]:\n                    mailbox.pop(accept=[to(username),\n                                        about(r\"^New\\(ish\\) Review:\")])\n                if username != primary_owner:\n                    mailbox.pop(accept=[to(username),\n                                        about(r\"^Updated Review:\")])\n\n            if \"closed\" in review:\n                frontend.operation(\n                    \"closereview\",\n                    data={ \"review_id\": review[\"id\"] })\n            if \"dropped\" in review:\n                frontend.operation(\n                    \"dropreview\",\n                    data={ \"review_id\": review[\"id\"] })\n\ndef search(query, expected):\n    global FAILED\n\n    if isinstance(query, list):\n        for q in query:\n            search(q, expected)\n        return\n\n    try:\n        result = frontend.operation(\n            \"searchreview\",\n            data={ \"query\": query })\n    except testing.TestFailure:\n        # Continue testing instead of aborting.  The error will have\n        # been logged by frontend.operation() already.\n        FAILED = True\n        return\n\n    actual = dict((review[\"id\"], review[\"summary\"])\n                  for review in result[\"reviews\"])\n\n    # Note: We only check that reviews we just created are included (or not) in\n    # the search result.  We specifically don't check that the search result\n    # doesn't contain other reviews (not created above) since that typically\n    # depends on which other tests have run, on which we don't want to depend.\n\n    for key in expected:\n        expected_review = REVIEWS[key]\n\n        if expected_review[\"id\"] not in actual:\n            logger.error(\"r/<%s>: not found by query %r as expected\"\n                         % (key, query))\n            FAILED = True\n        else:\n            if actual[expected_review[\"id\"]] != expected_review[\"summary\"]:\n                logger.error(\"r/<%s>: wrong summary %r reported\"\n                             % (key, actual[expected_review[\"id\"]]))\n                FAILED = True\n\n    for key in REVIEWS.keys():\n        if key not in expected:\n            if REVIEWS[key][\"id\"] in actual:\n                logger.error(\"r/<%s>: incorrectly found by query %r\"\n                             % (key, query))\n                FAILED = True\n\ndef invalid(query, code, title):\n    global FAILED\n\n    try:\n        frontend.operation(\n            \"searchreview\",\n            data={ \"query\": query },\n            expect={ \"status\": \"failure\",\n                     \"code\": code,\n                     \"title\": title })\n    except testing.TestFailure:\n        # Continue testing instead of aborting.  The error will have\n        # been logged by frontend.operation() already.\n        FAILED = True\n        return\n\nsearch(query=\"existentialism\",\n       expected=[])\nsearch(query=\"support\",\n       expected=[\"tiger\", \"ashtray\"])\nsearch(query=\"support for\",\n       expected=[\"ashtray\"])\nsearch(query=\"'support for'\",\n       expected=[\"ashtray\"])\nsearch(query='\"support for\"',\n       expected=[\"ashtray\"])\nsearch(query=\"support owner:dave\",\n       expected=[\"tiger\", \"ashtray\"])\nsearch(query=\"support owner:alice\",\n       expected=[\"ashtray\"])\nsearch(query=\"support owner:bob\",\n       expected=[])\n\nsearch(query=\"support installing\",\n       expected=[\"ashtray\"])\nsearch(query=\"'support installing'\",\n       expected=[\"ashtray\"])\nsearch(query=\"summary:'support installing'\",\n       expected=[])\nsearch(query=\"description:'support installing'\",\n       expected=[\"ashtray\"])\n\nsearch(query=\"r/026-searchreview/*\",\n       expected=[\"giraffe\", \"elephant\", \"tiger\", \"ashtray\"])\nsearch(query=[\"b:r/026-searchreview/*\", \"branch:r/026-searchreview/*\"],\n       expected=[\"giraffe\", \"elephant\", \"tiger\", \"ashtray\"])\nsearch(query=\"path:r/026-searchreview/*\",\n       expected=[])\nsearch(query=\"r/026-searchreview/elephant\",\n       expected=[\"elephant\"])\nsearch(query=\"r/026-searchreview/* upgrade.py\",\n       expected=[\"tiger\", \"ashtray\"])\nsearch(query=\"branch:r/026-searchreview/* path:upgrade.py\",\n       expected=[\"tiger\", \"ashtray\"])\nsearch(query=[\"p:upgrade.py\", \"path:upgrade.py\"],\n       expected=[\"tiger\", \"ashtray\"])\nsearch(query=\"branch:upgrade.py\",\n       expected=[])\n\nsearch(query=\"user:alice\",\n       expected=[\"giraffe\", \"elephant\", \"ashtray\"])\nsearch(query=\"owner:alice\",\n       expected=[\"giraffe\", \"elephant\", \"ashtray\"])\nsearch(query=\"reviewer:alice\",\n       expected=[\"elephant\"])\n\nsearch(query=\"user:bob\",\n       expected=[\"elephant\"])\nsearch(query=\"owner:bob\",\n       expected=[\"elephant\"])\nsearch(query=\"reviewer:bob\",\n       expected=[])\n\nsearch(query=\"user:dave\",\n       expected=[\"tiger\", \"ashtray\"])\nsearch(query=\"owner:dave\",\n       expected=[\"tiger\", \"ashtray\"])\nsearch(query=\"reviewer:dave\",\n       expected=[\"tiger\", \"ashtray\"])\n\nsearch(query=[\"u:erin\", \"user:erin\"],\n       expected=[\"elephant\", \"tiger\"])\nsearch(query=[\"o:erin\", \"owner:erin\"],\n       expected=[])\nsearch(query=[\"reviewer:erin\"],\n       expected=[\"tiger\"])\n\nsearch(query=\"owner:alice reviewer:bob user:erin\",\n       expected=[])\nsearch(query=\"reviewer:alice owner:bob\",\n       expected=[\"elephant\"])\n\nsearch(query=[\"s:open\", \"state:open\"],\n       expected=[\"giraffe\", \"elephant\", \"tiger\"])\n# It would be nice if we could make one of the reviews accepted, but doing that\n# is a lot of work.  In practice, a \"pending\" search is almost the same as an\n# \"accepted\" search; it's just inverted.  So we're at least quite close to\n# testing an \"accepted\" search.\nsearch(query=[\"s:pending\", \"state:pending\"],\n       expected=[\"giraffe\", \"elephant\", \"tiger\"])\nsearch(query=[\"s:accepted\", \"state:accepted\"],\n       expected=[])\n# It would be nice if we could close a review too, but again, that depends on\n# making a review accepted, and that's a lot of work.\nsearch(query=[\"s:closed\", \"state:closed\"],\n       expected=[])\nsearch(query=[\"s:dropped\", \"state:dropped\"],\n       expected=[\"ashtray\"])\n\n# A bit boring since there's only one repository.\nsearch(query=[\"r:critic\", \"repo:critic\", \"repository:critic\"],\n       expected=[\"giraffe\", \"elephant\", \"tiger\", \"ashtray\"])\n\ninvalid(query=\"overlord:admin\",\n        code=\"invalidkeyword\",\n        title=\"Invalid keyword: 'overlord'\")\ninvalid(query=\"user:nosuchuser\",\n        code=\"invalidterm\",\n        title=\"No such user: 'nosuchuser'\")\ninvalid(query=\"owner:nosuchuser\",\n        code=\"invalidterm\",\n        title=\"No such user: 'nosuchuser'\")\ninvalid(query=\"reviewer:nosuchuser\",\n        code=\"invalidterm\",\n        title=\"No such user: 'nosuchuser'\")\ninvalid(query=\"state:limbo\",\n        code=\"invalidterm\",\n        title=\"Invalid review state: 'limbo'\")\n\nif FAILED:\n    raise testing.TestFailure\n"
  },
  {
    "path": "testing/tests/001-main/003-self/027-whitespace-filenames.py",
    "content": "import os\n\nBRANCH = \"027-whitespace-filename\"\nFILENAME = \"filename with spaces.txt\"\n\ndef check_filename(class_name):\n    def check(document):\n        cells = document.findAll(\"td\", attrs=testing.expect.with_class(class_name))\n\n        for cell in cells:\n            anchor = cell.find(\"a\")\n            if not anchor:\n                continue\n            testing.expect.check(FILENAME, anchor.string)\n            break\n        else:\n            testing.expect.check(\"<td class=%s><a>%s\" % (class_name, FILENAME),\n                                 \"<expected content not found>\")\n\n    return check\n\nwith frontend.signin(\"alice\"):\n    with repository.workcopy(empty=True) as work:\n        REMOTE_URL = instance.repository_url(\"alice\")\n\n        def commit():\n            work.run([\"add\", FILENAME])\n            work.run([\"commit\", \"-m\", FILENAME],\n                     GIT_AUTHOR_NAME=\"Alice von Testing\",\n                     GIT_AUTHOR_EMAIL=\"alice@example.org\",\n                     GIT_COMMITTER_NAME=\"Alice von Testing\",\n                     GIT_COMMITTER_EMAIL=\"alice@example.org\")\n            return work.run([\"rev-parse\", \"HEAD\"]).strip()\n\n        def push():\n            work.run([\"push\", \"-q\", REMOTE_URL,\n                      \"HEAD:refs/heads/\" + BRANCH])\n\n        with open(os.path.join(work.path, FILENAME), \"w\") as text_file:\n            print >>text_file, \"Content of file \" + FILENAME\n\n        sha1 = commit()\n        push()\n\n    frontend.page(\n        \"showcommit\",\n        params={ \"repository\": \"critic\",\n                 \"sha1\": sha1 },\n        expect={ \"filename\": check_filename(\"path\") })\n\n    frontend.page(\n        \"showtree\",\n        params={ \"repository\": \"critic\",\n                 \"sha1\": sha1,\n                 \"path\": \"/\" },\n        expect={ \"filename\": check_filename(\"name\") })\n"
  },
  {
    "path": "testing/tests/001-main/003-self/028-gitemails.py",
    "content": "# Test summary\n# ============\n#\n# Alice, Bob and Dave adds a bunch of filters making them reviewers of\n# a directory we add a bunch of different files to.  Alice adds a filter with\n# Erin as delegate.\n#\n# They all also set their Git emails, Alice and Bob sharing common@example.org\n# and dave having two different addresses.\n#\n# Then we make a bunch of commits with different authors (all involved Git email\n# addresses, plus nobody@example.org.)  Each commits adds one file.\n#\n# We then catch the \"New Review\" and \"Updated Review\" emails sent, and make sure\n# those emails claim that the right set of files is assigned to be reviewed by\n# the expected users.\n\nimport os\nimport re\n\nREMOTE_URL = instance.repository_url(\"alice\")\n\nto_recipient = testing.mailbox.ToRecipient\nwith_subject = testing.mailbox.WithSubject\n\nshowfilters_output = frontend.page(\n    \"showfilters\",\n    params={ \"repository\": \"critic\" },\n    expected_content_type=\"text/plain\")\ntesting.expect.check(\n    \"Path: /\\n\\nNo matching filters found.\\n\",\n    showfilters_output)\n\nshowfilters_output = frontend.page(\n    \"showfilters\",\n    params={ \"repository\": \"critic\",\n             \"path\": \"028-gitemails/\" },\n    expected_content_type=\"text/plain\")\ntesting.expect.check(\n    \"Path: 028-gitemails/\\n\\nNo matching filters found.\\n\",\n    showfilters_output)\n\nwith repository.workcopy() as workcopy:\n    def commit(filename, author):\n        path = os.path.join(workcopy.path, \"028-gitemails\", filename)\n        with open(path, \"w\") as the_file:\n            the_file.write(\"This is '%s' by %s.\\n\" % (filename, author))\n        workcopy.run([\"add\", \"028-gitemails/\" + filename])\n        workcopy.run([\"commit\", \"-m\", \"Edited \" + filename],\n                     GIT_AUTHOR_NAME=\"Anonymous Coward\",\n                     GIT_AUTHOR_EMAIL=author + \"@example.org\",\n                     GIT_COMMITTER_NAME=\"Anonymous Coward\",\n                     GIT_COMMITTER_EMAIL=author + \"@example.org\")\n        return workcopy.run([\"rev-parse\", \"HEAD\"]).strip()\n\n    def expect_mail(recipient, expected_files):\n        mail = mailbox.pop(\n            accept=[to_recipient(recipient + \"@example.org\"),\n                    with_subject(\"(New|Updated) Review: Edited cat.txt\")])\n\n        assigned_files = []\n\n        try:\n            marker_index = mail.lines.index(\n                \"These changes were assigned to you:\")\n        except ValueError:\n            pass\n        else:\n            for line in mail.lines[marker_index + 1:]:\n                if not line.strip():\n                    break\n                filename, counts = line.strip().split(None, 1)\n                if assigned_files:\n                    filename = filename.replace(\".../\", \"028-gitemails/\")\n                assigned_files.append((filename, counts))\n\n        for filename, counts in assigned_files:\n            if not expected_files:\n                testing.expect.check(\"<no more assigned files>\", filename)\n            else:\n                testing.expect.check(\n                    \"028-gitemails/\" + expected_files.pop(0), filename)\n                testing.expect.check(\"+1\", counts)\n\n        if expected_files:\n            for filename in expected_files:\n                testing.expect.check(filename, \"<no more assigned files>\")\n\n    with frontend.signin(\"alice\"):\n        frontend.operation(\n            \"addfilter\",\n            data={ \"filter_type\": \"reviewer\",\n                   \"repository_name\": \"critic\",\n                   \"path\": \"028-gitemails/\",\n                   \"delegates\": [\"erin\"] })\n        frontend.operation(\n            \"setgitemails\",\n            data={ \"subject_id\": instance.userid(\"alice\"),\n                   \"value\": [\"alice@example.org\", \"common@example.org\"] })\n\n    with frontend.signin(\"bob\"):\n        frontend.operation(\n            \"addfilter\",\n            data={ \"filter_type\": \"reviewer\",\n                   \"repository_name\": \"critic\",\n                   \"path\": \"028-gitemails/\",\n                   \"delegates\": [] })\n        frontend.operation(\n            \"setgitemails\",\n            data={ \"subject_name\": \"bob\",\n                   \"value\": [\"bob@example.org\", \"common@example.org\"] })\n\n    with frontend.signin(\"dave\"):\n        frontend.operation(\n            \"addfilter\",\n            data={ \"filter_type\": \"reviewer\",\n                   \"repository_name\": \"critic\",\n                   \"path\": \"028-gitemails/\",\n                   \"delegates\": [] })\n        frontend.operation(\n            \"setgitemails\",\n            data={ \"subject\": instance.userid(\"dave\"),\n                   \"value\": [\"dave@example.org\", \"dave@example.com\"] })\n\n    workcopy.run([\"checkout\", \"-b\", \"r/028-gitemails\"])\n\n    os.mkdir(os.path.join(workcopy.path, \"028-gitemails\"))\n\n    commits = []\n    commits.append(commit(\"cat.txt\", \"alice\"))\n\n    with testing.utils.settings(\"alice\", { \"review.createViaPush\": True }):\n        workcopy.run([\"push\", REMOTE_URL, \"HEAD\"])\n\n    expect_mail(\"alice\", [])\n    expect_mail(\"bob\", [\"cat.txt\"])\n    expect_mail(\"dave\", [\"cat.txt\"])\n    expect_mail(\"erin\", [\"cat.txt\"])\n\n    commits.append(commit(\"dog.txt\", \"bob\"))\n    commits.append(commit(\"mouse.txt\", \"dave\"))\n    commits.append(commit(\"snake.txt\", \"dave\"))\n    commits.append(commit(\"bird.txt\", \"common\"))\n    commits.append(commit(\"fish.txt\", \"nobody\"))\n\n    workcopy.run([\"push\", REMOTE_URL, \"HEAD\"])\n\n    expect_mail(\"alice\", [\"dog.txt\", \"fish.txt\", \"mouse.txt\", \"snake.txt\"])\n    expect_mail(\"bob\", [\"fish.txt\", \"mouse.txt\", \"snake.txt\"])\n    expect_mail(\"dave\", [\"bird.txt\", \"dog.txt\", \"fish.txt\"])\n    expect_mail(\"erin\", [\"bird.txt\"])\n\nshowfilters_output = frontend.page(\n    \"showfilters\",\n    params={ \"repository\": \"critic\" },\n    expected_content_type=\"text/plain\")\ntesting.expect.check(\n    \"Path: /\\n\\nNo matching filters found.\\n\",\n    showfilters_output)\n\nshowfilters_output = frontend.page(\n    \"showfilters\",\n    params={ \"repository\": \"critic\",\n             \"path\": \"028-gitemails/\" },\n    expected_content_type=\"text/plain\")\ntesting.expect.check(\n    \"\"\"\\\nPath: 028-gitemails/\n\nReviewers:\n  Alice von Testing <alice@example.org>\n  Bob von Testing <bob@example.org>\n  Dave von Testing <dave@example.org>\n\"\"\", showfilters_output)\n"
  },
  {
    "path": "testing/tests/001-main/003-self/029-log-bogus.py",
    "content": "expected = testing.expect.message(\"'notabranch' doesn't name a branch!\", None)\nfrontend.page(\n    url=\"log\",\n    params={ \"repository\": \"critic\",\n             \"branch\": \"notabranch\" },\n    expect={ \"message\": expected })\n\n\nexpected = testing.expect.message(\"Missing URI Parameter!\",\n                                  \"Expected 'repository' parameter.\")\nfrontend.page(\n    url=\"log\",\n    params={ \"branch\": \"branch_that_does_not_exist\" },\n    expect={ \"message\": expected },\n    expected_http_status=400)\n\n\nexpected = testing.expect.message(\"'nyetvetka' doesn't name a branch!\", None)\nfrontend.page(\n    url=\"log\",\n    params={ \"repository\": \"critic\",\n             \"branch\": \"master\",\n             \"base\": \"nyetvetka\" },\n    expect={ \"message\": expected })\n"
  },
  {
    "path": "testing/tests/001-main/003-self/030-trackingreview.py",
    "content": "TEST_NAME = \"030-trackingreview\"\nBRANCH_NAME = [TEST_NAME + \"-1\",\n               TEST_NAME + \"-2\"]\nUPSTREAM_NAME = [name + \"-upstream\"\n                 for name in BRANCH_NAME]\nSUMMARY = TEST_NAME\n\nORIGINAL_SHA1 = \"37bfd1ee7d301b364d0a8c716e9bca36efd5d139\"\nREVIEWED_SHA1 = []\nUPSTREAM_SHA1 = [\"22afd9377add956e1e8d8dd6efa378fad9237532\",\n                 \"702c1b1a4043d8837e788317698cfc88c5570ff8\"]\n\ndef to(name):\n    return testing.mailbox.ToRecipient(\"%s@example.org\" % name)\n\ndef about(subject):\n    return testing.mailbox.WithSubject(subject)\n\nrepository.run([\"branch\", UPSTREAM_NAME[0], UPSTREAM_SHA1[0]])\nrepository.run([\"branch\", UPSTREAM_NAME[1], UPSTREAM_SHA1[1]])\n\nwith repository.workcopy() as work:\n    work.run([\"checkout\", \"-b\", BRANCH_NAME[0], UPSTREAM_SHA1[0]])\n    work.run([\"cherry-pick\", ORIGINAL_SHA1])\n    work.run([\"push\", \"origin\", \"HEAD\"])\n\n    REVIEWED_SHA1.append(work.run([\"rev-parse\", \"HEAD\"]).strip())\n\n    work.run([\"checkout\", \"-b\", BRANCH_NAME[1], UPSTREAM_SHA1[1]])\n    work.run([\"cherry-pick\", ORIGINAL_SHA1])\n    work.run([\"push\", \"origin\", \"HEAD\"])\n\n    REVIEWED_SHA1.append(work.run([\"rev-parse\", \"HEAD\"]).strip())\n\nwith_class = testing.expect.with_class\nextract_text = testing.expect.extract_text\n\ndef check_tracking(branch_name, disabled=False):\n    def check(document):\n        class_names = [\"tracking\"]\n        if disabled:\n            class_names.append(\"disabled\")\n        p_tracking = document.find(\"p\", attrs=with_class(*class_names))\n        testing.expect.check(\"tracking\", extract_text(p_tracking))\n        if not disabled:\n            testing.expect.check(\"tracking\", p_tracking[\"class\"])\n\n        code_branch = document.findAll(\"code\", attrs=with_class(\"branch\"))\n        testing.expect.check(2, len(code_branch))\n        testing.expect.check(branch_name, extract_text(code_branch[1]))\n\n        code_repository = document.findAll(\"code\", attrs=with_class(\"repository\"))\n        testing.expect.check(2, len(code_repository))\n        testing.expect.check(repository.url, extract_text(code_repository[1]))\n\n    return check\n\nSETTINGS = { \"email.subjectLine.updatedReview.reviewRebased\":\n                 \"Rebased Review: %(summary)s\" }\n\nwith testing.utils.settings(\"alice\", SETTINGS), frontend.signin(\"alice\"):\n    result = frontend.operation(\n        \"fetchremotebranch\",\n        data={\n            \"repository_name\": \"critic\",\n            \"remote\": repository.url,\n            \"branch\": BRANCH_NAME[0],\n            \"upstream\": \"refs/heads/\" + UPSTREAM_NAME[0] },\n        expect={\n            \"head_sha1\": REVIEWED_SHA1[0],\n            \"upstream_sha1\": UPSTREAM_SHA1[0] })\n\n    # Run a GC to make sure the objects fetched by /fetchremotebranch are\n    # referenced and thus usable by the subsequent /submitreview operation.\n    instance.gc(\"critic.git\")\n\n    commit_ids = result[\"commit_ids\"]\n\n    result = frontend.operation(\n        \"submitreview\",\n        data={\n            \"repository\": \"critic\",\n            \"branch\": \"r/\" + TEST_NAME,\n            \"summary\": SUMMARY,\n            \"commit_ids\": commit_ids,\n            \"trackedbranch\": { \"remote\": repository.url,\n                               \"name\": BRANCH_NAME[0] }})\n\n    review_id = result[\"review_id\"]\n    trackedbranch_id = result[\"trackedbranch_id\"]\n\n    mailbox.pop(\n        accept=[to(\"alice\"),\n                about(\"New Review: \" + SUMMARY)])\n\n    # Wait for the immediate fetch of the tracked branch that /submitreview\n    # schedules.\n    instance.synchronize_service(\"branchtracker\")\n\n    # Emulate a review rebase via /rebasetrackingreview.\n    frontend.page(\n        \"r/%d\" % review_id,\n        expect={\n            \"tracking\": check_tracking(BRANCH_NAME[0]) })\n    frontend.page(\n        \"rebasetrackingreview\",\n        params={\n            \"review\": review_id })\n    result = frontend.operation(\n        \"fetchremotebranch\",\n        data={\n            \"repository_name\": \"critic\",\n            \"remote\": repository.url,\n            \"branch\": BRANCH_NAME[1],\n            \"upstream\": \"refs/heads/\" + UPSTREAM_NAME[1] },\n        expect={\n            \"head_sha1\": REVIEWED_SHA1[1],\n            \"upstream_sha1\": UPSTREAM_SHA1[1] })\n\n    # Run a GC to make sure the objects fetched by /fetchremotebranch are\n    # referenced and thus usable by the subsequent /rebasetrackingreview\n    # operation.\n    instance.gc(\"critic.git\")\n\n    frontend.page(\n        \"rebasetrackingreview\",\n        params={\n            \"review\": review_id,\n            \"newbranch\": BRANCH_NAME[1],\n            \"upstream\": UPSTREAM_NAME[1],\n            \"newhead\": REVIEWED_SHA1[1],\n            \"newupstream\": UPSTREAM_SHA1[1] })\n    frontend.operation(\n        \"checkconflictsstatus\",\n        data={\n            \"review_id\": review_id,\n            \"new_head_sha1\": REVIEWED_SHA1[1],\n            \"new_upstream_sha1\": UPSTREAM_SHA1[1] },\n        expect={\n            \"has_changes\": False,\n            \"has_conflicts\": False })\n    frontend.operation(\n        \"rebasereview\",\n        data={\n            \"review_id\": review_id,\n            \"new_head_sha1\": REVIEWED_SHA1[1],\n            \"new_upstream_sha1\": UPSTREAM_SHA1[1],\n            \"new_trackedbranch\": BRANCH_NAME[1] })\n    frontend.page(\n        \"r/%d\" % review_id,\n        expect={\n            \"tracking\": check_tracking(BRANCH_NAME[1]) })\n\n    mailbox.pop(\n        accept=[to(\"alice\"),\n                about(\"Rebased Review: \" + SUMMARY)])\n\n    # Disable and enable the tracking.\n    frontend.operation(\n        \"disabletrackedbranch\",\n        data={\n            \"branch_id\": trackedbranch_id })\n    frontend.page(\n        \"r/%d\" % review_id,\n        expect={\n            \"tracking\": check_tracking(BRANCH_NAME[1], disabled=True) })\n    frontend.operation(\n        \"enabletrackedbranch\",\n        data={\n            \"branch_id\": trackedbranch_id })\n    frontend.page(\n        \"r/%d\" % review_id,\n        expect={\n            \"tracking\": check_tracking(BRANCH_NAME[1]) })\n"
  },
  {
    "path": "testing/tests/001-main/003-self/031-fetchlines-bom.py",
    "content": "import os\n\nUTF8_BOM = \"\\xEF\\xBB\\xBF\"\n\nwith frontend.signin(\"alice\"), repository.workcopy(empty=True) as work:\n    REMOTE_URL = instance.repository_url(\"alice\")\n\n    def commit(filename):\n        work.run([\"add\", filename])\n        work.run([\"commit\", \"-m\", \"Add \" + filename],\n                 GIT_AUTHOR_NAME=\"Alice von Testing\",\n                 GIT_AUTHOR_EMAIL=\"alice@example.org\",\n                 GIT_COMMITTER_NAME=\"Alice von Testing\",\n                 GIT_COMMITTER_EMAIL=\"alice@example.org\")\n        sha1 = work.run([\"ls-tree\", \"HEAD\", filename]).strip().split()[2]\n        return sha1\n\n    def push():\n        return work.run([\"push\", \"-q\", REMOTE_URL,\n                         \"HEAD:refs/heads/031-fetchlines\"])\n\n    filename_cc = \"031-fetchlines.cc\"\n    with open(os.path.join(work.path, filename_cc), \"w\") as text_file:\n        print >>text_file, UTF8_BOM\n        print >>text_file, \"\\n\"*42\n        print >>text_file, \"hello world\"\n    file_sha1_cc = commit(filename_cc)\n\n    filename_py = \"031-fetchlines.py\"\n    with open(os.path.join(work.path, filename_py), \"w\") as text_file:\n        print >>text_file, UTF8_BOM\n        print >>text_file, \"\\n\"*42\n        print >>text_file, \"hello world\"\n    file_sha1_py = commit(filename_py)\n\n    push()\n\n    frontend.operation(\n        \"fetchlines\",\n        data={ \"repository_id\": 1,\n               \"path\": filename_cc,\n               \"sha1\": file_sha1_cc,\n               \"ranges\": [{ \"offset\": 1,\n                            \"count\": 40,\n                            \"context\": True }],\n               \"tabify\": False })\n\n    frontend.operation(\n        \"fetchlines\",\n        data={ \"repository_id\": 1,\n               \"path\": filename_py,\n               \"sha1\": file_sha1_py,\n               \"ranges\": [{ \"offset\": 1,\n                            \"count\": 40,\n                            \"context\": True }],\n               \"tabify\": False })\n"
  },
  {
    "path": "testing/tests/001-main/003-self/032-download.py",
    "content": "# Download the README file in the root directory in the initial released commit.\nexpected_README = \"\"\"\\\nCritic\n======\n\nThis is the code review system, Critic.\n\"\"\"\nactual_README = frontend.page(\n    \"download/README\",\n    params={ \"repository\": \"critic\",\n             \"sha1\": \"f4c6e5fc09de47f7eb1a623cbc8820f67967d558\" },\n    expected_content_type=\"text/plain\")\ntesting.expect.check(expected_README, actual_README)\n\n# Download the resources/basic.js file in the initial released commit.  We won't\n# bother to check that the content is correct (it's too big to inline in this\n# test,) so just check that the content type is correctly guessed.\nfrontend.page(\n    \"download/resources/basic.js\",\n    params={ \"repository\": \"critic\",\n             \"sha1\": \"2c7d6f87c11670f3c371cca0580553f01ec94340\" },\n    expected_content_type=\"text/javascript\")\n\n# Download the resources/basic.js file in the initial released commit, this time\n# with an abbreviated SHA-1 sum.\nfrontend.page(\n    \"download/resources/basic.js\",\n    params={ \"repository\": \"critic\",\n             \"sha1\": \"2c7d6f87c11\" },\n    expected_content_type=\"text/javascript\")\n\n# Attempt to download the README file in the root directory but specify a SHA-1\n# that isn't a blob, but rather the initial released commit's SHA-1.  This\n# should fail.\nfrontend.page(\n    \"download/README\",\n    params={ \"repository\": \"critic\",\n             \"sha1\": \"aa15bc746d3340bda912a1cc4759b332b9adff55\" },\n    expected_http_status=404)\n\n# Attempt to download the README file in the root directory but specify a SHA-1\n# that doesn't exist at all in the repository.\nfrontend.page(\n    \"download/README\",\n    params={ \"repository\": \"critic\",\n             \"sha1\": \"0000000000000000000000000000000000000000\" },\n    expected_http_status=404)\n\n# Attempt to download the README file in the root directory but specify a SHA-1\n# that isn't a valid SHA-1.\nfrontend.page(\n    \"download/README\",\n    params={ \"repository\": \"critic\",\n             \"sha1\": \"0123456789abcdefghijklmnopqrstuvwxzy\" },\n    expected_http_status=404)\n\n# Use a bogus repository parameter.\nfrontend.page(\n    \"download/README\",\n    params={ \"repository\": \"notcritic\",\n             \"sha1\": \"f4c6e5fc09de47f7eb1a623cbc8820f67967d558\" },\n    expected_http_status=404)\n\n# Omit the sha1 parameter.\nfrontend.page(\n    \"download/README\",\n    params={ \"repository\": \"critic\" },\n    expected_http_status=400)\n\n# Omit the repository parameter.\nfrontend.page(\n    \"download/README\",\n    params={ \"sha1\": \"f4c6e5fc09de47f7eb1a623cbc8820f67967d558\" },\n    expected_http_status=400)\n"
  },
  {
    "path": "testing/tests/001-main/003-self/033-propagation-vs-rebase.py",
    "content": "import os\n\nFILENAME = \"033-propagation-vs-rebase.txt\"\nFILENAME_BASE = \"033-propagation-vs-rebase.base.txt\"\n\nNONSENSE = \"\"\"\\\nLorem ipsum dolor sit amet, consectetur adipiscing\nelit. Donec ut enim sit amet purus ultricies\nlobortis. Pellentesque nisi arcu, convallis sed purus sed,\nsemper ultrices velit. Ut egestas lorem tortor, vitae\nlacinia lorem consectetur nec. Integer tempor ornare ipsum\nat viverra. Curabitur nec orci mollis, lacinia sapien eget,\nultricies ipsum. Curabitur a libero tortor. Curabitur\nvolutpat lacinia erat, ac suscipit enim dignissim nec.\"\"\"\n\ndef lines(*args):\n    return \"\\n\".join((line.upper() if index in args else line)\n                     for index, line in enumerate(NONSENSE.splitlines()))\n\ndef to(name):\n    return testing.mailbox.ToRecipient(\"%s@example.org\" % name)\n\ndef about(subject):\n    return testing.mailbox.WithSubject(subject)\n\nSETTINGS = { \"review.createViaPush\": True,\n             \"email.subjectLine.updatedReview.reviewRebased\":\n                 \"Rebased Review: %(summary)s\" }\n\nwork = repository.workcopy()\nsettings = testing.utils.settings(\"alice\", SETTINGS)\nsignin = frontend.signin(\"alice\")\nreviews = []\n\nwith work, settings, signin:\n    REMOTE_URL = instance.repository_url(\"alice\")\n\n    def write(*args, **kwargs):\n        \"\"\"Write the file<tm>, optionally with lines upper-cased.\"\"\"\n        filename = kwargs.get(\"filename\", FILENAME)\n        with open(os.path.join(work.path, filename), \"w\") as target:\n            print >>target, lines(*args)\n\n    def commit(message, *args, **kwargs):\n        \"\"\"Create a commit and return its SHA-1\"\"\"\n        write(*args, **kwargs)\n        work.run([\"add\", kwargs.get(\"filename\", FILENAME)])\n        work.run([\"commit\", \"-m\", message])\n        return work.run([\"rev-parse\", \"HEAD\"]).strip()\n\n    def expectmail(title):\n        review = reviews[-1]\n        mailbox.pop(accept=[to(\"alice\"),\n                            about(\"%s: %s\" % (title, review[\"summary\"]))])\n\n    def expecthead(expected):\n        \"\"\"Check that the review branch in Critic's repository is where we want\n           it to be.\"\"\"\n        review = reviews[-1]\n        actual = work.run([\"ls-remote\", REMOTE_URL, review[\"branch\"]]).split()[0]\n        testing.expect.check(expected, actual)\n\n    def createreview(commits):\n        \"\"\"Create a review of the specified commits.\"\"\"\n        index = len(reviews) + 1\n        branch = \"033-propagation-vs-rebase/%d\" % index\n        summary = \"033-propagation-vs-rebase, test %d\" % index\n        work.run([\"push\", REMOTE_URL, \"%s:refs/heads/%s\" % (commits[-1],\n                                                            branch)])\n        result = frontend.operation(\n            \"submitreview\",\n            data={ \"repository\": \"critic\",\n                   \"commit_sha1s\": [sha1 for sha1 in commits],\n                   \"branch\": \"r/\" + branch,\n                   \"frombranch\": branch,\n                   \"summary\": summary })\n        reviews.append({ \"id\": result[\"review_id\"],\n                         \"branch\": \"r/\" + branch,\n                         \"summary\": summary })\n        expectmail(\"New Review\")\n\n    def push(new_head, force=False):\n        \"\"\"Push specified commit to the review branch in Critic's repository,\n           optionally forced.\"\"\"\n        review = reviews[-1]\n        args = [\"push\"]\n        if force:\n            args.append(\"-f\")\n        args.extend([REMOTE_URL, \"%s:refs/heads/%s\" % (new_head,\n                                                       review[\"branch\"])])\n        work.run(args)\n        expecthead(new_head)\n\n    def moverebase(new_upstream, new_head):\n        \"\"\"Perform a move rebase.\"\"\"\n        review = reviews[-1]\n        work.run([\"reset\", \"--hard\", new_head])\n        frontend.operation(\n            \"preparerebase\",\n            data={ \"review_id\": review[\"id\"],\n                   \"new_upstream\": new_upstream })\n        push(new_head, force=True)\n        expectmail(\"Rebased Review\")\n\n    def historyrewrite(new_head):\n        \"\"\"Perform a history rewrite rebase.\"\"\"\n        review = reviews[-1]\n        work.run([\"reset\", \"--hard\", new_head])\n        frontend.operation(\n            \"preparerebase\",\n            data={ \"review_id\": review[\"id\"] })\n        push(new_head, force=True)\n        expectmail(\"Rebased Review\")\n\n    def createcomment(parent_sha1, child_sha1, offset, count, verdict):\n        frontend.operation(\n            \"validatecommentchain\",\n            data={ \"review_id\": reviews[-1][\"id\"],\n                   \"origin\": \"new\",\n                   \"parent_sha1\": parent_sha1,\n                   \"child_sha1\": child_sha1,\n                   \"file_path\": FILENAME,\n                   \"offset\": offset,\n                   \"count\": count },\n            expect={ \"verdict\": verdict })\n        frontend.operation(\n            \"createcommentchain\",\n            data={ \"review_id\": reviews[-1][\"id\"],\n                   \"chain_type\": \"issue\",\n                   \"file_context\": {\n                       \"origin\": \"new\",\n                       \"parent_sha1\": parent_sha1,\n                       \"child_sha1\": child_sha1,\n                       \"file_path\": FILENAME,\n                       \"offset\": offset,\n                       \"count\": count },\n                   \"text\": (\"Issue at lines %d-%d\"\n                            % (offset, offset + count - 1)) })\n\n    start_sha1 = work.run([\"rev-parse\", \"HEAD\"]).strip()\n\n    # TEST #1: Create a review with one commit, then do a fast-forward move\n    # rebase, and then create some comments in the diff of the original,\n    # pre-rebase commit.\n\n    work.run([\"checkout\", \"-b\", \"033-propagation-vs-rebase-base\", start_sha1])\n    base_commits = [commit(\"Base commit 1\"),\n                    commit(\"Base commit 2\", 0),\n                    commit(\"Base commit 3\", 0, 1)]\n    work.run([\"push\", REMOTE_URL, \"HEAD\"])\n\n    work.run([\"checkout\", \"-b\", \"033-propagation-vs-rebase-1\", base_commits[0]])\n    original_commits = [commit(\"Test #1, commit 1\", 6, 7)]\n    createreview(original_commits)\n    work.run([\"reset\", \"--hard\", base_commits[2]])\n    rebased_commits = [commit(\"Test #1, commit 1 (rebased)\", 0, 1, 6, 7)]\n    moverebase(base_commits[2], rebased_commits[0])\n\n    # Lines modified in the rebase.\n    createcomment(base_commits[0], original_commits[0], 1, 2, \"modified\")\n\n    # Lines not modified at all.\n    createcomment(base_commits[0], original_commits[0], 4, 2, \"transferred\")\n\n    # Lines modified in the review (but not in the rebase).\n    createcomment(base_commits[0], original_commits[0], 7, 2, \"transferred\")\n\n    # TEST #2: Create a review with one commit, then do a non-fast-forward move\n    # rebase, and then create some comments in the diff of the original,\n    # pre-rebase commit.\n\n    work.run([\"checkout\", \"-b\", \"033-propagation-vs-rebase-2\", base_commits[2]])\n    original_commits = [commit(\"Test #2, commit 1\", 0, 1, 6, 7)]\n    createreview(original_commits)\n    work.run([\"reset\", \"--hard\", base_commits[0]])\n    rebased_commits = [commit(\"Test #2, commit 1 (rebased)\", 6, 7)]\n    moverebase(base_commits[0], rebased_commits[0])\n\n    # Lines modified in the rebase.\n    createcomment(base_commits[0], original_commits[0], 1, 2, \"modified\")\n\n    # Lines not modified at all.\n    createcomment(base_commits[0], original_commits[0], 4, 2, \"transferred\")\n\n    # Lines modified in the review (but not in the rebase).\n    createcomment(base_commits[0], original_commits[0], 7, 2, \"transferred\")\n"
  },
  {
    "path": "testing/tests/001-main/003-self/100-reviewing/001-comments.basic.py",
    "content": "import os\nimport re\nimport pprint\n\ndef to(name):\n    return testing.mailbox.ToRecipient(\"%s@example.org\" % name)\n\ndef about(subject):\n    return testing.mailbox.WithSubject(subject)\n\nBASE = \"100-reviewing/\"\nTEST = BASE + \"001-comment.basic\"\nBRANCH = \"r/\" + TEST\nFILENAME = TEST + \".txt\"\nSUMMARY = \"Added \" + FILENAME\n\nNEW_SUBJECT = \"New Review: \" + SUMMARY\nNEWISH_SUBJECT = r\"New\\(ish\\) Review: \" + SUMMARY\nUPDATED_SUBJECT = \"Updated Review: \" + SUMMARY\n\nLINES = [\"First line\",\n         \"Second line\",\n         \"Third line\",\n         \"Fourth line\",\n         \"Fifth line\",\n         \"Sixth line\",\n         \"Seventh line\",\n         \"Eighth line\",\n         \"Ninth line\",\n         \"Tenth line\"]\n\n################################################################################\n#\n# Some utility stuff.\n#\n################################################################################\n\nclass CommentChain(object):\n    def __init__(self, chain_id, chain_type, author, text, lines=None):\n        self.id = chain_id\n        self.type = chain_type\n        self.author = author\n        self.text = text\n        self.lines = lines\n        self.replies = []\n\n    def add_reply(self, author):\n        self.replies.append((author, (\"This is a reply from %s.\"\n                                      % author.capitalize())))\n\ndef findChainInMail(mail, chain_id):\n    chain_type = author = text = lines = trailer = reply_author = None\n    replies = []\n    last_comment_seen = False\n\n    first_line_index = None\n\n    for index, line in enumerate(mail + [None]):\n        if last_comment_seen:\n            if line is None or line:\n                del mail[first_line_index:index]\n                return chain_type, author, text, lines, replies, trailer\n\n        if not line:\n            continue\n\n        match = re.match(\"(?:> )?General (issue|note)\", line)\n        if match:\n            chain_type = \"general \" + match.group(1)\n            continue\n\n        match = re.match(\"(?:> )?(Issue|Note) in commit\", line)\n        if match:\n            chain_type = \"commit \" + match.group(1).lower()\n            continue\n\n        match = re.match(\"(?:> )?(Issue|Note) in\", line)\n        if match:\n            chain_type = \"file \" + match.group(1).lower()\n            continue\n\n        if chain_type is None:\n            continue\n\n        match = re.match(r\"(?:> )?  http://.*/showcomment\\?chain=(\\d+)\", line)\n        if match:\n            first_line_index = index - 1\n            if int(match.group(1)) != chain_id:\n                chain_type = None\n            continue\n\n        match = re.match(\"(?:> )?([^ ]+) von Testing at\", line)\n        if match:\n            if author is None:\n                author = match.group(1).lower()\n            else:\n                reply_author = match.group(1).lower()\n            continue\n\n        if re.match(r\"(?:> )?-+$\", line):\n            if lines is None and not chain_type.startswith(\"general \"):\n                lines = []\n            continue\n\n        if lines is not None and author is None:\n            if chain_type.startswith(\"file \"):\n                match = re.match(r\"(?:> )?\\s*(\\d+)\\|(.*)$\", line)\n                lines.append((int(match.group(1)), match.group(2)))\n            else:\n                match = re.match(r\"(?:> )?  (.*)$\", line)\n                lines.append((None, match.group(1)))\n            continue\n\n        if line and (line.lower() != line == line.upper() or\n                     re.match(r\"\\(.*\\)\", line)):\n            trailer = line\n            last_comment_seen = True\n            continue\n\n        match = re.match(r\"(> )?  (.+)$\", line)\n        last_comment_seen = match.group(1) is None\n        if reply_author:\n            replies.append((reply_author, match.group(2)))\n        else:\n            text = match.group(2)\n\n    testing.expect.check(\"<chain %d in mail>\" % chain_id,\n                         \"<expected content not found>\")\n\ndef checkSubmitter(mails, expected_submitter):\n    for mail in mails:\n        for line in mail:\n            match = re.match(\"(.*) von Testing has submitted\", line)\n            if match:\n                testing.expect.check(expected_submitter, match.group(1).lower())\n                return\n\n        testing.expect.check(\"<'$USER has submitted' line in mail>\",\n                             \"<expected content not found>\")\n\ndef checkChain(mails, chain, expected_trailer=None):\n    for mail in mails:\n        (actual_type, actual_author,\n         actual_text, actual_lines,\n         actual_replies, actual_trailer) = findChainInMail(mail, chain.id)\n\n        testing.expect.check(chain.type, actual_type)\n        testing.expect.check(chain.author, actual_author)\n        testing.expect.check(chain.text, actual_text)\n        testing.expect.check(chain.lines, actual_lines)\n        testing.expect.check(chain.replies, actual_replies)\n\ndef checkNoMoreChains(mails):\n    for mail in mails:\n        for index, line in enumerate(mail):\n            if re.match(\"(?:> )?General (issue|note)\", line) \\\n                    or re.match(\"(?:> )?(Issue|Note) in commit\", line) \\\n                    or re.match(\"(?:> )?(Issue|Note) in\", line):\n                testing.logger.error(\n                    \"Unexpected comment chain mentioned in mail:\\n  %s\\n  %s\"\n                    % (mail[index], mail[index + 1]))\n\ndef receiveMails(subject):\n    return [mailbox.pop(accept=[to(whom), about(subject)]).lines[:]\n            for whom in [\"alice\", \"bob\", \"dave\", \"erin\"]]\n\ndef createComment(chain, author):\n    frontend.operation(\n        \"createcomment\",\n        data={ \"chain_id\": chain.id,\n               \"text\": \"This is a reply from %s.\" % author.capitalize() })\n\ndef resolveCommentChain(chain):\n    frontend.operation(\n        \"resolvecommentchain\",\n        data={ \"chain_id\": chain.id })\n\ndef reopenResolvedCommentChain(chain):\n    frontend.operation(\n        \"reopenresolvedcommentchain\",\n        data={ \"chain_id\": chain.id })\n\ndef morphCommentChain(chain, new_type):\n    frontend.operation(\n        \"morphcommentchain\",\n        data={ \"chain_id\": chain.id,\n               \"new_type\": new_type })\n\ndef submitChanges():\n    if instance.has_flag(\"fixed-batch-preview\"):\n        frontend.page(\n            \"showbatch\",\n            params={ \"review\": str(review_id) })\n\n    result = frontend.operation(\n        \"submitchanges\",\n        data={ \"review_id\": review_id })\n\n    if \"batch_id\" in result:\n        frontend.page(\n            \"showbatch\",\n            params={ \"batch\": result[\"batch_id\"] })\n\nwith repository.workcopy() as work:\n    ############################################################################\n    #\n    # As Alice, create a commit that adds a file, and a review of that commit,\n    # with Bob, Dave and Erin as associated users.\n    #\n    ############################################################################\n\n    parent_sha1 = work.run([\"rev-parse\", \"HEAD\"]).strip()\n\n    work.run([\"checkout\", \"-b\", BRANCH, \"--no-track\", \"origin/master\"])\n\n    os.mkdir(os.path.join(work.path, \"100-reviewing\"))\n\n    with open(os.path.join(work.path, FILENAME), \"w\") as review_file:\n        review_file.write(\"\\n\".join(LINES) + \"\\n\")\n\n    work.run([\"add\", FILENAME])\n    work.run([\"commit\", \"-m\", \"\\n\".join([SUMMARY, \"\"] + LINES[:3])])\n\n    child_sha1 = work.run([\"rev-parse\", \"HEAD\"]).strip()\n\n    review_id = testing.utils.createReviewViaPush(work, \"alice\")\n\n    mailbox.pop(accept=[to(\"alice\"), about(NEW_SUBJECT)])\n\n    with frontend.signin(\"alice\"):\n        frontend.operation(\n            \"addreviewfilters\",\n            data={ \"review_id\": review_id,\n                   \"filters\": [{ \"type\": \"reviewer\",\n                                 \"user_names\": [\"bob\"],\n                                 \"paths\": [BASE] },\n                               { \"type\": \"watcher\",\n                                 \"user_names\": [\"dave\"],\n                                 \"paths\": [\"/\"] },\n                               { \"type\": \"watcher\",\n                                 \"user_names\": [\"erin\"],\n                                 \"paths\": [\"src/\"] }]})\n\n        for whom in [\"bob\", \"dave\", \"erin\"]:\n            mailbox.pop(accept=[to(whom), about(NEWISH_SUBJECT)])\n            mailbox.pop(accept=[to(whom), about(UPDATED_SUBJECT)])\n\n    ############################################################################\n    #\n    # Create one each of the different types of comment chain:\n    #   { general, commit, file } x { issue, note}\n    #\n    # Submit, and check that everyone involved received a mail with each comment\n    # chain included (and correctly rendered.)\n    #\n    ############################################################################\n\n    with frontend.signin(\"alice\"):\n        result = frontend.operation(\n            \"createcommentchain\",\n            data={ \"review_id\": review_id,\n                   \"chain_type\": \"issue\",\n                   \"text\": \"This is a general issue.\" })\n\n        general_issue = CommentChain(\n            chain_id=result[\"chain_id\"],\n            chain_type=\"general issue\",\n            author=\"alice\",\n            text=\"This is a general issue.\")\n\n        result = frontend.operation(\n            \"createcommentchain\",\n            data={ \"review_id\": review_id,\n                   \"chain_type\": \"note\",\n                   \"text\": \"This is a general note.\" })\n\n        general_note = CommentChain(\n            chain_id=result[\"chain_id\"],\n            chain_type=\"general note\",\n            author=\"alice\",\n            text=\"This is a general note.\")\n\n        result = frontend.operation(\n            \"createcommentchain\",\n            data={ \"review_id\": review_id,\n                   \"chain_type\": \"issue\",\n                   \"commit_context\": { \"commit\": child_sha1,\n                                       \"offset\": 0,\n                                       \"count\": 3 },\n                   \"text\": \"This is a commit issue.\" })\n\n        commit_issue = CommentChain(\n            chain_id=result[\"chain_id\"],\n            chain_type=\"commit issue\",\n            author=\"alice\",\n            text=\"This is a commit issue.\",\n            lines=[(None, SUMMARY),\n                   (None, \"\"),\n                   (None, \"First line\")])\n\n        result = frontend.operation(\n            \"createcommentchain\",\n            data={ \"review_id\": review_id,\n                   \"chain_type\": \"note\",\n                   \"commit_context\": { \"commit\": child_sha1,\n                                       \"offset\": 4,\n                                       \"count\": 1 },\n                   \"text\": \"This is a commit note.\" })\n\n        commit_note = CommentChain(\n            chain_id=result[\"chain_id\"],\n            chain_type=\"commit note\",\n            author=\"alice\",\n            text=\"This is a commit note.\",\n            lines=[(None, \"Third line\")])\n\n        result = frontend.operation(\n            \"createcommentchain\",\n            data={ \"review_id\": review_id,\n                   \"chain_type\": \"issue\",\n                   \"file_context\": { \"origin\": \"new\",\n                                     \"parent\": parent_sha1,\n                                     \"child\": child_sha1,\n                                     \"file\": FILENAME,\n                                     \"offset\": 2,\n                                     \"count\": 3 },\n                   \"text\": \"This is a file issue.\" })\n\n        file_issue = CommentChain(\n            chain_id=result[\"chain_id\"],\n            chain_type=\"file issue\",\n            author=\"alice\",\n            text=\"This is a file issue.\",\n            lines=[(2, \"Second line\"),\n                   (3, \"Third line\"),\n                   (4, \"Fourth line\")])\n\n        result = frontend.operation(\n            \"createcommentchain\",\n            data={ \"review_id\": review_id,\n                   \"chain_type\": \"note\",\n                   \"file_context\": { \"origin\": \"new\",\n                                     \"parent\": parent_sha1,\n                                     \"child\": child_sha1,\n                                     \"file\": FILENAME,\n                                     \"offset\": 10,\n                                     \"count\": 1 },\n                   \"text\": \"This is a file note.\" })\n\n        file_note = CommentChain(\n            chain_id=result[\"chain_id\"],\n            chain_type=\"file note\",\n            author=\"alice\",\n            text=\"This is a file note.\",\n            lines=[(10, \"Tenth line\")])\n\n        testing.expect.check(6, result[\"draft_status\"][\"writtenComments\"])\n\n        submitChanges()\n\n    mails = receiveMails(UPDATED_SUBJECT)\n\n    checkSubmitter(mails, \"alice\")\n    checkChain(mails, general_issue)\n    checkChain(mails, general_note)\n    checkChain(mails, commit_issue)\n    checkChain(mails, commit_note)\n    checkChain(mails, file_issue)\n    checkChain(mails, file_note)\n    checkNoMoreChains(mails)\n\n    ############################################################################\n    #\n    # Verify that we have some basic correctness checks on comment creation,\n    # such as the commented lines existing.\n    #\n    ############################################################################\n\n    with frontend.signin(\"alice\"):\n        # These don't work since the comment text is empty or contains only\n        # white-space characters.\n        for text in (\"\", \" \", \"\\t\", \"\\n\", \"\\r\"):\n            frontend.operation(\n                \"createcommentchain\",\n                data={ \"review_id\": review_id,\n                       \"chain_type\": \"note\",\n                       \"commit_context\": { \"commit\": parent_sha1,\n                                           \"offset\": 0,\n                                           \"count\": 1 },\n                       \"text\": text },\n                expect={ \"status\": \"failure\",\n                         \"title\": \"Empty comment!\" })\n\n        # These don't work since we're trying to comment lines that don't\n        # exist in the commit message.  (We tried offset=4/count=1 above.)\n        frontend.operation(\n            \"createcommentchain\",\n            data={ \"review_id\": review_id,\n                   \"chain_type\": \"note\",\n                   \"commit_context\": { \"commit\": child_sha1,\n                                       \"offset\": 5,\n                                       \"count\": 1 },\n                   \"text\": \"This won't stick.\" },\n            expect={ \"status\": \"failure\",\n                     \"message\": \"It's not possible to create a comment here.\" })\n        frontend.operation(\n            \"createcommentchain\",\n            data={ \"review_id\": review_id,\n                   \"chain_type\": \"note\",\n                   \"commit_context\": { \"commit\": child_sha1,\n                                       \"offset\": 4,\n                                       \"count\": 2 },\n                   \"text\": \"This won't stick.\" },\n            expect={ \"status\": \"failure\",\n                     \"message\": \"It's not possible to create a comment here.\" })\n\n        # This doesn't work since we're trying to comment the \"old\" side of the\n        # commit that added the commented file.\n        frontend.operation(\n            \"createcommentchain\",\n            data={ \"review_id\": review_id,\n                   \"chain_type\": \"note\",\n                   \"file_context\": { \"origin\": \"old\",\n                                     \"parent\": parent_sha1,\n                                     \"child\": child_sha1,\n                                     \"file\": FILENAME,\n                                     \"offset\": 3,\n                                     \"count\": 3 },\n                   \"text\": \"This won't stick.\" },\n            expect={ \"status\": \"failure\",\n                     \"message\": \"It's not possible to create a comment here.\" })\n\n        # These don't work, since we're trying to comment lines that don't\n        # exist in the file.  (We tried offset=10/count=1 above.)\n        frontend.operation(\n            \"createcommentchain\",\n            data={ \"review_id\": review_id,\n                   \"chain_type\": \"note\",\n                   \"file_context\": { \"origin\": \"old\",\n                                     \"parent\": parent_sha1,\n                                     \"child\": child_sha1,\n                                     \"file\": FILENAME,\n                                     \"offset\": 11,\n                                     \"count\": 1 },\n                   \"text\": \"This won't stick.\" },\n            expect={ \"status\": \"failure\",\n                     \"message\": \"It's not possible to create a comment here.\" })\n        frontend.operation(\n            \"createcommentchain\",\n            data={ \"review_id\": review_id,\n                   \"chain_type\": \"note\",\n                   \"file_context\": { \"origin\": \"old\",\n                                     \"parent\": parent_sha1,\n                                     \"child\": child_sha1,\n                                     \"file\": FILENAME,\n                                     \"offset\": 10,\n                                     \"count\": 2 },\n                   \"text\": \"This won't stick.\" },\n            expect={ \"status\": \"failure\",\n                     \"message\": \"It's not possible to create a comment here.\" })\n\n    ############################################################################\n    #\n    # Reply to some of the comment chains, and morph, resolve and reopen them,\n    # as multiple users.\n    #\n    ############################################################################\n\n    # Bob replies to some issues, but before he submits, Dave also replies to\n    # one of them, and submits.  Checks that Dave's reply appears before Bob's,\n    # even though Bob created his first.\n\n    with frontend.signin(\"bob\"):\n        createComment(general_issue, \"bob\")\n        createComment(commit_issue, \"bob\")\n        createComment(file_issue, \"bob\")\n\n    with frontend.signin(\"dave\"):\n        createComment(general_issue, \"dave\")\n        submitChanges()\n\n        general_issue.add_reply(\"dave\")\n\n    mails = receiveMails(UPDATED_SUBJECT)\n\n    checkSubmitter(mails, \"dave\")\n    checkChain(mails, general_issue)\n    checkNoMoreChains(mails)\n\n    with frontend.signin(\"bob\"):\n        submitChanges()\n\n        general_issue.add_reply(\"bob\")\n        commit_issue.add_reply(\"bob\")\n        file_issue.add_reply(\"bob\")\n\n    mails = receiveMails(UPDATED_SUBJECT)\n\n    checkSubmitter(mails, \"bob\")\n    checkChain(mails, general_issue)\n    checkChain(mails, commit_issue)\n    checkChain(mails, file_issue)\n    checkNoMoreChains(mails)\n\n    # Erin replies to an issue too.\n\n    with frontend.signin(\"erin\"):\n        createComment(commit_issue, \"erin\")\n        submitChanges()\n\n        commit_issue.add_reply(\"erin\")\n\n    mails = receiveMails(UPDATED_SUBJECT)\n\n    checkSubmitter(mails, \"erin\")\n    checkChain(mails, commit_issue)\n    checkNoMoreChains(mails)\n\n    # Alice replies to the general note and converts it to an issue (in the same\n    # batch.)\n\n    with frontend.signin(\"alice\"):\n        createComment(general_note, \"alice\")\n        morphCommentChain(general_note, \"issue\")\n        submitChanges()\n\n        general_note.add_reply(\"alice\")\n        general_note.type = \"general issue\"\n\n    mails = receiveMails(UPDATED_SUBJECT)\n\n    checkSubmitter(mails, \"alice\")\n    checkChain(mails, general_note, \"CONVERTED TO ISSUE!\")\n    checkNoMoreChains(mails)\n\n    # Alice converts the general note back to a note, without replying.\n\n    with frontend.signin(\"alice\"):\n        morphCommentChain(general_note, \"note\")\n        submitChanges()\n\n        general_note.type = \"general note\"\n\n    mails = receiveMails(UPDATED_SUBJECT)\n\n    checkSubmitter(mails, \"alice\")\n    checkChain(mails, general_note, \"CONVERTED TO NOTE!\")\n    checkNoMoreChains(mails)\n\n    # Bob replies to and converts the general note to an issue, but before he\n    # submits, Dave also converts it to an issue and submits.  Checks that Bob's\n    # converting of the issue has no effect, but his reply remains.\n\n    with frontend.signin(\"bob\"):\n        createComment(general_note, \"bob\")\n        morphCommentChain(general_note, \"issue\")\n\n    with frontend.signin(\"dave\"):\n        morphCommentChain(general_note, \"issue\")\n        submitChanges()\n\n        general_note.type = \"general issue\"\n\n    mails = receiveMails(UPDATED_SUBJECT)\n\n    checkSubmitter(mails, \"dave\")\n    checkChain(mails, general_note, \"CONVERTED TO ISSUE!\")\n    checkNoMoreChains(mails)\n\n    with frontend.signin(\"bob\"):\n        submitChanges()\n\n        general_note.add_reply(\"bob\")\n\n    mails = receiveMails(UPDATED_SUBJECT)\n\n    checkSubmitter(mails, \"bob\")\n    checkChain(mails, general_note)\n    checkNoMoreChains(mails)\n\n    # Alice resolves the general issue.\n\n    with frontend.signin(\"alice\"):\n        resolveCommentChain(general_issue)\n        submitChanges()\n\n    mails = receiveMails(UPDATED_SUBJECT)\n\n    checkSubmitter(mails, \"alice\")\n    checkChain(mails, general_issue, \"ISSUE RESOLVED!\")\n    checkNoMoreChains(mails)\n\n    # Erin replies to the now resolved general issue.\n\n    with frontend.signin(\"erin\"):\n        createComment(general_issue, \"erin\")\n        submitChanges()\n\n        general_issue.add_reply(\"erin\")\n\n    mails = receiveMails(UPDATED_SUBJECT)\n\n    checkSubmitter(mails, \"erin\")\n    checkChain(mails, general_issue, \"(This issue is resolved.)\")\n    checkNoMoreChains(mails)\n\n    # Alice replies to and reopens the general issue, but before she submits,\n    # Bob also replies to and reopens it, and submits.  Checks that Alice's\n    # reopening of the issue has no effect, but her reply remains.\n\n    with frontend.signin(\"alice\"):\n        createComment(general_issue, \"alice\")\n        reopenResolvedCommentChain(general_issue)\n\n    with frontend.signin(\"bob\"):\n        createComment(general_issue, \"bob\")\n        reopenResolvedCommentChain(general_issue)\n        submitChanges()\n\n        general_issue.add_reply(\"bob\")\n\n    mails = receiveMails(UPDATED_SUBJECT)\n\n    checkSubmitter(mails, \"bob\")\n    checkChain(mails, general_issue, \"ISSUE REOPENED!\")\n    checkNoMoreChains(mails)\n\n    with frontend.signin(\"alice\"):\n        submitChanges()\n\n        general_issue.add_reply(\"alice\")\n\n    mails = receiveMails(UPDATED_SUBJECT)\n\n    checkSubmitter(mails, \"alice\")\n    checkChain(mails, general_issue)\n    checkNoMoreChains(mails)\n\n    # Alice replies to and resolved the commit issue, as does Dave, but before\n    # either submit, Bob swoops in and converts the issue to a note, and\n    # submits.  Then Alice submits, which checks (again) that her resolving of\n    # the issue has no effect, but her reply remains.  Then Bob converts the\n    # issue back to an issue, and submits.  Finally, Dave submits, which checks\n    # that his (old) resolving of the issue still remains and takes effect.\n\n    with frontend.signin(\"alice\"):\n        createComment(commit_issue, \"alice\")\n        resolveCommentChain(commit_issue)\n\n    with frontend.signin(\"dave\"):\n        resolveCommentChain(commit_issue)\n\n    with frontend.signin(\"bob\"):\n        morphCommentChain(commit_issue, \"note\")\n        submitChanges()\n\n        commit_issue.type = \"commit note\"\n\n    mails = receiveMails(UPDATED_SUBJECT)\n\n    checkSubmitter(mails, \"bob\")\n    checkChain(mails, commit_issue, \"CONVERTED TO NOTE!\")\n    checkNoMoreChains(mails)\n\n    with frontend.signin(\"alice\"):\n        submitChanges()\n\n        commit_issue.add_reply(\"alice\")\n\n    mails = receiveMails(UPDATED_SUBJECT)\n\n    checkSubmitter(mails, \"alice\")\n    checkChain(mails, commit_issue)\n    checkNoMoreChains(mails)\n\n    with frontend.signin(\"bob\"):\n        morphCommentChain(commit_issue, \"issue\")\n        submitChanges()\n\n        commit_issue.type = \"commit issue\"\n\n    mails = receiveMails(UPDATED_SUBJECT)\n\n    checkSubmitter(mails, \"bob\")\n    checkChain(mails, commit_issue, \"CONVERTED TO ISSUE!\")\n    checkNoMoreChains(mails)\n\n    with frontend.signin(\"dave\"):\n        submitChanges()\n\n    mails = receiveMails(UPDATED_SUBJECT)\n\n    checkSubmitter(mails, \"dave\")\n    checkChain(mails, commit_issue, \"ISSUE RESOLVED!\")\n    checkNoMoreChains(mails)\n"
  },
  {
    "path": "testing/tests/001-main/003-self/100-reviewing/__init__.py",
    "content": "# @dependency 001-main/002-createrepository.py\n"
  },
  {
    "path": "testing/tests/001-main/003-self/101-keepalives.py",
    "content": "# @dependency 001-main/002-createrepository.py\n\ninstance.unittest(\"gitutils\", [\"keepalives\"])\n"
  },
  {
    "path": "testing/tests/001-main/003-self/200-json/001-users.py",
    "content": "# @dependency 001-main/001-empty/003-criticctl/002-adduser-deluser.py\n# @dependency 001-main/001-empty/004-mixed/003-oauth.py\n# @dependency 001-main/001-empty/004-mixed/004-password.py\n# @dependency 001-main/003-self/028-gitemails.py\n\nfrontend.json(\n    \"users\",\n    expect={ \"users\": [user_json(\"admin\", \"Testing Administrator\"),\n                       user_json(\"alice\"),\n                       user_json(\"bob\"),\n                       user_json(\"dave\"),\n                       user_json(\"erin\"),\n                       user_json(\"howard\"),\n                       user_json(\"extra\", status=\"retired\"),\n                       user_json(\"carol\"),\n                       user_json(\"felix\"),\n                       user_json(\"gina\", no_email=True),\n                       user_json(\"iris\")] })\n\nfrontend.json(\n    \"users\",\n    params={ \"status\": \"current\" },\n    expect={ \"users\": [user_json(\"admin\", \"Testing Administrator\"),\n                       user_json(\"alice\"),\n                       user_json(\"bob\"),\n                       user_json(\"dave\"),\n                       user_json(\"erin\"),\n                       user_json(\"howard\"),\n                       user_json(\"carol\"),\n                       user_json(\"felix\"),\n                       user_json(\"gina\", no_email=True),\n                       user_json(\"iris\")] })\n\nfrontend.json(\n    \"users\",\n    params={ \"status\": \"retired\" },\n    expect={ \"users\": [user_json(\"extra\", status=\"retired\")] })\n\nfrontend.json(\n    \"users\",\n    params={ \"sort\": \"fullname\" },\n    expect={ \"users\": [user_json(\"alice\"),\n                       user_json(\"bob\"),\n                       user_json(\"carol\"),\n                       user_json(\"dave\"),\n                       user_json(\"erin\"),\n                       user_json(\"extra\", status=\"retired\"),\n                       user_json(\"felix\"),\n                       user_json(\"gina\", no_email=True),\n                       user_json(\"howard\"),\n                       user_json(\"iris\"),\n                       user_json(\"admin\", \"Testing Administrator\")] })\n\nfrontend.json(\n    \"users\",\n    params={ \"sort\": \"fullname\",\n             \"count\": \"4\" },\n    expect={ \"users\": [user_json(\"alice\"),\n                       user_json(\"bob\"),\n                       user_json(\"carol\"),\n                       user_json(\"dave\")] })\n\nfrontend.json(\n    \"users\",\n    params={ \"sort\": \"fullname\",\n             \"offset\": \"2\",\n             \"count\": \"4\" },\n    expect={ \"users\": [user_json(\"carol\"),\n                       user_json(\"dave\"),\n                       user_json(\"erin\"),\n                       user_json(\"extra\", status=\"retired\")] })\n\nfrontend.json(\n    \"users\",\n    params={ \"sort\": \"fullname\",\n             \"offset\": \"6\" },\n    expect={ \"users\": [user_json(\"felix\"),\n                       user_json(\"gina\", no_email=True),\n                       user_json(\"howard\"),\n                       user_json(\"iris\"),\n                       user_json(\"admin\", \"Testing Administrator\")] })\n\nfrontend.json(\n    \"users/%d\" % instance.userid(\"alice\"),\n    expect=user_json(\"alice\"))\n\nfrontend.json(\n    \"users/%d\" % instance.userid(\"alice\"),\n    params={ \"fields\": \"id\" },\n    expect={ \"id\": instance.userid(\"alice\") })\n\nfrontend.json(\n    \"users\",\n    params={ \"name\": \"alice\" },\n    expect=user_json(\"alice\"))\n\nfrontend.json(\n    \"users/%d/emails\" % instance.userid(\"alice\"),\n    expect={ \"emails\": [{ \"address\": \"alice@example.org\",\n                          \"selected\": True,\n                          \"verified\": None }] })\n\nfrontend.json(\n    \"users/%d/emails/1\" % instance.userid(\"alice\"),\n    expect={ \"address\": \"alice@example.org\",\n             \"selected\": True,\n             \"verified\": None })\n\nfilter_json = { \"id\": int,\n                \"type\": \"reviewer\",\n                \"path\": \"028-gitemails/\",\n                \"repository\": 1,\n                \"delegates\": [instance.userid(\"erin\")] }\n\nfrontend.json(\n    \"users/%d/filters\" % instance.userid(\"alice\"),\n    expect={ \"filters\": [filter_json] })\n\nfrontend.json(\n    \"users/%d/filters\" % instance.userid(\"alice\"),\n    params={ \"repository\": \"critic\" },\n    expect={ \"filters\": [filter_json] })\n\nresult = frontend.json(\n    \"users/%d/filters\" % instance.userid(\"alice\"),\n    params={ \"repository\": \"1\" },\n    expect={ \"filters\": [filter_json] })\n\nfrontend.json(\n    \"users/%d/filters\" % instance.userid(\"alice\"),\n    params={ \"include\": \"users,repositories\" },\n    expect={ \"filters\": [{ \"id\": int,\n                           \"type\": \"reviewer\",\n                           \"path\": \"028-gitemails/\",\n                           \"repository\": 1,\n                           \"delegates\": [instance.userid(\"erin\")] }],\n             \"linked\": { \"repositories\": [critic_json],\n                         \"users\": [user_json(\"erin\")] }})\n\nfrontend.json(\n    \"users/%d/filters/%d\" % (instance.userid(\"alice\"),\n                             result[\"filters\"][0][\"id\"]),\n    expect={ \"id\": result[\"filters\"][0][\"id\"],\n             \"type\": \"reviewer\",\n             \"path\": \"028-gitemails/\",\n             \"repository\": 1,\n             \"delegates\": [instance.userid(\"erin\")] })\n\n# Test asking for just the list of delegates.\nfrontend.json(\n    \"users/%d/filters/%d/delegates\" % (instance.userid(\"alice\"),\n                                       result[\"filters\"][0][\"id\"]),\n    expect={ \"delegates\": [instance.userid(\"erin\")] })\n\n# Check that the repository is not linked when we ask for just delegates.\nfrontend.json(\n    \"users/%d/filters/%d/delegates\" % (instance.userid(\"alice\"),\n                                       result[\"filters\"][0][\"id\"]),\n    params={ \"include\": \"users,repositories\" },\n    expect={ \"delegates\": [instance.userid(\"erin\")],\n             \"linked\": { \"repositories\": [],\n                         \"users\": [user_json(\"erin\")] }})\n\nfrontend.json(\n    \"users/%d,%d,%d\" % (instance.userid(\"alice\"),\n                        instance.userid(\"bob\"),\n                        instance.userid(\"dave\")),\n    expect={ \"users\": [user_json(\"alice\"),\n                       user_json(\"bob\"),\n                       user_json(\"dave\")] })\n\nfrontend.json(\n    \"users/%d,%d,%d\" % (instance.userid(\"alice\"),\n                        instance.userid(\"bob\"),\n                        instance.userid(\"dave\")),\n    params={ \"fields[users]\": \"name\" },\n    expect={ \"users\": [{ \"name\": \"alice\" },\n                       { \"name\": \"bob\" },\n                       { \"name\": \"dave\" }] })\n\nfrontend.json(\n    \"users/4711\",\n    expect={ \"error\": { \"title\": \"No such resource\",\n                        \"message\": \"Resource not found: Invalid user id: 4711\" }},\n    expected_http_status=404)\n\nfrontend.json(\n    \"users/alice\",\n    expect={ \"error\": { \"title\": \"Invalid API request\",\n                        \"message\": \"Invalid numeric id: 'alice'\" }},\n    expected_http_status=400)\n\nfrontend.json(\n    \"users\",\n    params={ \"name\": \"nosuchuser\" },\n    expect={ \"error\": { \"title\": \"No such resource\",\n                        \"message\": \"Resource not found: Invalid user name: 'nosuchuser'\" }},\n    expected_http_status=404)\n\nfrontend.json(\n    \"users\",\n    params={ \"status\": \"clown\" },\n    expect={ \"error\": { \"title\": \"Invalid API request\",\n                        \"message\": \"Invalid user status values: 'clown'\" }},\n    expected_http_status=400)\n\nfrontend.json(\n    \"users\",\n    params={ \"status\": \"current,clown,president\" },\n    expect={ \"error\": { \"title\": \"Invalid API request\",\n                        \"message\": \"Invalid user status values: 'clown', 'president'\" }},\n    expected_http_status=400)\n\nfrontend.json(\n    \"users\",\n    params={ \"sort\": \"age\" },\n    expect={ \"error\": { \"title\": \"Invalid API request\",\n                        \"message\": \"Invalid user sort parameter: 'age'\" }},\n    expected_http_status=400)\n\nwith frontend.signin(\"alice\"):\n    frontend.json(\n        \"users/me\",\n        expect=user_json(\"alice\"))\n\n    frontend.json(\n        \"users/%d\" % instance.userid(\"alice\"),\n        put={ \"fullname\": \"Alice has a new name\" },\n        expect=user_json(\"alice\", fullname=\"Alice has a new name\"))\n\n    frontend.json(\n        \"users/%d\" % instance.userid(\"alice\"),\n        expect=user_json(\"alice\", fullname=\"Alice has a new name\"))\n\n    frontend.json(\n        \"users/%d\" % instance.userid(\"alice\"),\n        put={ \"fullname\": user_json(\"alice\")[\"fullname\"] },\n        expect=user_json(\"alice\"))\n\n    frontend.json(\n        \"users/%d\" % instance.userid(\"alice\"),\n        expect=user_json(\"alice\"))\n\nfrontend.json(\n    \"users/me\",\n    expected_http_status=404,\n    expect={\n        \"error\": {\n            \"title\": \"No such resource\",\n            \"message\": \"Resource not found: 'users/me' (not signed in)\"\n        }\n    })\n"
  },
  {
    "path": "testing/tests/001-main/003-self/200-json/002-branches.py",
    "content": "# @dependency 001-main/002-createrepository.py\n\nstored_branches = []\n\ndef check_branches(path, branches, check):\n    if not check(path, expected=list, actual=branches):\n        return\n    for index, branch in enumerate(branches):\n        if check(\"%s[%d]\" % (path, index),\n                 expected={ \"id\": int,\n                            \"name\": str,\n                            \"repository\": 1,\n                            \"head\": int },\n                 actual=branch):\n            stored_branches.append(branch)\n\nfrontend.json(\n    \"repositories/1/branches\",\n    expect={ \"branches\": check_branches })\n\nfor branch in stored_branches:\n    frontend.json(\n        \"branches/%d\" % branch[\"id\"],\n        expect=branch)\n    frontend.json(\n        \"branches\",\n        params={ \"name\": branch[\"name\"],\n                 \"repository\": branch[\"repository\"] },\n        expect=branch)\n    frontend.json(\n        \"repositories/%d/branches/%d\" % (branch[\"repository\"], branch[\"id\"]),\n        expect=branch)\n    frontend.json(\n        \"repositories/%d/branches\" % branch[\"repository\"],\n        params={ \"name\": branch[\"name\"] },\n        expect=branch)\n\nstored_branch_heads_by_name = {}\nstored_commit_sha1s_by_id = {}\n\ndef store_branches(path, branches, check):\n    stored_branch_heads_by_name.update({ branch[\"name\"]: branch[\"head\"]\n                                         for branch in branches })\n\ndef store_commits(path, commits, check):\n    stored_commit_sha1s_by_id.update({ commit[\"id\"]: commit[\"sha1\"]\n                                       for commit in commits })\n\nfrontend.json(\n    \"repositories/1/branches\",\n    params={ \"include\": \"commits\" },\n    expect={ \"branches\": store_branches,\n             \"linked\": { \"commits\": store_commits }})\n\nfor name, head_id in stored_branch_heads_by_name.items():\n    if head_id not in stored_commit_sha1s_by_id:\n        logger.error(\"linked head of branch %s (commit id=%d) not included\"\n                     % (name, head_id))\n        continue\n\n    expected_sha1 = repository.run(\n        [\"ls-remote\", instance.repository_url(\"alice\"),\n         \"refs/heads/\" + name]).split()[0]\n    actual_sha1 = stored_commit_sha1s_by_id[head_id]\n\n    testing.expect.check(expected_sha1, actual_sha1)\n\ndef check_commits(path, commits, check):\n    if not check(path, expected=list, actual=commits):\n        return\n    for index, commit in enumerate(commits):\n        check(\"%s[%d]\" % (path, index),\n              expected=generic_commit_json,\n              actual=commit)\n\n\n\nfrontend.json(\n    \"branches/%d/commits\" % stored_branches[0][\"id\"],\n    expect={ \"commits\": check_commits })\n\nfirst10 = frontend.json(\n    \"branches/%d/commits\" % stored_branches[0][\"id\"],\n    params={ \"sort\": \"topological\",\n             \"fields\": \"id\",\n             \"count\": 10 },\n    expect={ \"commits\": list })\n\nfrontend.json(\n    \"branches/%d/commits\" % stored_branches[0][\"id\"],\n    params={ \"sort\": \"date\",\n             \"fields\": \"id\",\n             \"count\": 10 },\n    expect={ \"commits\": list })\n\nfrontend.json(\n    \"branches/%d/commits\" % stored_branches[0][\"id\"],\n    params={ \"sort\": \"topological\",\n             \"fields\": \"id\",\n             \"offset\": 5,\n             \"count\": 5 },\n    expect={ \"commits\": first10[\"commits\"][5:] })\n\nfrontend.json(\n    \"branches/4711\",\n    expect={ \"error\": { \"title\": \"No such resource\",\n                        \"message\": \"Resource not found: Invalid branch id: 4711\" }},\n    expected_http_status=404)\n\nfrontend.json(\n    \"branches/master\",\n    expect={ \"error\": { \"title\": \"Invalid API request\",\n                        \"message\": \"Invalid numeric id: 'master'\" }},\n    expected_http_status=400)\n\nfrontend.json(\n    \"branches\",\n    params={ \"name\": \"nosuchbranch\",\n             \"repository\": \"critic\" },\n    expect={ \"error\": { \"title\": \"No such resource\",\n                        \"message\": \"Resource not found: Invalid branch name: 'nosuchbranch'\" }},\n    expected_http_status=404)\n\nfrontend.json(\n    \"branches\",\n    params={ \"name\": \"nosuchbranch\" },\n    expect={ \"error\": { \"title\": \"Invalid API request\",\n                        \"message\": \"Named branch access must have repository specified.\" }},\n    expected_http_status=400)\n\nfrontend.json(\n    \"branches\",\n    params={ \"name\": \"master\",\n             \"repository\": \"4711\" },\n    expect={ \"error\": { \"title\": \"No such resource\",\n                        \"message\": \"Resource not found: Invalid repository id: 4711\" }},\n    expected_http_status=404)\n\nfrontend.json(\n    \"branches\",\n    params={ \"name\": \"master\",\n             \"repository\": \"nosuchrepository\" },\n    expect={ \"error\": { \"title\": \"No such resource\",\n                        \"message\": \"Resource not found: Invalid repository name: 'nosuchrepository'\" }},\n    expected_http_status=404)\n"
  },
  {
    "path": "testing/tests/001-main/003-self/200-json/003-repositories.py",
    "content": "# @dependency 001-main/002-createrepository.py\n\nfrontend.json(\n    \"repositories\",\n    expect={ \"repositories\": [critic_json, other_json] })\n\nfrontend.json(\n    \"repositories/1\",\n    expect=critic_json)\n\nfrontend.json(\n    \"repositories\",\n    params={ \"name\": \"critic\" },\n    expect=critic_json)\n\nfrontend.json(\n    \"repositories/2\",\n    expect=other_json)\n\nfrontend.json(\n    \"repositories\",\n    params={ \"name\": \"other\" },\n    expect=other_json)\n\nfrontend.json(\n    \"repositories/4711\",\n    expect={ \"error\": { \"title\": \"No such resource\",\n                        \"message\": \"Resource not found: Invalid repository id: 4711\" }},\n    expected_http_status=404)\n\nfrontend.json(\n    \"repositories/critic\",\n    expect={ \"error\": { \"title\": \"Invalid API request\",\n                        \"message\": \"Invalid numeric id: 'critic'\" }},\n    expected_http_status=400)\n\nfrontend.json(\n    \"repositories\",\n    params={ \"name\": \"nosuchrepository\" },\n    expect={ \"error\": { \"title\": \"No such resource\",\n                        \"message\": \"Resource not found: Invalid repository name: 'nosuchrepository'\" }},\n    expected_http_status=404)\n\nfrontend.json(\n    \"repositories\",\n    params={ \"filter\": \"interesting\" },\n    expect={ \"error\": { \"title\": \"Invalid API request\",\n                        \"message\": \"Invalid repository filter parameter: 'interesting'\" }},\n    expected_http_status=400)\n\n# Test with an access control profile that restricts access to other.git.\nno_other = {\n    \"repositories\": {\n        \"rule\": \"allow\",\n        \"exceptions\": [{\n            \"repository\": \"other\"\n        }]\n    }\n}\n\nwith testing.utils.access_token(\"alice\", no_other) as access_token:\n    with frontend.signin(access_token=access_token):\n        # Check that we can still access critic.git.\n        frontend.json(\n            \"repositories\",\n            params={ \"name\": \"critic\" },\n            expect=critic_json)\n\n        # Check that we can't access other.git.\n        frontend.json(\n            \"repositories\",\n            params={ \"name\": \"other\" },\n            expected_http_status=403)\n\n        # Check that we can still list all repositories, but that other.git is\n        # not included.\n        frontend.json(\n            \"repositories\",\n            expect={ \"repositories\": [critic_json] })\n"
  },
  {
    "path": "testing/tests/001-main/003-self/200-json/004-review.py",
    "content": "# @dependency 001-main/003-self/020-reviewrebase.py\n# @dependency 001-main/003-self/100-reviewing/001-comments.basic.py\n\n# Fetch the id of a review which contains some comments.\nresult = frontend.operation(\n    \"searchreview\",\n    data={ \"query\": \"branch:r/100-reviewing/001-comment.basic\" })\ntesting.expect.check(1, len(result[\"reviews\"]))\nreview_id = result[\"reviews\"][0][\"id\"]\n\nreview_json = {\n    \"id\": review_id,\n    \"state\": \"open\",\n    \"summary\": \"Added 100-reviewing/001-comment.basic.txt\",\n    \"description\": None,\n    \"repository\": 1,\n    \"branch\": int,\n    \"owners\": [instance.userid(\"alice\")],\n    \"assigned_reviewers\": [instance.userid(\"bob\")],\n    \"active_reviewers\": [],\n    \"progress\": 0,\n    \"progress_per_commit\": list,\n    \"watchers\": [instance.userid(\"dave\"),\n                 instance.userid(\"erin\")],\n    \"partitions\": [{ \"commits\": [int],\n                     \"rebase\": None }],\n    \"issues\": [int, int, int, int],\n    \"notes\": [int, int],\n    \"pending_rebase\": None\n}\n\nfrontend.json(\n    \"reviews/%d\" % review_id,\n    expect=review_json)\n\nfrontend.json(\n    \"reviews/%d\" % review_id,\n    params={ \"include\": \"users,commits\" },\n    expect={ \"id\": review_id,\n             \"state\": \"open\",\n             \"summary\": \"Added 100-reviewing/001-comment.basic.txt\",\n             \"description\": None,\n             \"repository\": 1,\n             \"branch\": int,\n             \"owners\": [instance.userid(\"alice\")],\n             \"assigned_reviewers\": [instance.userid(\"bob\")],\n             \"active_reviewers\": [],\n             \"progress\": 0,\n             \"progress_per_commit\": list,\n             \"watchers\": [instance.userid(\"dave\"),\n                          instance.userid(\"erin\")],\n             \"partitions\": [{ \"commits\": [int],\n                              \"rebase\": None }],\n             \"issues\": [int, int, int, int],\n             \"notes\": [int, int],\n             \"pending_rebase\": None,\n             \"linked\": { \"users\": [user_json(\"alice\"),\n                                   user_json(\"bob\"),\n                                   user_json(\"dave\"),\n                                   user_json(\"erin\")],\n                         \"commits\": [generic_commit_json] }})\n\nfrontend.json(\n    \"reviews/%d/commits\" % review_id,\n    expect={ \"commits\": [generic_commit_json] })\n\ndef check_description(path, description, check):\n    if description is not None:\n        check(path, expected=str, actual=description)\n\ndef check_reviews(expected_state=str):\n    def checker(path, reviews, check):\n        if not check(path, expected=list, actual=reviews):\n            return\n        for index, review in enumerate(reviews):\n            check(\"%s[%d]\" % (path, index),\n                  expected={ \"id\": int,\n                             \"state\": str,\n                             \"summary\": str,\n                             \"description\": check_description,\n                             \"repository\": 1,\n                             \"branch\": int,\n                             \"owners\": list,\n                             \"assigned_reviewers\": list,\n                             \"active_reviewers\": list,\n                             \"progress\": 0,\n                             \"progress_per_commit\": list,\n                             \"watchers\": list,\n                             \"partitions\": list,\n                             \"issues\": list,\n                             \"notes\": list,\n                             \"pending_rebase\": None },\n                  actual=review)\n    return checker\n\nall_reviews = frontend.json(\n    \"reviews\",\n    expect={ \"reviews\": check_reviews() })\n\nif not any(review[\"id\"] == review_id for review in all_reviews[\"reviews\"]):\n    logger.error(\"/api/v1/reviews did not contain r/%d\" % review_id)\n\nfrontend.json(\n    \"reviews\",\n    params={ \"repository\": \"critic\" },\n    expect={ \"reviews\": check_reviews() })\n\nopen_reviews = frontend.json(\n    \"reviews\",\n    params={ \"state\": \"open\" },\n    expect={ \"reviews\": check_reviews(\"open\") })\n\nif not any(review[\"id\"] == review_id for review in open_reviews[\"reviews\"]):\n    logger.error(\"/api/v1/reviews?state=open did not contain r/%d\" % review_id)\n\nclosed_reviews = frontend.json(\n    \"reviews\",\n    params={ \"state\": \"closed\" },\n    expect={ \"reviews\": check_reviews(\"closed\") })\n\nif any(review[\"id\"] == review_id for review in closed_reviews[\"reviews\"]):\n    logger.error(\"/api/v1/reviews?state=closed contained r/%d\" % review_id)\n\ndropped_reviews = frontend.json(\n    \"reviews\",\n    params={ \"state\": \"dropped\" },\n    expect={ \"reviews\": check_reviews(\"dropped\") })\n\nif any(review[\"id\"] == review_id for review in dropped_reviews[\"reviews\"]):\n    logger.error(\"/api/v1/reviews?state=dropped contained r/%d\" % review_id)\n\nfrontend.json(\n    \"reviews/4711\",\n    expect={ \"error\": { \"title\": \"No such resource\",\n                        \"message\": \"Resource not found: Invalid review id: 4711\" }},\n    expected_http_status=404)\n\nfrontend.json(\n    \"reviews/mypatch\",\n    expect={ \"error\": { \"title\": \"Invalid API request\",\n                        \"message\": \"Invalid numeric id: 'mypatch'\" }},\n    expected_http_status=400)\n\nfrontend.json(\n    \"reviews\",\n    params={ \"repository\": \"nosuchrepository\" },\n    expect={ \"error\": { \"title\": \"No such resource\",\n                        \"message\": \"Resource not found: Invalid repository name: 'nosuchrepository'\" }},\n    expected_http_status=404)\n\nfrontend.json(\n    \"reviews\",\n    params={ \"state\": \"rejected\" },\n    expect={ \"error\": { \"title\": \"Invalid API request\",\n                        \"message\": \"Invalid review state values: 'rejected'\" }},\n    expected_http_status=400)\n\nno_repository_access = {\n    \"repositories\": {\n        \"rule\": \"deny\",\n        \"exceptions\": []\n    }\n}\n\nwith testing.utils.access_token(\"alice\", no_repository_access) as access_token:\n    with frontend.signin(access_token=access_token):\n        # Check that this review is inaccessible now.\n        frontend.json(\n            \"reviews/%d\" % review_id,\n            expected_http_status=403)\n\n        # Check that we can still list \"all\" reviews successfully.\n        frontend.json(\n            \"reviews\",\n            expect={\n                \"reviews\": []\n            })\n"
  },
  {
    "path": "testing/tests/001-main/003-self/200-json/005-commits.py",
    "content": "# @dependency 001-main/002-createrepository.py\n\nSHA1 = \"78d7849db854f3544d7291cce96a0a4fa6d6843d\"\n\ncommit_json = {\n    \"id\": int,\n    \"sha1\": SHA1,\n    \"summary\": \"High-level testing framework\",\n    \"message\": \"\"\"\\\nHigh-level testing framework\n\nFramework for automated installation and \"black-box\" testing of Critic\nrunning in a VirtualBox instance.\n\"\"\",\n    \"parents\": [int],\n    \"author\": {\n        \"name\": \"Jens Lindstrom\",\n        \"email\": \"jl@opera.com\",\n        \"timestamp\": float\n    },\n    \"committer\": {\n        \"name\": \"Jens Lindstrom\",\n        \"email\": \"jl@opera.com\",\n        \"timestamp\": float\n    },\n}\n\nresult = frontend.json(\n    \"commits\",\n    params={ \"sha1\": SHA1,\n             \"repository\": \"critic\" },\n    expect=commit_json)\n\nfrontend.json(\n    \"commits/%d\" % result[\"id\"],\n    params={ \"repository\": \"critic\" },\n    expect=commit_json)\n\nresult = frontend.json(\n    \"repositories/1/commits\",\n    params={ \"sha1\": SHA1 },\n    expect=commit_json)\n\nfrontend.json(\n    \"repositories/1/commits/%d\" % result[\"id\"],\n    expect=commit_json)\n\nfrontend.json(\n    \"commits/47114711\",\n    params={ \"repository\": \"critic\" },\n    expect={ \"error\": { \"title\": \"No such resource\",\n                        \"message\": \"Resource not found: Invalid commit id: 47114711\" }},\n    expected_http_status=404)\n\nfrontend.json(\n    \"commits/47114711\",\n    expect={ \"error\": { \"title\": \"Invalid API request\",\n                        \"message\": \"Commit reference must have repository specified.\" }},\n    expected_http_status=400)\n\nfrontend.json(\n    \"commits\",\n    expect={ \"error\": { \"title\": \"Invalid API request\",\n                        \"message\": \"Missing required SHA-1 parameter.\" }},\n    expected_http_status=400)\n\nfrontend.json(\n    \"commits\",\n    params={ \"sha1\": SHA1 },\n    expect={ \"error\": { \"title\": \"Invalid API request\",\n                        \"message\": \"Commit reference must have repository specified.\" }},\n    expected_http_status=400)\n\nfrontend.json(\n    \"commits\",\n    params={ \"sha1\": \"00\",\n             \"repository\": \"critic\" },\n    expect={ \"error\": { \"title\": \"Invalid API request\",\n                        \"message\": \"Invalid SHA-1 parameter: '00'\" }},\n    expected_http_status=400)\n\nfrontend.json(\n    \"commits\",\n    params={ \"sha1\": \"invalid SHA-1\",\n             \"repository\": \"critic\" },\n    expect={ \"error\": { \"title\": \"Invalid API request\",\n                        \"message\": \"Invalid SHA-1 parameter: 'invalid SHA-1'\" }},\n    expected_http_status=400)\n\nfrontend.json(\n    \"commits\",\n    params={ \"sha1\": \"47114711\",\n             \"repository\": \"critic\" },\n    expect={ \"error\": { \"title\": \"No such resource\",\n                        \"message\": \"Resource not found: Invalid commit SHA-1: '47114711'\" }},\n    expected_http_status=404)\n"
  },
  {
    "path": "testing/tests/001-main/003-self/200-json/006-changesets.py",
    "content": "# @dependency 001-main/002-createrepository.py\n\nFROM_SHA1 = \"573c5ff15ad95cfbc3e2f2efb0a638a4a78c17a7\"\nFROM_SINGLE_SHA1 = \"aabc2b10c930a9e72fe9587a6e8634087bb3efe1\"\nTO_SHA1 = \"6dc8e9c2d952028286d4b83475947bd0b1410860\"\nROOT_SHA1 = \"ee37c47f6f6a14afa6912c1cc58a9f49d2a29acd\"\n\n# Changeset for single commit\nfrontend.json(\n    \"changesets\",\n    params={ \"repository\": 1,\n             \"commit\": TO_SHA1},\n    expected_http_status=202)\ninstance.synchronize_service(\"changeset\") # wait for changeset creation to finish\nsingle_changeset = frontend.json(\n    \"changesets\",\n    params={ \"repository\": 1,\n             \"commit\": TO_SHA1},\n    expect={ \"files\": [int, int, int],\n             \"type\": \"direct\",\n             \"to_commit\": int,\n             \"id\": int,\n             \"from_commit\": int,\n             \"contributing_commits\": [int],\n             \"review_state\": None })\nequiv_changeset = frontend.json(\n    \"changesets\",\n    params={ \"repository\": 1,\n             \"from\": FROM_SINGLE_SHA1,\n             \"to\": TO_SHA1},\n    expect={ \"files\": [int, int, int],\n             \"type\": \"direct\",\n             \"to_commit\": int,\n             \"id\": int,\n             \"from_commit\": int,\n             \"contributing_commits\": [int],\n             \"review_state\": None })\nassert (single_changeset == equiv_changeset),\\\n    \"single changeset should equal equivalent changeset\"\n\n# Changeset between two commits\nfrontend.json(\n    \"changesets\",\n    params={ \"repository\": \"critic\",\n             \"from\": FROM_SHA1,\n             \"to\": TO_SHA1 },\n    expected_http_status=202)\ninstance.synchronize_service(\"changeset\") # wait for changeset creation to finish\nfrontend.json(\n    \"changesets\",\n    params={ \"repository\": \"critic\",\n             \"from\": FROM_SHA1,\n             \"to\": TO_SHA1 },\n    expect={ \"files\": [int, int, int, int, int, int, int, int],\n             \"type\": \"custom\",\n             \"to_commit\": int,\n             \"id\": int,\n             \"from_commit\": int,\n             \"contributing_commits\": [int, int, int],\n             \"review_state\": None })\n\n# Changeset from id\nfrontend.json(\n    \"changesets/\" + str(single_changeset[\"id\"]),\n    params={ \"repository\": 1 },\n    expect={ \"files\": [int, int, int],\n             \"type\": \"direct\",\n             \"to_commit\": int,\n             \"id\": single_changeset[\"id\"],\n             \"from_commit\": int,\n             \"contributing_commits\": [int],\n             \"review_state\": None })\n\n# Changeset from partial SHA1\nfrontend.json(\n    \"changesets\",\n    params={ \"repository\": 1,\n             \"commit\": TO_SHA1[:8]},\n    expect={ \"files\": [int, int, int],\n             \"type\": \"direct\",\n             \"to_commit\": int,\n             \"id\": int,\n             \"from_commit\": int,\n             \"contributing_commits\": [int],\n             \"review_state\": None })\n\n# Missing changeset id and commit refs\nfrontend.json(\n    \"changesets\",\n    params={ \"repository\": 1 },\n    expect={ \"error\": {\n        \"message\": \"Missing required parameters from and to, or commit\",\n        \"title\": \"Invalid API request\" }\n         },\n    expected_http_status=400)\n\n# Missing repository\nfrontend.json(\n    \"changesets\",\n    params={ \"commit\": TO_SHA1 },\n    expect={ \"error\": {\n        \"message\": \"repository needs to be specified, ex. &repository=<id>\",\n        \"title\": \"Invalid API request\" }\n         },\n    expected_http_status=400)\n\n# Missing to\nfrontend.json(\n    \"changesets\",\n    params={ \"repository\": 1,\n             \"from\": FROM_SHA1 },\n    expect={ \"error\": {\n        \"message\": \"Missing required parameters from and to, only one supplied\",\n        \"title\": \"Invalid API request\" }\n         },\n    expected_http_status=400)\n\n# Invalid SHA1\nfrontend.json(\n    \"changesets\",\n    params={ \"repository\": 1,\n             \"commit\": \"00g0\"},\n    expect={ \"error\": {\n        \"message\": \"Invalid parameter: commit=00g0: Invalid ref: '00g0^{commit}'\",\n        \"title\": \"No such resource\" }\n         },\n    expected_http_status=404)\n\n# Changeset between a commit and itself\nfrontend.json(\n    \"changesets\",\n    params={ \"repository\": 1,\n             \"from\": FROM_SHA1,\n             \"to\": FROM_SHA1},\n    expect={ \"error\": {\n        \"message\": \"from and to can't be the same commit\",\n        \"title\": \"Invalid API request\" }\n         },\n    expected_http_status=400)\n"
  },
  {
    "path": "testing/tests/001-main/003-self/200-json/006-comments.py",
    "content": "# @dependency 001-main/003-self/004-createreview.py\n# @dependency 001-main/003-self/100-reviewing/001-comments.basic.py\n\nimport os\n\n# Fetch the id of a review which contains some comments.\nresult = frontend.operation(\n    \"searchreview\",\n    data={ \"query\": \"branch:r/100-reviewing/001-comment.basic\" })\ntesting.expect.check(1, len(result[\"reviews\"]))\nreview_id = result[\"reviews\"][0][\"id\"]\n\nresult = frontend.json(\n    \"reviews/%d\" % review_id,\n    params={ \"fields\": \"issues,notes\" },\n    expect={ \"issues\": [int, int, int, int],\n             \"notes\": [int, int] })\n\nfrontend.json(\n    \"comments/%d\" % result[\"issues\"][0],\n    expect={ \"id\": result[\"issues\"][0],\n             \"type\": \"issue\",\n             \"is_draft\": False,\n             \"state\": \"open\",\n             \"review\": review_id,\n             \"author\": instance.userid(\"alice\"),\n             \"location\": None,\n             \"resolved_by\": None,\n             \"addressed_by\": None,\n             \"timestamp\": float,\n             \"text\": \"This is a general issue.\",\n             \"replies\": [int, int, int, int, int],\n             \"draft_changes\": None })\n\nfrontend.json(\n    \"comments/%d\" % result[\"issues\"][1],\n    expect={ \"id\": result[\"issues\"][1],\n             \"type\": \"issue\",\n             \"is_draft\": False,\n             \"state\": \"open\",\n             \"review\": review_id,\n             \"author\": instance.userid(\"alice\"),\n             \"location\": None,\n             \"resolved_by\": None,\n             \"addressed_by\": None,\n             \"timestamp\": float,\n             \"text\": \"This is a general note.\",\n             \"replies\": [int, int],\n             \"draft_changes\": None })\n\nfrontend.json(\n    \"comments/%d\" % result[\"issues\"][2],\n    expect={ \"id\": result[\"issues\"][2],\n             \"type\": \"issue\",\n             \"is_draft\": False,\n             \"state\": \"resolved\",\n             \"review\": review_id,\n             \"author\": instance.userid(\"alice\"),\n             \"location\": {\n                 \"type\": \"commit-message\",\n                 \"first_line\": int,\n                 \"last_line\": int,\n                 \"commit\": int\n             },\n             \"resolved_by\": instance.userid(\"dave\"),\n             \"addressed_by\": None,\n             \"timestamp\": float,\n             \"text\": \"This is a commit issue.\",\n             \"replies\": [int, int, int],\n             \"draft_changes\": None })\n\nfrontend.json(\n    \"comments/%d\" % result[\"issues\"][3],\n    expect={ \"id\": result[\"issues\"][3],\n             \"type\": \"issue\",\n             \"is_draft\": False,\n             \"state\": \"open\",\n             \"review\": review_id,\n             \"author\": instance.userid(\"alice\"),\n             \"location\": {\n                 \"type\": \"file-version\",\n                 \"first_line\": int,\n                 \"last_line\": int,\n                 \"file\": int,\n                 \"changeset\": int,\n                 \"side\": \"new\",\n                 \"commit\": None,\n                 \"is_translated\": False\n             },\n             \"resolved_by\": None,\n             \"addressed_by\": None,\n             \"timestamp\": float,\n             \"text\": \"This is a file issue.\",\n             \"replies\": [int],\n             \"draft_changes\": None })\n\nfrontend.json(\n    \"comments/%d\" % result[\"notes\"][0],\n    expect={ \"id\": result[\"notes\"][0],\n             \"type\": \"note\",\n             \"is_draft\": False,\n             \"state\": None,\n             \"review\": review_id,\n             \"author\": instance.userid(\"alice\"),\n             \"location\": {\n                 \"type\": \"commit-message\",\n                 \"first_line\": int,\n                 \"last_line\": int,\n                 \"commit\": int\n             },\n             \"resolved_by\": None,\n             \"addressed_by\": None,\n             \"timestamp\": float,\n             \"text\": \"This is a commit note.\",\n             \"replies\": [],\n             \"draft_changes\": None })\n\nfrontend.json(\n    \"comments/%d\" % result[\"notes\"][1],\n    expect={ \"id\": result[\"notes\"][1],\n             \"type\": \"note\",\n             \"is_draft\": False,\n             \"state\": None,\n             \"review\": review_id,\n             \"author\": instance.userid(\"alice\"),\n             \"location\": {\n                 \"type\": \"file-version\",\n                 \"first_line\": int,\n                 \"last_line\": int,\n                 \"file\": int,\n                 \"changeset\": int,\n                 \"side\": \"new\",\n                 \"commit\": None,\n                 \"is_translated\": False\n             },\n             \"resolved_by\": None,\n             \"addressed_by\": None,\n             \"timestamp\": float,\n             \"text\": \"This is a file note.\",\n             \"replies\": [],\n             \"draft_changes\": None })\n\nfrontend.json(\n    \"reviews/%d/comments\" % review_id,\n    params={ \"fields\": \"id\" },\n    expect={ \"comments\": [{ \"id\": result[\"issues\"][0] },\n                          { \"id\": result[\"issues\"][1] },\n                          { \"id\": result[\"issues\"][2] },\n                          { \"id\": result[\"notes\"][0] },\n                          { \"id\": result[\"issues\"][3] },\n                          { \"id\": result[\"notes\"][1] }] })\n\nfrontend.json(\n    \"comments/%d\" % result[\"issues\"][0],\n    params={ \"include\": \"users,replies\" },\n    expect={ \"id\": result[\"issues\"][0],\n             \"type\": \"issue\",\n             \"is_draft\": False,\n             \"state\": \"open\",\n             \"review\": review_id,\n             \"author\": instance.userid(\"alice\"),\n             \"location\": None,\n             \"resolved_by\": None,\n             \"addressed_by\": None,\n             \"timestamp\": float,\n             \"text\": \"This is a general issue.\",\n             \"replies\": [int, int, int, int, int],\n             \"draft_changes\": None,\n             \"linked\": { \"users\": [user_json(\"alice\"),\n                                   user_json(\"bob\"),\n                                   user_json(\"dave\"),\n                                   user_json(\"erin\")],\n                         \"replies\": [reply_json(\"bob\"),\n                                     reply_json(\"dave\"),\n                                     reply_json(\"erin\"),\n                                     reply_json(\"alice\"),\n                                     reply_json(\"bob\")] }})\n\nwith frontend.signin(\"alice\"):\n    # Create comment with review specified via query parameter.\n    created_issue_id_1 = frontend.json(\n        \"comments\",\n        params={\n            \"review\": review_id\n        },\n        post={\n            \"type\": \"issue\",\n            \"text\": \"JSON general issue #1\"\n        },\n        expect={\n            \"id\": int,\n            \"type\": \"issue\",\n            \"is_draft\": True,\n            \"state\": \"open\",\n            \"review\": review_id,\n            \"author\": instance.userid(\"alice\"),\n            \"location\": None,\n            \"resolved_by\": None,\n            \"addressed_by\": None,\n            \"timestamp\": float,\n            \"text\": \"JSON general issue #1\",\n            \"replies\": [],\n            \"draft_changes\": draft_changes_json(\"alice\", is_draft=True),\n        })[\"id\"]\n\n    # Create comment with review specified via POST data. Also specify author\n    # explicitly.\n    created_note_id_1 = frontend.json(\n        \"comments\",\n        post={\n            \"type\": \"note\",\n            \"review\": review_id,\n            \"author\": instance.userid(\"alice\"),\n            \"text\": \"JSON general note #1\"\n        },\n        expect={\n            \"id\": int,\n            \"type\": \"note\",\n            \"is_draft\": True,\n            \"state\": None,\n            \"review\": review_id,\n            \"author\": instance.userid(\"alice\"),\n            \"location\": None,\n            \"resolved_by\": None,\n            \"addressed_by\": None,\n            \"timestamp\": float,\n            \"text\": \"JSON general note #1\",\n            \"replies\": [],\n            \"draft_changes\": draft_changes_json(\"alice\", is_draft=True),\n        })[\"id\"]\n\n    # Create issue with review specified in the path.\n    created_issue_id_2 = frontend.json(\n        \"reviews/%d/issues\" % review_id,\n        post={\n            \"text\": \"JSON general issue #2\"\n        },\n        expect={\n            \"id\": int,\n            \"type\": \"issue\",\n            \"is_draft\": True,\n            \"state\": \"open\",\n            \"review\": review_id,\n            \"author\": instance.userid(\"alice\"),\n            \"location\": None,\n            \"resolved_by\": None,\n            \"addressed_by\": None,\n            \"timestamp\": float,\n            \"text\": \"JSON general issue #2\",\n            \"replies\": [],\n            \"draft_changes\": draft_changes_json(\"alice\", is_draft=True),\n        })[\"id\"]\n\n    # Create note with review specified in the path.\n    created_note_id_2 = frontend.json(\n        \"reviews/%d/notes\" % review_id,\n        post={\n            \"text\": \"JSON general note #2\"\n        },\n        expect={\n            \"id\": int,\n            \"type\": \"note\",\n            \"is_draft\": True,\n            \"state\": None,\n            \"review\": review_id,\n            \"author\": instance.userid(\"alice\"),\n            \"location\": None,\n            \"resolved_by\": None,\n            \"addressed_by\": None,\n            \"timestamp\": float,\n            \"text\": \"JSON general note #2\",\n            \"replies\": [],\n            \"draft_changes\": draft_changes_json(\"alice\", is_draft=True),\n        })[\"id\"]\n\n    review_data = frontend.json(\n        \"reviews/%d\" % review_id,\n        params={\n            \"fields\": \"issues,notes\"\n        },\n        expect={\n            \"issues\": [int, int, int, int, int, int],\n            \"notes\": [int, int, int, int]\n        })\n\n    testing.expect.true(\n        created_issue_id_1 in review_data[\"issues\"],\n        \"created issue #1 in reviews/N/issues\")\n    testing.expect.true(\n        created_note_id_1 in review_data[\"notes\"],\n        \"created note #1 in reviews/N/notes\")\n    testing.expect.true(\n        created_issue_id_2 in review_data[\"issues\"],\n        \"created issue #2 in reviews/N/issues\")\n    testing.expect.true(\n        created_note_id_2 in review_data[\"notes\"],\n        \"created note #2 in reviews/N/notes\")\n\nwith frontend.signin(\"bob\"):\n    # Check that Bob doesn't see Alice's draft comments.\n    review_data = frontend.json(\n        \"reviews/%d\" % review_id,\n        params={\n            \"fields\": \"issues,notes\"\n        },\n        expect={\n            \"issues\": [int, int, int, int],\n            \"notes\": [int, int]\n        })\n\n    published_issue_ids = review_data[\"issues\"]\n    published_note_ids = review_data[\"notes\"]\n\n    testing.expect.false(\n        created_issue_id_1 in review_data[\"issues\"],\n        \"created issue #1 in reviews/N/issues\")\n    testing.expect.false(\n        created_note_id_1 in review_data[\"notes\"],\n        \"created note #1 in reviews/N/notes\")\n    testing.expect.false(\n        created_issue_id_2 in review_data[\"issues\"],\n        \"created issue #2 in reviews/N/issues\")\n    testing.expect.false(\n        created_note_id_2 in review_data[\"notes\"],\n        \"created note #2 in reviews/N/notes\")\n\n# Find another review.\nresult = frontend.operation(\n    \"searchreview\",\n    data={ \"query\": \"branch:r/004-createreview\" })\ntesting.expect.check(1, len(result[\"reviews\"]))\nother_review_id = result[\"reviews\"][0][\"id\"]\n\nwith frontend.signin(\"alice\"):\n    frontend.json(\n        \"comments/%d\" % created_issue_id_1,\n        put={\n            \"text\": \"JSON general issue #1 (edited)\"\n        },\n        expect={\n            \"id\": int,\n            \"type\": \"issue\",\n            \"is_draft\": True,\n            \"state\": \"open\",\n            \"review\": review_id,\n            \"author\": instance.userid(\"alice\"),\n            \"location\": None,\n            \"resolved_by\": None,\n            \"addressed_by\": None,\n            \"timestamp\": float,\n            \"text\": \"JSON general issue #1 (edited)\",\n            \"replies\": [],\n            \"draft_changes\": draft_changes_json(\"alice\", is_draft=True),\n        })\n\n    frontend.json(\n        \"comments/%d\" % created_note_id_1,\n        delete=True,\n        expected_http_status=204)\n\n    review_data = frontend.json(\n        \"reviews/%d\" % review_id,\n        params={\n            \"fields\": \"issues,notes\"\n        },\n        expect={\n            \"issues\": [int, int, int, int, int, int],\n            \"notes\": [int, int, int]\n        })\n\n    testing.expect.true(\n        created_issue_id_1 in review_data[\"issues\"],\n        \"created issue #1 in reviews/N/issues\")\n    testing.expect.false(\n        created_note_id_1 in review_data[\"notes\"],\n        \"created note #1 in reviews/N/notes\")\n    testing.expect.true(\n        created_issue_id_2 in review_data[\"issues\"],\n        \"created issue #2 in reviews/N/issues\")\n    testing.expect.true(\n        created_note_id_2 in review_data[\"notes\"],\n        \"created note #2 in reviews/N/notes\")\n\n    frontend.json(\n        \"comments/%d,%d,%d\" % (created_issue_id_1,\n                               created_issue_id_2,\n                               created_note_id_2),\n        put={\n            \"text\": \"Common text (edited)\"\n        },\n        expect={\n            \"comments\": [\n                {\n                    \"id\": created_issue_id_1,\n                    \"text\": \"Common text (edited)\",\n                    \"*\": \"*\"\n                },\n                {\n                    \"id\": created_issue_id_2,\n                    \"text\": \"Common text (edited)\",\n                    \"*\": \"*\"\n                },\n                {\n                    \"id\": created_note_id_2,\n                    \"text\": \"Common text (edited)\",\n                    \"*\": \"*\"\n                }\n            ]\n        })\n\n    # Error handling.\n\n    # Create comment without specifying a review.\n    frontend.json(\n        \"comments\",\n        post={\n            \"type\": \"issue\",\n            \"text\": \"Invalid issue\"\n        },\n        expected_http_status=400,\n        expect={\n            \"error\": {\n                \"title\": \"Invalid API request\",\n                \"message\": \"No review specified\"\n            }\n        })\n\n    # Create comment without specifying conflicting reviews.\n    frontend.json(\n        \"comments\",\n        params={\n            \"review\": review_id\n        },\n        post={\n            \"type\": \"issue\",\n            \"review\": other_review_id,\n            \"text\": \"Invalid issue\"\n        },\n        expected_http_status=400,\n        expect={\n            \"error\": {\n                \"title\": \"Invalid API request\",\n                \"message\": \"Conflicting reviews specified\"\n            }\n        })\n\n    # Create comment as another user.\n    frontend.json(\n        \"comments\",\n        post={\n            \"type\": \"issue\",\n            \"review\": review_id,\n            \"author\": instance.userid(\"bob\"),\n            \"text\": \"Invalid issue\"\n        },\n        expected_http_status=403,\n        expect={\n            \"error\": {\n                \"title\": \"Permission denied\",\n                \"message\": \"Must be an administrator\"\n            }\n        })\n\n    # Try to edit text of published comment.\n    frontend.json(\n        \"comments/%d\" % published_note_ids[0],\n        put={\n            \"text\": \"Invalid edit\"\n        },\n        expected_http_status=400,\n        expect={\n            \"error\": {\n                \"title\": \"Invalid API request\",\n                \"message\": \"Published comments cannot be edited\"\n            }\n        })\n\n    # Try to delete a published comment.\n    frontend.json(\n        \"comments/%d\" % published_note_ids[0],\n        delete=True,\n        expected_http_status=400,\n        expect={\n            \"error\": {\n                \"title\": \"Invalid API request\",\n                \"message\": \"Published comments cannot be deleted\"\n            }\n        })\n\n    frontend.operation(\n        \"abortchanges\",\n        data={\n            \"review_id\": review_id,\n            \"what\": {\n                \"approval\": False,\n                \"comments\": True,\n                \"metacomments\": False\n            }\n        })\n\n#\n# Create review which modifies a file a couple of times.\n#\n\nwith repository.workcopy() as work:\n    review = Review(work, \"alice\", \"200-json/006-comments\")\n    review.addFile(the_file=\"200-json/006-comments.txt\")\n    review.commit(\"reference commit\",\n                  reference=True,\n                  the_file=[\"1st line\",\n                            \"2nd line\",\n                            \"3rd line\",\n                            \"4th line\",\n                            \"5th line\",\n                            \"6th line\",\n                            \"7th line\",\n                            \"8th line\"])\n    review.commit(\"first reviewed commit\",\n                  the_file=[\"1st line\",\n                            \"2nd line (edited)\",\n                            \"3rd line (edited)\",\n                            \"4th line\",\n                            \"5th line\",\n                            \"6th line\",\n                            \"7th line\",\n                            \"8th line\"])\n    review.commit(\"second reviewed commit\",\n                     the_file=[\"1st line\",\n                               \"2nd line (edited)\",\n                               \"3rd line (edited)\",\n                               \"4th line\",\n                               \"  1st added line\",\n                               \"  2nd added line\",\n                               \"  3rd added line\",\n                               \"5th line\",\n                               \"6th line (edited)\",\n                               \"7th line (edited)\",\n                               \"8th line\"]),\n    review.commit(\"third reviewed commit\",\n                  the_file=[\"1st line\",\n                            \"2nd line (edited)\",\n                            \"3rd line (edited)\",\n                            \"4th line\",\n                            \"  1st added line\",\n                            \"  2nd added line\",\n                            \"  3rd added line\",\n                            \"5th line (edited)\",\n                            \"6th line (edited) (edited)\",\n                            \"7th line (edited)\",\n                            \"8th line\"])\n    review.submit()\n    review_id = review.id\n    sha1s = review.sha1s\n\nwith frontend.signin(\"alice\"):\n    issue_1 = frontend.json(\n        \"reviews/%d/issues\" % review_id,\n        post={\n            \"text\": \"Issue on 1st line\",\n            \"location\": {\n                \"type\": \"file-version\",\n                \"changeset\": fetch_changeset({ \"from\": sha1s[0],\n                                               \"to\": sha1s[1] })[\"id\"],\n                \"file\": \"200-json/006-comments.txt\",\n                \"first_line\": 1,\n                \"last_line\": 1,\n                \"side\": \"new\",\n            },\n        })[\"id\"]\n\n    issue_2 = frontend.json(\n        \"reviews/%d/issues\" % review_id,\n        post={\n            \"text\": \"Issue on 1st-3rd line\",\n            \"location\": {\n                \"type\": \"file-version\",\n                \"changeset\": fetch_changeset({ \"from\": sha1s[0],\n                                               \"to\": sha1s[2] })[\"id\"],\n                \"file\": \"200-json/006-comments.txt\",\n                \"first_line\": 1,\n                \"last_line\": 3,\n                \"side\": \"new\",\n            },\n        })[\"id\"]\n\n    issue_3 = frontend.json(\n        \"reviews/%d/issues\" % review_id,\n        post={\n            \"text\": \"Issue on 8th line\",\n            \"location\": {\n                \"type\": \"file-version\",\n                \"changeset\": fetch_changeset({ \"from\": sha1s[1],\n                                               \"to\": sha1s[3] })[\"id\"],\n                \"file\": \"200-json/006-comments.txt\",\n                \"first_line\": 11,\n                \"last_line\": 11,\n                \"side\": \"new\",\n            },\n        })[\"id\"]\n\n    issue_4 = frontend.json(\n        \"reviews/%d/issues\" % review_id,\n        post={\n            \"text\": \"Issue on 6th-7th line\",\n            \"location\": {\n                \"type\": \"file-version\",\n                \"commit\": sha1s[2],\n                \"file\": \"200-json/006-comments.txt\",\n                \"first_line\": 9,\n                \"last_line\": 10,\n            },\n        })[\"id\"]\n\n    frontend.json(\n        \"reviews/%d/comments\" % review_id,\n        params={\n            \"commit\": sha1s[1],\n            \"fields\": \"id,location.first_line,location.last_line\",\n        },\n        expect={\n            \"comments\": [{\n                \"id\": issue_1,\n                \"location\": {\n                    \"first_line\": 1,\n                    \"last_line\": 1,\n                }\n            }, {\n                \"id\": issue_2,\n                \"location\": {\n                    \"first_line\": 1,\n                    \"last_line\": 3,\n                }\n            }, {\n                \"id\": issue_3,\n                \"location\": {\n                    \"first_line\": 8,\n                    \"last_line\": 8,\n                }\n            }],\n        })\n\n    frontend.json(\n        \"reviews/%d/comments\" % review_id,\n        params={\n            \"commit\": sha1s[2],\n            \"fields\": \"id,location.first_line,location.last_line\",\n        },\n        expect={\n            \"comments\": [{\n                \"id\": issue_1,\n                \"location\": {\n                    \"first_line\": 1,\n                    \"last_line\": 1,\n                }\n            }, {\n                \"id\": issue_2,\n                \"location\": {\n                    \"first_line\": 1,\n                    \"last_line\": 3,\n                }\n            }, {\n                \"id\": issue_3,\n                \"location\": {\n                    \"first_line\": 11,\n                    \"last_line\": 11,\n                }\n            }, {\n                \"id\": issue_4,\n                \"location\": {\n                    \"first_line\": 9,\n                    \"last_line\": 10,\n                }\n            }],\n        })\n\n    frontend.json(\n        \"reviews/%d/comments\" % review_id,\n        params={\n            \"commit\": sha1s[3],\n            \"fields\": \"id,location.first_line,location.last_line\",\n        },\n        expect={\n            \"comments\": [{\n                \"id\": issue_1,\n                \"location\": {\n                    \"first_line\": 1,\n                    \"last_line\": 1,\n                }\n            }, {\n                \"id\": issue_2,\n                \"location\": {\n                    \"first_line\": 1,\n                    \"last_line\": 3,\n                }\n            }, {\n                \"id\": issue_3,\n                \"location\": {\n                    \"first_line\": 11,\n                    \"last_line\": 11,\n                }\n            }],\n        })\n\n    frontend.json(\n        \"reviews/%d/comments\" % review_id,\n        params={\n            \"changeset\": fetch_changeset({ \"from\": sha1s[0],\n                                           \"to\": sha1s[2] })[\"id\"],\n            \"fields\": \"id,location.first_line,location.last_line\",\n        },\n        expect={\n            \"comments\": [{\n                \"id\": issue_1,\n                \"location\": {\n                    \"first_line\": 1,\n                    \"last_line\": 1,\n                }\n            }, {\n                \"id\": issue_2,\n                \"location\": {\n                    \"first_line\": 1,\n                    \"last_line\": 3,\n                }\n            }, {\n                \"id\": issue_3,\n                \"location\": {\n                    \"first_line\": 11,\n                    \"last_line\": 11,\n                }\n            }, {\n                \"id\": issue_4,\n                \"location\": {\n                    \"first_line\": 9,\n                    \"last_line\": 10,\n                }\n            }],\n        })\n\n    frontend.json(\n        \"reviews/%d/comments\" % review_id,\n        params={\n            \"changeset\": fetch_changeset({ \"from\": sha1s[1],\n                                           \"to\": sha1s[3] })[\"id\"],\n            \"fields\": \"id,location.first_line,location.last_line\",\n        },\n        expect={\n            \"comments\": [{\n                \"id\": issue_1,\n                \"location\": {\n                    \"first_line\": 1,\n                    \"last_line\": 1,\n                }\n            }, {\n                \"id\": issue_2,\n                \"location\": {\n                    \"first_line\": 1,\n                    \"last_line\": 3,\n                }\n            }, {\n                \"id\": issue_3,\n                \"location\": {\n                    \"first_line\": 11,\n                    \"last_line\": 11,\n                }\n            }],\n        })\n\n    frontend.json(\n        \"reviews/%d/comments\" % review_id,\n        params={\n            \"changeset\": fetch_changeset({ \"from\": sha1s[0],\n                                           \"to\": sha1s[3] })[\"id\"],\n            \"fields\": \"id,location.first_line,location.last_line\",\n        },\n        expect={\n            \"comments\": [{\n                \"id\": issue_1,\n                \"location\": {\n                    \"first_line\": 1,\n                    \"last_line\": 1,\n                }\n            }, {\n                \"id\": issue_2,\n                \"location\": {\n                    \"first_line\": 1,\n                    \"last_line\": 3,\n                }\n            }, {\n                \"id\": issue_3,\n                \"location\": {\n                    \"first_line\": 11,\n                    \"last_line\": 11,\n                }\n            }],\n        })\n\n# end of file\n"
  },
  {
    "path": "testing/tests/001-main/003-self/200-json/007-filechanges.py",
    "content": "# @dependency 001-main/002-createrepository.py\n# @dependency 001-main/003-self/200-json/006-changesets.py\n\nFROM_SHA1 = \"573c5ff15ad95cfbc3e2f2efb0a638a4a78c17a7\"\nFROM_SINGLE_SHA1 = \"aabc2b10c930a9e72fe9587a6e8634087bb3efe1\"\nTO_SHA1 = \"6dc8e9c2d952028286d4b83475947bd0b1410860\"\nROOT_SHA1 = \"ee37c47f6f6a14afa6912c1cc58a9f49d2a29acd\"\n\nGENERIC_FILECHANGE = { \"file\": int,\n                       \"changeset\": int,\n                       \"old_sha1\": str,\n                       \"new_sha1\": str,\n                       \"new_mode\": None,\n                       \"old_mode\": None }\n\nfiles = {}\n\ndef fetch_file(path):\n    result = frontend.json(\n        \"files\",\n        params={\n            \"path\": path\n        },\n        expect={\n            \"id\": int,\n            \"path\": path\n        })\n    files[path] = result[\"id\"]\n\nfetch_file(\"testing/__init__.py\")\nfetch_file(\"testing/repository.py\")\nfetch_file(\"testing/virtualbox.py\")\n\n# Filechanges for changeset from single commit\nsingle_changeset = fetch_changeset({ \"commit\": TO_SHA1 })\n\nfrontend.json(\n    \"filechanges\",\n    params={\n        \"repository\": 1,\n        \"changeset\": single_changeset[\"id\"]\n    },\n    expect={\"filechanges\": [\n        {\"changeset\": single_changeset[\"id\"],\n         \"old_sha1\": \"a2ffb3a6cd3b021c34592f4bd8f32905e4dd5830\",\n         \"new_sha1\": \"2d06e47848827d8d8312542f3687f0380ebbc3ed\",\n         \"file\": files[\"testing/__init__.py\"],\n         \"new_mode\": None,\n         \"old_mode\": None},\n        {\"changeset\": single_changeset[\"id\"],\n         \"old_sha1\": \"e285e7c535dd8eee185d71c5adec1a328e586a58\",\n         \"new_sha1\": \"ac6fe72b7ffefb9d5d4c6637aa94c02e756b2665\",\n         \"file\": files[\"testing/repository.py\"],\n         \"new_mode\": None,\n         \"old_mode\": None},\n        {\"changeset\": single_changeset[\"id\"],\n         \"old_sha1\": \"0f5b7b313b6152f9c4f342c151fa1038a83e03f4\",\n         \"new_sha1\": \"c2e9ee01afb2b0cdde940532f93a6823013c8a91\",\n         \"file\": files[\"testing/virtualbox.py\"],\n         \"new_mode\": None,\n         \"old_mode\": None}]})\n\n# Single filechange for changeset from two commits\ncustom_changeset = fetch_changeset({\n    \"from\": FROM_SHA1,\n    \"to\": TO_SHA1\n})\n\nfrontend.json(\n    \"filechanges/\" + str(custom_changeset[\"files\"][0]),\n    params={\n        \"repository\": 1,\n        \"changeset\": custom_changeset[\"id\"]\n    },\n    expect=GENERIC_FILECHANGE)\n\n# Invalid filechange id\nfrontend.json(\n    \"filechanges/-1\",\n    params={\n        \"repository\": 1,\n        \"changeset\": custom_changeset[\"id\"]\n    },\n    expect={\n        \"error\": {\n            \"message\": \"Invalid numeric id: '-1'\",\n            \"title\": \"Invalid API request\"\n        }\n    },\n    expected_http_status=400)\n"
  },
  {
    "path": "testing/tests/001-main/003-self/200-json/007-replies.py",
    "content": "# @dependency 001-main/003-self/100-reviewing/001-comments.basic.py\n\n# Fetch the id of a review which contains some comments.\nresult = frontend.operation(\n    \"searchreview\",\n    data={ \"query\": \"branch:r/100-reviewing/001-comment.basic\" })\ntesting.expect.check(1, len(result[\"reviews\"]))\nreview_id = result[\"reviews\"][0][\"id\"]\n\nresult = frontend.json(\n    \"reviews/%d\" % review_id,\n    params={ \"fields\": \"issues,notes\" },\n    expect={ \"issues\": [int, int, int, int],\n             \"notes\": [int, int] })\n\nissue0 = frontend.json(\n    \"comments/%d/replies\" % result[\"issues\"][0],\n    expect={ \"replies\": [reply_json(\"dave\"),\n                         reply_json(\"bob\"),\n                         reply_json(\"erin\"),\n                         reply_json(\"bob\"),\n                         reply_json(\"alice\")] })\n\nissue1 = frontend.json(\n    \"comments/%d/replies\" % result[\"issues\"][1],\n    expect={ \"replies\": [reply_json(\"alice\"),\n                         reply_json(\"bob\")] })\n\nissue2 = frontend.json(\n    \"comments/%d/replies\" % result[\"issues\"][2],\n    expect={ \"replies\": [reply_json(\"bob\"),\n                         reply_json(\"erin\"),\n                         reply_json(\"alice\")] })\n\nnote0 = frontend.json(\n    \"comments/%d/replies\" % result[\"notes\"][0],\n    expect={ \"replies\": [] })\n\nissue3 = frontend.json(\n    \"comments/%d/replies\" % result[\"issues\"][3],\n    expect={ \"replies\": [reply_json(\"bob\")] })\n\nnote1 = frontend.json(\n    \"comments/%d/replies\" % result[\"notes\"][1],\n    expect={ \"replies\": [] })\n\ndef check_with_reply(comment_id, replies):\n    for reply_json in replies:\n        frontend.json(\n            \"comments\",\n            params={ \"with_reply\": reply_json[\"id\"],\n                     \"fields\": \"id\" },\n            expect={ \"id\": comment_id })\n\ncheck_with_reply(result[\"issues\"][0], issue0[\"replies\"])\ncheck_with_reply(result[\"issues\"][1], issue1[\"replies\"])\ncheck_with_reply(result[\"issues\"][2], issue2[\"replies\"])\ncheck_with_reply(result[\"issues\"][3], issue3[\"replies\"])\ncheck_with_reply(result[\"notes\"][0], note0[\"replies\"])\ncheck_with_reply(result[\"notes\"][1], note1[\"replies\"])\n\nwith frontend.signin(\"alice\"):\n    # Use a comment with no replies to test with.\n    comment_id = result[\"notes\"][1]\n    published_reply_id = issue0[\"replies\"][-1][\"id\"]\n\n    reply_id = frontend.json(\n        \"comments/%d/replies\" % comment_id,\n        post={\n            \"text\": \"JSON reply #1\",\n        },\n        expect={\n            \"id\": int,\n            \"is_draft\": True,\n            \"author\": instance.userid(\"alice\"),\n            \"timestamp\": float,\n            \"text\": \"JSON reply #1\"\n        })[\"id\"]\n\n    frontend.json(\n        \"replies/%d\" % reply_id,\n        expect={\n            \"id\": reply_id,\n            \"is_draft\": True,\n            \"author\": instance.userid(\"alice\"),\n            \"timestamp\": float,\n            \"text\": \"JSON reply #1\"\n        })\n\n    frontend.json(\n        \"comments/%d\" % comment_id,\n        expect={\n            \"id\": comment_id,\n            \"type\": \"note\",\n            \"is_draft\": False,\n            \"state\": None,\n            \"review\": review_id,\n            \"author\": instance.userid(\"alice\"),\n            \"location\": {\n                \"type\": \"file-version\",\n                \"first_line\": int,\n                \"last_line\": int,\n                \"file\": int,\n                \"changeset\": int,\n                \"side\": \"new\",\n                \"commit\": None,\n                \"is_translated\": False\n            },\n            \"resolved_by\": None,\n            \"addressed_by\": None,\n            \"timestamp\": float,\n            \"text\": str,\n            \"replies\": [],\n            \"draft_changes\": draft_changes_json(\"alice\", reply=reply_id),\n        })\n\nwith frontend.signin(\"bob\"):\n    # Check that Bob doesn't see the Alice's draft reply.\n    frontend.json(\n        \"comments/%d\" % comment_id,\n        expect={\n            \"id\": comment_id,\n            \"type\": \"note\",\n            \"is_draft\": False,\n            \"state\": None,\n            \"review\": review_id,\n            \"author\": instance.userid(\"alice\"),\n            \"location\": {\n                \"type\": \"file-version\",\n                \"first_line\": int,\n                \"last_line\": int,\n                \"file\": int,\n                \"changeset\": int,\n                \"side\": \"new\",\n                \"commit\": None,\n                \"is_translated\": False\n            },\n            \"resolved_by\": None,\n            \"addressed_by\": None,\n            \"timestamp\": float,\n            \"text\": str,\n            \"replies\": [],\n            \"draft_changes\": None,\n        })\n\nwith frontend.signin(\"alice\"):\n    frontend.json(\n        \"replies/%d\" % reply_id,\n        put={\n            \"text\": \"JSON reply (edited)\"\n        },\n        expect={\n            \"id\": reply_id,\n            \"is_draft\": True,\n            \"author\": instance.userid(\"alice\"),\n            \"timestamp\": float,\n            \"text\": \"JSON reply (edited)\"\n        })\n\n    frontend.json(\n        \"replies/%d\" % reply_id,\n        delete=True,\n        expected_http_status=204)\n\n    reply_id = frontend.json(\n        \"replies\",\n        params={\n            \"comment\": comment_id\n        },\n        post={\n            \"text\": \"JSON reply #2\"\n        },\n        expect={\n            \"id\": int,\n            \"is_draft\": True,\n            \"author\": instance.userid(\"alice\"),\n            \"timestamp\": float,\n            \"text\": \"JSON reply #2\"\n        })[\"id\"]\n\n    frontend.json(\n        \"replies/%d\" % reply_id,\n        delete=True,\n        expected_http_status=204)\n\n    reply_id = frontend.json(\n        \"replies\",\n        post={\n            \"comment\": comment_id,\n            \"text\": \"JSON reply #3\"\n        },\n        expect={\n            \"id\": int,\n            \"is_draft\": True,\n            \"author\": instance.userid(\"alice\"),\n            \"timestamp\": float,\n            \"text\": \"JSON reply #3\"\n        })[\"id\"]\n\n    frontend.json(\n        \"replies\",\n        post={\n            \"comment\": comment_id,\n            \"text\": \"JSON reply (invalid)\"\n        },\n        expected_http_status=400,\n        expect={\n            \"error\": {\n                \"title\": \"Invalid API request\",\n                \"message\": \"Comment already has a draft reply\"\n            }\n        })\n\n    frontend.json(\n        \"replies/%d\" % reply_id,\n        delete=True,\n        expected_http_status=204)\n\n    frontend.json(\n        \"comments/%d/replies\" % comment_id,\n        post={\n            \"text\": \"   \"\n        },\n        expected_http_status=400,\n        expect={\n            \"error\": {\n                \"title\": \"Invalid API request\",\n                \"message\": \"Empty reply\"\n            }\n        })\n\n    frontend.json(\n        \"replies/%d\" % published_reply_id,\n        put={\n            \"text\": \"Invalid edit\"\n        },\n        expected_http_status=400,\n        expect={\n            \"error\": {\n                \"title\": \"Invalid API request\",\n                \"message\": \"Published replies cannot be edited\"\n            }\n        })\n\n    frontend.json(\n        \"replies/%d\" % published_reply_id,\n        delete=True,\n        expected_http_status=400,\n        expect={\n            \"error\": {\n                \"title\": \"Invalid API request\",\n                \"message\": \"Published replies cannot be deleted\"\n            }\n        })\n\n# end of file\n"
  },
  {
    "path": "testing/tests/001-main/003-self/200-json/008-batches.py",
    "content": "# @dependency 001-main/002-createrepository.py\n\nwith repository.workcopy() as work:\n    review = Review(work, \"alice\", \"200-json/008-batches\")\n    review.addFile(first=\"200-json/008-batches/first.txt\",\n                   second=\"200-json/008-batches/second.txt\",\n                   third=\"200-json/008-batches/third.txt\")\n    review.commit(\"Reference commit\",\n                  reference=True,\n                  first=[\"First\",\n                         \"=====\",\n                         \"Initial line\"],\n                  second=[\"Second\",\n                          \"======\",\n                          \"Initial line\"],\n                  third=[\"Third\",\n                         \"=====\",\n                         \"Initial line\"])\n    review.commit(\"First commit\",\n                  first=[\"First\",\n                         \"=====\",\n                         \"Initial line\",\n                         \"Added line\"])\n    review.commit(\"Second commit\",\n                  second=[\"Second\",\n                          \"======\",\n                          \"Initial line\",\n                          \"Added line\"])\n    review.commit(\"Third commit\",\n                  third=[\"Third\",\n                         \"=====\",\n                         \"Initial line\",\n                         \"Added line\"])\n    review.addFilter(\"bob\", \"reviewer\", \"200-json/008-batches/\")\n    review.addFilter(\"dave\", \"reviewer\", \"200-json/008-batches/\")\n    review.submit()\n\n    changesets = {\n        \"first\": fetch_changeset({\n            \"from\": review.sha1s[0],\n            \"to\": review.sha1s[1],\n        }),\n        \"second\": fetch_changeset({\n            \"from\": review.sha1s[1],\n            \"to\": review.sha1s[2],\n        }),\n        \"third\": fetch_changeset({\n            \"from\": review.sha1s[2],\n            \"to\": review.sha1s[3],\n        }),\n        \"all\": fetch_changeset({\n            \"from\": review.sha1s[0],\n            \"to\": review.sha1s[3],\n        }),\n    }\n\n    issues = {\n        \"alice\": [],\n        \"bob\": [],\n        \"dave\": []\n    }\n\n    changes = {}\n\n    def fetch_changes(key):\n        changes[key] = frontend.json(\n            (\"reviews/%d/changesets/%d/reviewablefilechanges\"\n             % (review.id, changesets[key][\"id\"])),\n            expect={\n                \"reviewablefilechanges\": [{\n                    \"id\": int,\n                    \"review\": review.id,\n                    \"changeset\": changesets[key][\"id\"],\n                    \"file\": review.getFileId(key),\n                    \"deleted_lines\": int,\n                    \"inserted_lines\": int,\n                    \"is_reviewed\": False,\n                    \"reviewed_by\": None,\n                    \"assigned_reviewers\": [instance.userid(\"bob\"),\n                                           instance.userid(\"dave\")],\n                    \"draft_changes\": None,\n                }],\n            })[\"reviewablefilechanges\"]\n\n    fetch_changes(\"first\")\n    fetch_changes(\"second\")\n    fetch_changes(\"third\")\n\n    with frontend.signin(\"alice\"):\n        frontend.json(\n            \"reviews/%d/batches\" % review.id,\n            params={\n                \"unpublished\": \"yes\",\n            },\n            expect=batch_json(review.id, \"alice\", \"draft\"))\n\n        issues[\"alice\"].append(\n            frontend.json(\n                \"reviews/%d/issues\" % review.id,\n                post={\n                    \"text\": \"Alice's issue #1\",\n                    \"location\": {\n                        \"type\": \"file-version\",\n                        \"changeset\": changesets[\"first\"][\"id\"],\n                        \"side\": \"new\",\n                        \"file\": review.getFilename(\"first\"),\n                        \"first_line\": 1,\n                        \"last_line\": 4,\n                    }\n                })[\"id\"])\n\n        frontend.json(\n            \"reviews/%d/batches\" % review.id,\n            params={\n                \"unpublished\": \"yes\",\n            },\n            expect=batch_json(review.id, \"alice\", \"draft\",\n                              created_comments=[issues[\"alice\"][0]]))\n\n        issues[\"alice\"].append(\n            frontend.json(\n                \"reviews/%d/issues\" % review.id,\n                post={\n                    \"text\": \"Alice's issue #2\",\n                    \"location\": {\n                        \"type\": \"file-version\",\n                        \"changeset\": changesets[\"second\"][\"id\"],\n                        \"side\": \"new\",\n                        \"file\": review.getFilename(\"second\"),\n                        \"first_line\": 1,\n                        \"last_line\": 2,\n                    }\n                })[\"id\"])\n\n        frontend.json(\n            \"reviews/%d/batches\" % review.id,\n            params={\n                \"unpublished\": \"yes\",\n            },\n            expect=batch_json(review.id, \"alice\", \"draft\",\n                              created_comments=[issues[\"alice\"][0],\n                                                issues[\"alice\"][1]]))\n\n        frontend.json(\n            \"reviews/%d/batches\" % review.id,\n            post={},\n            expect=batch_json(review.id, \"alice\", \"published\",\n                              created_comments=[issues[\"alice\"][0],\n                                                issues[\"alice\"][1]]))\n\n        frontend.json(\n            \"reviews/%d/batches\" % review.id,\n            params={\n                \"unpublished\": \"yes\",\n            },\n            expect=batch_json(review.id, \"alice\", \"draft\"))\n\n    with frontend.signin(\"bob\"):\n        frontend.json(\n            \"reviews/%d/batches\" % review.id,\n            params={\n                \"unpublished\": \"yes\",\n            },\n            expect=batch_json(review.id, \"bob\", \"draft\"))\n\n        issues[\"bob\"].append(\n            frontend.json(\n                \"reviews/%d/issues\" % review.id,\n                post={\n                    \"text\": \"Bob's issue #1\",\n                    \"location\": {\n                        \"type\": \"file-version\",\n                        \"changeset\": changesets[\"second\"][\"id\"],\n                        \"side\": \"new\",\n                        \"file\": review.getFilename(\"second\"),\n                        \"first_line\": 3,\n                        \"last_line\": 4,\n                    }\n                })[\"id\"])\n\n        frontend.json(\n            \"comments/%d\" % issues[\"alice\"][0],\n            put={\n                \"draft_changes\": {\n                    \"new_state\": \"resolved\",\n                },\n            },\n            expect={\n                \"id\": issues[\"alice\"][0],\n                \"state\": \"open\",\n                \"draft_changes\": draft_changes_json(\n                    \"bob\", new_state=\"resolved\"),\n                \"*\": \"*\",\n            })\n\n        frontend.json(\n            (\"reviews/%d/changesets/%d/reviewablefilechanges\"\n             % (review.id, changesets[\"second\"][\"id\"])),\n            put={\n                \"draft_changes\": {\n                    \"new_is_reviewed\": True,\n                }\n            },\n            expect={\n                \"reviewablefilechanges\": [{\n                    \"id\": int,\n                    \"review\": review.id,\n                    \"changeset\": changesets[\"second\"][\"id\"],\n                    \"file\": review.getFileId(\"second\"),\n                    \"deleted_lines\": int,\n                    \"inserted_lines\": int,\n                    \"is_reviewed\": False,\n                    \"reviewed_by\": None,\n                    \"assigned_reviewers\": [instance.userid(\"bob\"),\n                                           instance.userid(\"dave\")],\n                    \"draft_changes\": {\n                        \"author\": instance.userid(\"bob\"),\n                        \"new_is_reviewed\": True,\n                        \"new_reviewed_by\": instance.userid(\"bob\"),\n                    },\n                }],\n            })\n\n        frontend.json(\n            \"reviews/%d/batches\" % review.id,\n            params={\n                \"unpublished\": \"yes\",\n            },\n            expect=batch_json(review.id, \"bob\", \"draft\",\n                              created_comments=[issues[\"bob\"][0]],\n                              resolved_issues=[issues[\"alice\"][0]],\n                              reviewed_changes=[changes[\"second\"][0][\"id\"]]))\n\n    with frontend.signin(\"dave\"):\n        frontend.json(\n            \"reviews/%d/batches\" % review.id,\n            params={\n                \"unpublished\": \"yes\",\n            },\n            expect=batch_json(review.id, \"dave\", \"draft\"))\n\n        issues[\"dave\"].append(\n            frontend.json(\n                \"reviews/%d/issues\" % review.id,\n                post={\n                    \"text\": \"Dave's issue #1\",\n                    \"location\": {\n                        \"type\": \"file-version\",\n                        \"changeset\": changesets[\"all\"][\"id\"],\n                        \"side\": \"new\",\n                        \"file\": review.getFilename(\"third\"),\n                        \"first_line\": 1,\n                        \"last_line\": 4,\n                    }\n                })[\"id\"])\n\n        frontend.json(\n            \"comments/%d\" % issues[\"alice\"][0],\n            put={\n                \"draft_changes\": {\n                    \"new_state\": \"resolved\",\n                },\n            },\n            expect={\n                \"id\": issues[\"alice\"][0],\n                \"state\": \"open\",\n                \"draft_changes\": draft_changes_json(\n                    \"dave\", new_state=\"resolved\"),\n                \"*\": \"*\",\n            })\n\n        frontend.json(\n            \"comments/%d\" % issues[\"alice\"][1],\n            put={\n                \"draft_changes\": {\n                    \"new_state\": \"resolved\",\n                },\n            },\n            expect={\n                \"id\": issues[\"alice\"][1],\n                \"state\": \"open\",\n                \"draft_changes\": draft_changes_json(\n                    \"dave\", new_state=\"resolved\"),\n                \"*\": \"*\",\n            })\n\n        frontend.json(\n            \"reviewablefilechanges/%d,%d\" % (changes[\"second\"][0][\"id\"],\n                                             changes[\"third\"][0][\"id\"]),\n            put={\n                \"draft_changes\": {\n                    \"new_is_reviewed\": True,\n                }\n            },\n            expect={\n                \"reviewablefilechanges\": [{\n                    \"id\": int,\n                    \"review\": review.id,\n                    \"changeset\": changesets[\"second\"][\"id\"],\n                    \"file\": review.getFileId(\"second\"),\n                    \"deleted_lines\": int,\n                    \"inserted_lines\": int,\n                    \"is_reviewed\": False,\n                    \"reviewed_by\": None,\n                    \"assigned_reviewers\": [instance.userid(\"bob\"),\n                                           instance.userid(\"dave\")],\n                    \"draft_changes\": {\n                        \"author\": instance.userid(\"dave\"),\n                        \"new_is_reviewed\": True,\n                        \"new_reviewed_by\": instance.userid(\"dave\"),\n                    },\n                }, {\n                    \"id\": int,\n                    \"review\": review.id,\n                    \"changeset\": changesets[\"third\"][\"id\"],\n                    \"file\": review.getFileId(\"third\"),\n                    \"deleted_lines\": int,\n                    \"inserted_lines\": int,\n                    \"is_reviewed\": False,\n                    \"reviewed_by\": None,\n                    \"assigned_reviewers\": [instance.userid(\"bob\"),\n                                           instance.userid(\"dave\")],\n                    \"draft_changes\": {\n                        \"author\": instance.userid(\"dave\"),\n                        \"new_is_reviewed\": True,\n                        \"new_reviewed_by\": instance.userid(\"dave\"),\n                    },\n                }],\n            })\n\n        frontend.json(\n            \"reviews/%d/batches\" % review.id,\n            params={\n                \"unpublished\": \"yes\",\n            },\n            expect=batch_json(review.id, \"dave\", \"draft\",\n                              created_comments=[issues[\"dave\"][0]],\n                              resolved_issues=[issues[\"alice\"][0],\n                                               issues[\"alice\"][1]],\n                              reviewed_changes=[changes[\"second\"][0][\"id\"],\n                                                changes[\"third\"][0][\"id\"]]))\n\n    with frontend.signin(\"bob\"):\n        frontend.json(\n            \"reviews/%d/batches\" % review.id,\n            params={\n                \"unpublished\": \"yes\",\n            },\n            expect=batch_json(review.id, \"bob\", \"draft\",\n                              created_comments=[issues[\"bob\"][0]],\n                              resolved_issues=[issues[\"alice\"][0]],\n                              reviewed_changes=[changes[\"second\"][0][\"id\"]]))\n\n        frontend.json(\n            \"reviews/%d/batches\" % review.id,\n            post={\n                \"comment\": \"This looks good!\",\n            },\n            expect=batch_json(review.id, \"bob\", \"published\",\n                              comment=int,\n                              created_comments=[issues[\"bob\"][0]],\n                              resolved_issues=[issues[\"alice\"][0]],\n                              reviewed_changes=[changes[\"second\"][0][\"id\"]]))\n\n        frontend.json(\n            \"reviews/%d/batches\" % review.id,\n            params={\n                \"unpublished\": \"yes\",\n            },\n            expect=batch_json(review.id, \"bob\", \"draft\"))\n\n    with frontend.signin(\"dave\"):\n        frontend.json(\n            \"reviews/%d/batches\" % review.id,\n            params={\n                \"unpublished\": \"yes\",\n            },\n            expect=batch_json(review.id, \"dave\", \"draft\",\n                              created_comments=[issues[\"dave\"][0]],\n                              resolved_issues=[issues[\"alice\"][1]],\n                              reviewed_changes=[changes[\"third\"][0][\"id\"]]))\n\n# eof\n"
  },
  {
    "path": "testing/tests/001-main/003-self/200-json/__init__.py",
    "content": "critic_json = { \"id\": 1,\n                \"name\": \"critic\",\n                \"path\": instance.repository_path(),\n                \"relative_path\": \"critic.git\",\n                \"url\": str }\n\nother_json = { \"id\": 2,\n               \"name\": \"other\",\n               \"path\": instance.repository_path(\"other\"),\n               \"relative_path\": \"other.git\",\n               \"url\": str }\n\ndef user_json(name, fullname=None, status=\"current\", no_email=False):\n    if fullname is None:\n        fullname = name.capitalize() + \" von Testing\"\n    if no_email:\n        email = None\n    else:\n        email = name + \"@example.org\"\n    return { \"id\": instance.userid(name),\n             \"name\": name,\n             \"fullname\": fullname,\n             \"status\": status,\n             \"email\": email }\n\ngeneric_commit_json = {\n    \"id\": int,\n    \"sha1\": str,\n    \"summary\": str,\n    \"message\": str,\n    \"parents\": list,\n    \"author\": {\n        \"name\": str,\n        \"email\": str,\n        \"timestamp\": float\n    },\n    \"committer\": {\n        \"name\": str,\n        \"email\": str,\n        \"timestamp\": float\n    },\n}\n\ndef reply_json(author):\n    return { \"id\": int,\n             \"is_draft\": bool,\n             \"author\": instance.userid(author),\n             \"timestamp\": float,\n             \"text\": \"This is a reply from %s.\" % author.capitalize() }\n\ndef batch_json(review_id, author, batch_type, **fields):\n    expected = {\n        \"id\": int,\n        \"is_empty\": not fields,\n        \"review\": review_id,\n        \"author\": instance.userid(author),\n        \"comment\": None,\n        \"timestamp\": float,\n        \"created_comments\": [],\n        \"written_replies\": [],\n        \"resolved_issues\": [],\n        \"reopened_issues\": [],\n        \"morphed_comments\": [],\n        \"reviewed_changes\": [],\n        \"unreviewed_changes\": [],\n    }\n\n    if batch_type == \"draft\":\n        expected.update({\n            \"id\": None,\n            \"timestamp\": None,\n        })\n\n    expected.update(fields)\n\n    return expected\n\ndef fetch_changeset(params, repository=\"critic\"):\n    params.setdefault(\"repository\", repository)\n\n    result = frontend.json(\n        \"changesets\",\n        params=params,\n        expected_http_status=[200, 202])\n\n    if \"error\" in result:\n        instance.synchronize_service(\"changeset\")\n\n        result = frontend.json(\n            \"changesets\",\n            params=params,\n            expect={\n                \"id\": int,\n                \"*\": \"*\"\n            })\n\n    return result\n\ndef draft_changes_json(author, **kwargs):\n    result = {\n        \"author\": instance.userid(author),\n        \"is_draft\": False,\n        \"reply\": None,\n        \"new_type\": None,\n        \"new_state\": None,\n        \"new_location\": None,\n    }\n    result.update(kwargs)\n    return result\n\n# eof\n"
  },
  {
    "path": "testing/tests/001-main/003-self/__init__.py",
    "content": "import os\n\nclass Review(object):\n    def __init__(self, workcopy, as_user, work_branch_name):\n        self.workcopy = workcopy\n        self.work_branch_name = work_branch_name\n        self.review_branch_name = \"r/\" + work_branch_name\n        self.summary = work_branch_name\n        self.as_user = as_user\n        self.sha1s = []\n        self.reference_commits = 0\n        self.pushed_commits = 0\n        self.files = {}\n        self.review_id = None\n        self.filters = []\n        self.users = set([as_user])\n\n        self.workcopy.run([\"checkout\", \"-b\", self.work_branch_name])\n\n    class File(object):\n        def __init__(self, filename):\n            self.filename = filename\n            self.content = None\n\n        def write(self, path):\n            filename = os.path.join(path, self.filename)\n            if self.content is None:\n                if os.path.isfile(filename):\n                    os.unlink(filename)\n            else:\n                if isinstance(self.content, list):\n                    content = \"\\n\".join(self.content) + \"\\n\"\n                else:\n                    content = self.content\n                if not os.path.isdir(os.path.dirname(filename)):\n                    os.makedirs(os.path.dirname(filename))\n                with open(filename, \"w\") as fileobj:\n                    fileobj.write(content)\n\n    def addFile(self, **files):\n        for key, filename in files.items():\n            self.files[key] = Review.File(filename)\n\n    def getFilename(self, key):\n        return self.files[key].filename\n\n    def getFileId(self, key):\n        result = frontend.json(\n            \"files\",\n            params={\n                \"path\": self.files[key].filename,\n            },\n            expect={\n                \"id\": int,\n                \"path\": self.files[key].filename,\n            })\n        return result[\"id\"]\n\n    def commit(self, message, **files):\n        if files.pop(\"reference\", False):\n            assert len(self.sha1s) == self.reference_commits\n            self.reference_commits += 1\n        for key, content in files.items():\n            self.files[key].content = content\n        for file_ in self.files.values():\n            file_.write(self.workcopy.path)\n            self.workcopy.run([\"add\", file_.filename])\n        self.workcopy.run([\"commit\", \"-m\", message])\n        self.sha1s.append(self.workcopy.run([\"rev-parse\", \"HEAD\"]).strip())\n\n    def addFilter(self, username, filter_type, path):\n        assert filter_type in (\"reviewer\", \"watcher\")\n        self.users.add(username)\n        self.filters.append({\n            \"username\": username,\n            \"type\": filter_type,\n            \"path\": path\n        })\n\n    def submit(self):\n        assert len(self.sha1s) > self.reference_commits\n\n        self.workcopy.run(\n            [\"push\", instance.repository_url(self.as_user), \"HEAD\"])\n        self.pushed_commits = len(self.sha1s)\n\n        with frontend.signin(self.as_user):\n            result = frontend.operation(\n                \"submitreview\",\n                data={\n                    \"repository\": \"critic\",\n                    \"branch\": self.review_branch_name,\n                    \"summary\": self.work_branch_name,\n                    \"commit_sha1s\": self.sha1s[self.reference_commits:],\n                    \"reviewfilters\": self.filters,\n                    \"frombranch\": self.work_branch_name,\n                })\n\n        self.id = result[\"review_id\"]\n\n        for username in self.users:\n            mailbox.pop(accept=[\n                testing.mailbox.ToRecipient(username + \"@example.org\"),\n                testing.mailbox.WithSubject(\"New Review: \" + self.summary)\n            ])\n\n    def push(self):\n        assert self.review_id is not None, \"call review.submit() first!\"\n\n        self.workcopy.run(\n            [\"push\", instance.repository_url(self.as_user),\n             \"HEAD:%s\" % self.review_branch_name])\n\n        for username in self.users:\n            mailbox.pop(accept=[\n                testing.mailbox.ToRecipient(username + \"@example.org\"),\n                testing.mailbox.WithSubject(\"Updated Review: \" + self.summary)\n            ])\n"
  },
  {
    "path": "testing/tests/001-main/004-extensions/001-enable.py",
    "content": "# Enable extensions.\ninstance.extend(repository)\n"
  },
  {
    "path": "testing/tests/001-main/004-extensions/002-tests/001-tutorial.py",
    "content": "frontend.page(\"tutorial\",\n              expect={ \"document_title\": testing.expect.document_title(u\"Tutorials\"),\n                       \"content_title\": testing.expect.paleyellow_title(0, u\"Tutorials\"),\n                       \"pageheader_links\": testing.expect.pageheader_links(\"anonymous\",\n                                                                           \"extensions\"),\n                       \"script_user\": testing.expect.script_no_user() })\n\nfrontend.page(\"tutorial\",\n              params={ \"item\": \"extensions\" },\n              expect={ \"document_title\": testing.expect.document_title(u\"Critic Extensions\"),\n                       \"content_title\": testing.expect.paleyellow_title(0, u\"Critic Extensions\"),\n                       \"pageheader_links\": testing.expect.pageheader_links(\"anonymous\",\n                                                                           \"extensions\"),\n                       \"script_user\": testing.expect.script_no_user() })\n\nfrontend.page(\"tutorial\",\n              params={ \"item\": \"extensions-api\" },\n              expect={ \"document_title\": testing.expect.document_title(u\"Critic Extensions API\"),\n                       \"content_title\": testing.expect.paleyellow_title(0, u\"Critic Extensions API\"),\n                       \"pageheader_links\": testing.expect.pageheader_links(\"anonymous\",\n                                                                           \"extensions\"),\n                       \"script_user\": testing.expect.script_no_user() })\n"
  },
  {
    "path": "testing/tests/001-main/004-extensions/002-tests/002-manageextensions.py",
    "content": "frontend.page(\n    \"manageextensions\",\n    expect={ \"document_title\": testing.expect.document_title(u\"Manage Extensions\"),\n             \"content_title\": testing.expect.paleyellow_title(0, u\"Available Extensions\"),\n             \"pageheader_links\": testing.expect.pageheader_links(\"anonymous\",\n                                                                 \"extensions\"),\n             \"script_user\": testing.expect.script_anonymous_user() })\n\nfrontend.page(\n    \"manageextensions\",\n    params={ \"what\": \"available\" },\n    expect={ \"document_title\": testing.expect.document_title(u\"Manage Extensions\"),\n             \"content_title\": testing.expect.paleyellow_title(0, u\"Available Extensions\"),\n             \"pageheader_links\": testing.expect.pageheader_links(\"anonymous\",\n                                                                 \"extensions\"),\n             \"script_user\": testing.expect.script_anonymous_user() })\n\nfrontend.page(\n    \"manageextensions\",\n    params={ \"what\": \"installed\" },\n    expect={ \"document_title\": testing.expect.document_title(u\"Manage Extensions\"),\n             \"content_title\": testing.expect.paleyellow_title(0, u\"Installed Extensions\"),\n             \"pageheader_links\": testing.expect.pageheader_links(\"anonymous\",\n                                                                 \"extensions\"),\n             \"script_user\": testing.expect.script_anonymous_user() })\n"
  },
  {
    "path": "testing/tests/001-main/004-extensions/002-tests/003-install-TestExtension.py",
    "content": "def check_extension(installed):\n    def check(document):\n        tr_item = document.find(\"tr\", attrs={ \"class\": \"item\" })\n\n        td_name = tr_item.find(\"td\", attrs={ \"class\": \"name\" })\n        testing.expect.check(\"Extension:\", td_name.string)\n\n        td_value = tr_item.find(\"td\", attrs={ \"class\": \"value\" })\n\n        span_name = td_value.find(\"span\", attrs={ \"class\": \"name\" })\n        testing.expect.check(\"TestExtension\", span_name.contents[0].string)\n        testing.expect.check(\" hosted by Alice von Testing\", span_name.contents[1])\n\n        span_installed = td_value.find(\"span\", attrs={ \"class\": \"installed\" })\n        if installed:\n            testing.expect.check(\" [installed]\", span_installed.string)\n        elif span_installed:\n            testing.expect.check(\"<no installed indicator>\",\n                                 \"<found installed indicator>\")\n\n    return check\n\ntry:\n    instance.execute(\n        [\"sudo\", \"mkdir\", \"~alice/CriticExtensions\",\n         \"&&\",\n         \"sudo\", \"cp\", \"-R\", \"~/critic/testing/input/TestExtension\",\n         \"~alice/CriticExtensions\",\n         \"&&\",\n         \"sudo\", \"chown\", \"-R\", \"alice.critic\", \"~alice/CriticExtensions\",\n         \"&&\",\n         \"sudo\", \"chmod\", \"-R\", \"u+rwX,go+rX\", \"~alice/CriticExtensions\"])\n\n    instance.execute(\n        [\"sudo\", \"-H\", \"-u\", \"alice\", \"git\", \"init\",\n         \"&&\",\n         \"sudo\", \"-H\", \"-u\", \"alice\", \"git\", \"add\", \".\",\n         \"&&\",\n         \"sudo\", \"-H\", \"-u\", \"alice\", \"git\", \"commit\", \"-mInitial\",\n         \"&&\",\n         \"sudo\", \"-H\", \"-u\", \"alice\", \"git\", \"checkout\", \"-b\", \"version/stable\",\n         \"&&\",\n         \"sudo\", \"su\", \"-c\", \"'echo stable:1 > version.txt'\", \"alice\",\n         \"&&\",\n         \"sudo\", \"-H\", \"-u\", \"alice\", \"git\", \"add\", \"version.txt\",\n         \"&&\",\n         \"sudo\", \"-H\", \"-u\", \"alice\", \"git\", \"commit\", \"-mStable:1\",\n         \"&&\",\n         \"sudo\", \"su\", \"-c\", \"'echo stable:2 > version.txt'\", \"alice\",\n         \"&&\",\n         \"sudo\", \"-H\", \"-u\", \"alice\", \"git\", \"commit\", \"-mStable:2\", \"version.txt\",\n         \"&&\",\n         \"sudo\", \"-H\", \"-u\", \"alice\", \"git\", \"checkout\", \"master\",\n         \"&&\",\n         \"sudo\", \"su\", \"-c\", \"'echo live > version.txt'\", \"alice\"],\n        cwd=\"~alice/CriticExtensions/TestExtension\")\nexcept testing.InstanceError as error:\n    raise testing.TestFailure(error.message)\n\nwith frontend.signin(\"alice\"):\n    frontend.page(\n        \"manageextensions\",\n        expect={ \"test_extension\": check_extension(False) })\n\n    frontend.operation(\n        \"installextension\",\n        data={ \"author_name\": \"alice\",\n               \"extension_name\": \"TestExtension\" })\n\n    frontend.page(\n        \"manageextensions\",\n        expect={ \"test_extension\": check_extension(True) })\n\nfrontend.page(\n    \"manageextensions\",\n    expect={ \"test_extension\": check_extension(False) })\n"
  },
  {
    "path": "testing/tests/001-main/004-extensions/002-tests/004-TestExtension/001-echo.py",
    "content": "import json\n\ndef check_arguments(expected):\n    def check(document):\n        try:\n            result = json.loads(document)\n        except ValueError:\n            testing.expect.check(\"<valid json>\", repr(document))\n        else:\n            actual = result[\"arguments\"]\n            testing.expect.check(expected, actual)\n\n    return check\n\ndef check_json(expected):\n    def check(actual):\n        try:\n            return expected, json.loads(actual)\n        except ValueError:\n            return \"<valid JSON>\", actual\n    return check\n\nwith frontend.signin(\"alice\"):\n    frontend.page(\n        \"echo\",\n        expected_content_type=\"text/json\",\n        expect={ \"json\": check_arguments(\n            [\"GET\", \"echo\", None]) })\n\n    frontend.page(\n        \"echo?foo=bar\",\n        expected_content_type=\"text/json\",\n        expect={ \"json\": check_arguments(\n            [\"GET\", \"echo\", { \"raw\": \"foo=bar\",\n                              \"params\": { \"foo\": \"bar\" }}]) })\n\n    frontend.page(\n        \"echo?foo=bar&x=10&y=20\",\n        expected_content_type=\"text/json\",\n        expect={ \"json\": check_arguments(\n            [\"GET\", \"echo\", { \"raw\": \"foo=bar&x=10&y=20\",\n                              \"params\": { \"foo\": \"bar\",\n                                          \"x\": \"10\",\n                                          \"y\": \"20\" }}]) })\n\n    frontend.operation(\n        \"echo\",\n        data={},\n        expect={ \"arguments\": [\"POST\", \"echo\", None],\n                 \"stdin\": check_json({}) })\n\n    frontend.operation(\n        \"echo\",\n        data={ \"foo\": \"bar\",\n               \"positions\": [{ \"x\": 10, \"y\": 20 },\n                             { \"x\": 11, \"y\": 21 }]},\n        expect={ \"arguments\": [\"POST\", \"echo\", None],\n                 \"stdin\": check_json({ \"foo\": \"bar\",\n                                       \"positions\": [{ \"x\": 10, \"y\": 20 },\n                                                     { \"x\": 11, \"y\": 21 }]}) })\n\n# Verify that Alice's extension install doesn't affect Bob.\nwith frontend.signin(\"bob\"):\n    frontend.page(\"echo\", expected_http_status=404)\n"
  },
  {
    "path": "testing/tests/001-main/004-extensions/002-tests/004-TestExtension/002-nothandled.py",
    "content": "with frontend.signin(\"alice\"):\n    frontend.page(\"nothandled\", expected_http_status=404)\n"
  },
  {
    "path": "testing/tests/001-main/004-extensions/002-tests/004-TestExtension/003-empty.py",
    "content": "def empty(document):\n    if document != \"\":\n        testing.expect.check(\"<empty string>\", document)\n\nwith frontend.signin(\"alice\"):\n    frontend.page(\n        \"empty\",\n        expected_content_type=\"text/plain\",\n        expect={ \"empty\": empty })\n"
  },
  {
    "path": "testing/tests/001-main/004-extensions/002-tests/004-TestExtension/004-Review.list.py",
    "content": "with frontend.signin(\"alice\"):\n    result = frontend.operation(\"Review.list\", data={})\n\n    for failed in result[\"failed\"]:\n        logger.error(\"Review.list: %(test)s: %(message)s\" % failed)\n\n    for passed in result[\"passed\"]:\n        logger.debug(\"Review.list: %(test)s: passed (%(result)s)\" % passed)\n"
  },
  {
    "path": "testing/tests/001-main/004-extensions/002-tests/004-TestExtension/005-MailTransaction.py",
    "content": "mailbox.check_empty()\n\nwith frontend.signin(\"alice\"):\n    to_alice = testing.mailbox.ToRecipient(\"alice@example.org\")\n    to_bob = testing.mailbox.ToRecipient(\"bob@example.org\")\n\n    frontend.operation(\n        \"MailTransaction\",\n        data={ \"mails\": [{ \"to\": [\"alice\", \"bob\"],\n                           \"subject\": \"MailTransaction test #1\",\n                           \"body\": \"This is the mail body.\\n\\nBye, bye.\" }] },\n        expect={ \"message\": None })\n\n    def recipients_equal(expected, actual):\n        return set(expected) == set(map(str.strip, actual.split(\",\")))\n\n    def check_mail1(mail):\n        testing.expect.check(\"Alice von Testing <alice@example.org>\",\n                             mail.header(\"From\"))\n        testing.expect.check([\"Alice von Testing <alice@example.org>\",\n                              \"Bob von Testing <bob@example.org>\"],\n                             mail.header(\"To\"), equal=recipients_equal)\n        testing.expect.check(\"MailTransaction test #1\",\n                             mail.header(\"Subject\"))\n        testing.expect.check([\"This is the mail body.\", \"\", \"Bye, bye.\"],\n                             mail.lines)\n\n    check_mail1(mailbox.pop(to_alice))\n    check_mail1(mailbox.pop(to_bob))\n"
  },
  {
    "path": "testing/tests/001-main/004-extensions/002-tests/004-TestExtension/006-inject.py",
    "content": "import json\n\nfrom BeautifulSoup import Comment\n\ndef check_injected(key, expected):\n    def check(document):\n        comments = document.findAll(text=lambda text: isinstance(text, Comment))\n\n        for comment in comments:\n            if comment.strip().startswith(\"[alice/TestExtension] Extension error:\"):\n                logger.error(comment.strip())\n                return\n\n        for script in document.findAll(\"script\"):\n            if script.has_key(\"src\"):\n                src = script[\"src\"]\n                if src.startswith(\"data:text/javascript,var %s=\" % key) \\\n                        and src[-1] == \";\":\n                    injected = src[len(\"data:text/javascript,var %s=\" % key):-1]\n                    break\n        else:\n            testing.expect.check(\"<injected script>\",\n                                 \"<expected content not found>\")\n\n        try:\n            actual = json.loads(injected)\n        except ValueError:\n            testing.expect.check(\"<valid json>\", repr(injected))\n        else:\n            testing.expect.check(expected, actual)\n\n    return check\n\ndef check_not_injected(key):\n    def check(document):\n        comments = document.findAll(text=lambda text: isinstance(text, Comment))\n\n        for comment in comments:\n            if comment.strip().startswith(\"[alice/TestExtension] Extension error:\"):\n                logger.error(comment.strip())\n                return\n\n        for script in document.findAll(\"script\"):\n            if script.has_key(\"src\"):\n                src = script[\"src\"]\n                if src.startswith(\"data:text/javascript,var %s=\" % key) \\\n                        and src[-1] == \";\":\n                    testing.expect.check(\"<no injected script>\",\n                                         \"<injected script found>\")\n                    break\n\n    return check\n\ndef check_error(message):\n    def check(document):\n        comments = document.findAll(text=lambda text: isinstance(text, Comment))\n\n        for comment in comments:\n            comment = comment.strip()\n            if comment.startswith(\"[alice/TestExtension] Extension error:\"):\n                comment = comment[len(\"[alice/TestExtension] Extension error:\"):]\n                testing.expect.check(message, comment.strip())\n                return\n\n        testing.expect.check(\"<error message>\", \"<expected content not found>\")\n\n    return check\n\nwith frontend.signin(\"alice\"):\n    frontend.page(\n        \"home\",\n        expect={ \"injected\": check_injected(\n                \"injected\",\n                [\"home\", None]) })\n\n    frontend.page(\n        \"home?foo=bar\",\n        expect={ \"injected\": check_injected(\n                \"injected\",\n                [\"home\", { \"raw\": \"foo=bar\",\n                           \"params\": { \"foo\": \"bar\" }}]) })\n\n    frontend.page(\n        \"home?foo=bar&x=10&y=20\",\n        expect={ \"injected\": check_injected(\n                \"injected\",\n                [\"home\", { \"raw\": \"foo=bar&x=10&y=20\",\n                           \"params\": { \"foo\": \"bar\",\n                                       \"x\": \"10\",\n                                       \"y\": \"20\" }}]) })\n\n    sha1 = repository.run([\"rev-parse\", \"master\"]).strip()\n\n    frontend.page(\n        \"critic/master\",\n        expect={ \"showcommitShort\": check_injected(\n                \"showcommitShort\",\n                [\"critic/master\", None]),\n                 \"showcommitLong\": check_injected(\n                \"showcommitLong\",\n                [\"showcommit\", { \"raw\": \"repository=critic&sha1=\" + sha1,\n                                 \"params\": { \"repository\": \"critic\",\n                                             \"sha1\": sha1 }}]) })\n\n    frontend.page(\n        \"showcommit?repository=critic&sha1=master\",\n        expect={ \"showcommitShort\": check_not_injected(\n                \"showcommitShort\"),\n                \"showcommitLong\": check_injected(\n                \"showcommitLong\",\n                [\"showcommit\", { \"raw\": \"repository=critic&sha1=master\",\n                                 \"params\": { \"repository\": \"critic\",\n                                             \"sha1\": \"master\" }}]) })\n\n    frontend.page(\n        \"home?expr=path\",\n        expect={\n            \"injected\": check_injected(\"injectedCustom\", \"home\")\n        })\n\n    frontend.page(\n        \"home\",\n        params={\n            \"expr\": \"while(true){}\"\n        },\n        expect={\n            \"injected\": check_error(\"Process timed out after 5 seconds\")\n        })\n\n# Verify that Alice's extension install doesn't affect Bob.\nwith frontend.signin(\"bob\"):\n    frontend.page(\n        \"home\",\n        expect={ \"injected\": check_not_injected(\"injected\") })\n"
  },
  {
    "path": "testing/tests/001-main/004-extensions/002-tests/004-TestExtension/007-version.py",
    "content": "def check_version(expected):\n    def check(actual):\n        testing.expect.check(expected, actual.strip())\n    return check\n\nwith frontend.signin(\"alice\"):\n    frontend.page(\n        \"version\",\n        expected_content_type=\"text/plain\",\n        expect={ \"version\": check_version(\"live\") })\n\nwith frontend.signin(\"bob\"):\n    frontend.page(\n        \"version\",\n        expected_http_status=404)\n\n    frontend.operation(\n        \"installextension\",\n        data={ \"author_name\": \"alice\",\n               \"extension_name\": \"TestExtension\",\n               \"version\": \"version/stable\" })\n\n    frontend.page(\n        \"version\",\n        expected_content_type=\"text/plain\",\n        expect={ \"version\": check_version(\"stable:2\") })\n\n    frontend.operation(\n        \"uninstallextension\",\n        data={ \"author_name\": \"alice\",\n               \"extension_name\": \"TestExtension\" })\n\n    frontend.operation(\n        \"installextension\",\n        data={ \"author_name\": \"alice\",\n               \"extension_name\": \"TestExtension\" })\n\n    frontend.page(\n        \"version\",\n        expected_content_type=\"text/plain\",\n        expect={ \"version\": check_version(\"live\") })\n\n    frontend.operation(\n        \"uninstallextension\",\n        data={ \"author_name\": \"alice\",\n               \"extension_name\": \"TestExtension\" })\n\n    frontend.operation(\n        \"installextension\",\n        data={ \"author_name\": \"alice\",\n               \"extension_name\": \"TestExtension\",\n               \"version\": \"version/stable\",\n               \"universal\": True },\n        expect={ \"status\": \"failure\",\n                 \"code\": \"notallowed\" })\n\nwith frontend.signin(\"admin\"):\n    frontend.operation(\n        \"installextension\",\n        data={ \"author_name\": \"alice\",\n               \"extension_name\": \"TestExtension\",\n               \"version\": \"version/stable\",\n               \"universal\": True })\n\nwith frontend.signin(\"bob\"):\n    frontend.page(\n        \"version\",\n        expected_content_type=\"text/plain\",\n        expect={ \"version\": check_version(\"stable:2\") })\n\n    frontend.operation(\n        \"installextension\",\n        data={ \"author_name\": \"alice\",\n               \"extension_name\": \"TestExtension\" })\n\n    frontend.page(\n        \"version\",\n        expected_content_type=\"text/plain\",\n        expect={ \"version\": check_version(\"live\") })\n\n    frontend.operation(\n        \"uninstallextension\",\n        data={ \"author_name\": \"alice\",\n               \"extension_name\": \"TestExtension\" })\n\n    frontend.page(\n        \"version\",\n        expected_content_type=\"text/plain\",\n        expect={ \"version\": check_version(\"stable:2\") })\n\nwith frontend.signin(\"admin\"):\n    frontend.operation(\n        \"uninstallextension\",\n        data={ \"author_name\": \"alice\",\n               \"extension_name\": \"TestExtension\",\n               \"universal\": True })\n"
  },
  {
    "path": "testing/tests/001-main/004-extensions/002-tests/004-TestExtension/008-processcommits.py",
    "content": "import os\nimport re\n\ndef to(name):\n    return testing.mailbox.ToRecipient(\"%s@example.org\" % name)\n\ndef about(subject):\n    return testing.mailbox.WithSubject(subject)\n\nFILENAME = \"008-processcommits.txt\"\nSUMMARY = \"Added %s\" % FILENAME\nSETTINGS = { \"review.createViaPush\": True }\n\nreview_id = None\n\nwith testing.utils.settings(\"alice\", SETTINGS), frontend.signin(\"alice\"):\n    with repository.workcopy() as work:\n        base_sha1 = work.run([\"rev-parse\", \"HEAD\"]).strip()\n\n        work.run([\"remote\", \"add\", \"critic\",\n                  \"alice@%s:/var/git/critic.git\" % instance.hostname])\n\n        def commit(fixup_message=None):\n            if fixup_message:\n                full_message = \"fixup! %s\\n\\n%s\" % (SUMMARY, fixup_message)\n            else:\n                full_message = SUMMARY\n            work.run([\"add\", FILENAME])\n            work.run([\"commit\", \"-m\", full_message],\n                     GIT_AUTHOR_NAME=\"Alice von Testing\",\n                     GIT_AUTHOR_EMAIL=\"alice@example.org\",\n                     GIT_COMMITTER_NAME=\"Alice von Testing\",\n                     GIT_COMMITTER_EMAIL=\"alice@example.org\")\n            return work.run([\"rev-parse\", \"HEAD\"]).strip()\n\n        def push():\n            output = work.run(\n                [\"push\", \"-q\", \"critic\",\n                 \"HEAD:refs/heads/r/008-processcommits\"])\n            all_lines = []\n            for line in output.splitlines():\n                if not line.startswith(\"remote:\"):\n                    continue\n                all_lines.append(line[len(\"remote:\"):].split(\"\\x1b\", 1)[0].strip())\n            extension_lines = []\n            for line in all_lines:\n                if line.startswith(\"[TestExtension] \"):\n                    extension_lines.append(line[len(\"[TestExtension] \"):])\n            return all_lines, extension_lines\n\n        with open(os.path.join(work.path, FILENAME), \"w\") as text_file:\n            print >>text_file, \"First line.\"\n\n        first_commit = commit()\n        all_lines, extension_lines = push()\n        next_is_review_url = False\n\n        for line in all_lines:\n            if line == \"Submitted review:\":\n                next_is_review_url = True\n            elif next_is_review_url:\n                review_id = int(re.search(r\"/r/(\\d+)$\", line).group(1))\n                break\n\n        testing.expect.check([\"processcommits.js::processcommits()\",\n                              \"===================================\",\n                              \"r/%d\" % review_id,\n                              \"%s..%s\" % (base_sha1[:8], first_commit[:8]),\n                              \"%s\" % first_commit[:8]],\n                             extension_lines)\n\n        mailbox.pop(accept=[to(\"alice\"),\n                            about(\"New Review: %s\" % SUMMARY)])\n\n        with open(os.path.join(work.path, FILENAME), \"a\") as text_file:\n            print >>text_file, \"Second line.\"\n        second_commit = commit(\"Added second line\")\n\n        with open(os.path.join(work.path, FILENAME), \"a\") as text_file:\n            print >>text_file, \"Third line.\"\n        third_commit = commit(\"Added third line\")\n\n        with open(os.path.join(work.path, FILENAME), \"a\") as text_file:\n            print >>text_file, \"Fourth line.\"\n        fourth_commit = commit(\"Added fourth line\")\n\n        all_lines, extension_lines = push()\n\n        testing.expect.check([\"processcommits.js::processcommits()\",\n                              \"===================================\",\n                              \"r/%d\" % review_id,\n                              \"%s..%s\" % (first_commit[:8], fourth_commit[:8]),\n                              \"%s,%s,%s\" % (fourth_commit[:8],\n                                            third_commit[:8],\n                                            second_commit[:8])],\n                             extension_lines)\n\n        mailbox.pop(accept=[to(\"alice\"),\n                            about(\"Updated Review: %s\" % SUMMARY)])\n"
  },
  {
    "path": "testing/tests/001-main/004-extensions/002-tests/004-TestExtension/009-error-messages.py",
    "content": "import re\n\ndef check_compilation(document):\n    testing.expect.check(\n        expected=\"\"\"\\\nExtension failure: returned 1\nFailed to load 'error\\\\.compilation\\\\.js':\n  SyntaxError: Duplicate parameter name not allowed in this context\"\"\",\n        actual=document,\n        equal=re.match)\n\ndef check_runtime(document):\n    testing.expect.check(\n        expected=\"\"\"\\\nExtension failure: returned 1\nFailed to call 'error\\\\.runtime\\\\.js::test\\\\(\\\\)':\n  CriticError: nosuchuser: no such user\n    new CriticUser\\\\(\\\\) at <Library>/critic-user\\\\.js:\\\\d+\"\"\",\n        actual=document,\n        equal=re.match)\n\nwith frontend.signin(\"alice\"):\n    frontend.page(\n        \"error.compilation\",\n        expected_content_type=\"text/plain\",\n        expected_http_status=500,\n        expect={ \"message\": check_compilation })\n\n    frontend.page(\n        \"error.runtime\",\n        expected_content_type=\"text/plain\",\n        expected_http_status=500,\n        expect={ \"message\": check_runtime })\n"
  },
  {
    "path": "testing/tests/001-main/004-extensions/002-tests/004-TestExtension/010-restrictions.py",
    "content": "with frontend.signin(\"alice\"):\n    frontend.operation(\n        \"restrictions\",\n        data={},\n        expect={ \"database_connection\": \"PostgreSQL is not defined\" })\n"
  },
  {
    "path": "testing/tests/001-main/004-extensions/002-tests/004-TestExtension/011-User.py",
    "content": "DUMP_USER = \"\"\"\\\nvar user = %s;\nreturn [user.name, user.email, user.fullname, user.isAnonymous];\"\"\"\n\ndef dump_user(user):\n    return DUMP_USER % user\n\nwith frontend.signin(\"alice\"):\n    frontend.operation(\n        \"evaluate\",\n        data={ \"source\": dump_user(\"critic.User.current\") },\n        expect={ \"result\": [\"alice\", \"alice@example.org\", \"Alice von Testing\", False] })\n\n    frontend.operation(\n        \"evaluate\",\n        data={ \"source\": dump_user(\"new critic.User(%d)\"\n                                   % instance.userid(\"alice\")) },\n        expect={ \"result\": [\"alice\", \"alice@example.org\", \"Alice von Testing\", False] })\n\n    frontend.operation(\n        \"evaluate\",\n        data={ \"source\": dump_user(\"new critic.User({ id: %d })\"\n                                   % instance.userid(\"alice\")) },\n        expect={ \"result\": [\"alice\", \"alice@example.org\", \"Alice von Testing\", False] })\n\n    frontend.operation(\n        \"evaluate\",\n        data={ \"source\": dump_user(\"new critic.User('alice')\") },\n        expect={ \"result\": [\"alice\", \"alice@example.org\", \"Alice von Testing\", False] })\n\n    frontend.operation(\n        \"evaluate\",\n        data={ \"source\": dump_user(\"new critic.User({ name: 'alice' })\") },\n        expect={ \"result\": [\"alice\", \"alice@example.org\", \"Alice von Testing\", False] })\n\n    frontend.operation(\n        \"evaluate\",\n        data={ \"source\": dump_user(\"new critic.User({ id: %d, name: 'alice' })\"\n                                   % instance.userid(\"alice\")) },\n        expect={ \"result\": [\"alice\", \"alice@example.org\", \"Alice von Testing\", False] })\n"
  },
  {
    "path": "testing/tests/001-main/004-extensions/002-tests/004-TestExtension/012-resources.py",
    "content": "with frontend.signin(\"alice\"):\n    frontend.page(\n        \"extension-resource/TestExtension/helloworld.html\",\n        expected_content_type=\"text/html\")\n\n    frontend.page(\n        \"extension-resource/TestExtension/helloworld.css\",\n        expected_content_type=\"text/css\")\n\n    # This resource has an extra period in its name, the check that this doesn't\n    # interfere with the content type guessing.\n    frontend.page(\n        \"extension-resource/TestExtension/hello.world.js\",\n        expected_content_type=\"text/javascript\")\n\n    frontend.page(\n        \"extension-resource/TestExtension/helloworld.txt\",\n        expected_http_status=404)\n"
  },
  {
    "path": "testing/tests/001-main/004-extensions/002-tests/004-TestExtension/013-storage.py",
    "content": "with frontend.signin(\"alice\"):\n    frontend.operation(\n        \"evaluate\",\n        data={ \"source\": \"return critic.storage.has('the key');\" },\n        expect={ \"result\": False })\n\n    frontend.operation(\n        \"evaluate\",\n        data={ \"source\": \"return critic.storage.get('the key');\" },\n        expect={ \"result\": None })\n\n    frontend.operation(\n        \"evaluate\",\n        data={ \"source\": \"critic.storage.set('the key', 'the value');\" },\n        expect={ \"result\": None })\n\n    frontend.operation(\n        \"evaluate\",\n        data={ \"source\": \"return critic.storage.has('the key');\" },\n        expect={ \"result\": True })\n\n    frontend.operation(\n        \"evaluate\",\n        data={ \"source\": \"return critic.storage.has('the other key');\" },\n        expect={ \"result\": False })\n\n    frontend.operation(\n        \"evaluate\",\n        data={ \"source\": \"return critic.storage.get('the key');\" },\n        expect={ \"result\": \"the value\" })\n\n    frontend.operation(\n        \"clearextensionstorage\",\n        data={ \"author_name\": \"alice\",\n               \"extension_name\": \"TestExtension\" })\n\n    frontend.operation(\n        \"evaluate\",\n        data={ \"source\": \"return critic.storage.has('the key');\" },\n        expect={ \"result\": False })\n\n    frontend.operation(\n        \"evaluate\",\n        data={ \"source\": \"return critic.storage.get('the key');\" },\n        expect={ \"result\": None })\n\n    frontend.operation(\n        \"evaluate\",\n        data={ \"source\": \"\"\"\ncritic.storage.set('a', '1');\ncritic.storage.set('b', '4');\ncritic.storage.set('aa', '2');\ncritic.storage.set('bb', '5');\ncritic.storage.set('aaa', '3');\"\"\" },\n        expect={ \"result\": None })\n\n    frontend.operation(\n        \"evaluate\",\n        data={ \"source\": \"return critic.storage.list();\" },\n        expect={ \"result\": [\"a\", \"aa\", \"aaa\", \"b\", \"bb\"] })\n\n    frontend.operation(\n        \"evaluate\",\n        data={ \"source\": \"return critic.storage.list({ like: 'a%' });\" },\n        expect={ \"result\": [\"a\", \"aa\", \"aaa\"] })\n\n    frontend.operation(\n        \"evaluate\",\n        data={ \"source\": \"return critic.storage.list({ like: 'aa%' });\" },\n        expect={ \"result\": [\"aa\", \"aaa\"] })\n\n    frontend.operation(\n        \"evaluate\",\n        data={ \"source\": \"return critic.storage.list({ regexp: 'a+' });\" },\n        expect={ \"result\": [\"a\", \"aa\", \"aaa\"] })\n\n    frontend.operation(\n        \"evaluate\",\n        data={ \"source\": \"return critic.storage.list({ regexp: '[ab]*' });\" },\n        expect={ \"result\": [\"a\", \"aa\", \"aaa\", \"b\", \"bb\"] })\n"
  },
  {
    "path": "testing/tests/001-main/004-extensions/002-tests/004-TestExtension/014-Repository.run.py",
    "content": "import re\n\nRE_NAME_EMAIL = re.compile(r\"(author|committer)\\s+(.*?)\\s+<(.*?)>\")\n\nwith frontend.signin(\"alice\"):\n    result = frontend.operation(\n        \"evaluate\",\n        data={ \"source\": \"\"\"\\\nvar repository = new critic.Repository(\"critic\");\nvar tree_sha1 = repository.revparse(\"HEAD^{tree}\");\nvar parent_sha1 = repository.revparse(\"HEAD^\");\nvar workcopy = repository.getWorkCopy();\nvar sha1 = workcopy.run(\"commit-tree\", tree_sha1, \"-p\", parent_sha1,\n                        { stdin: \"Fix some stuff!\\\\n\\\\nFTW!\\\\n\",\n                          GIT_AUTHOR_NAME: \"Bob von Testing\",\n                          GIT_AUTHOR_EMAIL: \"bob@example.com\",\n                          GIT_COMMITTER_NAME: \"Alice von Testing\",\n                          GIT_COMMITTER_EMAIL: \"alice@example.com\" });\nsha1 = sha1.trim(); // includes a line-break\nworkcopy.run(\"push\", \"origin\", sha1 + \":refs/heads/014-Repository.run-1\");\nreturn sha1;\"\"\" })\n\n    with repository.workcopy() as work:\n        REMOTE_URL = instance.repository_url(\"alice\")\n\n        work.run([\"fetch\", REMOTE_URL, \"refs/heads/014-Repository.run-1\"])\n\n        message = None\n\n        for line in work.run([\"cat-file\", \"commit\", \"FETCH_HEAD\"]).splitlines():\n            if message is None:\n                if not line:\n                    message = []\n                    continue\n\n                match = RE_NAME_EMAIL.match(line)\n                if match:\n                    field, name, email = match.groups()\n\n                    if field == \"author\":\n                        testing.expect.check(\"Bob von Testing\", name)\n                        testing.expect.check(\"bob@example.com\", email)\n                    else:\n                        testing.expect.check(\"Alice von Testing\", name)\n                        testing.expect.check(\"alice@example.com\", email)\n                elif not (line.startswith(\"tree\") or line.startswith(\"parent\")):\n                    testing.logger.error(\"Unexpected line: %r\" % line)\n            else:\n                message.append(line)\n\n        testing.expect.check([\"Fix some stuff!\", \"\", \"FTW!\"], message)\n"
  },
  {
    "path": "testing/tests/001-main/004-extensions/002-tests/004-TestExtension/015-filterhook.py",
    "content": "import os\n\nwith_class = testing.expect.with_class\nextract_text = testing.expect.extract_text\n\ndef check_echo_filter_type(document):\n    filterdialog = document.find(\"div\", attrs=with_class(\"filterdialog\"))\n    type_select = filterdialog.find(\"select\", attrs={ \"name\": \"type\" })\n    type_options = type_select.findAll(\"option\")\n\n    testing.expect.check(4, len(type_options))\n    testing.expect.check(\"echo\", extract_text(type_options[-1]))\n    testing.expect.check(\"extensionhook\", type_options[-1][\"value\"])\n    testing.expect.check(\"echo\", type_options[-1][\"data-filterhook-name\"])\n\n    try:\n        int(type_options[-1][\"data-extension-id\"])\n    except (KeyError, ValueError):\n        testing.logger.error(\"invalid or missing data-extension-id attribute\")\n\ndef check_echo_filter(filter_id):\n    def check(document):\n        filters = document.find(\"table\", attrs=with_class(\"filters\"))\n        for tr in filters.findAll(\"tr\"):\n            path_td = tr.find(\"td\", attrs=with_class(\"path\"))\n            if not path_td or extract_text(path_td) != \"015-filterhook/include/\":\n                continue\n            title_td = tr.find(\"td\", attrs=with_class(\"title\"))\n            testing.expect.check(\"echo\", extract_text(title_td))\n            data_td = tr.find(\"td\", attrs=with_class(\"data\"))\n            testing.expect.check(\"this is the data\", extract_text(data_td))\n            break\n        else:\n            testing.logger.error(\"echo extension hook filter not found\")\n\n    return check\n\nwith frontend.signin(\"alice\"):\n    frontend.page(\n        \"home\",\n        expect={ \"check echo filter type\": check_echo_filter_type })\n\n    result = frontend.operation(\n        \"addextensionhookfilter\",\n        data={ \"subject\": \"alice\",\n               \"extension\": \"alice/TestExtension\",\n               \"repository\": \"critic\",\n               \"filterhook_name\": \"echo\",\n               \"path\": \"015-filterhook/include/\",\n               \"data\": \"this is the data\" })\n\n    filter_id = result[\"filter_id\"]\n\n    frontend.page(\n        \"home\",\n        expect={ \"check echo filter\": check_echo_filter(filter_id) })\n\nwith repository.workcopy() as work:\n    work.run([\"checkout\", \"-b\", \"r/015-filterhook\"])\n\n    include = os.path.join(work.path, \"015-filterhook\", \"include\")\n    exclude = os.path.join(work.path, \"015-filterhook\", \"exclude\")\n\n    os.makedirs(include)\n    os.makedirs(exclude)\n\n    def make(directory, filename):\n        with open(os.path.join(directory, filename), \"w\") as file:\n            print >>file, filename\n\n    make(include, \"file1\")\n    make(include, \"file2\")\n    make(exclude, \"file3\")\n\n    work.run([\"add\", \"015-filterhook\"])\n    work.run([\"commit\", \"-mFirst\"])\n\n    review_id = testing.utils.createReviewViaPush(work, \"alice\")\n\n    instance.synchronize_service(\"extensiontasks\")\n    instance.synchronize_service(\"maildelivery\")\n\n    # Ignore this mail; not very interesting in this context.\n    mailbox.pop(\n        [testing.mailbox.ToRecipient(\"alice@example.org\"),\n         testing.mailbox.WithSubject(\"New Review: First\")])\n\n    to_alice = mailbox.pop(\n        [testing.mailbox.ToRecipient(\"alice@example.org\"),\n         testing.mailbox.WithSubject(\"filterhook.js::filterhook\")])\n\n    testing.expect.check(\n        ['data: \"this is the data\"',\n         'review.id: %d' % review_id,\n         'user.name: alice',\n         'commits: [\"First\\\\n\"]',\n         'files: [\"015-filterhook/include/file1\",\"015-filterhook/include/file2\"]'],\n        to_alice.lines)\n\n    mailbox.check_empty()\n\n    make(exclude, \"file4\")\n\n    work.run([\"add\", \"015-filterhook\"])\n    work.run([\"commit\", \"-mSecond\"])\n\n    make(include, \"file5\")\n    make(include, \"file6\")\n\n    work.run([\"add\", \"015-filterhook\"])\n    work.run([\"commit\", \"-mThird\"])\n\n    work.run([\"push\", \"bob@%s:/var/git/critic.git\" % instance.hostname,\n              \"HEAD\"])\n\n    instance.synchronize_service(\"extensiontasks\")\n    instance.synchronize_service(\"maildelivery\")\n\n    # Ignore this mail; not very interesting in this context.\n    mailbox.pop(\n        [testing.mailbox.ToRecipient(\"alice@example.org\"),\n         testing.mailbox.WithSubject(\"Updated Review: First\")])\n\n    to_alice = mailbox.pop(\n        [testing.mailbox.ToRecipient(\"alice@example.org\"),\n         testing.mailbox.WithSubject(\"filterhook.js::filterhook\")])\n\n    testing.expect.check(\n        ['data: \"this is the data\"',\n         'review.id: %d' % review_id,\n         'user.name: bob',\n         'commits: [\"Third\\\\n\",\"Second\\\\n\"]',\n         'files: [\"015-filterhook/include/file5\",\"015-filterhook/include/file6\"]'],\n        to_alice.lines)\n\n    mailbox.check_empty()\n\n    make(include, \"explode\")\n\n    work.run([\"add\", \"015-filterhook\"])\n    work.run([\"commit\", \"-mExplode\"])\n\n    explode_sha1 = work.run([\"rev-parse\", \"HEAD\"]).strip()\n\n    work.run([\"push\", \"bob@%s:/var/git/critic.git\" % instance.hostname,\n              \"HEAD\"])\n\n    instance.synchronize_service(\"extensiontasks\")\n    instance.synchronize_service(\"maildelivery\")\n\n    # Ignore this mail; not very interesting in this context.\n    mailbox.pop(\n        [testing.mailbox.ToRecipient(\"alice@example.org\"),\n         testing.mailbox.WithSubject(\"Updated Review: First\")])\n\n    to_alice = mailbox.pop(\n        [testing.mailbox.ToRecipient(\"alice@example.org\"),\n         testing.mailbox.WithSubject(\"Failed: echo\")])\n\n    testing.expect.check(\n         ['An error occurred while processing an extension hook filter event!',\n          '',\n          'Filter details:',\n          '',\n          '  Extension:   TestExtension hosted by Alice von Testing',\n          '  Filter hook: echo',\n          '  Repository:  critic',\n          '  Path:        015-filterhook/include/',\n          '  Data:        \"this is the data\"',\n          '',\n          'Event details:',\n          '',\n          '  Review:  r/%d \"First\"' % review_id,\n          '  Commits: %s \"Explode\"' % explode_sha1[:8],\n          '',\n          'Error details:',\n          '',\n          '  Error:  Process returned non-zero exit status 1',\n          '  Output:',\n          '',\n          \"    Failed to call 'filterhook.js::filterhook()':\",\n          '      Error: Boom!',\n          '        Error: Boom!',\n          '            at filterhook.js:9:15',\n          '            at filterhook (filterhook.js:6:9)',\n          '',\n          '-- critic'],\n         to_alice.lines)\n\n    mailbox.check_empty()\n\nwith frontend.signin(\"bob\"):\n    frontend.operation(\n        \"deleteextensionhookfilter\",\n        data={ \"subject\": \"bob\",\n               \"filter_id\": filter_id },\n        expect={ \"status\": \"failure\",\n                 \"message\": (\"Filter to delete does not exist \"\n                             \"or belongs to another user!\") })\n\n    frontend.operation(\n        \"deleteextensionhookfilter\",\n        data={ \"subject\": \"alice\",\n               \"filter_id\": filter_id },\n        expect={ \"status\": \"failure\",\n                 \"message\": (\"Operation not permitted, user \"\n                             \"that lacks role 'administrator'.\") })\n\nwith frontend.signin(\"admin\"):\n    frontend.operation(\n        \"deleteextensionhookfilter\",\n        data={ \"subject\": \"alice\",\n               \"filter_id\": filter_id })\n\n    result = frontend.operation(\n        \"addextensionhookfilter\",\n        data={ \"subject\": \"alice\",\n               \"extension\": \"alice/TestExtension\",\n               \"repository\": \"critic\",\n               \"filterhook_name\": \"echo\",\n               \"path\": \"015-filterhook/include/\",\n               \"data\": \"this is the data\" })\n\n    filter_id = result[\"filter_id\"]\n\nwith frontend.signin(\"alice\"):\n    result = frontend.operation(\n        \"addextensionhookfilter\",\n        data={ \"subject\": \"alice\",\n               \"extension\": \"alice/TestExtension\",\n               \"repository\": \"critic\",\n               \"filterhook_name\": \"echo\",\n               \"path\": \"015-filterhook/include/\",\n               \"data\": \"this is the data\",\n               \"replaced_filter_id\": filter_id })\n\n    filter_id = result[\"filter_id\"]\n\n    frontend.operation(\n        \"deleteextensionhookfilter\",\n        data={ \"subject\": \"alice\",\n               \"filter_id\": filter_id })\n\n    frontend.operation(\n        \"addextensionhookfilter\",\n        data={ \"subject\": \"alice\",\n               \"extension\": \"alice/TestExtension\",\n               \"repository\": \"critic\",\n               \"filterhook_name\": \"explode\",\n               \"path\": \"/\" },\n        expect={ \"status\": \"failure\",\n                 \"code\": \"invalidrequest\",\n                 \"message\": (\"The extension doesn't have a filter \"\n                             \"hook role named 'explode'!\") })\n\n    frontend.operation(\n        \"addextensionhookfilter\",\n        data={ \"subject\": \"alice\",\n               \"extension\": \"alice/WrongExtension\",\n               \"repository\": \"critic\",\n               \"filterhook_name\": \"echo\",\n               \"path\": \"/\" },\n        expect={ \"status\": \"failure\",\n                 \"code\": \"invalidextension\",\n                 \"message\": (\"Invalid or inaccessible extension dir: \"\n                             \"/home/alice/CriticExtensions/WrongExtension\") })\n\n    frontend.operation(\n        \"addextensionhookfilter\",\n        data={ \"subject\": \"alice\",\n               \"extension\": 4711,\n               \"repository\": \"critic\",\n               \"filterhook_name\": \"echo\",\n               \"path\": \"/\" },\n        expect={ \"status\": \"failure\",\n                 \"code\": \"invalidextension\",\n                 \"message\": \"Invalid extension id: 4711\" })\n\n    frontend.operation(\n        \"uninstallextension\",\n        data={ \"author_name\": \"alice\",\n               \"extension_name\": \"TestExtension\" })\n\n    frontend.operation(\n        \"addextensionhookfilter\",\n        data={ \"subject\": \"alice\",\n               \"extension\": \"alice/TestExtension\",\n               \"repository\": \"critic\",\n               \"filterhook_name\": \"echo\",\n               \"path\": \"/\" },\n        expect={ \"status\": \"failure\",\n                 \"code\": \"invalidrequest\",\n                 \"message\": (\"The extension \\\"TestExtension hosted by Alice \"\n                             \"von Testing\\\" must be installed first!\") })\n\n    frontend.operation(\n        \"installextension\",\n        data={ \"author_name\": \"alice\",\n               \"extension_name\": \"TestExtension\" })\n"
  },
  {
    "path": "testing/tests/001-main/004-extensions/002-tests/004-TestExtension/016-accesscontrol.py",
    "content": "# Create an access token, and restrict it to not allow execution of\n# Alice's TestExtension.\nwith frontend.signin(\"alice\"):\n    access_token = frontend.json(\n        \"users/me/accesstokens\",\n        post={ \"title\": \"token for 016-accesscontrol.py\" })\n\n    extension = frontend.json(\n        \"extensions\",\n        params={ \"key\": \"alice/TestExtension\" })\n\n    frontend.json(\n        (\"users/me/accesstokens/%d/profile/extensions/exceptions\"\n         % access_token[\"id\"]),\n        post={ \"access_type\": \"execute\",\n               \"extension\": \"alice/TestExtension\" },\n        expect={\n            \"profile/extensions/exceptions\": [{\n                \"id\": int,\n                \"access_type\": \"execute\",\n                \"extension\": extension[\"id\"]\n            }]\n        })\n\nwith frontend.signin(access_token=access_token):\n    # Trying to execute an extension role should give a \"403 Forbidden\".\n    frontend.page(\n        \"echo\",\n        expected_http_status=403)\n"
  },
  {
    "path": "testing/tests/001-main/004-extensions/002-tests/004-TestExtension/999-missing.py",
    "content": "document_title = testing.expect.document_title\n\ndef check_version(expected):\n    def check(actual):\n        testing.expect.check(expected, actual.strip())\n    return check\n\nwith frontend.signin(\"alice\"):\n    # Check that alice still has the LIVE version installed.\n    frontend.page(\n        \"version\",\n        expected_content_type=\"text/plain\",\n        expect={ \"version\": check_version(\"live\") })\n\ninstance.execute([\"sudo\", \"rm\", \"-rf\", \"~alice/CriticExtensions\"])\n\nwith frontend.signin(\"alice\"):\n    # Check that the extension is ignored, and that /version just returns 404.\n    frontend.page(\n        \"version\",\n        expected_http_status=404)\n\n    # Check that /home, where the extension injects things, loads as expected.\n    frontend.page(\n        \"home\",\n        expect={ \"title\": document_title(u\"Alice von Testing's Home\") })\n\n    # Check that /manageextensions also loads as expected.\n    frontend.page(\n        \"manageextensions\",\n        expect={ \"title\": document_title(u\"Manage Extensions\") })\n\n    # ... even with what=installed.\n    frontend.page(\n        \"manageextensions\",\n        params={ \"what\": \"installed\" },\n        expect={ \"title\": document_title(u\"Manage Extensions\") })\n\n    # Check that the extension can be uninstalled.\n    frontend.operation(\n        \"uninstallextension\",\n        data={ \"author_name\": \"alice\",\n               \"extension_name\": \"TestExtension\" })\n\n    # Check that /manageextensions still loads.\n    frontend.page(\n        \"manageextensions\",\n        params={ \"what\": \"installed\" },\n        expect={ \"title\": document_title(u\"Manage Extensions\") })\n"
  },
  {
    "path": "testing/tests/001-main/004-extensions/002-tests/005-install-SystemExtension.py",
    "content": "def check_extension(installed):\n    def check(document):\n        tr_items = document.findAll(\"tr\", attrs={ \"class\": \"item\" })\n\n        for tr_item in tr_items:\n            td_name = tr_item.find(\"td\", attrs={ \"class\": \"name\" })\n            testing.expect.check(\"Extension:\", td_name.string)\n\n            td_value = tr_item.find(\"td\", attrs={ \"class\": \"value\" })\n            span_name = td_value.find(\"span\", attrs={ \"class\": \"name\" })\n\n            if span_name.contents[0].string != \"SystemExtension\":\n                # Wrong extension.\n                continue\n\n            testing.expect.check(1, len(span_name.contents))\n\n            span_installed = td_value.find(\"span\", attrs={ \"class\": \"installed\" })\n            if installed:\n                testing.expect.check(\" [installed]\", span_installed.string)\n            elif span_installed:\n                testing.expect.check(\"<no installed indicator>\",\n                                     \"<found installed indicator>\")\n\n            return\n        else:\n            testing.expect.check(\"<SystemExtension entry>\",\n                                 \"<expected content not found>\")\n\n    return check\n\ndef check_helloworld(document):\n    testing.expect.check(\"Hello world!\\n\", document)\n\ntry:\n    instance.execute(\n        [\"sudo\", \"mkdir\", \"/var/lib/critic/extensions\",\n         \"&&\",\n         \"sudo\", \"cp\", \"-R\", \"~/critic/testing/input/SystemExtension\",\n         \"/var/lib/critic/extensions\",\n         \"&&\",\n         \"sudo\", \"chown\", \"-R\", \"critic.critic\",\n         \"/var/lib/critic/extensions\",\n         \"&&\",\n         \"sudo\", \"chmod\", \"-R\", \"u+rwX,go+rX\",\n         \"/var/lib/critic/extensions\"])\nexcept testing.InstanceError as error:\n    raise testing.TestFailure(error.message)\n\nwith frontend.signin(\"alice\"):\n    frontend.page(\n        \"manageextensions\",\n        expect={ \"system_extension\": check_extension(installed=False) })\n\n    frontend.operation(\n        \"installextension\",\n        data={ \"extension_name\": \"SystemExtension\" })\n\n    frontend.page(\n        \"manageextensions\",\n        expect={ \"system_extension\": check_extension(installed=True) })\n\n    frontend.operation(\n        \"check\",\n        data={})\n\n    frontend.page(\n        \"extension-resource/SystemExtension/HelloWorld.txt\",\n        expected_content_type=\"text/plain\",\n        expect={ \"hello_world\": check_helloworld })\n\nfrontend.page(\n    \"manageextensions\",\n    expect={ \"system_extension\": check_extension(False) })\n"
  },
  {
    "path": "testing/tests/001-main/004-extensions/002-tests/006-manifest-checks.py",
    "content": "import os\nimport tempfile\n\ninstance.execute(\n    [\"mkdir\", \"-p\", \"~/CriticExtensions/InvalidExtension\",\n     \"&&\",\n     \"chmod\", \"u+rwX,go+rX\", \"~/CriticExtensions\"],\n    as_user=\"alice\")\n\nEXTENSION_PATH = \"/home/alice/CriticExtensions/InvalidExtension\"\nMANIFEST_PATH = os.path.join(EXTENSION_PATH, \"MANIFEST\")\n\nclass TransferredFile(object):\n    def __init__(self, target_name, source):\n        self.target_path = os.path.join(EXTENSION_PATH, target_name)\n        self.source = source\n    def __enter__(self, *args):\n        source_file = tempfile.NamedTemporaryFile()\n        source_file.write(self.source)\n        source_file.flush()\n        instance.copyto(source_file.name, self.target_path, as_user=\"alice\")\n        instance.execute([\"chmod\", \"g+r\", self.target_path], as_user=\"alice\")\n        source_file.close()\n        return self\n    def __exit__(self, *args):\n        instance.execute([\"rm\", \"-f\", self.target_path], as_user=\"alice\")\n\ndef error_message(linenr, message):\n    expected = \"%s:%d: manifest error: %s\" % (MANIFEST_PATH, linenr, message)\n    def check(actual):\n        testing.expect.check(expected, actual)\n    return check\n\ndef injected_script(document):\n    scripts = document.findAll(\"script\")\n    expected = \"<injected script>\"\n    actual = \"<expected content not found>\"\n    for script in scripts:\n        if script[\"src\"] == \"injected\":\n            actual = expected\n    testing.expect.check(expected, actual)\n\nscript_js = TransferredFile(\"script.js\", \"\"\"\\\nfunction page(method, path, query) {\n  writeln(\"200\");\n  writeln(\"Content-Type: text/json\");\n  writeln();\n  writeln(\"%r\", { status: 'ok', method: method, path: path, query: query });\n}\n\nfunction inject(path, query) {\n  writeln(\"script %r\", \"injected\");\n}\n\"\"\")\n\nwith frontend.signin(\"alice\"):\n    with TransferredFile(\"MANIFEST\", \"\"\"\\\nAuthor = Alice von Testing <alice@example.org>\nDescription = Extension with invalid MANIFEST\n\n[Page /foo]\nDescription = Page role with invalid pattern\nScript = script.js\nFunction = page\n\"\"\"):\n        frontend.page(\n            \"loadmanifest\",\n            params={ \"key\": \"alice/InvalidExtension\" },\n            expected_content_type=\"text/plain\",\n            expect={ \"error_message\": error_message(4, \"path pattern should not start with a '/'\") })\n\n    with script_js, TransferredFile(\"MANIFEST\", \"\"\"\\\nAuthor = Alice von Testing <alice@example.org>\nDescription = Extension with soon to be missing MANIFEST\n\n[Page foo]\nDescription = Dummy page role\nScript = script.js\nFunction = page\n\n[Inject tutorial]\nDescription = Dummy page role\nScript = script.js\nFunction = inject\n\"\"\"):\n        frontend.operation(\n            \"installextension\",\n            data={ \"extension_name\": \"InvalidExtension\",\n                   \"author_name\": \"alice\" })\n\n        frontend.operation(\n            \"foo\",\n            data={},\n            expect={ \"method\": \"POST\",\n                     \"path\": \"foo\" })\n\n        frontend.page(\n            \"tutorial\",\n            expect={ \"injected_script\": injected_script })\n\n    frontend.page(\n        \"foo\",\n        expected_http_status=404)\n\n    frontend.page(\"tutorial\")\n\n    frontend.operation(\n        \"uninstallextension\",\n        data={ \"extension_name\": \"InvalidExtension\",\n               \"author_name\": \"alice\" })\n\n    with TransferredFile(\"MANIFEST\", \"\"\"\\\nAuthor = Alice von Testing <alice@example.org>\nDescription = Soon to be inaccessible extension\n\n[Page foo]\nDescription = Dummy page role\nScript = script.js\nFunction = page\n\"\"\"):\n        frontend.operation(\n            \"installextension\",\n            data={ \"extension_name\": \"InvalidExtension\",\n                   \"author_name\": \"alice\" })\n\n        instance.execute(\n            [\"chmod\", \"go-rx\", \"~/CriticExtensions/InvalidExtension\"],\n            as_user=\"alice\")\n\n        frontend.page(\n            \"foo\",\n            expected_http_status=404)\n\n    frontend.operation(\n        \"uninstallextension\",\n        data={ \"extension_name\": \"InvalidExtension\",\n               \"author_name\": \"alice\" })\n"
  },
  {
    "path": "testing/tests/001-main/004-extensions/__init__.py",
    "content": "# @flag extensions\n"
  },
  {
    "path": "testing/tests/001-main/005-unittests/001-local/001-independence.py",
    "content": "# These tests simply check that some modules can be imported.\n\ninstance.unittest(\"base\", [\"independence\"])\ninstance.unittest(\"dbutils\", [\"independence\"])\ninstance.unittest(\"textutils\", [\"independence\"])\ninstance.unittest(\"htmlutils\", [\"independence\"])\ninstance.unittest(\"operation\", [\"independence\"])\ninstance.unittest(\"extensions\", [\"independence\"])\n"
  },
  {
    "path": "testing/tests/001-main/005-unittests/001-local/002-operation.py",
    "content": "# @dependency 001-main/005-unittests/001-local/001-independence.py\n# @flag local\n\n# These tests simply check that some modules can be imported.\n\ninstance.unittest(\"operation.basictypes\", [\"basic\"])\ninstance.unittest(\"operation.typechecker\", [\"basic\"])\n"
  },
  {
    "path": "testing/tests/001-main/005-unittests/001-local/005-dbutils.database.py",
    "content": "instance.unittest(\"dbutils.database\", [\"analyzeQuery\"])\n"
  },
  {
    "path": "testing/tests/001-main/005-unittests/001-local/__init__.py",
    "content": "# @dependency none\n# @flag local\n"
  },
  {
    "path": "testing/tests/001-main/005-unittests/002-api/001-commit.py",
    "content": "# @dependency 001-main/002-createrepository.py\n\ninstance.unittest(\"api.commit\", [\"basic\"])\n"
  },
  {
    "path": "testing/tests/001-main/005-unittests/002-api/002-review.py",
    "content": "# @dependency 001-main/003-self/004-createreview.py\n\ninstance.unittest(\"api.review\", [\"basic\"])\n"
  },
  {
    "path": "testing/tests/001-main/005-unittests/002-api/003-user.py",
    "content": "# @dependency 001-main/001-empty/003-criticctl/002-adduser-deluser.py\n# @dependency 001-main/001-empty/004-mixed/003-oauth.py\n# @dependency 001-main/001-empty/004-mixed/004-password.py\n# @dependency 001-main/003-self/028-gitemails.py\n\nargs = []\nif not testing.has_flag(instance.install_commit,\n                        \"reliable-git-emails\"):\n    args.append(\"--unreliable-git-emails\")\nif not testing.has_flag(instance.install_commit,\n                        \"reliable-admin-newswriter\"):\n    args.append(\"--unreliable-admin-newswriter\")\ninstance.unittest(\"api.user\", [\"basic\"], args)\n\nsettings_per_user = testing.utils.settings(\n    \"alice\", { \"commit.diff.visualTabs\": True })\nsettings_per_repository = testing.utils.settings(\n    \"alice\", { \"commit.expandAllFiles\": True },\n    repository=\"critic\")\n\nwith settings_per_user, settings_per_repository:\n    instance.unittest(\"api.user\", [\"preferences\"])\n"
  },
  {
    "path": "testing/tests/001-main/005-unittests/002-api/004-config.py",
    "content": "# @dependency 001-main/000-install.py\n\ninstance.unittest(\"api.config\", [\"basic\"])\n"
  },
  {
    "path": "testing/tests/001-main/005-unittests/002-api/005-log.partition.py",
    "content": "# @dependency 001-main/003-self/020-reviewrebase.py\n\ninstance.unittest(\"api.log.partition\", [\"basic\"])\n"
  },
  {
    "path": "testing/tests/001-main/005-unittests/002-api/006-log.rebase.py",
    "content": "# @dependency 001-main/003-self/012-replayrebase.py\n# @dependency 001-main/003-self/020-reviewrebase.py\n\ninstance.unittest(\"api.log.rebase\", [\"basic\"])\n"
  },
  {
    "path": "testing/tests/001-main/005-unittests/002-api/007-repository.py",
    "content": "# @dependency 001-main/003-self/028-gitemails.py\n\nHEAD = repository.run([\"rev-parse\", \"HEAD\"]).strip()\nSHA1 = \"66f25ae79dcc5e200b136388771b5924a1b5ae56\"\n\nwith repository.workcopy() as work:\n    REMOTE_URL = instance.repository_url(\"alice\")\n\n    work.run([\"tag\", \"007-repository/simple-tag\"])\n    work.run([\"push\", REMOTE_URL, \"007-repository/simple-tag\"])\n\n    work.run([\"tag\", \"-mAnnotated\", \"007-repository/annotated-tag\"])\n    work.run([\"push\", REMOTE_URL, \"007-repository/annotated-tag\"])\n\n    try:\n        instance.unittest(\"api.repository\", [\"basic\"],\n                          args=[\"--head=\" + HEAD,\n                                \"--sha1=\" + SHA1,\n                                \"--path=\" + instance.repository_path()])\n    finally:\n        work.run([\"push\", REMOTE_URL,\n                  \":007-repository/simple-tag\",\n                  \":007-repository/annotated-tag\"])\n"
  },
  {
    "path": "testing/tests/001-main/005-unittests/002-api/008-branch.py",
    "content": "# @dependency 001-main/002-createrepository.py\n\nSHA1 = \"66f25ae79dcc5e200b136388771b5924a1b5ae56\"\n\nwith repository.workcopy() as work:\n    REMOTE_URL = instance.repository_url(\"alice\")\n\n    work.run([\"checkout\", \"-b\", \"008-branch\", SHA1])\n    work.run([\"rebase\", \"--force-rebase\", \"HEAD~5\"])\n    work.run([\"push\", REMOTE_URL, \"008-branch\"])\n\n    sha1 = work.run([\"rev-parse\", \"HEAD\"]).strip()\n\n    try:\n        instance.unittest(\"api.branch\", [\"basic\"],\n                          args=[\"--sha1=\" + sha1,\n                                \"--name=008-branch\"])\n    finally:\n        work.run([\"push\", REMOTE_URL, \":008-branch\"])\n"
  },
  {
    "path": "testing/tests/001-main/005-unittests/002-api/009-commitset.py",
    "content": "# @dependency 001-main/002-createrepository.py\n\nimport os\nimport time\n\nwith repository.workcopy() as work:\n    REMOTE_URL = instance.repository_url(\"alice\")\n\n    os.mkdir(os.path.join(work.path, \"009-commitset\"))\n\n    # Generate this set of commits:\n    #\n    #  (X)\n    #   |\n    #   A\n    #  / \\\n    # C   B  (Y)\n    # |   |\\ /\n    # D   | G\n    #  \\ /  |\n    #   E   H\n    #   |   |\\\n    #   F   K \\\n    #    \\ /   I\n    #     M    |\n    #     |    J\n    #     N\n    #     |\n    #     L\n    #\n    # Commits are named in (committer date) chronological order; A is oldest, M\n    # is youngest.  X and Y are the tails of the set.\n\n    commits = {}\n    timestamp = int(time.time()) - 3600\n\n    def commit(letter, delta=0):\n        global timestamp\n        filename = os.path.join(work.path, \"009-commitset\", letter)\n        with open(filename, \"w\") as file:\n            print >>file, letter\n        work.run([\"add\", os.path.join(\"009-commitset\", letter)])\n        work.run([\"commit\", \"-m\" + letter],\n                 GIT_COMMITTER_DATE=\"%d +0000\" % (timestamp + delta))\n        timestamp += 10\n        commits[letter] = work.run([\"rev-parse\", \"HEAD\"]).strip()\n\n    def merge(letter, what, delta=0):\n        global timestamp\n        work.run([\"merge\", \"-m\" + letter, what],\n                 GIT_COMMITTER_DATE=\"%d +0000\" % (timestamp + delta))\n        timestamp += 10\n        commits[letter] = work.run([\"rev-parse\", \"HEAD\"]).strip()\n\n    work.run([\"checkout\", \"-b\", \"X\"])\n    commit(\"X\")\n\n    work.run([\"checkout\", \"-b\", \"Y\"])\n    commit(\"Y\")\n\n    work.run([\"checkout\", \"-b\", \"ACD\", \"X\"])\n    commit(\"A\")\n\n    work.run([\"checkout\", \"-b\", \"B\"])\n    commit(\"B\")\n\n    work.run([\"checkout\", \"ACD\"])\n    commit(\"C\")\n    commit(\"D\")\n\n    work.run([\"checkout\", \"-b\", \"EF\"])\n    merge(\"E\", \"B\")\n    commit(\"F\")\n\n    work.run([\"checkout\", \"-b\", \"GHK\", commits[\"B\"]])\n    merge(\"G\", \"Y\")\n    commit(\"H\")\n\n    work.run([\"checkout\", \"-b\", \"IJ\"])\n    commit(\"I\")\n    commit(\"J\")\n\n    work.run([\"checkout\", \"GHK\"])\n    commit(\"K\")\n\n    work.run([\"checkout\", \"-b\", \"MNL\", commits[\"F\"]])\n    merge(\"M\", \"GHK\", delta=10)\n    commit(\"N\", delta=10)\n    commit(\"L\", delta=-20)\n\n    work.run([\"push\", REMOTE_URL] +\n             [\"%s:refs/heads/009-commitset/%s\" % (sha1, letter)\n              for letter, sha1 in commits.items()])\n\n    try:\n        instance.unittest(\"api.commitset\", [\"basic\"],\n                          args=[\"--prefix=009-commitset/\"])\n    finally:\n        work.run([\"push\", REMOTE_URL] +\n                 [\":refs/heads/009-commitset/%s\" % letter\n                  for letter in commits.keys()])\n"
  },
  {
    "path": "testing/tests/001-main/005-unittests/002-api/010-comment.py",
    "content": "# @dependency 001-main/003-self/100-reviewing/001-comments.basic.py\n\nargs = [\"--review=r/100-reviewing/001-comment.basic\"]\n\ninstance.unittest(\"api.comment\", [\"basic\"], args)\n"
  },
  {
    "path": "testing/tests/001-main/005-unittests/002-api/011-reply.py",
    "content": "# @dependency 001-main/003-self/100-reviewing/001-comments.basic.py\n\nargs = [\"--review=r/100-reviewing/001-comment.basic\"]\n\ninstance.unittest(\"api.reply\", [\"basic\"], args)\n"
  },
  {
    "path": "testing/tests/001-main/005-unittests/002-api/012-changeset.py",
    "content": "instance.unittest(\"api.changeset\", [\"pre\"])\n\ninstance.synchronize_service(\"changeset\") # wait for changeset creation to finish\n\ninstance.unittest(\"api.changeset\", [\"post\"])\n"
  },
  {
    "path": "testing/tests/001-main/005-unittests/002-api/013-filechange.py",
    "content": "instance.unittest(\"api.filechange\", [\"pre\"])\n\ninstance.synchronize_service(\"changeset\") # wait for changeset creation to finish\n\ninstance.unittest(\"api.filechange\", [\"post\"])\n"
  },
  {
    "path": "testing/tests/001-main/005-unittests/002-api/014-filediff.py",
    "content": "instance.unittest(\"api.filediff\", [\"pre1\"])\n\ninstance.synchronize_service(\"changeset\") # wait for changeset creation to finish\n\ninstance.unittest(\"api.filediff\", [\"pre2\"])\n\ninstance.synchronize_service(\"highlight\") # wait for syntax highlighting to finish\n\ninstance.unittest(\"api.filediff\", [\"post\"])\n"
  },
  {
    "path": "testing/tests/001-main/005-unittests/003-other/001-dbutils.database.py",
    "content": "instance.unittest(\"dbutils.database\", [\"cursors\"])\n"
  },
  {
    "path": "testing/tests/001-main/005-unittests/003-other/__init__.py",
    "content": "# @dependency 001-main/000-install.py\n"
  },
  {
    "path": "testing/tests/001-main/900-uninstall-reinstall.py",
    "content": "# @flag uninstall\n\n# Uninstall Critic.\ninstance.uninstall()\n\n# Delete the repository clone (the install() call recreates it.)\ninstance.execute([\"rm\", \"-rf\", \"critic\"])\n\n# Install (and upgrade, optionally) Critic with the default arguments.\ninstance.install(repository, other_cwd=True)\ninstance.upgrade()\n"
  },
  {
    "path": "testing/tools/__init__.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2013 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\n"
  },
  {
    "path": "testing/tools/install.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2013 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport argparse\nimport subprocess\n\nimport testing\n\ndef main():\n    parser = argparse.ArgumentParser(\n        description=\"Critic testing framework: Quick install utility\")\n\n    parser.add_argument(\"--debug\",\n                        help=\"Enable DEBUG level logging\", action=\"store_true\")\n    parser.add_argument(\"--quiet\",\n                        help=\"Disable INFO level logging\", action=\"store_true\")\n\n    parser.add_argument(\"--commit\", default=\"HEAD\",\n                        help=\"Commit (symbolic ref or SHA-1) to test [default=HEAD]\")\n    parser.add_argument(\"--upgrade-from\",\n                        help=\"Commit (symbolic ref or SHA-1) to install first and upgrade from\")\n\n    parser.add_argument(\"--vm-identifier\", required=True,\n                        help=\"VirtualBox instance name or UUID\")\n    parser.add_argument(\"--vm-hostname\",\n                        help=\"VirtualBox instance hostname [default=VM_IDENTIFIER]\")\n    parser.add_argument(\"--vm-snapshot\", default=\"clean\",\n                        help=\"VirtualBox snapshot (name or UUID) to upgrade [default=clean]\")\n    parser.add_argument(\"--vm-ssh-port\", type=int, default=22,\n                        help=\"VirtualBox instance SSH port [default=22]\")\n    parser.add_argument(\"--git-daemon-port\", type=int,\n                        help=\"Port to tell 'git daemon' to bind to\")\n\n    parser.add_argument(\"--interactive\", \"-i\", action=\"store_true\",\n                        help=\"Install interactively (without arguments)\")\n\n    arguments = parser.parse_args()\n\n    logger = testing.configureLogging(arguments)\n    logger.info(\"Critic testing framework: Quick install\")\n\n    tested_commit = subprocess.check_output(\n        [\"git\", \"rev-parse\", \"--verify\", arguments.commit]).strip()\n\n    if arguments.upgrade_from:\n        install_commit = subprocess.check_output(\n            [\"git\", \"rev-parse\", \"--verify\", arguments.upgrade_from]).strip()\n        upgrade_commit = tested_commit\n    else:\n        install_commit = tested_commit\n        upgrade_commit = None\n\n    install_commit_description = subprocess.check_output(\n        [\"git\", \"log\", \"--oneline\", \"-1\", install_commit]).strip()\n\n    if upgrade_commit:\n        upgrade_commit_description = subprocess.check_output(\n            [\"git\", \"log\", \"--oneline\", \"-1\", upgrade_commit]).strip()\n    else:\n        upgrade_commit_description = None\n\n    instance = testing.virtualbox.Instance(\n        arguments,\n        install_commit=(install_commit, install_commit_description),\n        upgrade_commit=(upgrade_commit, upgrade_commit_description))\n\n    repository = testing.repository.Repository(\n        arguments.git_daemon_port,\n        install_commit,\n        arguments.vm_hostname)\n\n    mailbox = testing.mailbox.Mailbox()\n\n    with repository, mailbox, instance:\n        if not repository.export():\n            return\n\n        instance.mailbox = mailbox\n        instance.start()\n\n        if arguments.interactive:\n            print \"\"\"\nNote: To use the simple SMTP server built into the Critic testing framework,\n      enter \"host\" as the SMTP host and \"%d\" as the SMTP port.\n\nAlso note: The administrator user's password will be \"testing\" (password\n           input doesn't work over this channel.)\"\"\" % mailbox.port\n\n        instance.install(repository, quick=True,\n                         interactive=arguments.interactive)\n        instance.upgrade(interactive=arguments.interactive)\n\n        testing.pause(\"Press ENTER to stop VM: \")\n\n        try:\n            while True:\n                mail = mailbox.pop()\n                logger.info(\"Mail to <%s>:\\n%s\" % (mail.recipient, mail))\n        except testing.mailbox.MissingMail:\n            pass\n\nif __name__ == \"__main__\":\n    main()\n"
  },
  {
    "path": "testing/tools/upgrade.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2013 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport argparse\nimport time\n\nimport testing\n\ndef main():\n    parser = argparse.ArgumentParser(\n        description=\"Critic testing framework: instance upgrade utility\")\n    parser.add_argument(\"--debug\", help=\"Enable DEBUG level logging\", action=\"store_true\")\n    parser.add_argument(\"--quiet\", help=\"Disable INFO level logging\", action=\"store_true\")\n    parser.add_argument(\"--vm-identifier\", help=\"VirtualBox instance name or UUID\", required=True)\n    parser.add_argument(\"--vm-hostname\", help=\"VirtualBox instance hostname [default=VM_IDENTIFIER]\")\n    parser.add_argument(\"--vm-snapshot\", help=\"VirtualBox snapshot (name or UUID) to upgrade\", default=\"clean\")\n    parser.add_argument(\"--vm-ssh-port\", help=\"VirtualBox instance SSH port [default=22]\", type=int, default=22)\n    parser.add_argument(\"--pause-before-upgrade\", help=\"Pause before upgrading\", action=\"store_true\")\n    parser.add_argument(\"--pause-after-upgrade\", help=\"Pause after upgrading\", action=\"store_true\")\n    parser.add_argument(\"--no-upgrade\", action=\"store_true\", help=\"Do not upgrade installed packages\")\n    parser.add_argument(\"--install\", action=\"append\", help=\"Install named package\")\n    parser.add_argument(\"--custom\", action=\"store_true\",\n                        help=\"Stop for custom maintenance, and always retake snapshot\")\n    parser.add_argument(\"--reboot\", action=\"store_true\",\n                        help=\"Reboot VM before retaking snapshot\")\n\n    arguments = parser.parse_args()\n\n    logger = testing.configureLogging(arguments)\n    logger.info(\"Critic Testing Framework: Instance Upgrade\")\n\n    instance = testing.virtualbox.Instance(arguments)\n\n    with instance:\n        instance.start()\n\n        if not arguments.no_upgrade:\n            logger.debug(\"Upgrading guest OS ...\")\n\n            update_output = instance.execute(\n                [\"sudo\", \"DEBIAN_FRONTEND=noninteractive\",\n                 \"apt-get\", \"-q\", \"-y\", \"update\"])\n\n            logger.debug(\"Output from 'apt-get -q -y update':\\n\" + update_output)\n\n            upgrade_output = instance.execute(\n                [\"sudo\", \"DEBIAN_FRONTEND=noninteractive\",\n                 \"apt-get\", \"-q\", \"-y\", \"upgrade\"])\n\n            logger.debug(\"Output from 'apt-get -q -y upgrade':\\n\" + upgrade_output)\n\n            retake_snapshot = False\n\n            if \"The following packages will be upgraded:\" in upgrade_output.splitlines():\n                retake_snapshot = True\n\n        if arguments.install:\n            install_output = instance.execute(\n                [\"sudo\", \"DEBIAN_FRONTEND=noninteractive\",\n                 \"apt-get\", \"-q\", \"-y\", \"install\"] + arguments.install)\n\n            logger.debug(\"Output from 'apt-get -q -y install':\\n\" +\n                         install_output)\n\n            retake_snapshot = True\n\n        if arguments.custom:\n            testing.pause()\n            retake_snapshot = True\n\n        if retake_snapshot:\n            if arguments.reboot:\n                instance.execute([\"sudo\", \"reboot\"])\n\n                logger.debug(\"Sleeping 10 seconds ...\")\n                time.sleep(10)\n\n                instance.wait()\n\n                logger.debug(\"Sleeping 10 seconds ...\")\n                time.sleep(10)\n\n                logger.info(\"Rebooted VM: %s\" % arguments.vm_identifier)\n\n            logger.info(\"Upgraded guest OS\")\n            logger.debug(\"Retaking snapshot ...\")\n\n            instance.retake_snapshot(arguments.vm_snapshot)\n\n            logger.info(\"Snapshot '%s' upgraded!\" % arguments.vm_snapshot)\n        else:\n            logger.info(\"No packages upgraded in guest OS\")\n\nif __name__ == \"__main__\":\n    main()\n"
  },
  {
    "path": "testing/utils.py",
    "content": "import re\nimport contextlib\n\ninstance = None\nfrontend = None\n\nRE_REVIEW_URL = re.compile(r\"^remote:\\s+http://.*/r/(\\d+)\\s*$\")\n\n@contextlib.contextmanager\ndef settings(user, settings, repository=None):\n    data = { \"settings\": [{ \"item\": item, \"value\": value }\n                          for item, value in settings.items()] }\n    if repository:\n        data[\"repository\"] = repository\n\n    # Set requested settings.\n    with frontend.signin(user):\n        frontend.operation(\"savesettings\", data=data)\n\n    try:\n        yield\n    finally:\n        data = { \"settings\": [{ \"item\": item }\n                              for item, value in settings.items()] }\n        if repository:\n            data[\"repository\"] = repository\n\n        # Reset settings back to the default.\n        with frontend.signin(user):\n            frontend.operation(\"savesettings\", data=data)\n\n@contextlib.contextmanager\ndef access_token(user, profile):\n    with frontend.signin(user):\n        access_token = frontend.json(\n            \"users/me/accesstokens\",\n            post={\n                \"title\": \"by testing.utils.access_token()\",\n                \"profile\": profile\n            },\n            expect={\n                \"id\": int,\n                \"access_type\": \"user\",\n                \"user\": instance.userid(user),\n                \"title\": \"by testing.utils.access_token()\",\n                \"part1\": str,\n                \"part2\": str,\n                \"profile\": dict\n            })\n\n    try:\n        yield access_token\n    finally:\n        with frontend.signin(user):\n            frontend.json(\n                \"accesstokens/%d\" % access_token[\"id\"],\n                delete=True,\n                expected_http_status=204)\n\ndef createReviewViaPush(work, owner, commit=\"HEAD\"):\n    with settings(owner, { \"review.createViaPush\": True }):\n        remote_url = instance.repository_url(owner)\n        output = work.run([\"push\", remote_url, \"HEAD\"], TERM=\"dumb\")\n        for line in output.splitlines():\n            match = RE_REVIEW_URL.match(line)\n            if match:\n                return int(match.group(1))\n        else:\n            testing.expect.check(\"<review URL in 'git push' output>\",\n                                 \"<no review URL found>\")\n"
  },
  {
    "path": "testing/virtualbox.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2013 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport sys\nimport os\nimport subprocess\nimport time\nimport fcntl\nimport select\nimport errno\nimport datetime\nimport signal\nimport json\n\nimport testing\n\n# Directory (on guest system) to store coverage data in.\nCOVERAGE_DIR = \"/var/tmp/critic/coverage\"\n\ndef setnonblocking(fd):\n    fcntl.fcntl(fd, fcntl.F_SETFL, fcntl.fcntl(fd, fcntl.F_GETFL) | os.O_NONBLOCK)\n\nclass HostCommandError(testing.CommandError):\n    pass\n\nclass GuestCommandError(testing.CommandError):\n    pass\n\nclass Instance(testing.Instance):\n    def __init__(self, arguments, install_commit=None, upgrade_commit=None,\n                 frontend=None):\n        super(Instance, self).__init__()\n        self.arguments = arguments\n        self.vboxhost = getattr(arguments, \"vbox_host\", \"host\")\n        self.identifier = arguments.vm_identifier\n        self.snapshot = arguments.vm_snapshot\n        self.hostname = arguments.vm_hostname or self.identifier\n        self.ssh_port = arguments.vm_ssh_port\n        self.test_extensions = arguments.test_extensions\n        if install_commit:\n            self.install_commit, self.install_commit_description = install_commit\n            self.tested_commit = self.install_commit\n        if upgrade_commit:\n            self.upgrade_commit, self.upgrade_commit_description = upgrade_commit\n            if self.upgrade_commit:\n                self.tested_commit = self.upgrade_commit\n        self.frontend = frontend\n        self.strict_fs_permissions = getattr(arguments, \"strict_fs_permissions\", False)\n        self.coverage = getattr(arguments, \"coverage\", False)\n        self.mailbox = None\n        self.etc_dir = \"/etc/critic\"\n\n        # Check that the identified VM actually exists:\n        output = subprocess.check_output(\n            [\"VBoxManage\", \"list\", \"vms\"],\n            stderr=subprocess.STDOUT)\n        if not self.__isincluded(output):\n            raise testing.Error(\"Invalid VM identifier: %s\" % self.identifier)\n\n        # Check that the identified snapshot actually exists (and that there\n        # aren't multiple snapshots with the same name):\n        count = self.count_snapshots(self.snapshot)\n        if count == 0:\n            raise testing.Error(\"Invalid VM snapshot: %s (not found)\"\n                                % self.snapshot)\n        elif count > 1:\n            raise testing.Error(\"Invalid VM snapshot: %s (matches multiple snapshots)\"\n                                % self.snapshot)\n\n        # Check that the VM isn't running:\n        state = self.state()\n        if state != \"poweroff\":\n            raise testing.Error(\"Invalid VM state: %s (expected 'poweroff')\"\n                                % state)\n\n        self.__reset()\n\n    def __reset(self):\n        self.__started = False\n        self.__installed = False\n        self.__upgraded = False\n        self.resetusers()\n        self.registeruser(\"admin\")\n\n    def __enter__(self):\n        return self\n\n    def __exit__(self, *args):\n        if self.__started:\n            self.stop()\n        self.__reset()\n        return False\n\n    def __vmcommand(self, command, *arguments):\n        argv = [\"VBoxManage\", command, self.identifier] + list(arguments)\n        try:\n            testing.logger.debug(\"Running: \" + \" \".join(argv))\n            return subprocess.check_output(argv, stderr=subprocess.STDOUT)\n        except subprocess.CalledProcessError as error:\n            raise HostCommandError(argv, error.output)\n\n    def __isincluded(self, output):\n        name = '\"%s\"' % self.identifier\n        uuid = '{%s}' % self.identifier\n\n        for line in output.splitlines():\n            if name in line or uuid in line:\n                return True\n        else:\n            return False\n\n    def isrunning(self):\n        output = subprocess.check_output(\n            [\"VBoxManage\", \"list\", \"runningvms\"],\n            stderr=subprocess.STDOUT)\n        return self.__isincluded(output)\n\n    def state(self):\n        output = self.__vmcommand(\"showvminfo\", \"--machinereadable\")\n        for line in output.splitlines():\n            if line.startswith(\"VMState=\"):\n                return eval(line[len(\"VMState=\"):])\n        return \"<not found>\"\n\n    def count_snapshots(self, identifier):\n        try:\n            output = subprocess.check_output(\n                [\"VBoxManage\", \"snapshot\", self.identifier, \"list\"],\n                stderr=subprocess.STDOUT)\n        except subprocess.CalledProcessError:\n            # Assuming we've already checked that 'self.identifier' is a valid\n            # VM identifier, the most likely cause of this failure is that the\n            # VM has no snapshots.\n            return 0\n        else:\n            name = \"Name: %s (\" % identifier\n            uuid = \"(UUID: %s)\" % identifier\n            count = 0\n            for line in output.splitlines():\n                if name in line or uuid in line:\n                    count += 1\n            return count\n\n    def wait(self):\n        testing.logger.debug(\"Waiting for VM to come online ...\")\n\n        while True:\n            try:\n                self.execute([\"true\"], timeout=1)\n            except GuestCommandError:\n                time.sleep(0.5)\n            else:\n                break\n\n    def start(self):\n        testing.logger.debug(\"Starting VM: %s ...\" % self.identifier)\n\n        self.__vmcommand(\"snapshot\", \"restore\", self.snapshot)\n        self.__vmcommand(\"startvm\", \"--type\", \"headless\")\n        self.__started = True\n\n        self.wait()\n\n        # Set the guest system's clock to match the host system's.  Since we're\n        # restoring the same snapshot over and over, the guest system's clock is\n        # probably quite far from the truth.\n        now = datetime.datetime.utcnow().strftime(\"%m%d%H%M%Y.%S\")\n        self.execute([\"sudo\", \"date\", \"--utc\", now])\n\n        testing.logger.info(\"Started VM: %s\" % self.identifier)\n\n    def stop(self):\n        testing.logger.debug(\"Stopping VM: %s ...\" % self.identifier)\n\n        self.__vmcommand(\"controlvm\", \"poweroff\")\n\n        while self.state() != \"poweroff\":\n            time.sleep(0.1)\n\n        # It appears the VirtualBox \"session\" can be locked for a while after\n        # the \"controlvm poweroff\" command, and possibly after the VM state\n        # changes to \"poweroff\", so sleep a little longer to avoid problems.\n        time.sleep(0.5)\n\n        testing.logger.info(\"Stopped VM: %s\" % self.identifier)\n\n    def retake_snapshot(self, name):\n        index = 1\n\n        while True:\n            temporary_name = \"%s-%d\" % (name, index)\n            if self.count_snapshots(temporary_name) == 0:\n                break\n            index += 1\n\n        self.__vmcommand(\"snapshot\", \"take\", temporary_name, \"--pause\")\n        self.__vmcommand(\"snapshot\", \"delete\", name)\n        self.__vmcommand(\"snapshot\", \"edit\", temporary_name, \"--name\", name)\n\n    def execute(self, argv, cwd=None, timeout=None, interactive=False,\n                as_user=None, log_stdout=True, log_stderr=True):\n        guest_argv = list(argv)\n        if cwd is not None:\n            guest_argv[:0] = [\"cd\", cwd, \"&&\"]\n        host_argv = [\"ssh\"]\n        if self.ssh_port != 22:\n            host_argv.extend([\"-p\", str(self.ssh_port)])\n        if timeout is not None:\n            host_argv.extend([\"-o\", \"ConnectTimeout=%d\" % timeout])\n        if not interactive:\n            host_argv.append(\"-n\")\n        if as_user is not None:\n            host_argv.extend([\"-l\", as_user])\n        host_argv.append(self.hostname)\n\n        testing.logger.debug(\"Running: \" + \" \".join(host_argv + guest_argv))\n\n        process = subprocess.Popen(\n            host_argv + guest_argv,\n            stdout=subprocess.PIPE if not interactive else None,\n            stderr=subprocess.PIPE if not interactive else None)\n\n        class BufferedLineReader(object):\n            def __init__(self, source):\n                self.source = source\n                self.buffer = \"\"\n\n            def readline(self):\n                try:\n                    while self.source is not None:\n                        try:\n                            line, self.buffer = self.buffer.split(\"\\n\", 1)\n                        except ValueError:\n                            pass\n                        else:\n                            return line + \"\\n\"\n                        data = self.source.read(1024)\n                        if not data:\n                            self.source = None\n                            break\n                        self.buffer += data\n                    line = self.buffer\n                    self.buffer = \"\"\n                    return line\n                except IOError as error:\n                    if error.errno == errno.EAGAIN:\n                        return None\n                    raise\n\n        stdout_data = \"\"\n        stdout_reader = BufferedLineReader(process.stdout)\n\n        stderr_data = \"\"\n        stderr_reader = BufferedLineReader(process.stderr)\n\n        if not interactive:\n            setnonblocking(process.stdout)\n            setnonblocking(process.stderr)\n\n            poll = select.poll()\n            poll.register(process.stdout)\n            poll.register(process.stderr)\n\n            stdout_done = False\n            stderr_done = False\n\n            while not (stdout_done and stderr_done):\n                poll.poll()\n\n                while not stdout_done:\n                    line = stdout_reader.readline()\n                    if line is None:\n                        break\n                    elif not line:\n                        poll.unregister(process.stdout)\n                        stdout_done = True\n                        break\n                    else:\n                        stdout_data += line\n                        if log_stdout:\n                            testing.logger.log(testing.STDOUT, line.rstrip(\"\\n\"))\n\n                while not stderr_done:\n                    line = stderr_reader.readline()\n                    if line is None:\n                        break\n                    elif not line:\n                        poll.unregister(process.stderr)\n                        stderr_done = True\n                        break\n                    else:\n                        stderr_data += line\n                        if log_stderr:\n                            testing.logger.log(testing.STDERR, line.rstrip(\"\\n\"))\n\n        process.wait()\n\n        if process.returncode != 0:\n            raise GuestCommandError(argv, stdout_data, stderr_data)\n\n        return stdout_data\n\n    def copyto(self, source, target, as_user=None):\n        target = \"%s:%s\" % (self.hostname, target)\n        if as_user:\n            target = \"%s@%s\" % (as_user, target)\n        argv = [\"scp\", \"-q\", \"-P\", str(self.ssh_port), source, target]\n        try:\n            testing.logger.debug(\"Running: \" + \" \".join(argv))\n            return subprocess.check_output(argv, stderr=subprocess.STDOUT)\n        except subprocess.CalledProcessError as error:\n            raise GuestCommandError(argv, error.output)\n\n    def copyfrom(self, source, target, as_user=None):\n        source = \"%s:%s\" % (self.hostname, source)\n        if as_user:\n            source = \"%s@%s\" % (as_user, source)\n        argv = [\"scp\", \"-q\", \"-P\", str(self.ssh_port), source, target]\n        try:\n            testing.logger.debug(\"Running: \" + \" \".join(argv))\n            return subprocess.check_output(argv, stderr=subprocess.STDOUT)\n        except subprocess.CalledProcessError as error:\n            raise GuestCommandError(argv, error.output)\n\n    def criticctl(self, argv):\n        try:\n            return self.execute([\"sudo\", \"criticctl\"] + argv)\n        except GuestCommandError as error:\n            raise testing.CriticctlError(error.command, error.stdout, error.stderr)\n\n    def adduser(self, name, email=None, fullname=None, password=None):\n        if email is None:\n            email = \"%s@example.org\" % name\n        if fullname is None:\n            fullname = \"%s von Testing\" % name.capitalize()\n        if password is None:\n            password = \"testing\"\n\n        self.execute([\n            \"sudo\", \"criticctl\", \"adduser\", \"--name\", name, \"--email\", email,\n            \"--fullname\", \"'%s'\" % fullname, \"--password\", password,\n            \"&&\",\n            \"sudo\", \"adduser\", \"--ingroup\", \"critic\", \"--disabled-password\",\n            \"--gecos\", \"''\", name])\n\n        # Running all commands with a single self.execute() call is just an\n        # optimization; SSH sessions are fairly expensive to start.\n        self.execute([\n            \"sudo\", \"mkdir\", \".ssh\",\n            \"&&\",\n            \"sudo\", \"cp\", \"$HOME/.ssh/authorized_keys\", \".ssh/\",\n            \"&&\",\n            \"sudo\", \"chown\", \"-R\", name, \".ssh/\",\n            \"&&\",\n            \"sudo\", \"-H\", \"-u\", name, \"git\", \"config\", \"--global\", \"user.name\",\n            \"'%s'\" % fullname,\n            \"&&\",\n            \"sudo\", \"-H\", \"-u\", name, \"git\", \"config\", \"--global\", \"user.email\",\n            email],\n                     cwd=\"/home/%s\" % name)\n\n        self.registeruser(name)\n\n    def has_flag(self, flag):\n        if self.upgrade_commit and self.__upgraded:\n            check_commit = self.upgrade_commit\n        else:\n            check_commit = self.install_commit\n        return testing.has_flag(check_commit, flag)\n\n    def repository_path(self, repository=\"critic\"):\n        return \"/var/git/%s.git\" % repository\n\n    def repository_url(self, name=None, repository=\"critic\"):\n        if name is None:\n            user_prefix = \"\"\n        else:\n            user_prefix = name + \"@\"\n        return \"%s%s:/var/git/%s.git\" % (user_prefix, self.hostname, repository)\n\n    def restrict_access(self):\n        if not self.strict_fs_permissions:\n            return\n\n        # Set restrictive access bits on home directory of the installing user\n        # and of root, to make sure that no part of Critic's installation\n        # process, or the background processes started by it, depend on being\n        # able to access them as the Critic system user.\n        self.execute([\"sudo\", \"chmod\", \"-R\", \"go-rwx\", \"$HOME\", \"/root\"])\n\n        # Running install.py may have left files owned by root in $HOME.  The\n        # command above will have made them inaccessible for sure, so change\n        # the ownership back to us.\n        self.execute([\"sudo\", \"chown\", \"-R\", \"$LOGNAME\", \"$HOME\"])\n\n    def install(self, repository, override_arguments={}, other_cwd=False,\n                quick=False, interactive=False):\n        testing.logger.debug(\"Installing Critic ...\")\n\n        if not interactive:\n            use_arguments = { \"--headless\": True,\n                              \"--system-hostname\": self.hostname,\n                              \"--auth-mode\": \"critic\",\n                              \"--session-type\": \"cookie\",\n                              \"--admin-username\": \"admin\",\n                              \"--admin-email\": \"admin@example.org\",\n                              \"--admin-fullname\": \"'Testing Administrator'\",\n                              \"--admin-password\": \"testing\",\n                              \"--smtp-host\": self.vboxhost,\n                              \"--smtp-port\": str(self.mailbox.port),\n                              \"--smtp-no-ssl-tls\": True,\n                              \"--skip-testmail-check\": True }\n\n            if self.mailbox.credentials:\n                use_arguments[\"--smtp-username\"] = self.mailbox.credentials[\"username\"]\n                use_arguments[\"--smtp-password\"] = self.mailbox.credentials[\"password\"]\n\n            if self.coverage:\n                use_arguments[\"--coverage-dir\"] = COVERAGE_DIR\n\n            if self.has_flag(\"system-recipients\"):\n                use_arguments[\"--system-recipient\"] = \"system@example.org\"\n        else:\n            use_arguments = { \"--admin-password\": \"testing\" }\n\n        if self.has_flag(\"minimum-password-hash-time\"):\n            use_arguments[\"--minimum-password-hash-time\"] = \"0.01\"\n\n        if self.has_flag(\"is-testing\"):\n            use_arguments[\"--is-testing\"] = True\n\n        if self.has_flag(\"web-server-integration\") and self.arguments.vm_web_server:\n            use_arguments[\"--web-server-integration\"] = self.arguments.vm_web_server\n\n        for name, value in override_arguments.items():\n            if value is None:\n                if name in use_arguments:\n                    del use_arguments[name]\n            else:\n                use_arguments[name] = value\n\n        arguments = []\n\n        for name, value in use_arguments.items():\n            arguments.append(name)\n            if value is not True:\n                arguments.append(value)\n\n        # First install (if necessary) Git.\n        try:\n            self.execute([\"git\", \"--version\"])\n        except GuestCommandError:\n            testing.logger.debug(\"Installing Git ...\")\n\n            self.execute([\"sudo\", \"DEBIAN_FRONTEND=noninteractive\",\n                          \"apt-get\", \"-qq\", \"update\"])\n\n            self.execute([\"sudo\", \"DEBIAN_FRONTEND=noninteractive\",\n                          \"apt-get\", \"-qq\", \"-y\", \"install\", \"git-core\"])\n\n            testing.logger.info(\"Installed Git: %s\" % self.execute([\"git\", \"--version\"]).strip())\n\n        self.execute([\"git\", \"clone\", repository.url, \"critic\"])\n        self.execute([\"git\", \"fetch\", \"--quiet\", \"&&\",\n                      \"git\", \"checkout\", \"--quiet\", self.install_commit],\n                     cwd=\"critic\")\n\n        if self.upgrade_commit:\n            output = subprocess.check_output(\n                [\"git\", \"log\", \"--oneline\", self.install_commit, \"--\",\n                 \"background/servicemanager.py\"])\n\n            for line in output.splitlines():\n                sha1, subject = line.split(\" \", 1)\n                if subject == \"Make sure background services run with correct $HOME\":\n                    self.restrict_access()\n                    break\n        else:\n            self.restrict_access()\n\n        if other_cwd and self.has_flag(\"pwd-independence\"):\n            install_py = \"critic/install.py\"\n            cwd = None\n        else:\n            install_py = \"install.py\"\n            cwd = \"critic\"\n\n        self.execute(\n            [\"sudo\", \"python\", \"-u\", install_py] + arguments,\n            cwd=cwd, interactive=\"--headless\" not in use_arguments)\n\n        if not quick:\n            try:\n                testmail = self.mailbox.pop(\n                    testing.mailbox.WithSubject(\"Test email from Critic\"))\n\n                if not testmail:\n                    testing.expect.check(\"<test email>\", \"<no test email received>\")\n                else:\n                    testing.expect.check(\"admin@example.org\",\n                                         testmail.header(\"To\"))\n                    testing.expect.check(\"This is the configuration test email from Critic.\",\n                                         \"\\n\".join(testmail.lines))\n\n                self.mailbox.check_empty()\n                self.check_service_logs()\n            except testing.TestFailure as error:\n                if error.message:\n                    testing.logger.error(\"Basic test: %s\" % error.message)\n\n                # If basic tests fail, there's no reason to further test this\n                # instance; it seems to be properly broken.\n                raise testing.InstanceError\n\n            # Add \"developer\" role to get stacktraces in error messages.\n            self.execute([\"sudo\", \"criticctl\", \"addrole\",\n                          \"--name\", \"admin\",\n                          \"--role\", \"developer\"])\n\n            # Add some regular users.\n            for name in (\"alice\", \"bob\", \"dave\", \"erin\"):\n                self.adduser(name)\n\n            self.adduser(\"howard\")\n            self.execute([\"sudo\", \"criticctl\", \"addrole\",\n                          \"--name\", \"howard\",\n                          \"--role\", \"newswriter\"])\n\n        self.current_commit = self.install_commit\n\n        if not quick:\n            try:\n                self.frontend.run_basic_tests()\n                self.mailbox.check_empty()\n            except testing.TestFailure as error:\n                if error.message:\n                    testing.logger.error(\"Basic test: %s\" % error.message)\n\n                # If basic tests fail, there's no reason to further test this\n                # instance; it seems to be properly broken.\n                raise testing.InstanceError\n\n        testing.logger.info(\"Installed Critic: %s\" % self.install_commit_description)\n\n        self.__installed = True\n\n    def check_upgrade(self):\n        if not self.upgrade_commit:\n            raise testing.NotSupported(\"--upgrade-from argument not given\")\n\n    def upgrade(self, override_arguments={}, other_cwd=False, quick=False,\n                interactive=False, is_after_test=False):\n        if self.upgrade_commit \\\n                and self.upgrade_commit != self.current_commit \\\n                and (is_after_test or not self.arguments.upgrade_after):\n            testing.logger.debug(\"Upgrading Critic ...\")\n\n            self.restrict_access()\n\n            if not interactive:\n                use_arguments = { \"--headless\": True }\n            else:\n                use_arguments = {}\n\n            if not self.has_flag(\"minimum-password-hash-time\"):\n                use_arguments[\"--minimum-password-hash-time\"] = \"0.01\"\n\n            if not self.has_flag(\"is-testing\"):\n                use_arguments[\"--is-testing\"] = True\n\n            if not self.has_flag(\"system-recipients\"):\n                use_arguments[\"--system-recipient\"] = \"system@example.org\"\n\n            for name, value in override_arguments.items():\n                if value is None:\n                    if name in use_arguments:\n                        del use_arguments[name]\n                else:\n                    use_arguments[name] = value\n\n            arguments = []\n\n            for name, value in use_arguments.items():\n                arguments.append(name)\n                if value is not True:\n                    arguments.append(value)\n\n            self.execute([\"git\", \"checkout\", self.upgrade_commit], cwd=\"critic\")\n            self.execute([\"git\", \"submodule\", \"update\", \"--recursive\"], cwd=\"critic\")\n\n            # Setting this will make has_flag() from now on (including when used\n            # in the rest of this function) check the upgraded-to commit rather\n            # than the initially installed commit.\n            self.__upgraded = True\n\n            if other_cwd and self.has_flag(\"pwd-independence\"):\n                upgrade_py = \"critic/upgrade.py\"\n                cwd = None\n            else:\n                upgrade_py = \"upgrade.py\"\n                cwd = \"critic\"\n\n            self.execute([\"sudo\", \"python\", \"-u\", upgrade_py] + arguments,\n                         cwd=cwd, interactive=\"--headless\" not in use_arguments)\n\n            self.current_commit = self.upgrade_commit\n\n            if not quick:\n                self.frontend.run_basic_tests()\n\n            testing.logger.info(\"Upgraded Critic: %s\" % self.upgrade_commit_description)\n\n    def check_extend(self, repository, pre_upgrade=False):\n        commit = self.install_commit if pre_upgrade else self.tested_commit\n        if not testing.exists_at(commit, \"extend.py\"):\n            raise testing.NotSupported(\"tested commit lacks extend.py\")\n        if not self.arguments.test_extensions:\n            raise testing.NotSupported(\"--test-extensions argument not given\")\n        if not repository.v8_jsshell_path:\n            raise testing.NotSupported(\"v8-jsshell sub-module not initialized\")\n\n    def extend(self, repository):\n        self.check_extend(repository)\n\n        testing.logger.debug(\"Extending Critic ...\")\n\n        def internal(action, extra_argv=None):\n            argv = [\"sudo\", \"python\", \"-u\", \"extend.py\", \"--headless\",\n                    \"--%s\" % action]\n\n            if extra_argv:\n                argv.extend(extra_argv)\n\n            self.execute(argv, cwd=\"critic\")\n\n        internal(\"prereqs\", [\"--libcurl-flavor=gnutls\"])\n\n        submodule_path = \"installation/externals/v8-jsshell\"\n\n        v8_jsshell_sha1 = testing.repository.submodule_sha1(\n            os.getcwd(), self.current_commit, submodule_path)\n        cached_executable = os.path.join(self.arguments.cache_dir,\n                                         self.identifier, \"v8-jsshell\",\n                                         v8_jsshell_sha1 + \"-gnutls\")\n\n        if self.upgrade_commit is not None \\\n                and self.install_commit == self.current_commit \\\n                and self.upgrade_commit != self.current_commit:\n            # We're extending before upgrading.  Don't use a cached executable\n            # now if the upgrade changes the sub-module reference, since this\n            # breaks upgrade.py's automatic invocation of extend.py.\n\n            upgraded_v8_jsshell_sha1 = testing.repository.submodule_sha1(\n                os.getcwd(), self.upgrade_commit, submodule_path)\n            if upgraded_v8_jsshell_sha1 != v8_jsshell_sha1:\n                testing.logger.debug(\"Caching of v8-jsshell disabled\")\n                cached_executable = None\n\n        if cached_executable and os.path.isfile(cached_executable):\n            self.execute([\"mkdir\", \"installation/externals/v8-jsshell/out\"], cwd=\"critic\")\n            self.copyto(cached_executable,\n                        \"critic/installation/externals/v8-jsshell/out/jsshell\")\n            testing.logger.debug(\"Copied cached v8-jsshell executable to instance\")\n        else:\n            if repository.v8_url:\n                extra_argv = [\"--with-v8=%s\" % repository.v8_url]\n            else:\n                extra_argv = None\n\n            internal(\"fetch\", extra_argv)\n\n            # v8_sha1 = subprocess.check_output(\n            #     [\"git\", \"ls-tree\", \"HEAD\", \"v8\"],\n            #     cwd=\"installation/externals/v8-jsshell\").split()[2]\n            # cached_v8deps = os.path.join(self.arguments.cache_dir,\n            #                              \"v8-dependencies\",\n            #                              \"%s.tar.bz2\" % v8_sha1)\n            # if os.path.isfile(cached_v8deps):\n            #     self.copyto(cached_v8deps, \"v8deps.tar.bz2\")\n            #     internal(\"import-v8-dependencies=~/v8deps.tar.bz2\")\n            # else:\n            #     internal(\"export-v8-dependencies=~/v8deps.tar.bz2\")\n            #     if not os.path.isdir(os.path.dirname(cached_v8deps)):\n            #         os.makedirs(os.path.dirname(cached_v8deps))\n            #     self.copyfrom(\"v8deps.tar.bz2\", cached_v8deps)\n\n            internal(\"build\")\n\n            if cached_executable:\n                if not os.path.isdir(os.path.dirname(cached_executable)):\n                    os.makedirs(os.path.dirname(cached_executable))\n                self.copyfrom(\"critic/installation/externals/v8-jsshell/out/jsshell\",\n                              cached_executable)\n                testing.logger.debug(\"Copied built v8-jsshell executable from instance\")\n\n        internal(\"install\")\n        internal(\"enable\")\n\n        self.frontend.run_basic_tests()\n\n        testing.logger.info(\"Extensions enabled\")\n\n    def restart(self):\n        self.execute([\"sudo\", \"criticctl\", \"restart\"])\n\n    def uninstall(self):\n        self.execute(\n            [\"sudo\", \"python\", \"uninstall.py\", \"--headless\", \"--keep-going\"],\n            cwd=\"critic\")\n\n        # Delete the regular users.\n        for name in (\"alice\", \"bob\", \"dave\", \"erin\"):\n            self.execute([\"sudo\", \"deluser\", \"--remove-home\", name])\n\n        self.execute([\"sudo\", \"deluser\", \"--remove-home\", \"howard\"])\n\n        self.__installed = False\n        self.__upgraded = False\n\n    def finish(self):\n        if not self.__started:\n            return\n\n        if self.__installed:\n            self.execute([\"sudo\", \"criticctl\", \"stop\"])\n\n        if self.coverage:\n            sys.stdout.write(self.execute(\n                [\"sudo\", \"python\", \"coverage.py\",\n                 \"--coverage-dir\", COVERAGE_DIR,\n                 \"--critic-dir\", \"/etc/critic/main\",\n                 \"--critic-dir\", \"/usr/share/critic\"],\n                cwd=\"/usr/share/critic\"))\n\n        # Check that we didn't leave any files owned by root anywhere in the\n        # directory we installed from.\n        self.execute([\"chmod\", \"-R\", \"a+r\", \"critic\"])\n        self.execute([\"rm\", \"-r\", \"critic\"])\n\n    def run_unittest(self, args):\n        if self.coverage:\n            args = [\"--coverage\"] + args\n        return self.execute(\n            [\"cd\", \"/usr/share/critic\", \"&&\",\n             \"sudo\", \"-H\", \"-u\", \"critic\",\n             \"PYTHONPATH=/etc/critic/main:/usr/share/critic\",\n             \"python\", \"-u\", \"-m\", \"run_unittest\"] + args,\n            log_stderr=False)\n\n    def gc(self, repository):\n        self.execute([\"git\", \"gc\", \"--prune=now\"],\n                     cwd=os.path.join(\"/var/git\", repository),\n                     as_user=\"alice\")\n\n    def synchronize_service(self, service_name, force_maintenance=False, timeout=30):\n        helper = \"testing/input/service_synchronization_helper.py\"\n        if not (self.__upgraded or testing.exists_at(self.install_commit, helper)):\n            # We're upgrading from a commit where background services don't\n            # support synchronization, and haven't upgraded yet.  Sleep a (long)\n            # while and pray that the service is idle when we wake up.\n            testing.logger.debug(\"Synchronizing service: %s (sleeping %d seconds)\"\n                                 % (service_name, timeout))\n            time.sleep(timeout)\n            return\n        testing.logger.debug(\"Synchronizing service: %s\" % service_name)\n        pidfile_path = os.path.join(\"/var/run/critic/main\", service_name + \".pid\")\n        if force_maintenance:\n            signum = signal.SIGUSR2\n        else:\n            signum = signal.SIGUSR1\n        before = time.time()\n        self.execute(\n            [\"sudo\", \"python\", \"critic/\" + helper,\n             pidfile_path, str(signum), str(timeout)])\n        after = time.time()\n        testing.logger.debug(\"Synchronized service: %s in %.2f seconds\"\n                             % (service_name, after - before))\n\n    def filter_service_logs(self, level, service_names):\n        helper = \"testing/input/service_log_filter.py\"\n        if not (self.__upgraded or testing.exists_at(self.install_commit, helper)):\n            # We're upgrading from a commit where the helper for filtering\n            # service logs isn't supported, and haven't upgraded yet.\n            return\n        logfile_paths = {\n            os.path.join(\"/var/log/critic/main\", service_name + \".log\"): service_name\n            for service_name in service_names }\n        try:\n            data = json.loads(self.execute(\n                [\"sudo\", \"python\", \"critic/\" + helper, level] + logfile_paths.keys(),\n                log_stdout=False))\n            return { logfile_paths[logfile_path]: entries\n                     for logfile_path, entries in sorted(data.items()) }\n        except GuestCommandError:\n            return None\n"
  },
  {
    "path": "uninstall.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Martin Olsson\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport os\nimport grp\nimport sys\nimport argparse\nimport subprocess\n\nimport installation\n\ndef enum(*sequential, **named):\n    enums = dict(zip(sequential, range(len(sequential))), **named)\n    return type('Enum', (), enums)\n\nExitStatus = enum('EXIT_SUCCESS', 'MUST_RUN_AS_ROOT', 'INVALID_ETC_DIR', 'UNEXPECTED_ERROR')\n\ndef check(value):\n    if value.strip() != \"deletemydata\":\n        return \"to continue with uninstall, enter 'deletemydata', to abort uninstall press CTRL-C\"\n\ndef abort_if_no_keep_going_param(arguments, error_msg):\n    if not arguments.keep_going:\n        print error_msg\n        print \"Unexpected error encountered.  Critic uninstall aborted.\"\n        print \"Re-run with --keep-going to ignore errors.\"\n        sys.exit(ExitStatus.UNEXPECTED_ERROR)\n\ndef get_all_configurations(arguments):\n    configurations = []\n    etc_dir = arguments.etc_dir\n    original_sys_path = list(sys.path)\n\n    for critic_instance in os.listdir(etc_dir):\n        etc_path = os.path.join(etc_dir, critic_instance)\n\n        if not os.path.isdir(etc_path):\n            abort_if_no_keep_going_param(arguments, \"ERROR: %s is not a directory.\" % etc_path)\n\n        sys.path = list(original_sys_path)\n        sys.path.insert(0, etc_path)\n\n        try:\n            import configuration\n            configurations.append(configuration)\n        except ImportError:\n            abort_if_no_keep_going_param(arguments, \"ERROR: Failed to load Critic instance configuration from %s.\" % etc_path)\n\n    sys.path = list(original_sys_path)\n    return configurations\n\ndef run_command(arguments, command_parts):\n    try:\n        subprocess.check_output(command_parts)\n    except:\n        abort_if_no_keep_going_param(arguments, \"Error while running command: \" + ' '.join(command_parts))\n\ndef rmdir_if_empty(directories):\n    for dir in directories:\n        try:\n            os.rmdir(dir)\n        except OSError:\n            pass\n\ndef main():\n    parser = argparse.ArgumentParser(description=\"Critic uninstall script\")\n    parser.add_argument(\"--headless\", help=argparse.SUPPRESS, action=\"store_true\")\n    parser.add_argument(\"--etc-dir\", default=\"/etc/critic\", help=\"root directory for Critic system configurations i.e. specifying /etc/critic will read configuration data from /etc/critic/*/configuration/*.py\", action=\"store\")\n    parser.add_argument(\"--keep-going\", help=\"keep going even if errors are encountered (useful for purging broken installations)\", action=\"store_true\")\n    arguments = parser.parse_args()\n\n    if os.getuid() != 0:\n        print \"\"\"\nERROR: This script must be run as root.\n\"\"\"\n        sys.exit(ExitStatus.MUST_RUN_AS_ROOT)\n\n    if not arguments.headless:\n        print \"\"\"\\\n!!!! WARNING !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!\nThis uninstall script will delete Critic, all Critic logs, caches and\nconfiguration files, and it will also DELETE ALL DATA related to Critic.\nIt will drop the entire Critic database from postgresql and it will\npermanently delete the Critic git repositories.  If there are multiple\ninstances of Critic on this system, all of them will be removed.\n!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!\n\nThis step cannot be undone! To abort the uninstall script, press CTRL-C now.\n\"\"\"\n        installation.input.string(\"To continue the uninstall script and DELETE ALL YOUR DATA, enter 'deletemydata' here:\", default=\"\", check=check)\n\n    if not os.path.isdir(arguments.etc_dir):\n        print \"%s: no such directory.  Invalid --etc-dir parameter.\" % arguments.etc_dir\n        sys.exit(ExitStatus.INVALID_ETC_DIR)\n\n    run_command(arguments, [\"service\", \"apache2\", \"stop\"])\n\n    # Sets of system users/groups to delete will be collected (to avoid trying to delete the same user/group twice).\n    users_to_delete = set()\n    groups_to_delete = set()\n\n    for configuration in get_all_configurations(arguments):\n        users_to_delete.add(configuration.base.SYSTEM_USER_NAME)\n        groups_to_delete.add(configuration.base.SYSTEM_GROUP_NAME)\n\n        run_command(arguments, [\"service\", \"critic-%s\" % configuration.base.SYSTEM_IDENTITY, \"stop\"])\n        run_command(arguments, [\"rm\", \"-rf\", configuration.paths.DATA_DIR, configuration.paths.LOG_DIR])\n        run_command(arguments, [\"rm\", \"-rf\", configuration.paths.CACHE_DIR, configuration.paths.RUN_DIR])\n        run_command(arguments, [\"rm\", \"-rf\", configuration.paths.INSTALL_DIR, configuration.paths.GIT_DIR])\n        if configuration.base.WEB_SERVER_INTEGRATION == \"apache\":\n            run_command(arguments, [\"rm\", \"-f\", \"/etc/apache2/sites-available/critic-%s\" % configuration.base.SYSTEM_IDENTITY])\n            run_command(arguments, [\"rm\", \"-f\", \"/etc/apache2/sites-enabled/critic-%s\" % configuration.base.SYSTEM_IDENTITY])\n        elif configuration.base.WEB_SERVER_INTEGRATION == \"nginx+uwsgi\":\n            run_command(arguments, [\"rm\", \"-f\", \"/etc/nginx/sites-available/critic-%s\" % configuration.base.SYSTEM_IDENTITY])\n            run_command(arguments, [\"rm\", \"-f\", \"/etc/nginx/sites-enabled/critic-%s\" % configuration.base.SYSTEM_IDENTITY])\n        else:\n            run_command(arguments, [\"rm\", \"-f\", \"/etc/uwsgi/apps-available/critic-frontend-%s.ini\" % configuration.base.SYSTEM_IDENTITY])\n            run_command(arguments, [\"rm\", \"-f\", \"/etc/uwsgi/apps-enabled/critic-frontend-%s.ini\" % configuration.base.SYSTEM_IDENTITY])\n        if configuration.base.WEB_SERVER_INTEGRATION in (\"nginx+uwsgi\", \"uwsgi\"):\n            run_command(arguments, [\"rm\", \"-f\", \"/etc/uwsgi/apps-available/critic-backend-%s.ini\" % configuration.base.SYSTEM_IDENTITY])\n            run_command(arguments, [\"rm\", \"-f\", \"/etc/uwsgi/apps-enabled/critic-backend-%s.ini\" % configuration.base.SYSTEM_IDENTITY])\n        run_command(arguments, [\"rm\", \"-f\", \"/etc/init.d/critic-%s\" % configuration.base.SYSTEM_IDENTITY])\n        run_command(arguments, [\"update-rc.d\", \"critic-%s\" % configuration.base.SYSTEM_IDENTITY, \"remove\"])\n\n        # Typically the postgres user does not have access to the cwd during uninstall so we use \"-i\"\n        # with sudo which makes the command run with the postgres user's homedir as cwd instead.\n        # This avoids a harmless but pointless error message \"could not change directory to X\" when the\n        # /usr/bin/psql perl script tries to chdir back to the previous cwd after doing some stuff.\n        run_command(arguments, [\"sudo\", \"-u\", \"postgres\", \"-i\", \"psql\", \"-v\", \"ON_ERROR_STOP=1\", \"-c\", \"DROP DATABASE IF EXISTS %s;\" % configuration.database.PARAMETERS[\"database\"]])\n        run_command(arguments, [\"sudo\", \"-u\", \"postgres\", \"-i\", \"psql\", \"-v\", \"ON_ERROR_STOP=1\", \"-c\", \"DROP ROLE IF EXISTS %s;\" % configuration.database.PARAMETERS[\"user\"]])\n\n    for user in users_to_delete:\n        run_command(arguments, [\"deluser\", \"--system\", user])\n\n    for group in groups_to_delete:\n        try:\n            # Revoke push rights for all users that have been added to the Critic system group.\n            # delgroup doesn't do this automatically and we want to avoid users gettings errors like:\n            # \"groups: cannot find name for group ID 132\"\n            for group_member in grp.getgrnam(group).gr_mem:\n                subprocess.check_output([\"gpasswd\", \"-d\", group_member, group])\n        except KeyError:\n            abort_if_no_keep_going_param(arguments, \"ERROR: Could not find group '%s'.\" % group)\n        run_command(arguments, [\"delgroup\", \"--system\", group])\n\n    # Delete non-instance specific parts.\n    run_command(arguments, [\"rm\", \"-rf\", arguments.etc_dir, \"/usr/bin/criticctl\"])\n\n    if configuration.base.WEB_SERVER_INTEGRATION == \"apache\":\n        run_command(arguments, [\"service\", \"apache2\", \"restart\"])\n    elif configuration.base.WEB_SERVER_INTEGRATION == \"nginx+uwsgi\":\n        run_command(arguments, [\"service\", \"nginx\", \"restart\"])\n    if configuration.base.WEB_SERVER_INTEGRATION in (\"nginx+uwsgi\", \"uwsgi\"):\n        run_command(arguments, [\"service\", \"uwsgi\", \"restart\"])\n\n    # When default paths are used in install.py we put some extra effort into\n    # completely cleaning the system on uninstall, with custom paths it's\n    # trickier to know if the user really wants to delete empty parent dirs.\n    rmdir_if_empty([\"/var/log/critic\", \"/var/run/critic\", \"/var/cache/critic\"])\n\n    run_command(arguments, [\"rm\", \"-f\", os.path.join(installation.root_dir, \".installed\")])\n\n    print\n    print \"SUCCESS: Uninstall complete.\"\n    print\n\n    return ExitStatus.EXIT_SUCCESS\n\nif __name__ == \"__main__\":\n    sys.exit(main())\n"
  },
  {
    "path": "upgrade.py",
    "content": "# -*- mode: python; encoding: utf-8 -*-\n#\n# Copyright 2012 Jens Lindström, Opera Software ASA\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n# use this file except in compliance with the License.  You may obtain a copy of\n# the License at\n#\n#   http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the\n# License for the specific language governing permissions and limitations under\n# the License.\n\nimport os\nimport sys\nimport traceback\nimport subprocess\n\n# To avoid accidentally creating files owned by root.\nsys.dont_write_bytecode = True\n\n# Python version check is done before imports below so\n# that python 2.6/2.5 users can see the error message.\nimport pythonversion\npythonversion.check()\n\nif sys.flags.optimize > 0:\n    print \"\"\"\nERROR: Please run this script without -O or -OO options.\n\"\"\"\n    sys.exit(1)\n\nimport argparse\n\nsys.path.insert(0, os.path.join(os.getcwd(), \"src\"))\n\nimport installation\n\nparser = argparse.ArgumentParser(description=\"Critic upgrade script\")\n\n# Uses default values for everything that has a default value (and isn't\n# overridden by other command-line arguments) and signals an error for anything\n# that doesn't have a default value and isn't set by a command-line argument.\nparser.add_argument(\"--headless\", help=argparse.SUPPRESS, action=\"store_true\")\n\nparser.add_argument(\"--etc-dir\", default=\"/etc/critic\", help=\"directory where the Critic system configuration is stored\", action=\"store\")\nparser.add_argument(\"--identity\", \"-i\", default=\"main\", help=\"system identity to upgrade\", action=\"store\")\nparser.add_argument(\"--dry-run\", \"-n\", help=\"produce output but don't modify the system at all\", action=\"store_true\")\nparser.add_argument(\"--force\", \"-f\", help=\"force upgrade even if same commit is checked out\", action=\"store_true\")\n\nfor module in installation.modules:\n    if hasattr(module, \"add_arguments\"):\n        module.add_arguments(\"upgrade\", parser)\n\narguments = parser.parse_args()\n\nif arguments.headless:\n    installation.input.headless = True\n\nif os.getuid() != 0:\n    print \"\"\"\nERROR: This script must be run as root.\n\"\"\"\n    sys.exit(1)\n\ndef abort():\n    print\n    print \"ERROR: Upgrade aborted.\"\n    print\n\n    for module in reversed(installation.modules):\n        try:\n            if hasattr(module, \"undo\"):\n                module.undo()\n        except:\n            print >>sys.stderr, \"FAILED: %s.undo()\" % module.__name__\n            traceback.print_exc()\n\n    sys.exit(1)\n\ndata = installation.utils.read_install_data(arguments)\n\nprint \"\"\"\nCritic Upgrade\n==============\n\"\"\"\n\ngit = data[\"installation.prereqs.git\"]\n\nif \"sha1\" not in data:\n    try:\n        guess_sha1 = installation.utils.run_git([git, \"rev-parse\", \"HEAD@{1}\"],\n                                                cwd=installation.root_dir).strip()\n    except:\n        guess_sha1 = None\n\n    print \"\"\"\nThe SHA-1 of the commit you initially installed was not recorded.  This\nmeans you installed a version before the install.py script was changed\nto record the SHA-1 currently checked out.\"\"\"\n\n    if guess_sha1:\n        print \"\"\"\nA reasonable guess is HEAD@{1}, or \"where HEAD was before the last\noperation that changed HEAD\".  Otherwise, please figure out what you\ninstalled.  If you need to guess, guessing on something too old (i.e.\na commit that is an ancestor of the actual commit) is safer than\nguessing on something too recent.\"\"\"\n        default = \"HEAD@{1}\"\n    else:\n        print \"\"\"\nPlease figure out what you installed.  If you need to guess, guessing\non something too old (i.e.  a commit that is an ancestor of the actual\ncommit) is safer than guessing on something too recent.\"\"\"\n        default = None\n\n    print \"\"\"\nThe commit can be specified as a SHA-1 or any symbolic ref understood\nby \"git rev-parse\".\n\"\"\"\n\n    def revparse(value):\n        return installation.utils.run_git([git, \"rev-parse\", \"--verify\", value],\n                                          cwd=installation.root_dir).strip()\n\n    def valid_commit(value):\n        try:\n            sha1 = revparse(value)\n        except subprocess.CalledProcessError:\n            return \"not a valid ref (checked with \\\"git rev-parse --verify\\\")\"\n\n        try:\n            installation.utils.run_git([git, \"cat-file\", \"commit\", sha1],\n                                       cwd=installation.root_dir)\n        except subprocess.CalledProcessError:\n            return \"not a commit\"\n\n    sha1 = revparse(installation.input.string(prompt=\"What commit was originally installed?\",\n                                              default=default,\n                                              check=valid_commit))\n\n    data[\"sha1\"] = sha1\n\nold_critic_sha1 = data[\"sha1\"]\nnew_critic_sha1 = installation.utils.run_git([git, \"rev-parse\", \"HEAD\"],\n                                             cwd=installation.root_dir).strip()\n\nold_lifecycle = installation.utils.read_lifecycle(git, old_critic_sha1)\nnew_lifecycle = installation.utils.read_lifecycle()\n\nif old_lifecycle[\"stable\"] != new_lifecycle[\"stable\"]:\n    if old_lifecycle[\"stable\"]:\n        print \"\"\"\nWARNING: You're about to switch to an unstable development version of Critic!\n\nIf this is a production system, you are most likely better off staying on the\ncurrent branch, or switching to the latest stable branch, if there the current\nbranch isn't it.\n\nThe latest stable branch is the default branch (i.e. HEAD) in Critic's GitHub\nrepository at\n\n  https://github.com/jensl/critic.git\n\nTo interrogate it from the command-line, run\n\n  $ git ls-remote --symref https://github.com/jensl/critic.git HEAD\n\nHINT: If you installed from the 'master' branch prior to October 2017, then it\n      was at that time a stable branch (and also the only available option.)\n      At this point in time, a stable branch 'stable/1' was branched off, and\n      'master' became an unstable development branch.\n\n      In other words, if you are currently on 'master', you most likely want to\n      switch to 'stable/1', or the latest stable branch (see above,) now.\n\"\"\"\n\n        if not installation.input.yes_or_no(\n                \"Do you want to continue upgrading to the unstable version?\",\n                default=arguments.headless):\n            print\n            print \"Installation aborted.\"\n            print\n            sys.exit(1)\n    else:\n        print \"\"\"\nNOTE: Switching from an unstable version to a stable version.\n\"\"\"\n\nprint \"\"\"\nPreviously installed version: %s\nWill now upgrade to version:  %s\n\"\"\" % (old_critic_sha1, new_critic_sha1)\n\nif old_critic_sha1 == new_critic_sha1 and not arguments.force:\n    print \"Old and new commit are the same, nothing to do.\"\n    sys.exit(0)\n\nstatus_output = installation.utils.run_git([git, \"status\", \"--porcelain\"],\n                                           cwd=installation.root_dir).strip()\n\nif status_output:\n    print \"\"\"\\\nERROR: This Git repository has local modifications.\"\"\"\n\n    if len(status_output.splitlines()) \\\n            and \"installation/externals/v8-jsshell\" in status_output:\n        print \"\"\"\\\nHINT: You might just need to run \"git submodule update --recursive\".\"\"\"\n\n    print \"\"\"\nInstalling from a Git repository with local changes is not supported.\nPlease commit or stash the changes and then try again.\n\"\"\"\n    sys.exit(1)\n\ntry:\n    for module in installation.modules:\n        try:\n            if hasattr(module, \"prepare\") and not module.prepare(\"upgrade\", arguments, data):\n                abort()\n        except KeyboardInterrupt:\n            abort()\n        except SystemExit:\n            raise\n        except:\n            print >>sys.stderr, \"FAILED: %s.upgrade()\" % module.__name__\n            traceback.print_exc()\n            abort()\n\n    if not arguments.dry_run:\n        if not installation.httpd.stop():\n            abort()\n        if not installation.initd.stop():\n            abort()\n\n    for module in installation.modules:\n        try:\n            if hasattr(module, \"upgrade\") and not module.upgrade(arguments, data):\n                abort()\n        except KeyboardInterrupt:\n            abort()\n        except SystemExit:\n            raise\n        except:\n            print >>sys.stderr, \"FAILED: %s.upgrade()\" % module.__name__\n            traceback.print_exc()\n            abort()\n\n    import configuration\n\n    if not arguments.dry_run:\n        # Before bugfix \"Fix recreation of /var/run/critic/IDENTITY after reboot\"\n        # it was possible that /var/run/critic/IDENTITY was accidentally\n        # recreated owned by root:root instead of critic:critic (on reboot).\n        # If this had happened the service manager restart that is done during\n        # upgrade would fail so upgrades always failed. Further, it was not\n        # possible to write a migration script for this because migrations\n        # execute after the service manager restart. Because of this the\n        # following 3 line workaround was necessary:\n\n        if os.path.exists(configuration.paths.RUN_DIR):\n            os.chown(configuration.paths.RUN_DIR, installation.system.uid, installation.system.gid)\n\n        if not installation.initd.start():\n            abort()\n        if not installation.httpd.start():\n            abort()\n\n    data[\"sha1\"] = new_critic_sha1\n\n    with installation.utils.as_critic_system_user():\n        import dbaccess\n        db = dbaccess.connect()\n        cursor = db.cursor()\n        cursor.execute(\"UPDATE systemidentities SET installed_sha1=%s, installed_at=NOW() WHERE name=%s\", (new_critic_sha1, arguments.identity))\n        if not arguments.dry_run:\n            db.commit()\n\n    for module in installation.modules:\n        try:\n            if hasattr(module, \"finish\"):\n                module.finish(\"upgrade\", arguments, data)\n        except:\n            print >>sys.stderr, \"WARNING: %s.finish() failed\" % module.__name__\n            traceback.print_exc()\n\n    installation.utils.write_install_data(arguments, data)\n    installation.utils.clean_root_pyc_files()\n\n    print\n    print \"SUCCESS: Upgrade complete!\"\n    print\n\n    if configuration.extensions.ENABLED:\n        try:\n            installation.utils.run_git(\n                [git, \"diff\", \"--quiet\",\n                 \"%s..%s\" % (old_critic_sha1, new_critic_sha1),\n                 \"--\", \"installation/externals/v8-jsshell\"])\n        except subprocess.CalledProcessError:\n            # Non-zero exit status means there were changes.\n            print \"\"\"\nUpdated v8-jsshell submodule\n============================\n\nThe v8-jsshell program used to run extensions has been updated and needs to be\nrebuilt.  If this is not done, the extensions mechanism may malfunction.  It can\nbe done manually later by running this command as root:\n\n  python extend.py\n\"\"\"\n\n            rebuild_v8_jsshell = installation.input.yes_or_no(\n                \"Do you want to rebuild the v8-jsshell program now?\",\n                default=True)\n\n            if rebuild_v8_jsshell:\n                try:\n                    args = []\n                    if arguments.headless:\n                        args.append(\"--headless\")\n                    subprocess.check_call([sys.executable, \"extend.py\"] + args)\n                except subprocess.CalledProcessError:\n                    # We have already finished the main upgrade, so just\n                    # propagate the exit status if extend.py failed.  It will\n                    # have output enough error messages, for sure.\n                    sys.exit(1)\n\nexcept SystemExit:\n    raise\nexcept:\n    traceback.print_exc()\n    abort()\n"
  }
]