[
  {
    "path": ".circleci/config.yml",
    "content": "version: 2\n\njobs:\n  build:\n    machine:\n      docker_layer_caching: false\n    steps:\n      - checkout\n      - run: docker-compose build fwanalyzer\n      - run: docker-compose run --rm fwanalyzer make deps\n      - run: docker-compose run --rm fwanalyzer make\n\n  test:\n    machine:\n      docker_layer_caching: false\n    steps:\n      - checkout\n      - run: docker-compose build fwanalyzer\n      - run: docker-compose run --rm fwanalyzer make deps\n      - run: docker-compose run --rm fwanalyzer make testsetup ci-tests\n\nworkflows:\n  version: 2\n  test-build:\n    jobs:\n      - test\n      - build:\n          requires:\n            - test\n"
  },
  {
    "path": ".github/workflows/golangci-lint.yml",
    "content": "name: golangci-lint\non:\n  push:\n    tags:\n      - v*\n    branches:\n      - master\n      - main\n  pull_request:\npermissions:\n  contents: read\njobs:\n  golangci:\n    name: lint\n    runs-on: ubuntu-latest\n    steps:\n      - uses: actions/setup-go@v3\n        with:\n          go-version: 1.13\n      - uses: actions/checkout@v3\n      - name: golangci-lint\n        uses: golangci/golangci-lint-action@v3\n        with:\n          version: v1.29\n"
  },
  {
    "path": ".gitignore",
    "content": "build/**\nrelease/**\n"
  },
  {
    "path": "Building.md",
    "content": "# Building FwAnalyzer\n\n## Requirements\n\n- golang (with mod support) + golang-lint\n- Python\n- filesystem tools such as e2tools, mtools\n\nThe full list of dependencies is tracked in the [Dockerfile](Dockerfile).\n\n## Clone Repository\n\n```sh\ngo get github.com/cruise-automation/fwanalyzer\n```\n\n## Building\n\nBefore building you need to download third party go packages, run `make deps` before the first build.\n\n```sh\ncd go/src/github.com/cruise-automation/fwanalyzer\nmake deps\nmake\n```\n\nThe `fwanalyzer` binary will be in `build/`.\n\n# Testing\n\nWe have two types of tests: unit tests and integration tests, both tests will be triggered by running `make test`.\nRun `make testsetup` once to setup the test environment in `test/`.\nTests rely on e2tools, mtools, squashfs-tools, and ubi_reader, as well as Python.\n\n```sh\ncd go/src/github.com/cruise-automation/fwanalyzer\nmake testsetup\nmake test\n```\n"
  },
  {
    "path": "CODE_OF_CONDUCT.md",
    "content": "# Code of Conduct\n\nThis code of conduct outlines our expectations for participants within the\nCruise LLC (Cruise) community, as well as steps to reporting unacceptable\nbehavior. We are committed to providing a welcoming and inspiring community\nfor all and expect our code of conduct to be honored. Anyone who violates this\ncode of conduct may be banned from the community.\n\n## Our Commitment\n\nIn the interest of fostering an open and welcoming environment, we as\ncontributors and maintainers commit to making participation in our project and\nour community a harassment-free experience for everyone, regardless of age, body\nsize, disability, ethnicity, sex characteristics, gender identity and expression,\nlevel of experience, education, socio-economic status, nationality, personal\nappearance, race, religion, or sexual identity and orientation.\n\n## Our Standards\n\nExamples of behavior that contributes to creating a positive environment\ninclude:\n\n* Using welcoming and inclusive language\n* Being respectful of differing viewpoints and experiences\n* Gracefully accepting constructive criticism\n* Focusing on what is best for the community\n* Showing empathy towards other community members\n\nExamples of unacceptable behavior by participants include:\n\n* The use of sexualized language or imagery and unwelcome sexual attention or\n  advances\n* Trolling, insulting/derogatory comments, and personal or political attacks\n* Public or private harassment\n* Publishing others' private information, such as a physical or electronic\n  address, without explicit permission\n* Other conduct which could reasonably be considered inappropriate in a\n  professional setting\n\n## Our Responsibilities\n\nProject maintainers are responsible for clarifying the standards of acceptable\nbehavior and are expected to take appropriate and fair corrective action in\nresponse to any instances of unacceptable behavior.\n\nProject maintainers have the right and responsibility to remove, edit, or\nreject comments, commits, code, wiki edits, issues, and other contributions\nthat are not aligned to this Code of Conduct, or to ban temporarily or\npermanently any contributor for other behaviors that they deem inappropriate,\nthreatening, offensive, or harmful.\n\n## Scope\n\nThis Code of Conduct applies both within project spaces and in public spaces\nwhen an individual is representing the project or its community. Examples of\nrepresenting a project or community include using an official project e-mail\naddress, posting via an official social media account, or acting as an appointed\nrepresentative at an online or offline event. Representation of a project may be\nfurther defined and clarified by project maintainers.\n\n## Enforcement\n\nInstances of abusive, harassing, or otherwise unacceptable behavior may be\nreported by contacting the project team at opensource@getcruise.com. All\ncomplaints will be reviewed and investigated and will result in a response that\nis deemed necessary and appropriate to the circumstances. The project team will\nmaintain confidentiality to the extent possible with regard to the reporter of\nan incident. Further details of specific enforcement policies may be posted\nseparately.\n\nProject maintainers who do not follow or enforce the Code of Conduct in good\nfaith may face temporary or permanent repercussions as determined by other\nmembers of the project's leadership.\n\n## Attribution\n\nThis Code of Conduct is adapted from the [Contributor Covenant][homepage], version 1.4,\navailable at https://www.contributor-covenant.org/version/1/4/code-of-conduct.html\n\n[homepage]: https://www.contributor-covenant.org\n"
  },
  {
    "path": "CONTRIBUTING.md",
    "content": "# Contributing\n\nBy submitting a Contribution this Project (terms defined below), you agree to the following Contributor License Agreement:\n\nThe following terms are used throughout this agreement:\n\n* You - the person or legal entity including its affiliates asked to accept this agreement. An affiliate is any entity that controls or is controlled by the legal entity, or is under common control with it.\n* Project - is an umbrella term that refers to any and all open source projects from Cruise LLC.\n* Contribution - any type of work that is submitted to a Project, including any modifications or additions to existing work.\n* Submitted - conveyed to a Project via a pull request, commit, issue, or any form of electronic, written, or verbal communication with Cruise LLC, contributors or maintainers.\n\n**1. Grant of Copyright License.**\n\nSubject to the terms and conditions of this agreement, You grant to the Projects’ maintainers, contributors, users and to Cruise LLC a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare derivative works of, publicly display, publicly perform, sublicense, and distribute Your contributions and such derivative works. Except for this license, You reserve all rights, title, and interest in your contributions.\n\n**2. Grant of Patent License.**\n\nSubject to the terms and conditions of this agreement, You grant to the Projects’ maintainers, contributors, users and to Cruise LLC a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer your contributions, where such license applies only to those patent claims licensable by you that are necessarily infringed by your contribution or by combination of your contribution with the project to which this contribution was submitted.\nIf any entity institutes patent litigation - including cross-claim or counterclaim in a lawsuit - against You alleging that your contribution or any project it was submitted to constitutes or is responsible for direct or contributory patent infringement, then any patent licenses granted to that entity under this agreement shall terminate as of the date such litigation is filed.\n\n**3. Source of Contribution.**\n\nYour contribution is either your original creation, based upon previous work that, to the best of your knowledge, is covered under an appropriate open source license and you have the right under that license to submit that work with modifications, whether created in whole or in part by you, or you have clearly identified the source of the contribution and any license or other restriction (like related patents, trademarks, and license agreements) of which you are personally aware.\n"
  },
  {
    "path": "Changelog.md",
    "content": "# Change Log\n<!---\nAlways update Version in Makefile\n-->\n\n## Unreleased\n\n## [v1.4.4] - 2022-10-24\n\n### Changed\n- updated Building.md\n- updated Readme.md\n- Scripts now get the full filepath as second argument (before it would pass `bash` now it will pass `/bin/bash`)\n\n### Fixed\n- Fix a bug where incorrect keys in checksec were silently skipped\n\n## [v1.4.3] - 2020-08-17\n\n### Changed\n- support older versions of checksec\n\n## [v1.4.2] - 2020-08-17\n\n### Added\n- checksec wrapper script, see [check_sec.sh](scripts/check_sec.sh) and [Checksec Wrapper Readme](Checksec.md)\n- link support for extfs, this requires `https://github.com/crmulliner/e2tools/tree/link_support` (or later)\n\n### Changed\n- updated `test/test.img.gz` ext2 test filesystem image\n- updated `test/e2cp` binary\n\n## [v1.4.1] - 2020-05-06\n\n### Fixed\n- removed `release/` folder\n- FileStatCheck for links\n- general handling for links\n\n## [v1.4.0] - 2020-04-30\n\n### Added\n- NEW support for Linux Capabilities\n- NEW Capability support for ext2/3/4 and squashfs\n- NEW Selinux support for SquashFS\n\n### Changed\n- _check.py_ cleaned up a bit, avoiding using `shell=True` in subprocess invocations.\n- updated linter version to v1.24\n- switch back to `-lls` for unsquashfs\n- copyright: GM Cruise -> Cruise\n\n### Fixed\n- FileTreeCheck LinkTarget handling\n\n## [v1.3.2] - 2020-01-15\n\n### Fixed\n- _check.py_ fix to support pathnames with spaces\n- _cpiofs_ fix date parsing\n- _cpiofs_ added work around for missing directory entries\n\n## [v1.3.1] - 2020-01-07\n\n### Fixed\n- report status in _check.py_\n- use quiet flag for _cpiofs_\n\n## [v1.3.0] - 2020-01-07\n\n### Added\n- NEW _cpiofs_ for cpio as filesystem\n- NEW universal _check.py_ (so you just need to write a custom unpacker)\n- NEW _android/unpack.sh_ (for _check.py_)\n- better options for scripts (FileContent and DataExtract)\n\n### Fixed\n- $PATH in makefile\n- FileContent file iterator\n- _squashfs_ username parsing\n\n## [v1.2.0] - 2019-11-19\n\n### Changed\n- moved to go 1.13\n- only store _current_file_treepath_ if filetree changed\n\n## [v.1.1.0] - 2019-10-15\n\n### Added\n- NEW FileCmp check for full file diff against 'old' version\n- allow multiple matches for regex based DataExtract\n\n### Fixed\n- squashfs username parsing\n\n## [v.1.0.1] - 2019-09-19\n\n### Fixed\n- filename for BadFiles check output\n\n## [v.1.0.0] - 2019-08-15\n\n### Added\n- CI\n- Build instructions\n\n## [initial] - 2019-08-05\n"
  },
  {
    "path": "Checksec.md",
    "content": "# checksec Integration\n\n[checksec](https://github.com/slimm609/checksec.sh) is a bash script for checking security properties of executables (like PIE, RELRO, Canaries, ...).\n\nChecksec is an incredible helpful tool therefore we developed a wrapper script for FwAnalyzer to ease the usage of checksec. Below\nwe go through the steps required to use checksec with FwAnalyzer.\n\n## Installation\n\nThe installation is rather simple. Clone the checksec repository and copy the `checksec` script to a directory in your PATH\nor add the directory containing `checksec` to your PATH.\n\n## Configuration\n\nConfiguration is done in two steps. First step is adding a `FileContent` check that uses the `Script` option.\nThe second step is creating the checksec wrapper configuration. The configuration allows you to selectively skip files\n(e.g. vendor binaries) and fine tune the security features that you want to enforce.\n\n### checksec wrapper configuration\n\nThe checksec wrapper has two options, and uses JSON:\n\n- cfg : checksec config, where you can select acceptable values for each field in the checksec output. The key is the name of the checksec field and the value is an array where each item is an acceptable value (e.g. allow `full` and `partial` RELRO). Omitted fields are not checked.\n- skip : array of fully qualified filenames that should be not checked\n\nexample config:\n```json\n{\n  \"cfg\":\n  {\n    \"pie\": [\"yes\"],\n    \"nx\": [\"yes\"],\n    \"relro\": [\"full\", \"partial\"]\n  },\n  \"skip\": [\"/usr/bin/bla\",\"/bin/blabla\"]\n}\n```\n\n### FwAnalyzer configuration\n\nThe FwAnalyzer configuration uses the checksec wrapper config and looks like in the example below.\nWe define a `FileContent` check and select `/usr/bin` as the target directory.\nThe name of the wrapper script is `check_sec.sh`.\nWe pass two options to the script. First argument `*` selects all files in `/usr/bin` and\nthe second argument is the checksec wrapper config we created above.\n\nexample config:\n```ini\n[FileContent.\"checksec_usr_bin\"]\nFile = \"/usr/bin\"\nScript = \"check_sec.sh\"\nScriptOptions = [\"*\",\n\"\"\"\n{\n\"cfg\":{\n  \"pie\": [\"yes\"],\n  \"nx\": [\"yes\"],\n  \"relro\": [\"full\", \"partial\"]\n },\n \"skip\": [\"/usr/bin/bla\",\"/bin/blabla\"]\n}\n\"\"\"]\n```\n\n\n### Example Output\n\n```json\n\"offenders\": {\n  \"/usr/bin/example\": [\n  {\n    \"canary\": \"no\",\n    \"fortified\": \"0\",\n    \"fortify-able\": \"24\",\n    \"fortify_source\": \"no\",\n    \"nx\": \"yes\",\n    \"pie\": \"no\",\n    \"relro\": \"partial\",\n    \"rpath\": \"no\",\n    \"runpath\": \"no\",\n    \"symbols\": \"no\"\n  }\n  ]\n}\n```\n"
  },
  {
    "path": "Dockerfile",
    "content": "FROM golang:1.13\n\nRUN apt update && apt -y install e2tools mtools file squashfs-tools unzip python-setuptools python-lzo cpio sudo\nRUN wget https://github.com/crmulliner/ubi_reader/archive/master.zip -O ubireader.zip && unzip ubireader.zip && cd ubi_reader-master && python setup.py install\n\nWORKDIR $GOPATH/src/github.com/cruise-automation/fwanalyzer\n\nCOPY . ./\n\nRUN make deps\n"
  },
  {
    "path": "LICENSE",
    "content": "\n                                 Apache License\n                           Version 2.0, January 2004\n                        http://www.apache.org/licenses/\n\n   TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION\n\n   1. Definitions.\n\n      \"License\" shall mean the terms and conditions for use, reproduction,\n      and distribution as defined by Sections 1 through 9 of this document.\n\n      \"Licensor\" shall mean the copyright owner or entity authorized by\n      the copyright owner that is granting the License.\n\n      \"Legal Entity\" shall mean the union of the acting entity and all\n      other entities that control, are controlled by, or are under common\n      control with that entity. For the purposes of this definition,\n      \"control\" means (i) the power, direct or indirect, to cause the\n      direction or management of such entity, whether by contract or\n      otherwise, or (ii) ownership of fifty percent (50%) or more of the\n      outstanding shares, or (iii) beneficial ownership of such entity.\n\n      \"You\" (or \"Your\") shall mean an individual or Legal Entity\n      exercising permissions granted by this License.\n\n      \"Source\" form shall mean the preferred form for making modifications,\n      including but not limited to software source code, documentation\n      source, and configuration files.\n\n      \"Object\" form shall mean any form resulting from mechanical\n      transformation or translation of a Source form, including but\n      not limited to compiled object code, generated documentation,\n      and conversions to other media types.\n\n      \"Work\" shall mean the work of authorship, whether in Source or\n      Object form, made available under the License, as indicated by a\n      copyright notice that is included in or attached to the work\n      (an example is provided in the Appendix below).\n\n      \"Derivative Works\" shall mean any work, whether in Source or Object\n      form, that is based on (or derived from) the Work and for which the\n      editorial revisions, annotations, elaborations, or other modifications\n      represent, as a whole, an original work of authorship. For the purposes\n      of this License, Derivative Works shall not include works that remain\n      separable from, or merely link (or bind by name) to the interfaces of,\n      the Work and Derivative Works thereof.\n\n      \"Contribution\" shall mean any work of authorship, including\n      the original version of the Work and any modifications or additions\n      to that Work or Derivative Works thereof, that is intentionally\n      submitted to Licensor for inclusion in the Work by the copyright owner\n      or by an individual or Legal Entity authorized to submit on behalf of\n      the copyright owner. For the purposes of this definition, \"submitted\"\n      means any form of electronic, verbal, or written communication sent\n      to the Licensor or its representatives, including but not limited to\n      communication on electronic mailing lists, source code control systems,\n      and issue tracking systems that are managed by, or on behalf of, the\n      Licensor for the purpose of discussing and improving the Work, but\n      excluding communication that is conspicuously marked or otherwise\n      designated in writing by the copyright owner as \"Not a Contribution.\"\n\n      \"Contributor\" shall mean Licensor and any individual or Legal Entity\n      on behalf of whom a Contribution has been received by Licensor and\n      subsequently incorporated within the Work.\n\n   2. Grant of Copyright License. Subject to the terms and conditions of\n      this License, each Contributor hereby grants to You a perpetual,\n      worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n      copyright license to reproduce, prepare Derivative Works of,\n      publicly display, publicly perform, sublicense, and distribute the\n      Work and such Derivative Works in Source or Object form.\n\n   3. Grant of Patent License. Subject to the terms and conditions of\n      this License, each Contributor hereby grants to You a perpetual,\n      worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n      (except as stated in this section) patent license to make, have made,\n      use, offer to sell, sell, import, and otherwise transfer the Work,\n      where such license applies only to those patent claims licensable\n      by such Contributor that are necessarily infringed by their\n      Contribution(s) alone or by combination of their Contribution(s)\n      with the Work to which such Contribution(s) was submitted. If You\n      institute patent litigation against any entity (including a\n      cross-claim or counterclaim in a lawsuit) alleging that the Work\n      or a Contribution incorporated within the Work constitutes direct\n      or contributory patent infringement, then any patent licenses\n      granted to You under this License for that Work shall terminate\n      as of the date such litigation is filed.\n\n   4. Redistribution. You may reproduce and distribute copies of the\n      Work or Derivative Works thereof in any medium, with or without\n      modifications, and in Source or Object form, provided that You\n      meet the following conditions:\n\n      (a) You must give any other recipients of the Work or\n          Derivative Works a copy of this License; and\n\n      (b) You must cause any modified files to carry prominent notices\n          stating that You changed the files; and\n\n      (c) You must retain, in the Source form of any Derivative Works\n          that You distribute, all copyright, patent, trademark, and\n          attribution notices from the Source form of the Work,\n          excluding those notices that do not pertain to any part of\n          the Derivative Works; and\n\n      (d) If the Work includes a \"NOTICE\" text file as part of its\n          distribution, then any Derivative Works that You distribute must\n          include a readable copy of the attribution notices contained\n          within such NOTICE file, excluding those notices that do not\n          pertain to any part of the Derivative Works, in at least one\n          of the following places: within a NOTICE text file distributed\n          as part of the Derivative Works; within the Source form or\n          documentation, if provided along with the Derivative Works; or,\n          within a display generated by the Derivative Works, if and\n          wherever such third-party notices normally appear. The contents\n          of the NOTICE file are for informational purposes only and\n          do not modify the License. You may add Your own attribution\n          notices within Derivative Works that You distribute, alongside\n          or as an addendum to the NOTICE text from the Work, provided\n          that such additional attribution notices cannot be construed\n          as modifying the License.\n\n      You may add Your own copyright statement to Your modifications and\n      may provide additional or different license terms and conditions\n      for use, reproduction, or distribution of Your modifications, or\n      for any such Derivative Works as a whole, provided Your use,\n      reproduction, and distribution of the Work otherwise complies with\n      the conditions stated in this License.\n\n   5. Submission of Contributions. Unless You explicitly state otherwise,\n      any Contribution intentionally submitted for inclusion in the Work\n      by You to the Licensor shall be under the terms and conditions of\n      this License, without any additional terms or conditions.\n      Notwithstanding the above, nothing herein shall supersede or modify\n      the terms of any separate license agreement you may have executed\n      with Licensor regarding such Contributions.\n\n   6. Trademarks. This License does not grant permission to use the trade\n      names, trademarks, service marks, or product names of the Licensor,\n      except as required for reasonable and customary use in describing the\n      origin of the Work and reproducing the content of the NOTICE file.\n\n   7. Disclaimer of Warranty. Unless required by applicable law or\n      agreed to in writing, Licensor provides the Work (and each\n      Contributor provides its Contributions) on an \"AS IS\" BASIS,\n      WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or\n      implied, including, without limitation, any warranties or conditions\n      of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A\n      PARTICULAR PURPOSE. You are solely responsible for determining the\n      appropriateness of using or redistributing the Work and assume any\n      risks associated with Your exercise of permissions under this License.\n\n   8. Limitation of Liability. In no event and under no legal theory,\n      whether in tort (including negligence), contract, or otherwise,\n      unless required by applicable law (such as deliberate and grossly\n      negligent acts) or agreed to in writing, shall any Contributor be\n      liable to You for damages, including any direct, indirect, special,\n      incidental, or consequential damages of any character arising as a\n      result of this License or out of the use or inability to use the\n      Work (including but not limited to damages for loss of goodwill,\n      work stoppage, computer failure or malfunction, or any and all\n      other commercial damages or losses), even if such Contributor\n      has been advised of the possibility of such damages.\n\n   9. Accepting Warranty or Additional Liability. While redistributing\n      the Work or Derivative Works thereof, You may choose to offer,\n      and charge a fee for, acceptance of support, warranty, indemnity,\n      or other liability obligations and/or rights consistent with this\n      License. However, in accepting such obligations, You may act only\n      on Your own behalf and on Your sole responsibility, not on behalf\n      of any other Contributor, and only if You agree to indemnify,\n      defend, and hold each Contributor harmless for any liability\n      incurred by, or claims asserted against, such Contributor by reason\n      of your accepting any such warranty or additional liability.\n\n   END OF TERMS AND CONDITIONS\n\n   APPENDIX: How to apply the Apache License to your work.\n\n      To apply the Apache License to your work, attach the following\n      boilerplate notice, with the fields enclosed by brackets \"[]\"\n      replaced with your own identifying information. (Don't include\n      the brackets!)  The text should be enclosed in the appropriate\n      comment syntax for the file format. We also recommend that a\n      file or class name and description of purpose be included on the\n      same \"printed page\" as the copyright notice for easier\n      identification within third-party archives.\n\n   Copyright [yyyy] [name of copyright owner]\n\n   Licensed under the Apache License, Version 2.0 (the \"License\");\n   you may not use this file except in compliance with the License.\n   You may obtain a copy of the License at\n\n       http://www.apache.org/licenses/LICENSE-2.0\n\n   Unless required by applicable law or agreed to in writing, software\n   distributed under the License is distributed on an \"AS IS\" BASIS,\n   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n   See the License for the specific language governing permissions and\n   limitations under the License.\n"
  },
  {
    "path": "Makefile",
    "content": ".PHONY: build\n\nifeq ($(GOOS),)\nGOOS := \"linux\"\nendif\n\nVERSION=1.4.1\n\nPWD := $(shell pwd)\n\nall: build\n\n.PHONY: build\nbuild:\n\tgo mod verify\n\tmkdir -p build\n\tGOOS=$(GOOS) go build -a -ldflags '-w -s' -o build/fwanalyzer ./cmd/fwanalyzer\n\n.PHONY: release\nrelease: build\n\tmkdir -p release\n\tcp build/fwanalyzer release/fwanalyzer-$(VERSION)-linux-amd64\n\n.PHONY: testsetup\ntestsetup:\n\tgunzip -c test/test.img.gz >test/test.img\n\tgunzip -c test/ubifs.img.gz >test/ubifs.img\n\tgunzip -c test/cap_ext2.img.gz >test/cap_ext2.img\n\tsudo setcap cap_net_admin+p test/test.cap.file\n\tgetcap test/test.cap.file\n\n.PHONY: test\ntest:\n\tPATH=\"$(PWD)/scripts:$(PWD)/test:$(PATH)\" go test -count=3 -cover ./...\n\n.PHONY: integration-test\nintegration-test: build\n\tPATH=\"$(PWD)/scripts:$(PWD)/test:$(PWD)/build:$(PATH)\" ./test/test.py\n\n.PHONY: ci-tests\nci-tests: build test integration-test\n\techo \"done\"\n\n.PHONY: modules\nmodules:\n\tgo mod tidy\n\n.PHONY: deploy\ndeploy: build\n\n.PHONY: clean\nclean:\n\trm -rf build\n\n.PHONY: distclean\ndistclean: clean\n\trm -rf vendor\n\n.PHONY: deps\ndeps:\n\tgo mod download\n"
  },
  {
    "path": "Readme.md",
    "content": "# FwAnalyzer (Firmware Analyzer)\n\n[![CircleCI](https://circleci.com/gh/cruise-automation/fwanalyzer.svg?style=shield)](https://circleci.com/gh/cruise-automation/fwanalyzer)\n\n\nFwAnalyzer is a tool to analyze (ext2/3/4), FAT/VFat, SquashFS, UBIFS filesystem images,\ncpio archives, and directory content using a set of configurable rules.\nFwAnalyzer relies on [e2tools](https://github.com/crmulliner/e2tools/) for ext filesystems,\n[mtools](https://www.gnu.org/software/mtools/) for FAT filesystems,\n[squashfs-tools](https://github.com/plougher/squashfs-tools) for SquashFS filesystems, and\n[ubi_reader](https://github.com/crmulliner/ubi_reader) for UBIFS filesystems.\n[cpio](https://www.gnu.org/software/cpio/) for cpio archives.\nSELinux/Capability support for ext2/3/4 images requires a patched version of [e2tools](https://github.com/crmulliner/e2tools/).\nSELinux/Capability support for SquashFS images requires a patched version of [squashfs-tools](https://github.com/crmulliner/squashfs-tools/).\n\n![fwanalyzer](images/fwanalyzer.png)\n\n## Overview\n\nThe main idea of **FwAnalyzer** is to provide a tool for rapid analysis of\nfilesystem images as part of a firmware security Q&A check suite. FwAnalyzer\ntakes a configuration file that defines various rules for files and directories\nand runs the configured checks against a given filesystem image. The output of\nFwAnalyzer is a report, which contains the list of files that violate any of\nthe rules specified in the configuration. The report further contains meta\ninformation about the filesystem image and, if configured, information\nextracted from files within the analyzed filesystem. The report is formatted\nusing JSON so it can be easily integrated as a step in a larger analysis.\n\nExample report:\n\n```json\n{\n    \"fs_type\": \"extfs\",\n    \"image_digest\": \"9d5fd9acc98421b46976f283175cc438cf549bb0607a1bca6e881d3e7f323794\",\n    \"image_name\": \"test/test.img\",\n    \"current_file_tree_path\": \"test/oldtree.json.new\",\n    \"old_file_tree_path\": \"test/oldtree.json\",\n    \"data\": {\n        \"Version\": \"1.2.3\",\n        \"date1 file\": \"Mon Oct  1 16:13:05 EDT 2018\\n\"\n    },\n    \"informational\": {\n        \"/bin\": [\n                \"CheckFileTree: new file: 40755 1001:1001 1024 0 SeLinux label: -\"\n        ],\n    },\n    \"offenders\": {\n        \"/bin/elf_arm32\": [\n                \"script(check_file_elf_stripped.sh) returned=elf_arm32 is not stripped\"\n        ],\n        \"/file1\": [\n                \"File not allowed\"\n        ],\n        \"/file2\": [\n                \"File is WorldWriteable, not allowed\",\n                \"File Uid not allowed, Uid = 123\"\n        ],\n    }\n}\n```\n\n## Building and Development\n\nFollow the steps described in [Building](Building.md) to install all\nrequirements and build FwAnalyzer.\n\n## Using FwAnalyzer\n\nCommand line options\n- `-cfg`         : string, path to the config file\n- `-cfgpath`     : string, path to config file and included files (can be repeated)\n- `-in`          : string, filesystem image file or path to directory\n- `-out`         : string, output report to file or stdout using '-'\n- `-extra`       : string, overwrite directory to read extra data from (e.g. filetree, filecmp)\n- `-ee`          : exit with error if offenders are present\n- `-invertMatch` : invert regex matches (for testing)\n\nExample:\n```sh\nfwanalyzer -cfg system_fwa.toml -in system.img -out system_check_output.json\n```\n\nExample for using custom scripts stored in the _scripts/_ directory:\n```sh\nPATH=$PATH:./scripts fwanalyzer -cfg system_fwa.toml -in system.img -out system_check_output.json\n```\n\nThe [_devices/_](devices/) folder contains helper scripts for unpacking and\ndealing with specific device types and firmware package formats such as\n[Android](devices/android). It also includes general configuration files that\ncan be included in target specific FwAnalyzer configurations.\n\n_check.py_ in the [_devices/_](devices) folder provides a universal script to\neffectively use FwAnalyzer, see [devices/Readme.md](devices/Readme.md) for\ndetails. This likely is how most people will invoke FwAnalyzer.\n\nThe [_scripts/_](scripts/) folder contains helper scripts that can be called\nfrom FwAnalyzer for file content analysis and data extraction. Most interesting\nshould be our checksec wrapper [_check_sec.sh_](scripts/check_sc.sh), see the\n[Checksec Wrapper Readme](Checksec.md).\n\n## Config Options\n\n### Global Config\n\nThe global config is used to define some general parameters.\n\nThe `FsType` (filesystem type) field selects the backend that is used to access\nthe files in the image. The supported options for FsType are:\n\n- `dirfs`: to read files from a directory on the host running fwanalyzer, supports Capabilities (supported FsTypeOptions are: N/A)\n- `extfs`: to read ext2/3/4 filesystem images (supported FsTypeOptions are: `selinux` and `capabilities`)\n- `squashfs`: to read SquashFS filesystem images (supported FsTypeOptions are: `securityinfo`)\n- `ubifs`: to read UBIFS filesystem images (supported FsTypeOptions are: N/A)\n- `vfatfs`: to read VFat filesystem images (supported FsTypeOptions are: N/A)\n- `cpiofs`: to read cpio archives (supported FsTypeOptions are: `fixdirs`)\n\nThe FsTypeOptions allow tuning of the FsType driver.\n- `securityinfo`: will enable selinux and capability support for SquashFS images\n- `capabilities`: will enable capability support when reading ext filesystem images\n- `selinux`: will enable selinux support when reading ext filesystem images\n- `fixdirs`: will attempt to work around a cpio issue where a file exists in a directory while there is no entry for the directory itself\n\nThe `DigestImage` option will generate a SHA-256 digest of the filesystem image\nthat was analyzed, the digest will be included in the output.\n\nExample:\n```toml\n[GlobalConfig]\nFsType        = \"extfs\"\nFsTypeOptions = \"selinux\"\nDigestImage   = true\n```\n\nExample Output:\n```json\n\"fs_type\": \"extfs\",\n\"image_digest\": \"9d5fd9acc98421b46976f283175cc438cf549bb0607a1bca6e881d3e7f323794\",\n\"image_name\": \"test/test.img\",\n```\n\n### Include\n\nThe `Include` statement is used to include other FwAnalyzer configuration files\ninto the configuration containing the statement. The include statement can\nappear in any part of the configuration.\n\nThe `-cfgpath` parameter sets the search path for include files.\n\nExample:\n```toml\n[Include.\"fw_base.toml\"]\n```\n\n### Global File Checks\n\nThe `GlobalFileChecks` are more general checks that are applied to the entire filesystem.\n- `Suid`: bool, (optional) if enabled the analysis will fail if any file has the sticky bit set (default: false)\n- `SuidAllowedList`: string array, (optional) allows Suid files (by full path) for the Suid check\n- `WorldWrite`: bool, (optional) if enabled the analysis will fail if any file can be written to by any user (default: false)\n- `SELinuxLabel`: string, (optional) if enabled the analysis will fail if a file does NOT have an SeLinux label\n- `Uids`: int array, (optional) specifies every allowed UID in the system, every file needs to be owned by a Uid specified in this list\n- `Gids`: int array, (optional) specifies every allowed GID in the system, every file needs to be owned by a Gid specified in this list\n- `BadFiles`: string array, (optional) specifies a list of unwanted files, allows wildcards such as `?`, `*`, and `**` (no file in this list should exist)\n- `BadFilesInformationalOnly`: bool, (optional) the result of the BadFile check will be Informational only (default: false)\n- `FlagCapabilityInformationalOnly`: bool, (optional) flag files for having a Capability set as Informational (default: false)\n\nExample:\n```toml\n[GlobalFileChecks]\nSuid          = true\nSuidAllowedList = [\"/bin/sudo\"]\nSELinuxLabel  = false\nWorldWrite    = true\nUids          = [0,1001,1002]\nGids          = [0,1001,1002]\nBadFiles      = [\"/file99\", \"/file1\", \"*.h\"]\n```\n\nExample Output:\n```json\n\"offenders\": {\n  \"/bin/su\": [ \"File is SUID, not allowed\" ],\n  \"/file1\":  [ \"File Uid not allowed, Uid = 123\" ],\n  \"/world\":  [ \"File is WorldWriteable, not allowed\" ],\n}\n```\n\n### Link Handling\n\nWith links we refer to soft links. Links can point to files on a different\nfilesystem, therefore, we handle them in a special way. Link handling requires\na patched version of e2tools:\n\n- [e2tools](https://github.com/crmulliner/e2tools/tree/link_support) with link support\n\n`FileStatCheck` will handle links like you would expect it. However if\n`AllowEmpty` is `false` and the file is a link then the check fails.\n\nAll other checks and dataextract will fail if the file is a link. Those checks\nneed to be pointed to the actual file (the file the link points to).\n\n### File Stat Check\n\nThe `FileStatCheck` can be used to model the metadata for a specific file or\ndirectory. Any variation of the configuration will be reported as an offender.\n\n- `AllowEmpty`: bool, (optional) defines that the file can have zero size will\n  cause error if file is link (default: false)\n- `Uid`: int, (optional) specifies the UID of the file, not specifying a UID or\n  specifying -1 will skip the check\n- `Gid`: int, (optional) specifies the GID of the file, not specifying a GID or\n  specifying -1 will skip the check\n- `Mode`: string, (optional) specifies the UN*X file mode/permissions in octal,\n  not specifying a mode will skip the check\n- `SELinuxLabel`: string, (optional) the SELinux label of the file (will skip\n  the check if not set)\n- `LinkTarget`: string, (optional) the target of a symlink, not specifying a\n  link target will skip the check. This is currently supported for `dirfs`,\n  `squashfs`, `cpiofs`, `ubifs`, and `extfs` filesystems.\n- `Capability`: string array, (optional) list of capabilities (e.g.\n  cap_net_admin+p).\n- `Desc`: string, (optional) is a descriptive string that will be attached to\n  the report if there is a failed check\n- `InformationalOnly`: bool, (optional) the result of the check will be\n  Informational only (default: false)\n\nExample:\n```toml\n[FileStatCheck.\"/etc/passwd\"]\nAllowEmpty = false\nUid        = 0\nGid        = 0\nMode       = \"0644\"\nDesc       = \"this need to be this way\"\n```\n\nExample Output:\n```json\n\"offenders\": {\n  \"/file2\": [ \"File State Check failed: size: 0 AllowEmpyt=false : this needs to be this way\" ],\n}\n```\n\n### File Path Owner Check\n\nThe `FilePathOwner` check can be used to model the file/directory ownership for\na entire tree of the filesystem. The check fails if any file or directory with\nin the given directory is not owned by the specified `Uid` and `Gid`  (type:\nint).\n\nExample:\n```toml\n[FilePathOwner.\"/bin\"]\nUid = 0\nGid = 0\n```\n\nExample Output:\n```json\n\"offenders\": {\n  \"/dir1/file3\": [ \"FilePathOwner Uid not allowed, Uid = 1002 should be = 0\",\n                   \"FilePathOwner Gid not allowed, Gid = 1002 should be = 0\" ],\n}\n```\n\n### File Content Check\n\nThe `FileContent` check allows to inspect the content of files. The content of\na file can be check using four different methods. The file content check can be\nrun in non enforcement mode by setting `InformationalOnly` to true (default is false).\nInformationalOnly checks will produce informational element in place of an\noffender.\n\n#### Example: Regular Expression on entire file body\n\n- `File`: string, the full path of the file\n- `RegEx`: string, posix/golang regular expression\n- `RegExLineByLine`: bool, (optional) apply regex on a line by line basis,\n  matching line will be in result (default: false)\n- `Match`: bool, (optional) indicate if the regular expression should match or\n  not match (default: false)\n- `Desc`: string, (optional) is a descriptive string that will be attached to\n  failed check\n- `InformationalOnly`: bool, (optional) the result of the check will be\n  Informational only (default: false)\n\nExample:\n```toml\n[FileContent.\"RegExTest1\"]\nRegEx = \".*Ver=1337.*\"\nMatch = true\nFile  = \"/etc/version\"\n```\n\n#### Example: SHA-256 digest calculated over the file body\n\n- `File`: string, the full path of the file\n- `Digest`: string, HEX encoded digest\n- `Desc`: string, (optional) is a descriptive string that will be attached to\n  failed check\n- `InformationalOnly`: bool, (optional) the result of the check will be\n  Informational only\n\nExample:\n```toml\n[FileContent.\"DigestTest1\"]\nDigest = \"8b15095ed1af38d5e383af1c4eadc5ae73cab03964142eb54cb0477ccd6a8dd4\"\nFile   = \"/ver\"\n```\n\nExample Output:\n\n```json\n\"offenders\": {\n  \"/ver\": [ \"Digest (sha256) did not match found = 44c77e41961f354f515e4081b12619fdb15829660acaa5d7438c66fc3d326df3 should be = 8b15095ed1af38d5e383af1c4eadc5ae73cab03964142eb54cb0477ccd6a8dd4.\" ],\n}\n```\n\n#### Example: Run an external script passing the filename to the script\n\nThe file is extracted into a temp directory with a temp name before the script\nis executed. The check produces an offender if the script produced output on\nstdout or stderr.\n\n- `File`: string, the full path of the file or directory\n- `Script`: string, the full path of the script\n- `ScriptOptions`: string array, (optional) the first element allows to define\n  a pattern containing wildcards like `?`, `*`, and `**` that is applied to\n  filenames if present it will only check files that match the pattern, this is\n  mostly useful when running the script on a directory. Arguments can be passed\n  to the script using the second and following elements.\n- `File`: string, the full path of the file, if the path points to a directory\n  the script is run for every file in the directory and subdirectories\n\n- `Desc`: string, (optional) is a descriptive string that will be attached to\n  failed check\n- `InformationalOnly`: bool, (optional) the result of the check will be\n  Informational only (default: false)\n\nIf the `--` is present it indicates that the next argument is from the\n`ScriptOptions[1..N]`. The script is run with the following arguments:\n\n```\n<tmp filename> <original filename (fullpath)> <uid> <gid> <mode in octal> <selinux label or \"-\" for no label> [--] [script argument 1] ... [script argument N]\n```\n\nExample:\n```toml\n[FileContent.\"ScriptTest1\"]\nScript = \"check_file_x8664.sh\"\nFile   = \"/bin\"\n```\n\nExample Output:\n```json\n\"offenders\": {\n  \"/bin/elf_arm32\": [ \"script(check_file_x8664.sh) returned=elf_arm32 not a x86-64 elf file\" ],\n}\n```\n\n#### Json Field Compare\n\n- `File`: string, the full path of the file\n- `Json`: string, the field name using the dot (.) notation to access a field\n  within an object with a colon (:) separating the required value. All types\n  will be converted to string and compared as a string. Json arrays can be\n  index by supplying the index instead of a field name.\n- `Desc`: string, (optional) is a descriptive string that will be attached to\n  failed check\n- `InformationalOnly`: bool, (optional) the result of the check will be\n  Informational only (default: false)\n\nExample:\n```toml\n[FileContent.\"System_Arch\"]\nJson = \"System.Arch:arm64\"\nFile   = \"/system.json\"\nDesc = \"arch test\"\n```\n\nExample Input:\n```json\n{\n  \"System\": {\n    \"Version\": 7,\n    \"Arch\": \"arm32\",\n    \"Info\": \"customized\"\n  }\n}\n```\n\nExample Output:\n```json\n\"offenders\": {\n  \"/system.json\": [ \"Json field System.Arch = arm32 did not match = arm64, System.Arch, arch test\" ],\n}\n```\n\n### File Compare Check\n\nThe `FileCmp` (File Compare) check is a mechanism to compare a file from a\nprevious run with the file from the current run. The main idea behind this\ncheck is to provide more insights into file changes, since it allows comparing\ntwo versions of a file rather than comparing only a digest.\n\nThis works by saving the file as the `OldFilePath` (if it does not exist) and\nskipping the check at the first run. In consecutive runs the current file and\nthe saved old file will be copied to a temp directory. The script will be\nexecuted passing the original filename, the path to the old file and the path\nto the current file as arguments. If the script prints output the check will be\nmarked as failed.\n\n- `File`: string, the full path of the file\n- `Script`: string, path to the script\n- `ScriptOptions`: string array, (optional) arguments passed to the script\n- `OldFilePath`: string, filename (absolute or relative) to use to store old file\n- `InformationalOnly`: bool, (optional) the result of the check will be Informational only (default: false)\n\nScript runs as:\n```sh\nscript.sh <OrigFilename> <oldFile> <newFile> [--] [argument 1] .. [argument N]\n```\n\nExample:\n```toml\n[FileCmp.\"test.txt\"]\nFile = \"/test.txt\"\nScript = \"diff.sh\"\nOldFilePath = \"test.txt\"\nInformationalOnly = true\n```\n\n### File Tree Check\n\nThe `FileTree` check generates a full filesystem tree (a list of every file and directory) and compares it with a previously saved file tree. The check will produce an informational output listing new files, deleted files, and modified files.\n\n`CheckPath` (string array) specifies the paths that should be included in the check. If CheckPath is not set it will behave like it was set to `[\"/\"]` and will include the entire filesystem. If CheckPath was set to `[]` it will generate the file tree but will not check any files.\n\n`OldFileTreePath` specifies the filename to read the old filetree from, if a new filetree is generated (e.g. because the old filetree does not exist yet)\nthe newly generated filetree file is OldFileTreePath with \".new\" appeneded to it.\n\nThe `OldFileTreePath` is relative to the configuration file. This means for '-cfg testdir/test.toml' with OldTreeFilePath = \"test.json\" fwanalyzer will\ntry to read 'testdir/test.json'. The `-extra` command line option can be used to overwrite the path: '-cfg testdir/test.toml -extra test1' will try to\nread 'test1/test.json'. Similar the newly generated filetree file will be stored in the same directory.\n\nFile modification check can be customized with:\n\n- `CheckPermsOwnerChange`: bool, (optional) will tag a file as modified if owner or permission (mode) are changed (default: false)\n- `CheckFileSize`: bool, (optional) will tag a file as modified is the sized changed (default: false)\n- `CheckFileDigest`: bool, (optional) will tag a file as modified if the content changed (comparing it's SHA-256 digest) (default: false)\n- `SkipFileDigest`: bool, (optional) skip calculating the file digest (useful for dealing with very big files, default is: false)\n\nExample:\n```toml\n[FileTreeCheck]\nOldTreeFilePath       = \"testtree.json\"\nCheckPath             = [ \"/etc\", \"/bin\" ]\nCheckPermsOwnerChange = true\nCheckFileSize         = true\nCheckFileDigest       = false\n```\n\nExample Output:\n```json\n\"informational\": {\n    \"/bin/bla\": [ \"CheckFileTree: new file: 40755 1001:1001 1024 0 SeLinux label: -\" ]\n}\n```\n\n### Directory Content Check\n\nThe `DirCheck` (Directory content) check specifies a set of files that are\nallowed to be, or required to be, in a specified directory. Any other file or\ndirectory found in that directory will be reported as an offender. If an\n`Allowed` file isn't found, the check will pass. If a `Required` file is not\nfound, it will be reported as an offender.\n\nThe file entries can contain wildcards like `?`, `*`, and  `**`. The allowed patterns are described in\nthe [golang documentation](https://golang.org/pkg/path/filepath/#Match).\n\nOnly one `DirCheck` entry can exist per directory.\n\nExample:\n```toml\n[DirContent.\"/home\"]\nAllowed = [\"collin\", \"jon\"]\nRequired = [\"chris\"]\n```\n\n### Data Extract\n\nThe `DataExtract` option allows extracting data from a file and including it in\nthe report.  Data can be extracted via regular expression, by running an\nexternal script, or by reading a JSON object. The extracted data can later be\nused by the post processing script.\n\nThe Data Extract functionality adds the data to the report as a map of\nkey:value pairs.  The key is defined as the name of the statement or by the\noptional Name parameter.  The value is the result of the regular expression or\nthe output of the script.\n\n#### Example: Regular expression based data extraction\n\nThe output generated by the regular expression will be stored as the value for\nthe name of this statement, the example below is named \"Version\".\n\n- `File`: string, the full path of the file\n- `RegEx`: string, regular expression with one matching field\n- `Name`: string, (optional) the key name\n- `Desc`: string, (optional) description\n\nExample:\n\nThe key \"Version\" will contain the output of the regular expression.\n```toml\n[DataExtract.\"Version\"]\nFile   = \"/etv/versions\"\nRegEx  = \".*Ver=(.+)\\n\"\nDesc   = \"Ver 1337 test\"\n```\n\nExample Output:\n```json\n\"data\": {\n  \"Version\": \"1.2.3\",\n}\n```\n\n#### Example: Script-based data extraction\n\nThe output generated by the script will be stored as the value for the name of\nthis statement, the example below is named LastLine.\n\n- `File`: string, the full path of the file\n- `Script`:string, the full path of the script\n- `ScriptOptions`: string array (optionl), arguments to pass to the script\n- `Name`: string, (optional) the key name\n- `Desc`: string, (optional) description\n\nThe script is run with the following arguments:\n\n```\n<tmp filename> <original filename (fullpath)> <uid> <gid> <mode in octal> <selinux label or \"-\" for no label> [--] [script argument 1] ... [script argument N]\n```\n\nExample:\n\nThe key \"script_test\" will contain the output of the script. The name of this\nstatement is \"scripttest\"\n\n```toml\n[DataExtract.scripttest]\nFile   = \"/etc/somefile\"\nScript = \"extractscripttest.sh\"\nName   = \"script_test\"\n```\n\nExample Output:\n\n```json\n\"data\": {\n  \"script_test\": \"some data\",\n}\n```\n\n#### Example: JSON data extraction\n\nThe output generated by the script will be stored as the value for\nthe name of this statement, the example below is named LastLine.\n\n- `File`: string, the full path of the file\n- `Json`: string, the field name using the dot (.) notation to access a field\n  within an object\n- `Name`: string, (optional) the key name\n- `Desc`: string, (optional) description\n\nExample:\n\nThe key \"OS_Info\" will containt the content of the Info field from the System\nobject from _/etc/os_version.json_ below.\n\n```json\n{\n  \"System\": {\n    \"Version\": 7,\n    \"Arch\": \"arm32\",\n    \"Info\": \"customized\"\n  }\n}\n```\n\n```toml\n[DataExtract.OS_Info]\nFile   = \"/etc/os_version.json\"\nJson   = \"System.Info\"\nName   = \"OSinfo\"\n```\n\nExample Output:\n```json\n\"data\": {\n  \"OSinfo\": \"customized\",\n}\n```\n\nJson arrays can be indexed by supplying the index instead of a field name.\n\n#### Example: Advanced usage\n\nThe `DataExtract` statement allows multiple entries with the same Name (the\nsame key).  This can be useful for configuring multiple ways to extract the\nsame information.  The first data extract statement that produces valid output\nwill set the value for the given key.  This is supported for both regular\nexpressions and scripts and a mixture of both.\n\nThe example below shows two statements that will both create the key value pair\nfor the key \"Version\".  If \"1\" does not produce valid output the next one is\ntried, in this case \"2\".\n\nExample:\n\n```toml\n[DataExtract.\"1\"]\nFile  = \"/etc/versions\"\nRegEx = \".*Ver=(.+)\\n\"\nName  = \"Version\"\n\n[DataExtract.\"2\"]\nFile  = \"/etc/OSVersion\"\nRegEx = \".*OS Version: (.+)\\n\"\nName  = \"Version\"\n```\n\n# License\n\nCopyright 2019-present, Cruise LLC\n\nLicensed under the [Apache License Version 2.0](LICENSE) (the \"License\");\nyou may not use this project except in compliance with the License.\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n\n# Contributions\n\nContributions are welcome! Please see the agreement for contributions in\n[CONTRIBUTING.md](CONTRIBUTING.md).\n\nCommits must be made with a Sign-off (`git commit -s`) certifying that you\nagree to the provisions in [CONTRIBUTING.md](CONTRIBUTING.md).\n"
  },
  {
    "path": "cmd/fwanalyzer/fwanalyzer.go",
    "content": "/*\nCopyright 2019-present, Cruise LLC\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n\thttp://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n*/\n\npackage main\n\nimport (\n\t\"flag\"\n\t\"fmt\"\n\t\"io/ioutil\"\n\t\"os\"\n\t\"path\"\n\t\"strings\"\n\n\t\"github.com/BurntSushi/toml\"\n\n\t\"github.com/cruise-automation/fwanalyzer/pkg/analyzer\"\n\t\"github.com/cruise-automation/fwanalyzer/pkg/analyzer/dataextract\"\n\t\"github.com/cruise-automation/fwanalyzer/pkg/analyzer/dircontent\"\n\t\"github.com/cruise-automation/fwanalyzer/pkg/analyzer/filecmp\"\n\t\"github.com/cruise-automation/fwanalyzer/pkg/analyzer/filecontent\"\n\t\"github.com/cruise-automation/fwanalyzer/pkg/analyzer/filepathowner\"\n\t\"github.com/cruise-automation/fwanalyzer/pkg/analyzer/filestatcheck\"\n\t\"github.com/cruise-automation/fwanalyzer/pkg/analyzer/filetree\"\n\t\"github.com/cruise-automation/fwanalyzer/pkg/analyzer/globalfilechecks\"\n)\n\nfunc readFileWithCfgPath(filepath string, cfgpath []string) (string, error) {\n\tfor _, cp := range cfgpath {\n\t\tdata, err := ioutil.ReadFile(path.Join(cp, filepath))\n\t\tif err == nil {\n\t\t\treturn string(data), nil\n\t\t}\n\t}\n\tdata, err := ioutil.ReadFile(filepath)\n\treturn string(data), err\n}\n\n// read config file and parse Include statement reading all config files that are included\nfunc readConfig(filepath string, cfgpath []string) (string, error) {\n\tcfg := \"\"\n\tcfgBytes, err := readFileWithCfgPath(filepath, cfgpath)\n\tcfg = string(cfgBytes)\n\tif err != nil {\n\t\treturn cfg, err\n\t}\n\n\ttype includeCfg struct {\n\t\tInclude map[string]interface{}\n\t}\n\n\tvar include includeCfg\n\t_, err = toml.Decode(cfg, &include)\n\tif err != nil {\n\t\treturn cfg, err\n\t}\n\tfor inc := range include.Include {\n\t\tincCfg, err := readConfig(inc, cfgpath)\n\t\tif err != nil {\n\t\t\treturn cfg, err\n\t\t}\n\t\tcfg = cfg + incCfg\n\t}\n\treturn cfg, nil\n}\n\ntype arrayFlags []string\n\nfunc (af *arrayFlags) String() string {\n\treturn strings.Join(*af, \" \")\n}\n\nfunc (af *arrayFlags) Set(value string) error {\n\t*af = append(*af, value)\n\treturn nil\n}\n\nfunc main() {\n\tvar cfgpath arrayFlags\n\tvar in = flag.String(\"in\", \"\", \"filesystem image file or path to directory\")\n\tvar out = flag.String(\"out\", \"-\", \"output to file (use - for stdout)\")\n\tvar extra = flag.String(\"extra\", \"\", \"overwrite directory to read extra data from (filetree, cmpfile, ...)\")\n\tvar cfg = flag.String(\"cfg\", \"\", \"config file\")\n\tflag.Var(&cfgpath, \"cfgpath\", \"path to config file and included files (can be repated)\")\n\tvar errorExit = flag.Bool(\"ee\", false, \"exit with error if offenders are present\")\n\tvar invertMatch = flag.Bool(\"invertMatch\", false, \"invert RegEx Match\")\n\tflag.Parse()\n\n\tif *in == \"\" || *cfg == \"\" {\n\t\tfmt.Fprintf(os.Stderr, \"Usage of %s:\\n\", os.Args[0])\n\t\tflag.PrintDefaults()\n\t\tos.Exit(1)\n\t}\n\n\tcfgdata, err := readConfig(*cfg, cfgpath)\n\tif err != nil {\n\t\tfmt.Fprintf(os.Stderr, \"Could not read config file: %s, error: %s\\n\", *cfg, err)\n\t\tos.Exit(1)\n\t}\n\n\t// if no alternative extra data directory is given use the directory \"config filepath\"\n\tif *extra == \"\" {\n\t\t*extra = path.Dir(*cfg)\n\t}\n\n\tanalyzer := analyzer.NewFromConfig(*in, string(cfgdata))\n\n\tsupported, msg := analyzer.FsTypeSupported()\n\tif !supported {\n\t\tfmt.Fprintf(os.Stderr, \"%s\\n\", msg)\n\t\tos.Exit(1)\n\t}\n\n\tanalyzer.AddAnalyzerPlugin(globalfilechecks.New(string(cfgdata), analyzer))\n\tanalyzer.AddAnalyzerPlugin(filecontent.New(string(cfgdata), analyzer, *invertMatch))\n\tanalyzer.AddAnalyzerPlugin(filecmp.New(string(cfgdata), analyzer, *extra))\n\tanalyzer.AddAnalyzerPlugin(dataextract.New(string(cfgdata), analyzer))\n\tanalyzer.AddAnalyzerPlugin(dircontent.New(string(cfgdata), analyzer))\n\tanalyzer.AddAnalyzerPlugin(filestatcheck.New(string(cfgdata), analyzer))\n\tanalyzer.AddAnalyzerPlugin(filepathowner.New(string(cfgdata), analyzer))\n\tanalyzer.AddAnalyzerPlugin(filetree.New(string(cfgdata), analyzer, *extra))\n\n\tanalyzer.RunPlugins()\n\n\treport := analyzer.JsonReport()\n\tif *out == \"\" {\n\t\tfmt.Fprintln(os.Stderr, \"Use '-' for stdout or provide a filename.\")\n\t} else if *out == \"-\" {\n\t\tfmt.Println(report)\n\t} else {\n\t\terr := ioutil.WriteFile(*out, []byte(report), 0644)\n\t\tif err != nil {\n\t\t\tfmt.Fprintf(os.Stderr, \"Can't write report to: %s, error: %s\\n\", *out, err)\n\t\t}\n\t}\n\n\t_ = analyzer.CleanUp()\n\n\t// signal offenders by providing a error exit code\n\tif *errorExit && analyzer.HasOffenders() {\n\t\tos.Exit(1)\n\t}\n}\n"
  },
  {
    "path": "cmd/fwanalyzer/fwanalyzer_test.go",
    "content": "/*\nCopyright 2019-present, Cruise LLC\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n\thttp://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n*/\n\npackage main\n\nimport (\n\t\"io/ioutil\"\n\t\"strings\"\n\t\"testing\"\n\n\t\"github.com/cruise-automation/fwanalyzer/pkg/analyzer\"\n)\n\nfunc TestMain(t *testing.T) {\n\ttests := []struct {\n\t\tinclFile string\n\t\ttestFile string\n\t\tcontains []string\n\t}{\n\t\t{\n\t\t\t`\n[GlobalConfig]\nFsType=\"dirfs\"\n# we can have comments\n`,\n\t\t\t\"/tmp/fwa_test_cfg_file.1\",\n\t\t\t[]string{\"GlobalConfig\"},\n\t\t},\n\t\t{\n\t\t\t`\n[Include.\"/tmp/fwa_test_cfg_file.1\"]\n[Test]\na = \"a\"\n`,\n\t\t\t\"/tmp/fwa_test_cfg_file.2\",\n\t\t\t[]string{\"Test\"},\n\t\t},\n\n\t\t{\n\t\t\t`\n[Include.\"/tmp/fwa_test_cfg_file.2\"]\n`,\n\t\t\t\"/tmp/fwa_test_cfg_file.3\",\n\t\t\t[]string{\"Test\", \"GlobalConfig\"},\n\t\t},\n\t}\n\n\tfor _, test := range tests {\n\t\terr := ioutil.WriteFile(test.testFile, []byte(test.inclFile), 0644)\n\t\tif err != nil {\n\t\t\tt.Error(err)\n\t\t}\n\t\tcfg, err := readConfig(test.testFile, []string{})\n\t\tif err != nil {\n\t\t\tt.Error(err)\n\t\t}\n\t\tfor _, c := range test.contains {\n\t\t\tif !strings.Contains(cfg, c) {\n\t\t\t\tt.Errorf(\"include didn't work\")\n\t\t\t}\n\t\t}\n\t\t// this will panic if cfg contains an illegal config\n\t\tanalyzer.NewFromConfig(\"dummy\", cfg)\n\t}\n}\n"
  },
  {
    "path": "devices/Readme.md",
    "content": "# Devices\n\nThis directory contains support tools and popular checks that can be included in FwAnalyzer configs for multiple targets.\n\n- [Android](android)\n- [generic Linux](generic)\n\n## Check.py\n\ncheck.py is a universal script to run FwAnalyzer. It will unpack (with the help of a unpacker; see below) firmware\nand run fwanalyzer against each of the target filesystems, it will combine all of the reports\ninto one big report. In addition it will do some post processing of the filetree files (if present) and\nappend the result to the report.\n\nUsing check.py is straight forward (the example below is for an Android OTA firmware - make sure you have the required Android unpacking tools installed and added to your PATH, see: [Android](android/Readme.md)):\n\n```sh\ncheck.py --unpacker android/unpack.sh --fw some_device_ota.zip --cfg-path android --cfg-include android --fwanalyzer-bin ../build/fwanalyzer\n```\n\nThe full set of options is described below:\n```\nusage: check.py [-h] --fw FW --unpacker UNPACKER --cfg-path CFG_PATH\n                [--cfg-include-path CFG_INCLUDE_PATH] [--report REPORT]\n                [--keep-unpacked] [--fwanalyzer-bin FWANALYZER_BIN]\n                [--fwanalyzer-options FWANALYZER_OPTIONS]\n\noptional arguments:\n  -h, --help            show this help message and exit\n  --fw FW               path to firmware file OR path to unpacked firmware\n  --unpacker UNPACKER   path to unpacking script\n  --cfg-path CFG_PATH   path to directory containing config files\n  --cfg-include-path CFG_INCLUDE_PATH\n                        path to config include files\n  --report REPORT       report file\n  --keep-unpacked       keep unpacked data\n  --fwanalyzer-bin FWANALYZER_BIN\n                        path to fwanalyzer binary\n  --fwanalyzer-options FWANALYZER_OPTIONS\n                        options passed to fwanalyzer\n```\n\nThe _--keep-unpacked_ option will NOT delete the temp directory that contains the unpacked files.\nOnce you have the unpacked directory you can pass it to the _--fw_ option to avoid unpacking the\nfirmware for each run (e.g. while you test/modify your configuration files). See the example below.\n\n```sh\ncheck.py --unpacker android/unpack.sh --fw /tmp/tmp987689123 --cfg-path android --cfg-include android --fwanalyzer-bin ../build/fwanalyzer\n```\n\n### unpacker\n\nThe unpacker is used by check.py to _unpack_ firmware.\nThe unpacker needs to be an executable file, that takes two parameters first the `file` to unpack\nand second the `path to the config files` (the path that was provided via --cfg-path).\n\nThe unpacker needs to output a set of targets, the targets map a config file to a filesystem image (or directory).\nThe targets are specified as a JSON object.\n\nThe example below specifies two targets:\n\n- system : use _system.toml_ when analyzing _system.img_\n- boot: use _boot.toml_ when analyzing the content of directory _boot/_\n\n```json\n{ \"system\": \"system.img\" , \"boot\": \"boot/\" }\n```\n\nSee [Android/unpack.sh](android/unpack.sh) for a real world example.\n"
  },
  {
    "path": "devices/android/Readme.md",
    "content": "# Android OTA Firmware Analysis\n\nThe OTA file is a zip file with various files inside, the one file we care about is _payload.bin_.\nPayload.bin contains the filesystem images such as _system.img_ and _boot.img_.\nThe `check_ota.py` script unpacks an OTA file and runs FwAnalyzer on every filesystem image extracted from the OTA file.\n\n## FwAnalyzer Config\n\nThe OTA check script requires separate FwAnalyzer configuration files for each filesystem image that is extracted from the OTA file.\nThe `check_ota.py` script expects a directory that contains FwAnalyzer config files with the same name as the filesystem image but\nthe toml extensions. For example the config file for _system.img_ needs to be named _[system.toml](system.toml)_.\n\nOTA images contain _system.img_, _vendor.img_, _dsp.img_, and _boot.img_.\nAll images besides the _boot.img_ are ext4 filesystems and therefore the config file needs to have `FsType` set to `extfs`.\nThe _boot.img_ will be unpacked to a directory (using the `mkboot` tool), therefore, the _boot.toml_ file needs to have `FsType` set to `dirfs`.\n\n### Android Checks\n\nThe files _[android_user_build_checks.toml](android_user_build_checks.toml)_\nand _[android_user_build_checks_boot.toml](android_user_build_checks_boot.toml)_\nare a collection of very simple checks for Android production builds (user builds).\nThe config file can be included in custom FwAnalyzer config using the `Include` statement.\n\nThe _[android_properties.toml](android_properties.toml)_ file is a collection of `DataExtract`\nstatements that will extract Android properties from various parts of an Android firmware image.\n\n## Running check\\_ota.py\n\nThe OTA check fails if FwAnalyzer reports an Offender in any of the filesystem images.\nThe reports generated by FwAnalyzer are written to _IMAGENAME_out.json_ (e.g. _system_out.json_).\n\n`check_ota.py` arguments:\n- `--ota`              string : path to ota file\n- `--report`           string : path to report file (will be overwritten)\n- `--cfg-path`         string : path to directory containing fwanalyzer config files\n- `--cfg-include-path` string : path to directory containing fwanalyzer config include files\n- `--fwanalyzer-bin`   string : path to fwanalyzer binary\n- `--keep-unpacked`           : keep unpacked data\n- `--targets`          string : filesystem targets (e.g.: system boot)\n\nExample:\n```sh\n$ ls\n system.toml\n\n$ check_ota.py -ota update-ota.zip -cfg-path . -cfg-include-path . --targets system\n```\n\n## Required tools\n- [extract android ota payload](https://github.com/cyxx/extract_android_ota_payload.git) to extract the fs images from an ota update\n- [mkbootimg tools](https://github.com/xiaolu/mkbootimg_tools.git) unpack boot.img to extract kernel, initramfs, etc.\n"
  },
  {
    "path": "devices/android/android_properties.toml",
    "content": "\n# -- Android Properties --\n\n# - /system/etc/prop.default -\n\n[DataExtract.\"ro.debuggable__1\"]\nFile = \"/system/etc/prop.default\"\nRegEx = \".*\\\\nro\\\\.debuggable=(.+)\\\\n.*\"\n\n[DataExtract.\"ro.bootimage.build.fingerprint__1\"]\nFile = \"/system/etc/prop.default\"\nRegEx = \".*\\\\nro\\\\.bootimage\\\\.build\\\\.fingerprint=(\\\\S+)\\\\n.*\"\n\n[DataExtract.\"ro.bootimage.build.date__1\"]\nFile = \"/system/etc/prop.default\"\nRegEx = \".*\\\\nro\\\\.bootimage\\\\.build\\\\.date=(.+)\\\\n.*\"\n\n# - /system/build.prop -\n\n[DataExtract.\"ro.build.type__1\"]\nFile = \"/system/build.prop\"\nRegEx = \".*\\\\nro\\\\.build\\\\.type=(\\\\S+)\\\\n.*\"\n\n[DataExtract.\"ro.build.tags__1\"]\nFile = \"/system/build.prop\"\nRegEx = \".*\\\\nro\\\\.build\\\\.tags=(\\\\S+)\\\\n.*\"\n\n[DataExtract.\"ro.build.flavor__1\"]\nFile = \"/system/build.prop\"\nRegEx = \".*\\\\nro\\\\.build\\\\.flavor=(\\\\S+)\\\\n.*\"\n\n[DataExtract.\"ro.build.id__1\"]\nFile = \"/system/build.prop\"\nRegEx = \".*\\\\nro\\\\.build\\\\.id=(\\\\S+)\\\\n.*\"\n\n[DataExtract.\"ro.build.version.security_patch__1\"]\nFile = \"/system/build.prop\"\nRegEx = \".*\\\\nro\\\\.build\\\\.version\\\\.security_patch=(\\\\S+)\\\\n.*\"\n\n[DataExtract.\"ro.build.version.incremental__1\"]\nFile = \"/system/build.prop\"\nRegEx = \".*\\\\nro\\\\.build\\\\.version\\\\.incremental=(\\\\S+)\\\\n.*\"\n\n[DataExtract.\"ro.product.name__1\"]\nFile = \"/system/build.prop\"\nRegEx = \".*\\\\nro\\\\.product\\\\.name=(\\\\S+)\\\\n.*\"\n\n[DataExtract.\"ro.product.device__1\"]\nFile = \"/system/build.prop\"\nRegEx = \".*\\\\nro\\\\.product\\\\.device=(\\\\S+)\\\\n.*\"\n\n[DataExtract.\"ro.build.version.codename__1\"]\nFile = \"/system/build.prop\"\nRegEx = \".*\\\\nro\\\\.build\\\\.version\\\\.codename=(\\\\S+)\\\\n.*\"\n\n[DataExtract.\"ro.build.version.release__1\"]\nFile = \"/system/build.prop\"\nRegEx = \".*\\\\nro\\\\.build\\\\.version\\\\.release=(\\\\S+)\\\\n.*\"\n\n[DataExtract.\"ro.build.date__1\"]\nFile = \"/system/build.prop\"\nRegEx = \".*\\\\nro\\\\.build\\\\.date=(.+)\\\\n.*\"\n\n# - /boot_img/ramdisk/prop.default (from the boot image) -\n\n[DataExtract.\"ro.bootimage.build.fingerprint__2\"]\nFile = \"/boot_img/ramdisk/prop.default\"\nRegEx = \".*\\\\nro\\\\.bootimage\\\\.build\\\\.fingerprint=(\\\\S+)\\\\n.*\"\n\n[DataExtract.\"ro.bootimage.build.date__2\"]\nFile = \"/boot_img/ramdisk/prop.default\"\nRegEx = \".*\\\\nro\\\\.bootimage\\\\.build\\\\.date=(.+)\\\\n.*\"\n\n[DataExtract.\"ro.build.type__2\"]\nFile = \"/boot_img/ramdisk/prop.default\"\nRegEx = \".*\\\\nro\\\\.build\\\\.type=(\\\\S+)\\\\n.*\"\n\n[DataExtract.\"ro.build.tags__2\"]\nFile = \"/boot_img/ramdisk/prop.default\"\nRegEx = \".*\\\\nro\\\\.build\\\\.tags=(\\\\S+)\\\\n.*\"\n\n[DataExtract.\"ro.build.flavor__2\"]\nFile = \"/boot_img/ramdisk/prop.default\"\nRegEx = \".*\\\\nro\\\\.build\\\\.flavor=(\\\\S+)\\\\n.*\"\n\n[DataExtract.\"ro.build.id__2\"]\nFile = \"/boot_img/ramdisk/prop.default\"\nRegEx = \".*\\\\nro\\\\.build\\\\.id=(\\\\S+)\\\\n.*\"\n\n[DataExtract.\"ro.build.version.security_patch__2\"]\nFile = \"/boot_img/ramdisk/prop.default\"\nRegEx = \".*\\\\nro\\\\.build\\\\.version\\\\.security_patch=(\\\\S+)\\\\n.*\"\n\n[DataExtract.\"ro.build.version.incremental__2\"]\nFile = \"/boot_img/ramdisk/prop.default\"\nRegEx = \".*\\\\nro\\\\.build\\\\.version\\\\.incremental=(\\\\S+)\\\\n.*\"\n\n[DataExtract.\"ro.product.name__2\"]\nFile = \"/boot_img/ramdisk/prop.default\"\nRegEx = \".*\\\\nro\\\\.product\\\\.name=(\\\\S+)\\\\n.*\"\n\n[DataExtract.\"ro.product.device__2\"]\nFile = \"/boot_img/ramdisk/prop.default\"\nRegEx = \".*\\\\nro\\\\.product\\\\.device=(\\\\S+)\\\\n.*\"\n\n[DataExtract.\"ro.build.version.codename__2\"]\nFile = \"/boot_img/ramdisk/prop.default\"\nRegEx = \".*\\\\nro\\\\.build\\\\.version\\\\.codename=(\\\\S+)\\\\n.*\"\n\n[DataExtract.\"ro.build.version.release__2\"]\nFile = \"/boot_img/ramdisk/prop.default\"\nRegEx = \".*\\\\nro\\\\.build\\\\.version\\\\.release=(\\\\S+)\\\\n.*\"\n\n[DataExtract.\"ro.build.date__2\"]\nFile = \"/boot_img/ramdisk/prop.default\"\nRegEx = \".*\\\\nro\\\\.build\\\\.date=(.+)\\\\n.*\"\n\n[DataExtract.\"ro.debuggable__2\"]\nFile = \"/boot_img/ramdisk/prop.default\"\nRegEx = \".*\\\\nro\\\\.debuggable=(.+)\\\\n.*\"\n\n# -- Android Boot Partition Info --\n\n[DataExtract.\"androidboot.selinux__1\"]\nFile = \"/boot_img/img_info\"\nRegEx = \".*androidboot.selinux=(\\\\S+)\\\\s.*\"\n\n[DataExtract.\"buildvariant__1\"]\nFile = \"/boot_img/img_info\"\nRegEx = \".*buildvariant=(\\\\S+)\\\\s.*\"\n\n[DataExtract.\"veritykeyid__1\"]\nFile = \"/boot_img/img_info\"\nRegEx = \".*veritykeyid=id:(\\\\w+).*\"\n"
  },
  {
    "path": "devices/android/android_user_build_checks.toml",
    "content": "# -- Android user build checks --\n# \n# Basic checks for a production build.\n# Checks cover: system.img\n\n[FileContent.\"ro.build=user\"]\nFile = \"/system/build.prop\"\nRegex = \".*\\\\nro\\\\.build\\\\.type=user\\n.*\"\nDesc = \"ro.build.type must be user\"\n\n[FileContent.\"ro.secure=1\"]\nFile = \"/system/etc/prop.default\"\nRegex = \".*\\\\nro\\\\.secure=1.*\"\nDesc = \"ro.secure must be 1\"\n\n[FileContent.\"ro.debuggable=0\"]\nFile = \"/system/etc/prop.default\"\nRegex = \".*\\\\nro\\\\.debuggable=0.*\"\nDesc = \"ro.debuggable must be 0\"\n"
  },
  {
    "path": "devices/android/android_user_build_checks_boot.toml",
    "content": "# -- Android user build checks --\n# \n# Basic checks for a production build\n# checks cover: boot.img\n\n[FileContent.\"selinux enforcement\"]\nFile = \"/boot_img/img_info\"\nRegex = \".*androidboot.selinux=enforcing.*\"\nDesc = \"selinux must be set to enforcing\"\n\n[FileContent.\"buildvariant must be user\"]\nFile = \"/boot_img/img_info\"\nRegex = \".*buildvariant=user.*\"\nDesc = \"build variant must be 'user'\"\n\n[FileContent.\"veritykeyid should make sense\"]\nFile = \"/boot_img/img_info\"\nRegex = \".*veritykeyid=id:[[:alnum:]]+.*\"\nDesc = \"veritykeyid must be present\"\n\n[FileContent.\"ro.secure=1 (ramdisk)\"]\nFile = \"/boot_img/ramdisk/prop.default\"\nRegex = \".*\\\\nro.secure=1\\\\n.*\"\nDesc = \"ro.secure must be 1\"\n\n[FileContent.\"ro.debuggable=0 (ramdisk)\"]\nFile = \"/boot_img/ramdisk/prop.default\"\nRegex = \".*\\\\nro.debuggable=0\\\\n.*\"\nDesc = \"ro.debuggable must be 0\"\n"
  },
  {
    "path": "devices/android/check_ota.py",
    "content": "#!/usr/bin/env python3\n\n# Copyright 2019-present, Cruise LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\nimport json\nimport tempfile\nimport os\nimport os.path\nimport sys\nimport argparse\nimport subprocess\nimport hashlib\n\n\nclass CheckOTA:\n    def __init__(self, fwanalyzer=\"fwanalyzer\"):\n        self._tmpdir = tempfile.mktemp()\n        self._unpackdir = os.path.join(self._tmpdir, \"unpacked\")\n        self._fwanalyzer = fwanalyzer\n\n    def getTmpDir(self):\n        return self._tmpdir\n\n    def setUnpacked(self, unpacked):\n        self._tmpdir = os.path.realpath(unpacked + \"/..\")\n        self._unpackdir = os.path.realpath(unpacked)\n\n    def runFwAnalyzeFs(self, img, cfg, cfginc, out):\n        cfginclude = \"\"\n        if cfginc:\n            cfginclude = \" -cfgpath \" + cfginc\n        cmd = self._fwanalyzer + \" -in \" + img + cfginclude + \" -cfg \" + cfg + \" -out \" + out\n        subprocess.check_call(cmd, shell=True)\n\n    def unpack(self, otafile, otaunpacker, mkboot):\n        # customize based on firmware\n        #\n        # create tmp + unpackdir\n        cmd = \"mkdir -p \" + self._unpackdir\n        subprocess.check_call(cmd, shell=True)\n        cmd = \"unzip \" + otafile\n        subprocess.check_call(cmd, shell=True, cwd=self._unpackdir)\n        # unpack payload\n        cmd = otaunpacker + \" payload.bin\"\n        subprocess.check_call(cmd, shell=True, cwd=self._unpackdir)\n        # unpack boot.img\n        cmd = mkboot + \" boot.img boot_img\"\n        subprocess.check_call(cmd, shell=True, cwd=self._unpackdir)\n\n    def delTmpDir(self):\n        cmd = \"rm -rf \" + self._tmpdir\n        subprocess.check_call(cmd, shell=True)\n\n    # check result json\n    def checkResult(self, result):\n        with open(result) as read_file:\n            data = json.load(read_file)\n\n        if \"offenders\" in data:\n            status = False\n        else:\n            status = True\n\n        return (status, json.dumps(data, sort_keys=True, indent=2))\n\n\ndef getCfg(name):\n    return name + \".toml\"\n\n\ndef getOut(name):\n    return name + \"_out.json\"\n\n\ndef getImg(name):\n    if name == \"boot\":\n        return \"unpacked/\"\n    return \"unpacked/\" + name + \".img\"\n\n\ndef hashfile(fpath):\n    m = hashlib.sha256()\n    with open(fpath, 'rb') as f:\n        while True:\n            data = f.read(65535)\n            if not data:\n                break\n            m.update(data)\n    return m.hexdigest()\n\n\ndef makeReport(ota, data):\n    report = {}\n    report[\"firmware\"] = ota\n    status = True\n    for key in data:\n        s, r = out[key]\n        if not s:\n            status = s\n        report[key] = json.loads(r)\n\n    report[\"firmware_digest\"] = hashfile(ota)\n    report[\"status\"] = status\n    return json.dumps(report, sort_keys=True, indent=2)\n\n\nif __name__ == \"__main__\":\n    parser = argparse.ArgumentParser()\n    parser.add_argument('--ota', action='store', required=True, help=\"path to ota file\")\n    parser.add_argument('--cfg-path', action='store', required=True, help=\"path to directory containing config files\")\n    parser.add_argument('--cfg-include-path', action='store', help=\"path to config include files\")\n    parser.add_argument('--report', action='store', help=\"report file\")\n    parser.add_argument('--keep-unpacked', action='store_true', help=\"keep unpacked data\")\n    parser.add_argument('--targets', nargs='+', action='store', help=\"image targets e.g.: system vendor boot\")\n    parser.add_argument('--fwanalyzer-bin', action='store', default=\"fwanalyzer\", help=\"path to fwanalyzer binary\")\n    args = parser.parse_args()\n\n    # target file system images, a fwanalyzer config file is required for each of those\n    targets = [\"system\", \"vendor\", \"dsp\", \"boot\"]\n\n    # use target list from cmdline\n    if args.targets:\n        targets = args.targets\n\n    out = {}\n\n    for tgt in targets:\n        if not os.path.isfile(os.path.join(args.cfg_path, getCfg(tgt))):\n            print(\"OTA Check skipped, config file does not exist\")\n            sys.exit(0)\n\n    ota = os.path.realpath(args.ota)\n    cfg = os.path.realpath(args.cfg_path)\n    otaunpacker = \"extract_android_ota_payload.py\"\n    bootunpacker = \"mkboot\"\n\n    check = CheckOTA(args.fwanalyzer_bin)\n    if not ota.endswith(\"unpacked\"):\n        check.unpack(ota, otaunpacker, bootunpacker)\n    else:\n        check.setUnpacked(ota)\n        args.keep_unpacked = True\n        print(\"already unpacked\")\n\n    all_checks_ok = True\n    for tgt in targets:\n        check.runFwAnalyzeFs(os.path.join(check.getTmpDir(), getImg(tgt)),\n                             os.path.join(cfg, getCfg(tgt)), args.cfg_include_path, getOut(tgt))\n        ok, data = check.checkResult(getOut(tgt))\n        out[tgt] = ok, data\n        if not ok:\n            all_checks_ok = False\n\n    if args.keep_unpacked:\n        print(\"unpacked: {0}\\n\".format(check.getTmpDir()))\n    else:\n        check.delTmpDir()\n\n    report = makeReport(args.ota, out)\n    if args.report != None:\n        fp = open(args.report, \"w+\")\n        fp.write(report)\n        fp.close()\n        print(\"report written to: \" + args.report)\n\n    if not all_checks_ok:\n        print(report)\n        print(\"OTA Check Failed\")\n        sys.exit(1)\n    else:\n        print(\"OTA Check Success\")\n        sys.exit(0)\n"
  },
  {
    "path": "devices/android/system.toml",
    "content": "# -- Basic Config for Android's system.img --\n\n[GlobalConfig]\nFsType = \"extfs\"\n# enable SeLinux\nFsTypeOptions = \"selinux\"\nDigestImage = true\n\n[GlobalFileChecks]\nSuid = true\n# run-as is a common suid binary\nSuidAllowedList = [\"/system/bin/runs-as\"]\n# enable SeLinux checks\nSeLinuxLabel = true\n# system is mounted read-only\nWorldWrite = false\n# UIDs and GIDs need to be adjusted for each device\nUids = [0,1000,1003,1028,1036,2000]\nGids = [0,1000,1003,1028,1036,2000]\nBadFiles = [ \"/system/xbin/su\" ]\n\n[FileTreeCheck]\nOldTreeFilePath = \"system_filetree.json\"\nCheckPermsOwnerChange = true\nCheckFileSize = false\n\n[FilePathOwner.\"/system/etc\"]\nUid = 0\nGid = 0\n\n[Include.\"android_user_build_checks.toml\"]\n[Include.\"android_properties.toml\"]\n"
  },
  {
    "path": "devices/android/unpack.sh",
    "content": "#!/bin/sh\n\n# -- unpack android OTA --\n\nif [ -z \"$1\" ]; then\n    echo \"syntax: $0 <android_ota.zip>\"\n    exit 1\nfi\nOTAFILE=$1\n\n# tmpdir should contained 'unpacked' as last path element\nTMPDIR=$(pwd)\nif [ \"$(basename $TMPDIR)\" != \"unpacked\" ]; then\n    echo \"run script in directory named 'unpacked'\"\n    exit 1\nfi\n\n# unpack\nunzip $OTAFILE >../unpack.log 2>&1\nextract_android_ota_payload.py payload.bin >>../unpack.log 2>&1\nmkboot boot.img boot_img >>../unpack.log 2>&1\n\n# output targets, targets are consumed by check.py\n# key = name of fwanalyzer config file without extension\n#   e.g. 'system' => will look for 'system.toml'\n# value = path to filesystem image (or directory)\n\n# analyze system.img using system.toml\necho -n '{ \"system\": \"unpacked/system.img\" }'\n"
  },
  {
    "path": "devices/check.py",
    "content": "#!/usr/bin/env python3\n\n# Copyright 2020-present, Cruise LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport argparse\nimport hashlib\nimport json\nimport os\nimport sys\nimport subprocess\nimport tempfile\n\nclass CheckFirmware:\n    def __init__(self, fwanalyzer=\"fwanalyzer\"):\n        self._tmpdir = \"\"\n        self._unpackdir = \"\"\n        self._fwanalyzer = fwanalyzer\n        self._unpacked = False\n\n    def get_tmp_dir(self):\n        return self._tmpdir\n\n    def run_fwanalyzer_fs(self, img, cfg, cfginc, out, options=\"\"):\n        cfginclude = []\n        if cfginc:\n            cfginclude = [\"-cfgpath\", cfginc]\n        cmd = [self._fwanalyzer, \"-in\", img, *cfginclude, \"-cfg\", cfg, \"-out\", out, options]\n        return subprocess.check_call(cmd)\n\n    def unpack(self, fwfile, unpacker, cfgpath):\n        TARGETS_FILE = \"targets.json\"\n        try:\n            if os.path.exists(os.path.join(fwfile, \"unpacked\")) and os.path.exists(os.path.join(fwfile, TARGETS_FILE)):\n                self._tmpdir = fwfile\n                self._unpackdir = os.path.join(self._tmpdir, \"unpacked\")\n                print(\"{0}: is a directory containing an 'unpacked' path, skipping\".format(fwfile))\n                cmd = [\"cat\", os.path.join(fwfile, TARGETS_FILE)]\n                self._unpacked = True\n            else:\n                self._tmpdir = tempfile.mkdtemp()\n                self._unpackdir = os.path.join(self._tmpdir, \"unpacked\")\n                os.mkdir(self._unpackdir)\n                cmd = [unpacker, fwfile, cfgpath]\n            res = subprocess.check_output(cmd, cwd=self._unpackdir)\n            targets = json.loads(res.decode(\"utf-8\"))\n            with open(os.path.join(self._tmpdir, TARGETS_FILE), \"w\") as f:\n                f.write(res.decode(\"utf-8\"))\n            return targets\n        except Exception as e:\n            print(\"Exception: {0}\".format(e))\n            print(\"can't load targets from output of '{0}' check your script\".format(unpacker))\n            return None\n\n    def del_tmp_dir(self):\n        if not self._unpacked:\n            return subprocess.check_call([\"rm\", \"-rf\", self._tmpdir])\n\n    def files_by_ext_stat(self, data):\n        allext = {}\n        for file in data[\"files\"]:\n            fn, ext = os.path.splitext(file[\"name\"])\n            if ext in allext:\n                count, ext = allext[ext]\n                allext[ext] = count + 1, ext\n            else:\n                allext[ext] = (1, ext)\n        return len(data[\"files\"]), allext\n\n    def analyze_filetree(self, filetreefile):\n        with open(filetreefile) as f:\n            data = json.load(f)\n        num_files, stats = self.files_by_ext_stat(data)\n        out = {}\n        percent = num_files / 100\n        # only keep entries with count > 1% and files that have an extension\n        for i in stats:\n            count, ext = stats[i]\n            if count > percent and ext != \"\":\n                out[ext] = count, ext\n\n        return {\n            \"total_files\": num_files,\n            \"file_extension_stats_inclusion_if_more_than\": percent,\n            \"file_extension_stats\": sorted(out.values(), reverse=True)\n        }\n\n    # check result and run post analysis\n    def check_result(self, result):\n        with open(result) as read_file:\n            data = json.load(read_file)\n\n        if \"offenders\" in data:\n            status = False\n        else:\n            status = True\n\n        CURRENT_FILE_TREE = \"current_file_tree_path\"\n\n        if CURRENT_FILE_TREE in data:\n            if os.path.isfile(data[CURRENT_FILE_TREE]):\n                data[\"file_tree_analysis\"] = self.analyze_filetree(data[CURRENT_FILE_TREE])\n\n        return (status, json.dumps(data, sort_keys=True, indent=2))\n\n\ndef hashfile(fpath):\n    m = hashlib.sha256()\n    with open(fpath, \"rb\") as f:\n        while True:\n            data = f.read(65535)\n            if not data:\n                break\n            m.update(data)\n    return m.hexdigest()\n\n\ndef make_report(fwfile, data):\n    \"\"\"Return a json report built from image reports.\"\"\"\n    report = {}\n    status = True\n    for key in data:\n        img_status, img_report  = out[key]\n        if status != False:\n            status = img_status\n        report[key] = json.loads(img_report)\n    report[\"firmware\"] = fwfile\n    if os.path.isfile(fwfile):\n        report[\"firmware_digest\"] = hashfile(fwfile)\n    report[\"status\"] = status\n    return json.dumps(report, sort_keys=True, indent=2)\n\n\nif __name__ == \"__main__\":\n    parser = argparse.ArgumentParser()\n    parser.add_argument(\"--fw\", action=\"store\", required=True, help=\"path to firmware file OR path to unpacked firmware\")\n    parser.add_argument(\"--unpacker\", action=\"store\", required=True, help=\"path to unpacking script\")\n    parser.add_argument(\"--cfg-path\", action=\"store\", required=True, help=\"path to directory containing config files\")\n    parser.add_argument(\"--cfg-include-path\", action=\"store\", help=\"path to config include files\")\n    parser.add_argument(\"--report\", action=\"store\", help=\"report file\")\n    parser.add_argument(\"--keep-unpacked\", action=\"store_true\", help=\"keep unpacked data\")\n    parser.add_argument(\"--fwanalyzer-bin\", action=\"store\", default=\"fwanalyzer\", help=\"path to fwanalyzer binary\")\n    parser.add_argument(\"--fwanalyzer-options\", action=\"store\", default=\"\", help=\"options passed to fwanalyzer\")\n    args = parser.parse_args()\n\n    fw = os.path.realpath(args.fw)\n    cfg = os.path.realpath(args.cfg_path)\n\n    check = CheckFirmware(args.fwanalyzer_bin)\n    targets = check.unpack(fw, os.path.realpath(args.unpacker), cfg)\n    print(\"using tmp directory: {0}\".format(check.get_tmp_dir()))\n    if not targets:\n        print(\"no targets defined\")\n        sys.exit(1)\n\n    # target file system images, a fwanalyzer config file is required for each of those\n    for tgt in targets:\n        cfg_file_name = \"{0}.toml\".format(tgt)\n        if not os.path.isfile(os.path.join(args.cfg_path, cfg_file_name)):\n            print(\"skipped, config file '{0}' for '{1}' does not exist\\n\".format(\n                os.path.join(args.cfg_path, cfg_file_name), targets[tgt]))\n            sys.exit(0)\n        else:\n            print(\"using config file '{0}' for '{1}'\".format(\n                os.path.join(args.cfg_path, cfg_file_name), targets[tgt]))\n\n    out = {}\n    all_checks_ok = True\n    for tgt in targets:\n        cfg_file_name = \"{0}.toml\".format(tgt)\n        out_file_name = \"{0}_out.json\".format(tgt)\n        check.run_fwanalyzer_fs(os.path.join(check.get_tmp_dir(), targets[tgt]),\n            os.path.join(cfg, cfg_file_name), args.cfg_include_path, out_file_name,\n            options=args.fwanalyzer_options)\n        ok, data = check.check_result(out_file_name)\n        out[tgt] = ok, data\n        if not ok:\n            all_checks_ok = False\n\n    if args.keep_unpacked:\n        print(\"unpacked: {0}\\n\".format(check.get_tmp_dir()))\n    else:\n        check.del_tmp_dir()\n\n    report = make_report(args.fw, out)\n    if args.report != None:\n        with open(args.report, \"w+\") as f:\n            f.write(report)\n        print(\"report written to '{0}'\".format(args.report))\n    else:\n        print(report)\n\n    if not all_checks_ok:\n        print(\"Firmware Analysis: checks failed\")\n        sys.exit(1)\n    else:\n        print(\"Firmware Analysis: checks passed\")\n        sys.exit(0)\n"
  },
  {
    "path": "devices/generic/Readme.md",
    "content": "# Generic Linux Devices\n\nThe [root.toml](root.toml) provides a basic FwAnalyzer configuration for a generic Linux root filesystem.\n"
  },
  {
    "path": "devices/generic/root.toml",
    "content": "# -- Basic Config for a generic Linux device --\n\n[GlobalConfig]\nFsType = \"extfs\"\nDigestImage = true\n\n[GlobalFileChecks]\nSuid = true\nSuidAllowedList = []\n# disable SELinux checks\nSeLinuxLabel = false\n# flag world writable files\nWorldWrite = true\n# UIDs and GIDs need to be adjusted for each device\nUids = [0]\nGids = [0]\n# files we do not want in the filesystem\nBadFiles = [ \"/usr/sbin/sshd\", \"/usr/sbin/tcpdump\" ]\n\n[FileTreeCheck]\nOldTreeFilePath = \"root_filetree.json\"\nCheckPermsOwnerChange = true\n\n# -- root should own all binaries --\n\n[FilePathOwner.\"/bin\"]\nUid = 0\nGid = 0\n\n[FilePathOwner.\"/sbin\"]\nUid = 0\nGid = 0\n\n[FilePathOwner.\"/usr/bin\"]\nUid = 0\nGid = 0\n\n[FilePathOwner.\"/usr/sbin\"]\nUid = 0\nGid = 0\n\n# -- check that elf files are stripped --\n\n[FileContent.bins_stripped]\nFile = \"/\"\nScript = \"check_file_elf_stripped.sh\"\nDesc = \"elf file not stripped\"\n\n# -- check mount flags --\n# Note: adjust the device and mount point, example uses: /dev/sda1 at /mnt\n\n[FileContent.\"mount_flag_noexec\"]\nFile = \"/etc/fstab\"\nRegEx = \".*\\\\n/dev/sda1[\\\\t ]+/mnt[\\\\t ]+ext4[\\\\t a-z,]+noexec.*\\\\n.*\"\nDesc = \"sda1 should be mounted noexec\"\n\n[FileContent.\"mount_flag_ro\"]\nFile = \"/etc/fstab\"\nRegEx = \".*\\\\n/dev/sda1[\\\\t ]+/mnt[\\\\t ]+ext4[\\\\t a-z,]+ro.*\\\\n.*\"\nDesc = \"sda1 should be mounted ro\"\n\n[FileContent.\"mount_flag_nodev\"]\nFile = \"/etc/fstab\"\nRegEx = \".*\\\\n/dev/sda1[\\\\t ]+/mnt[\\\\t ]+ext4[\\\\t a-z,]+nodev.*\\\\n.*\"\nDesc = \"sda1 should be mounted nodev\"\n\n[FileContent.\"mount_flag_nosuid\"]\nFile = \"/etc/fstab\"\nRegEx = \".*\\\\n/dev/sda1[\\\\t ]+/mnt[\\\\t ]+vfat[ \\\\ta-z,]+nosuid.*\\\\n.*\"\nDesc = \"sda1 should be mounted nosuid\"\n"
  },
  {
    "path": "docker-compose.yml",
    "content": "version: \"3\"\nservices:\n  fwanalyzer:\n    build: .\n    working_dir: /go/src/github.com/cruise-automation/fwanalyzer\n    volumes:\n      - .:/go/src/github.com/cruise-automation/fwanalyzer\n"
  },
  {
    "path": "go.mod",
    "content": "module github.com/cruise-automation/fwanalyzer\n\ngo 1.13\n\nrequire (\n\tgithub.com/BurntSushi/toml v0.3.1\n\tgithub.com/bmatcuk/doublestar v1.1.4\n\tgithub.com/google/go-cmp v0.2.0\n)\n"
  },
  {
    "path": "go.sum",
    "content": "github.com/BurntSushi/toml v0.3.1 h1:WXkYYl6Yr3qBf1K79EBnL4mak0OimBfB0XUf9Vl28OQ=\ngithub.com/BurntSushi/toml v0.3.1/go.mod h1:xHWCNGjB5oqiDr8zfno3MHue2Ht5sIBksp03qcyfWMU=\ngithub.com/bmatcuk/doublestar v1.1.4 h1:OiC5vFUceSTlgPeJdxVJGNIXTLxCBVPO7ozqJjXbE9M=\ngithub.com/bmatcuk/doublestar v1.1.4/go.mod h1:wiQtGV+rzVYxB7WIlirSN++5HPtPlXEo9MEoZQC/PmE=\ngithub.com/google/go-cmp v0.2.0 h1:+dTQ8DZQJz0Mb/HjFlkptS1FeQ4cWSnN941F8aEG4SQ=\ngithub.com/google/go-cmp v0.2.0/go.mod h1:oXzfMopK8JAjlY9xF4vHSVASa0yLyX7SntLO5aqRK0M=\n"
  },
  {
    "path": "pkg/analyzer/analyzer.go",
    "content": "/*\nCopyright 2019-present, Cruise LLC\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n\thttp://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n*/\n\npackage analyzer\n\nimport (\n\t\"bytes\"\n\t\"encoding/hex\"\n\t\"encoding/json\"\n\t\"errors\"\n\t\"io/ioutil\"\n\t\"os\"\n\t\"path\"\n\t\"strings\"\n\n\t\"github.com/BurntSushi/toml\"\n\n\t\"github.com/cruise-automation/fwanalyzer/pkg/cpioparser\"\n\t\"github.com/cruise-automation/fwanalyzer/pkg/dirparser\"\n\t\"github.com/cruise-automation/fwanalyzer/pkg/extparser\"\n\t\"github.com/cruise-automation/fwanalyzer/pkg/fsparser\"\n\t\"github.com/cruise-automation/fwanalyzer/pkg/squashfsparser\"\n\t\"github.com/cruise-automation/fwanalyzer/pkg/ubifsparser\"\n\t\"github.com/cruise-automation/fwanalyzer/pkg/util\"\n\t\"github.com/cruise-automation/fwanalyzer/pkg/vfatparser\"\n)\n\ntype AnalyzerPluginType interface {\n\tName() string\n\tStart()\n\tFinalize() string\n\tCheckFile(fi *fsparser.FileInfo, path string) error\n}\n\ntype AnalyzerType interface {\n\tGetFileInfo(filepath string) (fsparser.FileInfo, error)\n\tRemoveFile(filepath string) error\n\tFileGetSha256(filepath string) ([]byte, error)\n\tFileGet(filepath string) (string, error)\n\tAddOffender(filepath string, reason string)\n\tAddInformational(filepath string, reason string)\n\tCheckAllFilesWithPath(cb AllFilesCallback, cbdata AllFilesCallbackData, filepath string)\n\tAddData(key, value string)\n\tImageInfo() AnalyzerReport\n}\n\ntype AllFilesCallbackData interface{}\ntype AllFilesCallback func(fi *fsparser.FileInfo, fullpath string, data AllFilesCallbackData)\n\ntype globalConfigType struct {\n\tFSType        string\n\tFSTypeOptions string\n\tDigestImage   bool\n}\n\ntype AnalyzerReport struct {\n\tFSType        string                   `json:\"fs_type\"`\n\tImageName     string                   `json:\"image_name\"`\n\tImageDigest   string                   `json:\"image_digest,omitempty\"`\n\tData          map[string]interface{}   `json:\"data,omitempty\"`\n\tOffenders     map[string][]interface{} `json:\"offenders,omitempty\"`\n\tInformational map[string][]interface{} `json:\"informational,omitempty\"`\n}\n\ntype Analyzer struct {\n\tfsparser      fsparser.FsParser\n\ttmpdir        string\n\tconfig        globalConfigType\n\tanalyzers     []AnalyzerPluginType\n\tPluginReports map[string]interface{}\n\tAnalyzerReport\n}\n\nfunc New(fsp fsparser.FsParser, cfg globalConfigType) *Analyzer {\n\tvar a Analyzer\n\ta.config = cfg\n\ta.fsparser = fsp\n\ta.FSType = cfg.FSType\n\ta.ImageName = fsp.ImageName()\n\ta.tmpdir, _ = util.MkTmpDir(\"analyzer\")\n\ta.Offenders = make(map[string][]interface{})\n\ta.Informational = make(map[string][]interface{})\n\ta.Data = make(map[string]interface{})\n\ta.PluginReports = make(map[string]interface{})\n\n\tif cfg.DigestImage {\n\t\ta.ImageDigest = hex.EncodeToString(util.DigestFileSha256(a.ImageName))\n\t}\n\n\treturn &a\n}\n\nfunc NewFromConfig(imagepath string, cfgdata string) *Analyzer {\n\ttype globalconfig struct {\n\t\tGlobalConfig globalConfigType\n\t}\n\tvar config globalconfig\n\n\t_, err := toml.Decode(cfgdata, &config)\n\tif err != nil {\n\t\tpanic(\"can't read config data: \" + err.Error())\n\t}\n\n\tvar fsp fsparser.FsParser\n\t// Set the parser based on the FSType in the config\n\tif strings.EqualFold(config.GlobalConfig.FSType, \"extfs\") {\n\t\tfsp = extparser.New(imagepath,\n\t\t\tstrings.Contains(config.GlobalConfig.FSTypeOptions, \"selinux\"),\n\t\t\tstrings.Contains(config.GlobalConfig.FSTypeOptions, \"capabilities\"))\n\t} else if strings.EqualFold(config.GlobalConfig.FSType, \"dirfs\") {\n\t\tfsp = dirparser.New(imagepath)\n\t} else if strings.EqualFold(config.GlobalConfig.FSType, \"vfatfs\") {\n\t\tfsp = vfatparser.New(imagepath)\n\t} else if strings.EqualFold(config.GlobalConfig.FSType, \"squashfs\") {\n\t\tfsp = squashfsparser.New(imagepath,\n\t\t\tstrings.Contains(config.GlobalConfig.FSTypeOptions, \"securityinfo\"))\n\t} else if strings.EqualFold(config.GlobalConfig.FSType, \"ubifs\") {\n\t\tfsp = ubifsparser.New(imagepath)\n\t} else if strings.EqualFold(config.GlobalConfig.FSType, \"cpiofs\") {\n\t\tfsp = cpioparser.New(imagepath,\n\t\t\tstrings.Contains(config.GlobalConfig.FSTypeOptions, \"fixdirs\"))\n\t} else {\n\t\tpanic(\"Cannot find an appropriate parser: \" + config.GlobalConfig.FSType)\n\t}\n\n\treturn New(fsp, config.GlobalConfig)\n}\n\nfunc (a *Analyzer) FsTypeSupported() (bool, string) {\n\tif !a.fsparser.Supported() {\n\t\treturn false, a.config.FSType + \": requires additional tools, please refer to documentation.\"\n\t}\n\treturn true, \"\"\n}\n\nfunc (a *Analyzer) ImageInfo() AnalyzerReport {\n\t// only provide the meta information, don't include offenders and other report data\n\treturn AnalyzerReport{\n\t\tFSType:      a.FSType,\n\t\tImageName:   a.ImageName,\n\t\tImageDigest: a.ImageDigest,\n\t}\n}\n\nfunc (a *Analyzer) AddAnalyzerPlugin(aplug AnalyzerPluginType) {\n\ta.analyzers = append(a.analyzers, aplug)\n}\n\nfunc (a *Analyzer) iterateFiles(curpath string) error {\n\tdir, err := a.fsparser.GetDirInfo(curpath)\n\tif err != nil {\n\t\treturn err\n\t}\n\tcp := curpath\n\tfor _, fi := range dir {\n\t\tfor _, ap := range a.analyzers {\n\t\t\terr = ap.CheckFile(&fi, cp)\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t}\n\n\t\tif fi.IsDir() {\n\t\t\terr = a.iterateFiles(path.Join(curpath, fi.Name))\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t}\n\t}\n\treturn nil\n}\n\nfunc (a *Analyzer) checkRoot() error {\n\tfi, err := a.fsparser.GetFileInfo(\"/\")\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tfor _, ap := range a.analyzers {\n\t\terr = ap.CheckFile(&fi, \"/\")\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\t}\n\n\treturn nil\n}\n\nfunc (a *Analyzer) addPluginReport(report string) {\n\tvar data map[string]interface{}\n\n\terr := json.Unmarshal([]byte(report), &data)\n\tif err != nil {\n\t\treturn\n\t}\n\tfor k := range data {\n\t\ta.PluginReports[k] = data[k]\n\t}\n}\n\nfunc (a *Analyzer) RunPlugins() {\n\tfor _, ap := range a.analyzers {\n\t\tap.Start()\n\t}\n\n\terr := a.checkRoot()\n\tif err != nil {\n\t\tpanic(\"RunPlugins error: \" + err.Error())\n\t}\n\n\terr = a.iterateFiles(\"/\")\n\tif err != nil {\n\t\tpanic(\"RunPlugins error: \" + err.Error())\n\t}\n\n\tfor _, ap := range a.analyzers {\n\t\tres := ap.Finalize()\n\t\ta.addPluginReport(res)\n\t}\n}\n\nfunc (a *Analyzer) CleanUp() error {\n\terr := os.RemoveAll(a.tmpdir)\n\treturn err\n}\n\nfunc (a *Analyzer) GetFileInfo(filepath string) (fsparser.FileInfo, error) {\n\treturn a.fsparser.GetFileInfo(filepath)\n}\n\nfunc (a *Analyzer) FileGet(filepath string) (string, error) {\n\ttmpfile, _ := ioutil.TempFile(a.tmpdir, \"\")\n\ttmpname := tmpfile.Name()\n\ttmpfile.Close()\n\tif a.fsparser.CopyFile(filepath, tmpname) {\n\t\treturn tmpname, nil\n\t}\n\treturn \"\", errors.New(\"error copying file\")\n}\n\nfunc (a *Analyzer) FileGetSha256(filepath string) ([]byte, error) {\n\ttmpname, err := a.FileGet(filepath)\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\tdefer os.Remove(tmpname)\n\tdigest := util.DigestFileSha256(tmpname)\n\treturn digest, nil\n}\n\nfunc (a *Analyzer) RemoveFile(filepath string) error {\n\tos.Remove(filepath)\n\treturn nil\n}\n\nfunc (a *Analyzer) iterateAllDirs(curpath string, cb AllFilesCallback, cbdata AllFilesCallbackData) error {\n\tdir, err := a.fsparser.GetDirInfo(curpath)\n\tif err != nil {\n\t\treturn err\n\t}\n\tfor _, fi := range dir {\n\t\tcb(&fi, curpath, cbdata)\n\t\tif fi.IsDir() {\n\t\t\terr := a.iterateAllDirs(path.Join(curpath, fi.Name), cb, cbdata)\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t}\n\t}\n\treturn nil\n}\n\nfunc (a *Analyzer) CheckAllFilesWithPath(cb AllFilesCallback, cbdata AllFilesCallbackData, filepath string) {\n\tif cb == nil {\n\t\treturn\n\t}\n\terr := a.iterateAllDirs(filepath, cb, cbdata)\n\tif err != nil {\n\t\tpanic(\"iterateAllDirs failed\")\n\t}\n}\n\nfunc (a *Analyzer) AddOffender(filepath string, reason string) {\n\tvar data map[string]interface{}\n\t// this is valid json?\n\tif err := json.Unmarshal([]byte(reason), &data); err == nil {\n\t\t// yes: store as json\n\t\ta.Offenders[filepath] = append(a.Offenders[filepath], json.RawMessage(reason))\n\t} else {\n\t\t// no: store as plain text\n\t\ta.Offenders[filepath] = append(a.Offenders[filepath], reason)\n\t}\n}\n\nfunc (a *Analyzer) AddInformational(filepath string, reason string) {\n\tvar data map[string]interface{}\n\t// this is valid json?\n\tif err := json.Unmarshal([]byte(reason), &data); err == nil {\n\t\t// yes: store as json\n\t\ta.Informational[filepath] = append(a.Informational[filepath], json.RawMessage(reason))\n\t} else {\n\t\t// no: store as plain text\n\t\ta.Informational[filepath] = append(a.Informational[filepath], reason)\n\t}\n}\n\nfunc (a *Analyzer) HasOffenders() bool {\n\treturn len(a.Offenders) > 0\n}\n\nfunc (a *Analyzer) AddData(key string, value string) {\n\t// this is a valid json object?\n\tvar data map[string]interface{}\n\tif err := json.Unmarshal([]byte(value), &data); err == nil {\n\t\t// yes: store as json\n\t\ta.Data[key] = json.RawMessage(value)\n\t\treturn\n\t}\n\n\t// this is valid json array?\n\tvar array []interface{}\n\tif err := json.Unmarshal([]byte(value), &array); err == nil {\n\t\t// yes: store as json\n\t\ta.Data[key] = json.RawMessage(value)\n\t} else {\n\t\t// no: store as plain text\n\t\ta.Data[key] = value\n\t}\n}\n\nfunc (a *Analyzer) addReportData(report []byte) ([]byte, error) {\n\tvar data map[string]interface{}\n\n\terr := json.Unmarshal(report, &data)\n\tif err != nil {\n\t\treturn report, err\n\t}\n\n\tfor k := range a.PluginReports {\n\t\tdata[k] = a.PluginReports[k]\n\t}\n\n\tjdata, err := json.Marshal(&data)\n\treturn jdata, err\n}\n\nfunc (a *Analyzer) JsonReport() string {\n\tar := AnalyzerReport{\n\t\tFSType:        a.FSType,\n\t\tOffenders:     a.Offenders,\n\t\tInformational: a.Informational,\n\t\tData:          a.Data,\n\t\tImageName:     a.ImageName,\n\t\tImageDigest:   a.ImageDigest,\n\t}\n\n\tjdata, _ := json.Marshal(ar)\n\tjdata, _ = a.addReportData(jdata)\n\n\t// make json look pretty\n\tvar prettyJson bytes.Buffer\n\t_ = json.Indent(&prettyJson, jdata, \"\", \"\\t\")\n\treturn prettyJson.String()\n}\n"
  },
  {
    "path": "pkg/analyzer/analyzer_test.go",
    "content": "/*\nCopyright 2019-present, Cruise LLC\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n\thttp://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n*/\n\npackage analyzer\n\nimport (\n\t\"os\"\n\t\"testing\"\n)\n\nfunc TestBasic(t *testing.T) {\n\tcfg := `\n[GlobalConfig]\nFsType = \"dirfs\"\nDigestImage = false\n`\n\n\t// check tmp file test\n\tanalyzer := NewFromConfig(\"../../test/testdir\", cfg)\n\t_ = analyzer.CleanUp()\n\tif _, err := os.Stat(analyzer.tmpdir); !os.IsNotExist(err) {\n\t\tt.Errorf(\"tmpdir was not removed\")\n\t}\n\n\t// file test\n\tanalyzer = NewFromConfig(\"../../test/testdir\", cfg)\n\tfi, err := analyzer.GetFileInfo(\"/file1.txt\")\n\tif err != nil {\n\t\tt.Errorf(\"GetFileInfo failed\")\n\t}\n\tif !fi.IsFile() {\n\t\tt.Errorf(\"GetFileInfo failed, should be regular file\")\n\t}\n\tif fi.IsDir() {\n\t\tt.Errorf(\"GetFileInfo failed, not a dir\")\n\t}\n\tif fi.Name != \"file1.txt\" {\n\t\tt.Errorf(\"filename does not match\")\n\t}\n\n\t// directory test\n\tfi, err = analyzer.GetFileInfo(\"/dir1\")\n\tif err != nil {\n\t\tt.Errorf(\"GetFileInfo failed\")\n\t}\n\tif fi.IsFile() {\n\t\tt.Errorf(\"GetFileInfo failed, not a file\")\n\t}\n\tif !fi.IsDir() {\n\t\tt.Errorf(\"GetFileInfo failed, should be a directory\")\n\t}\n\tif fi.Name != \"dir1\" {\n\t\tt.Errorf(\"filename does not match\")\n\t}\n\n\terr = analyzer.checkRoot()\n\tif err != nil {\n\t\tt.Errorf(\"checkroot failed with %s\", err)\n\t}\n\n\t_ = analyzer.CleanUp()\n}\n"
  },
  {
    "path": "pkg/analyzer/dataextract/dataextract.go",
    "content": "/*\nCopyright 2019-present, Cruise LLC\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n\thttp://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n*/\n\npackage dataextract\n\nimport (\n\t\"encoding/json\"\n\t\"fmt\"\n\t\"io/ioutil\"\n\t\"os/exec\"\n\t\"path\"\n\t\"regexp\"\n\t\"strings\"\n\n\t\"github.com/BurntSushi/toml\"\n\n\t\"github.com/cruise-automation/fwanalyzer/pkg/analyzer\"\n\t\"github.com/cruise-automation/fwanalyzer/pkg/fsparser\"\n\t\"github.com/cruise-automation/fwanalyzer/pkg/util\"\n)\n\ntype dataType struct {\n\tFile          string\n\tScript        string\n\tScriptOptions []string // options for script execution\n\tRegEx         string\n\tJson          string\n\tDesc          string\n\tName          string // the name can be set directly otherwise the key will be used\n}\n\ntype dataExtractType struct {\n\tconfig map[string][]dataType\n\ta      analyzer.AnalyzerType\n}\n\nfunc New(config string, a analyzer.AnalyzerType) *dataExtractType {\n\ttype dataExtractListType struct {\n\t\tDataExtract map[string]dataType\n\t}\n\tcfg := dataExtractType{a: a, config: make(map[string][]dataType)}\n\n\tvar dec dataExtractListType\n\t_, err := toml.Decode(config, &dec)\n\tif err != nil {\n\t\tpanic(\"can't read config data: \" + err.Error())\n\t}\n\n\t// convert name based map to filename based map with an array of dataType\n\tfor name, item := range dec.DataExtract {\n\t\tvar items []dataType\n\t\tif _, ok := cfg.config[item.File]; ok {\n\t\t\titems = cfg.config[item.File]\n\t\t}\n\t\tif item.Name == \"\" {\n\t\t\t// if the key ends with __[0-9] remove the suffix and use it as name\n\t\t\tif name[len(name)-1] >= '0' && name[len(name)-1] <= '9' && strings.HasSuffix(name[:len(name)-1], \"__\") {\n\t\t\t\titem.Name = name[:len(name)-3]\n\t\t\t} else {\n\t\t\t\titem.Name = name\n\t\t\t}\n\t\t}\n\t\titems = append(items, item)\n\t\titem.File = path.Clean(item.File)\n\t\tcfg.config[item.File] = items\n\t}\n\n\treturn &cfg\n}\n\nfunc (state *dataExtractType) Start() {}\nfunc (state *dataExtractType) Finalize() string {\n\treturn \"\"\n}\n\nfunc (state *dataExtractType) Name() string {\n\treturn \"DataExtract\"\n}\n\nfunc (state *dataExtractType) CheckFile(fi *fsparser.FileInfo, filepath string) error {\n\tif !fi.IsFile() {\n\t\treturn nil\n\t}\n\n\tfn := path.Join(filepath, fi.Name)\n\tif _, ok := state.config[fn]; !ok {\n\t\treturn nil\n\t}\n\n\titems := state.config[fn]\n\n\t// we record if the specific Name was already added with a non error value\n\tnameFilled := make(map[string]bool)\n\n\tfor _, item := range items {\n\t\t// Name already set?\n\t\tif _, ok := nameFilled[item.Name]; ok {\n\t\t\tcontinue\n\t\t}\n\n\t\tif fi.IsLink() {\n\t\t\tstate.a.AddData(item.Name, fmt.Sprintf(\"DataExtract ERROR: file is Link (extract data from actual file): %s : %s\",\n\t\t\t\titem.Name, item.Desc))\n\t\t\tcontinue\n\t\t}\n\n\t\tif item.RegEx != \"\" {\n\t\t\treg, err := regexp.Compile(item.RegEx)\n\t\t\tif err != nil {\n\t\t\t\tstate.a.AddData(item.Name, fmt.Sprintf(\"DataExtract ERROR: regex compile error: %s : %s %s\",\n\t\t\t\t\titem.RegEx, item.Name, item.Desc))\n\t\t\t\tcontinue\n\t\t\t}\n\n\t\t\ttmpfn, err := state.a.FileGet(fn)\n\t\t\tif err != nil {\n\t\t\t\tstate.a.AddData(item.Name, fmt.Sprintf(\"DataExtract ERROR: file read error, file get: %s : %s : %s\",\n\t\t\t\t\terr, item.Name, item.Desc))\n\t\t\t\tcontinue\n\t\t\t}\n\t\t\tfdata, err := ioutil.ReadFile(tmpfn)\n\t\t\tif err != nil {\n\t\t\t\tstate.a.AddData(item.Name, fmt.Sprintf(\"DataExtract ERROR: file read error, file read: %s : %s : %s\",\n\t\t\t\t\terr, item.Name, item.Desc))\n\t\t\t\tcontinue\n\t\t\t}\n\t\t\t_ = state.a.RemoveFile(tmpfn)\n\t\t\tres := reg.FindAllStringSubmatch(string(fdata), -1)\n\t\t\tif len(res) < 1 {\n\t\t\t\tstate.a.AddData(item.Name, fmt.Sprintf(\"DataExtract ERROR: regex match error, regex: %s : %s : %s\",\n\t\t\t\t\titem.RegEx, item.Name, item.Desc))\n\t\t\t} else {\n\t\t\t\t// only one match\n\t\t\t\tif len(res) == 1 && len(res[0]) == 2 {\n\t\t\t\t\tstate.a.AddData(item.Name, res[0][1])\n\t\t\t\t\tnameFilled[item.Name] = true\n\t\t\t\t} else if len(res) > 1 {\n\t\t\t\t\t// multiple matches\n\t\t\t\t\tdata := []string{}\n\t\t\t\t\tfor _, i := range res {\n\t\t\t\t\t\tif len(i) == 2 {\n\t\t\t\t\t\t\tdata = append(data, i[1])\n\t\t\t\t\t\t}\n\t\t\t\t\t}\n\t\t\t\t\t// convert to JSON array \n\t\t\t\t\tjdata, _ := json.Marshal(data)\n\t\t\t\t\tstate.a.AddData(item.Name, string(jdata))\n\t\t\t\t\tnameFilled[item.Name] = true\n\t\t\t\t} else {\n\t\t\t\t\tstate.a.AddData(item.Name, fmt.Sprintf(\"DataExtract ERROR: regex match error : %s : %s\",\n\t\t\t\t\t\titem.Name, item.Desc))\n\t\t\t\t}\n\t\t\t}\n\t\t}\n\n\t\tif item.Script != \"\" {\n\t\t\tout, err := runScriptOnFile(state.a, item.Script, item.ScriptOptions, fi, fn)\n\t\t\tif err != nil {\n\t\t\t\tstate.a.AddData(item.Name, fmt.Sprintf(\"DataExtract ERROR: script error: %s : %s : %s\",\n\t\t\t\t\terr, item.Name, item.Desc))\n\t\t\t} else {\n\t\t\t\tstate.a.AddData(item.Name, out)\n\t\t\t\tnameFilled[item.Name] = true\n\t\t\t}\n\t\t}\n\n\t\tif item.Json != \"\" {\n\t\t\ttmpfn, err := state.a.FileGet(fn)\n\t\t\tif err != nil {\n\t\t\t\tstate.a.AddData(item.Name, fmt.Sprintf(\"DataExtract ERROR: file read error, file get: %s : %s : %s\",\n\t\t\t\t\terr, item.Name, item.Desc))\n\t\t\t\tcontinue\n\t\t\t}\n\t\t\tfdata, err := ioutil.ReadFile(tmpfn)\n\t\t\tif err != nil {\n\t\t\t\tstate.a.AddData(item.Name, fmt.Sprintf(\"DataExtract ERROR: file read error, file read: %s : %s : %s\",\n\t\t\t\t\terr, item.Name, item.Desc))\n\t\t\t\tcontinue\n\t\t\t}\n\t\t\t_ = state.a.RemoveFile(tmpfn)\n\n\t\t\tout, err := util.XtractJsonField(fdata, strings.Split(item.Json, \".\"))\n\t\t\tif err != nil {\n\t\t\t\tstate.a.AddData(item.Name, fmt.Sprintf(\"DataExtract ERROR: JSON decode error: %s : %s : %s\",\n\t\t\t\t\terr, item.Name, item.Desc))\n\t\t\t\tcontinue\n\t\t\t}\n\t\t\tstate.a.AddData(item.Name, out)\n\t\t\tnameFilled[item.Name] = true\n\t\t}\n\t}\n\treturn nil\n}\n\n// runScriptOnFile runs the provided script with the following parameters:\n// <filename> <filename in filesystem> <uid> <gid> <mode> <selinux label - can be empty> -- scriptOptions[0] scriptOptions[1]\nfunc runScriptOnFile(a analyzer.AnalyzerType, script string, scriptOptions []string, fi *fsparser.FileInfo, fpath string) (string, error) {\n\tfname, err := a.FileGet(fpath)\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\toptions := []string{fname, fpath, fmt.Sprintf(\"%d\", fi.Uid), fmt.Sprintf(\"%d\", fi.Gid),\n\t\tfmt.Sprintf(\"%o\", fi.Mode), fi.SELinuxLabel}\n\tif len(scriptOptions) > 0 {\n\t\toptions = append(options, \"--\")\n\t\toptions = append(options, scriptOptions...)\n\t}\n\tout, err := exec.Command(script, options...).Output()\n\t_ = a.RemoveFile(fname)\n\n\treturn string(out), err\n}\n"
  },
  {
    "path": "pkg/analyzer/dataextract/dataextract_test.go",
    "content": "/*\nCopyright 2019-present, Cruise LLC\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n\thttp://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n*/\n\npackage dataextract\n\nimport (\n\t\"io/ioutil\"\n\t\"os\"\n\t\"testing\"\n\n\t\"github.com/cruise-automation/fwanalyzer/pkg/analyzer\"\n\t\"github.com/cruise-automation/fwanalyzer/pkg/fsparser\"\n\t\"github.com/cruise-automation/fwanalyzer/pkg/util\"\n)\n\ntype testAnalyzer struct {\n\tData     map[string]string\n\ttestfile string\n}\n\nfunc (a *testAnalyzer) AddData(key, value string) {\n\ta.Data[key] = value\n}\n\nfunc (a *testAnalyzer) GetFileInfo(filepath string) (fsparser.FileInfo, error) {\n\treturn fsparser.FileInfo{}, nil\n}\nfunc (a *testAnalyzer) RemoveFile(filepath string) error {\n\treturn nil\n}\nfunc (a *testAnalyzer) FileGetSha256(filepath string) ([]byte, error) {\n\treturn []byte(\"\"), nil\n}\nfunc (a *testAnalyzer) FileGet(filepath string) (string, error) {\n\treturn a.testfile, nil\n}\nfunc (a *testAnalyzer) AddOffender(filepath string, reason string) {\n}\nfunc (a *testAnalyzer) AddInformational(filepath string, reason string) {}\nfunc (a *testAnalyzer) CheckAllFilesWithPath(cb analyzer.AllFilesCallback, cbdata analyzer.AllFilesCallbackData, filepath string) {\n}\nfunc (a *testAnalyzer) ImageInfo() analyzer.AnalyzerReport {\n\treturn analyzer.AnalyzerReport{}\n}\n\nfunc makeFile(data string, fn string) fsparser.FileInfo {\n\terr := ioutil.WriteFile(\"/tmp/\"+fn, []byte(data), 0666)\n\tif err != nil {\n\t\tpanic(err)\n\t}\n\treturn fsparser.FileInfo{Name: fn, Size: 1, Mode: 0100644}\n}\n\nfunc TestRegex1(t *testing.T) {\n\n\ta := &testAnalyzer{}\n\ta.Data = make(map[string]string)\n\n\tcfg := `\n[DataExtract.\"Version\"]\nFile = \"/tmp/datatestfileX.1\"\nRegEx = \".*Ver=(.+)\\n\"\nDesc=\"Ver 1337 test\"\n`\n\n\tg := New(cfg, a)\n\n\tg.Start()\n\n\ta.testfile = \"/tmp/datatestfileX.1\"\n\n\t// must match\n\tfi := makeFile(\"sadkljhlksaj Ver=1337\\naasas\\n \", \"datatestfileX.1\")\n\terr := g.CheckFile(&fi, \"/tmp\")\n\tif err != nil {\n\t\tt.Errorf(\"CheckFile failed\")\n\t}\n\tif data, ok := a.Data[\"Version\"]; !ok || data != \"1337\" {\n\t\tt.Errorf(\"data extract failed Regex\")\n\t}\n\tos.Remove(\"/tmp/datatestfileX.1\")\n\tdelete(a.Data, \"Version\")\n\n\t// must not match\n\tfi = makeFile(\"sadkljhlksaj ver=1337\\naasas\\n \", \"datatestfileX.1\")\n\terr = g.CheckFile(&fi, \"/tmp\")\n\tif err != nil {\n\t\tt.Errorf(\"CheckFile failed\")\n\t}\n\tif data, ok := a.Data[\"Version\"]; ok && data == \"1337\" {\n\t\tt.Errorf(\"data extract failed Regex\")\n\t}\n\tos.Remove(\"/tmp/datatestfileX.1\")\n\n\tg.Finalize()\n}\n\nfunc TestScript1(t *testing.T) {\n\n\ta := &testAnalyzer{}\n\ta.Data = make(map[string]string)\n\n\tcfg := `\n[DataExtract.LastLine]\nFile = \"/tmp/datatestfileX.1\"\nScript=\"/tmp/extractscripttest.sh\"\nDesc=\"last line test\"\n`\n\n\tscript := `#!/bin/sh\ntail -n 1 $1\n`\n\n\terr := ioutil.WriteFile(\"/tmp/extractscripttest.sh\", []byte(script), 0777)\n\tif err != nil {\n\t\tt.Error(err)\n\t}\n\n\tg := New(cfg, a)\n\n\tg.Start()\n\n\ta.testfile = \"/tmp/datatestfileX.1\"\n\n\tfi := makeFile(\"lskjadh\\naskhj23832\\n\\nkjhf21987\\nhello world\\n\", \"datatestfileX.1\")\n\terr = g.CheckFile(&fi, \"/tmp\")\n\tif err != nil {\n\t\tt.Errorf(\"CheckFile failed\")\n\t}\n\tif data, ok := a.Data[\"LastLine\"]; !ok || data != \"hello world\\n\" {\n\t\tt.Errorf(\"data extract failed script\")\n\t}\n\tos.Remove(\"/tmp/datatestfileX.1\")\n\tos.Remove(\"/tmp/extractscripttest.sh\")\n\n\tg.Finalize()\n}\n\nfunc TestMulti(t *testing.T) {\n\n\ta := &testAnalyzer{}\n\ta.Data = make(map[string]string)\n\n\tcfg := `\n[DataExtract.\"1\"]\nFile = \"/tmp/datatestfileX.1\"\nRegEx = \".*Ver=(.+)\\n\"\nName = \"Version\"\n\n[DataExtract.\"2\"]\nFile = \"/tmp/datatestfileX.1\"\nRegEx = \".*Version=(.+)\\n\"\nName = \"Version\"\n`\n\tg := New(cfg, a)\n\n\tg.Start()\n\n\ta.testfile = \"/tmp/datatestfileX.1\"\n\n\tfi := makeFile(\"sadkljhlksaj Version=1337\\naasas\\n \", \"datatestfileX.1\")\n\terr := g.CheckFile(&fi, \"/tmp\")\n\tif err != nil {\n\t\tt.Errorf(\"CheckFile failed\")\n\t}\n\tif data, ok := a.Data[\"Version\"]; !ok || data != \"1337\" {\n\t\tt.Errorf(\"data extract failed Regex\")\n\t}\n\tos.Remove(\"/tmp/datatestfileX.1\")\n\tdelete(a.Data, \"Version\")\n\n\tfi = makeFile(\"sadkljhlksaj Ver=1337\\naasas\\n \", \"datatestfileX.1\")\n\terr = g.CheckFile(&fi, \"/tmp\")\n\tif err != nil {\n\t\tt.Errorf(\"CheckFile failed\")\n\t}\n\tif data, ok := a.Data[\"Version\"]; !ok && data == \"1337\" {\n\t\tt.Errorf(\"data extract failed Regex\")\n\t}\n\tos.Remove(\"/tmp/datatestfileX.1\")\n\n\tg.Finalize()\n}\n\nfunc TestAutoNaming(t *testing.T) {\n\n\ta := &testAnalyzer{}\n\ta.Data = make(map[string]string)\n\n\tcfg := `\n[DataExtract.\"Version__9\"]\nFile = \"/tmp/datatestfileX.1\"\nRegEx = \".*Ver=(.+)\\n\"\n\n[DataExtract.\"Version__0\"]\nFile = \"/tmp/datatestfileX.1\"\nRegEx = \".*Version=(.+)\\n\"\n`\n\tg := New(cfg, a)\n\n\tg.Start()\n\n\ta.testfile = \"/tmp/datatestfileX.1\"\n\n\tfi := makeFile(\"sadkljhlksaj Version=1337\\naasas\\n \", \"datatestfileX.1\")\n\terr := g.CheckFile(&fi, \"/tmp\")\n\tif err != nil {\n\t\tt.Errorf(\"CheckFile failed\")\n\t}\n\tif data, ok := a.Data[\"Version\"]; !ok || data != \"1337\" {\n\t\tt.Errorf(\"data extract failed Regex\")\n\t}\n\tos.Remove(\"/tmp/datatestfileX.1\")\n\tdelete(a.Data, \"Version\")\n\n\tfi = makeFile(\"sadkljhlksaj Ver=1337\\naasas\\n \", \"datatestfileX.1\")\n\terr = g.CheckFile(&fi, \"/tmp\")\n\tif err != nil {\n\t\tt.Errorf(\"CheckFile failed\")\n\t}\n\tif data, ok := a.Data[\"Version\"]; !ok && data == \"1337\" {\n\t\tt.Errorf(\"data extract failed Regex\")\n\t}\n\tos.Remove(\"/tmp/datatestfileX.1\")\n\n\tg.Finalize()\n}\n\nfunc TestJson1(t *testing.T) {\n\n\ta := &testAnalyzer{}\n\ta.Data = make(map[string]string)\n\n\tcfg := `\n[DataExtract.\"Version__9\"]\nFile = \"/tmp/datatestfileX.1\"\nJson = \"a\"\n`\n\tg := New(cfg, a)\n\n\tg.Start()\n\n\ta.testfile = \"/tmp/datatestfileX.1\"\n\n\tfi := makeFile(`{\"a\":\"lalala\"}`, \"datatestfileX.1\")\n\terr := g.CheckFile(&fi, \"/tmp\")\n\tif err != nil {\n\t\tt.Errorf(\"CheckFile failed\")\n\t}\n\tif data, ok := a.Data[\"Version\"]; !ok || data != \"lalala\" {\n\t\tt.Errorf(\"data extract failed Json\")\n\t}\n\tos.Remove(\"/tmp/datatestfileX.1\")\n\tdelete(a.Data, \"Version\")\n\n\tg.Finalize()\n}\n\nfunc TestJson2(t *testing.T) {\n\n\ta := &testAnalyzer{}\n\ta.Data = make(map[string]string)\n\n\tcfg := `\n[DataExtract.\"Version__9\"]\nFile = \"/tmp/datatestfileX.1\"\nJson = \"a.b\"\n`\n\tg := New(cfg, a)\n\n\tg.Start()\n\n\ta.testfile = \"/tmp/datatestfileX.1\"\n\n\tfi := makeFile(`{\"a\":{\"b\": \"lalala123\"}}`, \"datatestfileX.1\")\n\terr := g.CheckFile(&fi, \"/tmp\")\n\tif err != nil {\n\t\tt.Errorf(\"CheckFile failed\")\n\t}\n\tif data, ok := a.Data[\"Version\"]; !ok || data != \"lalala123\" {\n\t\tt.Errorf(\"data extract failed Json\")\n\t}\n\tos.Remove(\"/tmp/datatestfileX.1\")\n\tdelete(a.Data, \"Version\")\n\n\tg.Finalize()\n}\n\nfunc TestJson3Bool(t *testing.T) {\n\n\ta := &testAnalyzer{}\n\ta.Data = make(map[string]string)\n\n\tcfg := `\n[DataExtract.\"Version__9\"]\nFile = \"/tmp/datatestfileX.1\"\nJson = \"a.c\"\n`\n\tg := New(cfg, a)\n\n\tg.Start()\n\n\ta.testfile = \"/tmp/datatestfileX.1\"\n\n\tfi := makeFile(`{\"a\":{\"c\": true}}`, \"datatestfileX.1\")\n\terr := g.CheckFile(&fi, \"/tmp\")\n\tif err != nil {\n\t\tt.Errorf(\"CheckFile failed\")\n\t}\n\tif data, ok := a.Data[\"Version\"]; !ok || data != \"true\" {\n\t\tt.Errorf(\"data extract failed Json\")\n\t}\n\tos.Remove(\"/tmp/datatestfileX.1\")\n\tdelete(a.Data, \"Version\")\n\n\tg.Finalize()\n}\n\nfunc TestJsonError(t *testing.T) {\n\n\ta := &testAnalyzer{}\n\ta.Data = make(map[string]string)\n\n\tcfg := `\n[DataExtract.\"Version__9\"]\nFile = \"/tmp/datatestfileX.1\"\nJson = \"a.c\"\n`\n\tg := New(cfg, a)\n\n\tg.Start()\n\n\ta.testfile = \"/tmp/datatestfileX.1\"\n\n\tfi := makeFile(`{\"a\":{\"c\": true}`, \"datatestfileX.1\")\n\terr := g.CheckFile(&fi, \"/tmp\")\n\tif err != nil {\n\t\tt.Errorf(\"CheckFile failed\")\n\t}\n\tif data, ok := a.Data[\"Version\"]; !ok || data == \"true\" {\n\t\tt.Errorf(\"data extract failed Json: %s\", a.Data[\"Version\"])\n\t}\n\tos.Remove(\"/tmp/datatestfileX.1\")\n\tdelete(a.Data, \"Version\")\n\n\tg.Finalize()\n}\n\nfunc TestJson4Num(t *testing.T) {\n\n\ta := &testAnalyzer{}\n\ta.Data = make(map[string]string)\n\n\tcfg := `\n[DataExtract.\"Version__9\"]\nFile = \"/tmp/datatestfileX.1\"\nJson = \"a.d\"\n`\n\tg := New(cfg, a)\n\n\tg.Start()\n\n\ta.testfile = \"/tmp/datatestfileX.1\"\n\n\tfi := makeFile(`{\"a\":{\"d\": 123}}`, \"datatestfileX.1\")\n\terr := g.CheckFile(&fi, \"/tmp\")\n\tif err != nil {\n\t\tt.Errorf(\"CheckFile failed\")\n\t}\n\tif data, ok := a.Data[\"Version\"]; !ok || data != \"123.000000\" {\n\t\tt.Errorf(\"data extract failed Json, %s\", a.Data[\"Version\"])\n\t}\n\tos.Remove(\"/tmp/datatestfileX.1\")\n\tdelete(a.Data, \"Version\")\n\n\tg.Finalize()\n}\n\nfunc TestJson5Deep(t *testing.T) {\n\n\ta := &testAnalyzer{}\n\ta.Data = make(map[string]string)\n\n\tcfg := `\n[DataExtract.\"Version__9\"]\nFile = \"/tmp/datatestfileX.1\"\nJson = \"a.b.c.d.e.f\"\n`\n\tg := New(cfg, a)\n\n\tg.Start()\n\n\ta.testfile = \"/tmp/datatestfileX.1\"\n\n\tfi := makeFile(`{\"a\":{\"b\":{\"c\":{\"d\":{\"e\":{\"f\": \"deep\"}}}}}}`, \"datatestfileX.1\")\n\terr := g.CheckFile(&fi, \"/tmp\")\n\tif err != nil {\n\t\tt.Errorf(\"CheckFile failed\")\n\t}\n\tif data, ok := a.Data[\"Version\"]; !ok || data != \"deep\" {\n\t\tt.Errorf(\"data extract failed Json, %s\", a.Data[\"Version\"])\n\t}\n\tos.Remove(\"/tmp/datatestfileX.1\")\n\tdelete(a.Data, \"Version\")\n\n\tg.Finalize()\n}\n\nfunc TestJson6array(t *testing.T) {\n\n\ta := &testAnalyzer{}\n\ta.Data = make(map[string]string)\n\n\tcfg := `\n[DataExtract.\"Version__9\"]\nFile = \"/tmp/datatestfileX.1\"\nJson = \"a.0.c\"\n`\n\tg := New(cfg, a)\n\n\tg.Start()\n\n\ta.testfile = \"/tmp/datatestfileX.1\"\n\n\tfi := makeFile(`{\"a\":[{\"c\": true}]}`, \"datatestfileX.1\")\n\terr := g.CheckFile(&fi, \"/tmp\")\n\tif err != nil {\n\t\tt.Errorf(\"CheckFile failed\")\n\t}\n\tif data, ok := a.Data[\"Version\"]; !ok || data != \"true\" {\n\t\tt.Errorf(\"data extract failed Json\")\n\t}\n\tos.Remove(\"/tmp/datatestfileX.1\")\n\tdelete(a.Data, \"Version\")\n\n\tg.Finalize()\n}\n\nfunc TestJsonContent(t *testing.T) {\n\tcfg := `\n[GlobalConfig]\nFsType = \"dirfs\"\n\n[DataExtract.\"jsonfile.json\"]\nFile = \"/jsonfile.json\"\nRegEx = \"(.*)\\\\n\"\n`\n\tanalyzer := analyzer.NewFromConfig(\"../../../test/testdir\", cfg)\n\tanalyzer.AddAnalyzerPlugin(New(string(cfg), analyzer))\n\tanalyzer.RunPlugins()\n\n\treport := analyzer.JsonReport()\n\n\titem, err := util.XtractJsonField([]byte(report), []string{\"data\", \"jsonfile.json\", \"test_str\"})\n\tif err != nil {\n\t\tt.Errorf(\"error %s\", err)\n\t}\n\tif item != \"yolo\" {\n\t\tt.Errorf(\"data was not json encoded: %s\", report)\n\t}\n\n\t_ = analyzer.CleanUp()\n}\n"
  },
  {
    "path": "pkg/analyzer/dircontent/dircontent.go",
    "content": "/*\nCopyright 2019-present, Cruise LLC\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n\thttp://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n*/\n\npackage dircontent\n\nimport (\n\t\"fmt\"\n\t\"path\"\n\n\t\"github.com/BurntSushi/toml\"\n\t\"github.com/bmatcuk/doublestar\"\n\n\t\"github.com/cruise-automation/fwanalyzer/pkg/analyzer\"\n\t\"github.com/cruise-automation/fwanalyzer/pkg/fsparser\"\n)\n\ntype dirContentType struct {\n\tPath     string          // path of directory to check\n\tAllowed  []string        // list of files that are allowed to be there\n\tRequired []string        // list of files that must be there\n\tfound    map[string]bool // whether or not there was a match for this file\n}\n\ntype dirContentCheckType struct {\n\tdirs map[string]dirContentType\n\ta    analyzer.AnalyzerType\n}\n\nfunc addTrailingSlash(path string) string {\n\tif path[len(path)-1] != '/' {\n\t\treturn path + \"/\"\n\t}\n\treturn path\n}\n\nfunc validateItem(item dirContentType) bool {\n\t// ensure items in the Allowed/Required lists are valid for doublestar.Match()\n\tfor _, allowed := range item.Allowed {\n\t\t_, err := doublestar.Match(allowed, \"\")\n\t\tif err != nil {\n\t\t\treturn false\n\t\t}\n\t}\n\tfor _, required := range item.Required {\n\t\t_, err := doublestar.Match(required, \"\")\n\t\tif err != nil {\n\t\t\treturn false\n\t\t}\n\t}\n\treturn true\n}\n\nfunc New(config string, a analyzer.AnalyzerType) *dirContentCheckType {\n\ttype dirCheckListType struct {\n\t\tDirContent map[string]dirContentType\n\t}\n\n\tcfg := dirContentCheckType{a: a, dirs: make(map[string]dirContentType)}\n\n\tvar dec dirCheckListType\n\t_, err := toml.Decode(config, &dec)\n\tif err != nil {\n\t\tpanic(\"can't read config data: \" + err.Error())\n\t}\n\n\tfor name, item := range dec.DirContent {\n\t\tif !validateItem(item) {\n\t\t\ta.AddOffender(name, \"invalid DirContent entry\")\n\t\t}\n\t\titem.Path = addTrailingSlash(name)\n\t\tif _, ok := cfg.dirs[item.Path]; ok {\n\t\t\ta.AddOffender(name, \"only one DirContent is allowed per path\")\n\t\t}\n\t\titem.found = make(map[string]bool)\n\t\tfor _, req := range item.Required {\n\t\t\titem.found[req] = false\n\t\t}\n\t\tcfg.dirs[item.Path] = item\n\t}\n\n\treturn &cfg\n}\n\nfunc (state *dirContentCheckType) Start() {}\n\nfunc (state *dirContentCheckType) Finalize() string {\n\tfor _, item := range state.dirs {\n\t\tfor fn, found := range item.found {\n\t\t\tif !found {\n\t\t\t\tstate.a.AddOffender(fn, fmt.Sprintf(\"DirContent: required file %s not found in directory %s\", fn, item.Path))\n\t\t\t}\n\t\t}\n\t}\n\treturn \"\"\n}\n\nfunc (state *dirContentCheckType) Name() string {\n\treturn \"DirContent\"\n}\n\nfunc (state *dirContentCheckType) CheckFile(fi *fsparser.FileInfo, dirpath string) error {\n\tdp := addTrailingSlash(dirpath)\n\n\titem, ok := state.dirs[dp]\n\tif !ok {\n\t\treturn nil\n\t}\n\tfound := false\n\tfor _, fn := range item.Allowed {\n\t\t// allow globs for Allowed\n\t\tm, err := doublestar.Match(fn, fi.Name)\n\t\tif err != nil {\n\t\t\t// shouldn't happen because we check these in validateItem()\n\t\t\treturn err\n\t\t}\n\t\tif m {\n\t\t\tfound = true\n\t\t}\n\t}\n\n\tfor _, fn := range item.Required {\n\t\tm, err := doublestar.Match(fn, fi.Name)\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\t\tif m {\n\t\t\titem.found[fn] = true\n\t\t\tfound = true\n\t\t}\n\t}\n\n\tif !found {\n\t\tstate.a.AddOffender(path.Join(dirpath, fi.Name), fmt.Sprintf(\"DirContent: File %s not allowed in directory %s\", fi.Name, dirpath))\n\t}\n\treturn nil\n}\n"
  },
  {
    "path": "pkg/analyzer/dircontent/dircontent_test.go",
    "content": "/*\nCopyright 2019-present, Cruise LLC\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n\thttp://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n*/\n\npackage dircontent\n\nimport (\n\t\"testing\"\n\n\t\"github.com/cruise-automation/fwanalyzer/pkg/analyzer\"\n\t\"github.com/cruise-automation/fwanalyzer/pkg/fsparser\"\n)\n\ntype OffenderCallack func(fn string)\n\ntype testAnalyzer struct {\n\tocb      OffenderCallack\n\ttestfile string\n}\n\nfunc (a *testAnalyzer) AddData(key, value string) {}\nfunc (a *testAnalyzer) GetFileInfo(filepath string) (fsparser.FileInfo, error) {\n\treturn fsparser.FileInfo{}, nil\n}\nfunc (a *testAnalyzer) RemoveFile(filepath string) error {\n\treturn nil\n}\nfunc (a *testAnalyzer) FileGetSha256(filepath string) ([]byte, error) {\n\treturn []byte(\"\"), nil\n}\nfunc (a *testAnalyzer) FileGet(filepath string) (string, error) {\n\treturn a.testfile, nil\n}\nfunc (a *testAnalyzer) AddOffender(filepath string, reason string) {\n\ta.ocb(filepath)\n}\nfunc (a *testAnalyzer) AddInformational(filepath string, reason string) {}\nfunc (a *testAnalyzer) CheckAllFilesWithPath(cb analyzer.AllFilesCallback, cbdata analyzer.AllFilesCallbackData, filepath string) {\n}\nfunc (a *testAnalyzer) ImageInfo() analyzer.AnalyzerReport {\n\treturn analyzer.AnalyzerReport{}\n}\n\nfunc TestDirCheck(t *testing.T) {\n\ta := &testAnalyzer{}\n\tcfg := `\n[DirContent.\"/temp\"]\nAllowed = [\"file1\", \"file2\"]\nRequired = [\"file10\"]\n\t`\n\n\ttests := []struct {\n\t\tpath               string\n\t\tfile               string\n\t\tshouldTrigger      bool\n\t\tshouldTriggerFinal bool\n\t}{\n\t\t{\n\t\t\t\"/temp\", \"file1\", false, true, // file allowed\n\t\t},\n\t\t{\n\t\t\t\"/temp\", \"file4\", true, true, // file not allowed\n\t\t},\n\t\t{\n\t\t\t\"/temp1\", \"file4\", false, true, // wrong dir, shouldn't matter\n\t\t},\n\t\t{\n\t\t\t\"/temp\", \"file10\", false, false, // file is required\n\t\t},\n\t}\n\n\tg := New(cfg, a)\n\tg.Start()\n\n\tfor _, test := range tests {\n\t\ttriggered := false\n\t\ta.ocb = func(fp string) { triggered = true }\n\t\tfi := fsparser.FileInfo{Name: test.file}\n\t\terr := g.CheckFile(&fi, test.path)\n\t\tif err != nil {\n\t\t\tt.Errorf(\"CheckFile returned error for %s\", fi.Name)\n\t\t}\n\t\tif triggered != test.shouldTrigger {\n\t\t\tt.Errorf(\"incorrect result for %s/%s, wanted %v got %v\", test.path, test.file, test.shouldTrigger, triggered)\n\t\t}\n\n\t\ttriggered = false\n\t\tg.Finalize()\n\t\tif triggered != test.shouldTriggerFinal {\n\t\t\tt.Errorf(\"incorrect result for %s/%s on Finalize(), wanted %v got %v\", test.path, test.file, test.shouldTriggerFinal, triggered)\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "pkg/analyzer/filecmp/filecmp.go",
    "content": "/*\nCopyright 2019-present, Cruise LLC\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n\thttp://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n*/\n\npackage filecmp\n\nimport (\n\t\"fmt\"\n\t\"io\"\n\t\"io/ioutil\"\n\t\"os\"\n\t\"os/exec\"\n\t\"path\"\n\t\"syscall\"\n\n\t\"github.com/BurntSushi/toml\"\n\n\t\"github.com/cruise-automation/fwanalyzer/pkg/analyzer\"\n\t\"github.com/cruise-automation/fwanalyzer/pkg/fsparser\"\n)\n\ntype cmpType struct {\n\tFile              string // filename\n\tOldFilePath       string\n\tScript            string\n\tScriptOptions     []string\n\tInformationalOnly bool   // put result into Informational (not Offenders)\n\tname              string // name of this check (need to be unique)\n\n}\n\ntype fileCmpType struct {\n\tfiles map[string][]cmpType\n\ta     analyzer.AnalyzerType\n}\n\nfunc New(config string, a analyzer.AnalyzerType, fileDirectory string) *fileCmpType {\n\ttype fileCmpListType struct {\n\t\tFileCmp map[string]cmpType\n\t}\n\tcfg := fileCmpType{a: a, files: make(map[string][]cmpType)}\n\n\tvar fcc fileCmpListType\n\t_, err := toml.Decode(config, &fcc)\n\tif err != nil {\n\t\tpanic(\"can't read config data: \" + err.Error())\n\t}\n\n\t// convert text name based map to filename based map with an array of checks\n\tfor name, item := range fcc.FileCmp {\n\t\t// make sure required options are set\n\t\tif item.OldFilePath == \"\" || item.Script == \"\" {\n\t\t\tcontinue\n\t\t}\n\t\tvar items []cmpType\n\t\tif _, ok := cfg.files[item.File]; ok {\n\t\t\titems = cfg.files[item.File]\n\t\t}\n\n\t\tif fileDirectory != \"\" {\n\t\t\titem.OldFilePath = path.Join(fileDirectory, item.OldFilePath)\n\t\t}\n\n\t\titem.name = name\n\t\titem.File = path.Clean(item.File)\n\t\titems = append(items, item)\n\t\tcfg.files[item.File] = items\n\t}\n\n\treturn &cfg\n}\n\nfunc (state *fileCmpType) Start() {}\n\nfunc (state *fileCmpType) Finalize() string {\n\treturn \"\"\n}\n\nfunc (state *fileCmpType) Name() string {\n\treturn \"FileCmp\"\n}\n\nfunc fileExists(filePath string) error {\n\tvar fileState syscall.Stat_t\n\treturn syscall.Lstat(filePath, &fileState)\n}\n\nfunc copyFile(out string, in string) error {\n\tsrc, err := os.Open(in)\n\tif err != nil {\n\t\treturn err\n\t}\n\tdefer src.Close()\n\tdst, err := os.Create(out)\n\tif err != nil {\n\t\treturn err\n\t}\n\tdefer dst.Close()\n\t_, err = io.Copy(dst, src)\n\treturn err\n}\n\nfunc makeTmpFromOld(filePath string) (string, error) {\n\ttmpfile, err := ioutil.TempFile(\"\", \"\")\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\tdefer tmpfile.Close()\n\tsrc, err := os.Open(filePath)\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\tdefer src.Close()\n\t_, err = io.Copy(tmpfile, src)\n\treturn tmpfile.Name(), err\n}\n\nfunc (state *fileCmpType) CheckFile(fi *fsparser.FileInfo, filepath string) error {\n\tfn := path.Join(filepath, fi.Name)\n\tif _, ok := state.files[fn]; !ok {\n\t\treturn nil\n\t}\n\n\tfor _, item := range state.files[fn] {\n\t\tif !fi.IsFile() || fi.IsLink() {\n\t\t\tstate.a.AddOffender(fn, \"FileCmp: is not a file or is a link\")\n\t\t\tcontinue\n\t\t}\n\n\t\ttmpfn, err := state.a.FileGet(fn)\n\t\tif err != nil {\n\t\t\tstate.a.AddOffender(fn, fmt.Sprintf(\"FileCmp: error getting file: %s\", err))\n\t\t\tcontinue\n\t\t}\n\n\t\t// we don't have a saved file so save it now and skip this check\n\t\tif fileExists(item.OldFilePath) != nil {\n\t\t\terr := copyFile(item.OldFilePath+\".new\", tmpfn)\n\t\t\tif err != nil {\n\t\t\t\tstate.a.AddOffender(fn, fmt.Sprintf(\"FileCmp: error saving file: %s\", err))\n\t\t\t\tcontinue\n\t\t\t}\n\t\t\tstate.a.AddInformational(fn, \"FileCmp: saved file for next run\")\n\t\t\tcontinue\n\t\t}\n\n\t\toldTmp, err := makeTmpFromOld(item.OldFilePath)\n\t\tif err != nil {\n\t\t\tstate.a.AddOffender(fn, fmt.Sprintf(\"FileCmp: error getting old file: %s\", err))\n\t\t\tcontinue\n\t\t}\n\t\targs := []string{fi.Name, oldTmp, tmpfn}\n\t\tif len(item.ScriptOptions) > 0 {\n\t\t\targs = append(args, \"--\")\n\t\t\targs = append(args, item.ScriptOptions...)\n\t\t}\n\n\t\tout, err := exec.Command(item.Script, args...).CombinedOutput()\n\t\tif err != nil {\n\t\t\tstate.a.AddOffender(path.Join(filepath, fi.Name), fmt.Sprintf(\"script(%s) error=%s\", item.Script, err))\n\t\t}\n\n\t\terr = state.a.RemoveFile(tmpfn)\n\t\tif err != nil {\n\t\t\tpanic(\"removeFile failed\")\n\t\t}\n\t\terr = state.a.RemoveFile(oldTmp)\n\t\tif err != nil {\n\t\t\tpanic(\"removeFile failed\")\n\t\t}\n\n\t\tif len(out) > 0 {\n\t\t\tif item.InformationalOnly {\n\t\t\t\tstate.a.AddInformational(path.Join(filepath, fi.Name), string(out))\n\t\t\t} else {\n\t\t\t\tstate.a.AddOffender(path.Join(filepath, fi.Name), string(out))\n\t\t\t}\n\t\t}\n\t}\n\n\treturn nil\n}\n"
  },
  {
    "path": "pkg/analyzer/filecmp/filecmp_test.go",
    "content": "/*\nCopyright 2019-present, Cruise LLC\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n\thttp://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n*/\n\npackage filecmp\n\nimport (\n\t\"io/ioutil\"\n\t\"os\"\n\t\"testing\"\n\n\t\"github.com/cruise-automation/fwanalyzer/pkg/analyzer\"\n\t\"github.com/cruise-automation/fwanalyzer/pkg/fsparser\"\n)\n\ntype OffenderCallack func(fn string, info bool)\n\ntype testAnalyzer struct {\n\tocb      OffenderCallack\n\ttestfile string\n}\n\nfunc (a *testAnalyzer) AddData(key, value string) {}\nfunc (a *testAnalyzer) GetFileInfo(filepath string) (fsparser.FileInfo, error) {\n\treturn fsparser.FileInfo{}, nil\n}\nfunc (a *testAnalyzer) RemoveFile(filepath string) error {\n\treturn nil\n}\nfunc (a *testAnalyzer) FileGetSha256(filepath string) ([]byte, error) {\n\treturn []byte(\"\"), nil\n}\nfunc (a *testAnalyzer) FileGet(filepath string) (string, error) {\n\treturn a.testfile, nil\n}\nfunc (a *testAnalyzer) AddOffender(filepath string, reason string) {\n\ta.ocb(reason, false)\n}\nfunc (a *testAnalyzer) AddInformational(filepath string, reason string) {\n\ta.ocb(reason, true)\n}\nfunc (a *testAnalyzer) CheckAllFilesWithPath(cb analyzer.AllFilesCallback, cbdata analyzer.AllFilesCallbackData, filepath string) {\n}\nfunc (a *testAnalyzer) ImageInfo() analyzer.AnalyzerReport {\n\treturn analyzer.AnalyzerReport{}\n}\n\nfunc TestCmp(t *testing.T) {\n\n\ta := &testAnalyzer{}\n\n\tcfg := `\n[FileCmp.\"Test1\"]\nFile =\"/cmp_test_1\"\nScript = \"diff.sh\"\nScriptOptions = [\"\"]\nOldFilePath = \"/tmp/analyzer_filecmp_1\"\n`\n\n\tg := New(cfg, a, \"\")\n\tg.Start()\n\n\tcalled := false\n\tinfoText := \"\"\n\ta.ocb = func(name string, info bool) {\n\t\tcalled = true\n\t\tinfoText = name\n\t}\n\n\t// same file should not produce output\n\n\tdata := `\n\taaa\n\tbbb\n\tccc\n\tddd\n\t`\n\n\terr := ioutil.WriteFile(\"/tmp/analyzer_filecmp_1\", []byte(data), 0755)\n\tif err != nil {\n\t\tt.Error(err)\n\t}\n\n\ta.testfile = \"/tmp/analyzer_filecmp_1\"\n\n\tcalled = false\n\tinfoText = \"\"\n\n\tfi := fsparser.FileInfo{Name: \"cmp_test_1\", Mode: 100755}\n\terr = g.CheckFile(&fi, \"/\")\n\tif err != nil {\n\t\tt.Error(err)\n\t}\n\n\tif called {\n\t\tt.Errorf(\"should not produce offender: %s\", infoText)\n\t}\n\n\t// should cause an offender\n\n\tcalled = false\n\tinfoText = \"\"\n\n\tdata = `\n\taaa\n\tbbb\n\tccc\n\tddd\n\t`\n\n\terr = ioutil.WriteFile(\"/tmp/analyzer_filecmp_1\", []byte(data), 0755)\n\tif err != nil {\n\t\tt.Error(err)\n\t}\n\n\tdata = `\n\taaa\n\tddd\n\tccc\n\t`\n\n\terr = ioutil.WriteFile(\"/tmp/analyzer_filecmp_2\", []byte(data), 0755)\n\tif err != nil {\n\t\tt.Error(err)\n\t}\n\n\ta.testfile = \"/tmp/analyzer_filecmp_2\"\n\n\tfi = fsparser.FileInfo{Name: \"cmp_test_1\", Mode: 100755}\n\terr = g.CheckFile(&fi, \"/\")\n\tif err != nil {\n\t\tt.Error(err)\n\t}\n\n\tif !called {\n\t\tt.Errorf(\"should produce offender: %s\", infoText)\n\t}\n}\n\nfunc TestCmpInfo(t *testing.T) {\n\n\ta := &testAnalyzer{}\n\n\tcfg := `\n[FileCmp.\"Test1\"]\nFile =\"/cmp_test_1\"\nScript = \"diff.sh\"\nScriptOptions = [\"\"]\nInformationalOnly = true\nOldFilePath = \"/tmp/analyzer_filecmp_1\"\n`\n\n\tg := New(cfg, a, \"\")\n\tg.Start()\n\n\tcalled := false\n\tinfoText := \"\"\n\tinfoO := false\n\ta.ocb = func(name string, info bool) {\n\t\tcalled = true\n\t\tinfoText = name\n\t\tinfoO = info\n\t}\n\n\t// should cause an informational\n\n\tdata := `\n\taaa\n\tbbb\n\tccc\n\tddd\n\t`\n\n\terr := ioutil.WriteFile(\"/tmp/analyzer_filecmp_1\", []byte(data), 0755)\n\tif err != nil {\n\t\tt.Error(err)\n\t}\n\n\tdata = `\n\taaa\n\tddd\n\tccc\n\t`\n\n\terr = ioutil.WriteFile(\"/tmp/analyzer_filecmp_2\", []byte(data), 0755)\n\tif err != nil {\n\t\tt.Error(err)\n\t}\n\n\ta.testfile = \"/tmp/analyzer_filecmp_2\"\n\n\tfi := fsparser.FileInfo{Name: \"cmp_test_1\", Mode: 100755}\n\terr = g.CheckFile(&fi, \"/\")\n\tif err != nil {\n\t\tt.Error(err)\n\t}\n\n\tif !called || !infoO {\n\t\tt.Errorf(\"should produce informational: %s\", infoText)\n\t}\n}\n\nfunc TestCmpNoOld(t *testing.T) {\n\n\ta := &testAnalyzer{}\n\n\tcfg := `\n[FileCmp.\"Test1\"]\nFile =\"/cmp_test_1\"\nScript = \"diff.sh\"\nScriptOptions = [\"\"]\nOldFilePath = \"/tmp/analyzer_filecmp_99\"\n`\n\n\tg := New(cfg, a, \"\")\n\tg.Start()\n\n\tcalled := false\n\tinfoText := \"\"\n\tinfoO := false\n\ta.ocb = func(name string, info bool) {\n\t\tcalled = true\n\t\tinfoText = name\n\t\tinfoO = info\n\t}\n\n\t// should cause an informational\n\n\tdata := `\n\taaa\n\tbbb\n\tccc\n\tddd\n\t`\n\n\terr := ioutil.WriteFile(\"/tmp/analyzer_filecmp_1\", []byte(data), 0755)\n\tif err != nil {\n\t\tt.Error(err)\n\t}\n\n\ta.testfile = \"/tmp/analyzer_filecmp_1\"\n\n\tos.Remove(\"/tmp/analyzer_filecmp_99.new\")\n\n\tfi := fsparser.FileInfo{Name: \"cmp_test_1\", Mode: 100755}\n\terr = g.CheckFile(&fi, \"/\")\n\tif err != nil {\n\t\tt.Error(err)\n\t}\n\n\tif !called || !infoO {\n\t\tt.Errorf(\"should produce informational: %s\", infoText)\n\t}\n\n\tinData, err := ioutil.ReadFile(\"/tmp/analyzer_filecmp_99.new\")\n\tif err != nil {\n\t\tt.Error(err)\n\t}\n\tif string(inData) != data {\n\t\tt.Errorf(\"files not equal after save\")\n\t}\n}\n"
  },
  {
    "path": "pkg/analyzer/filecontent/filecontent.go",
    "content": "/*\nCopyright 2019-present, Cruise LLC\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n\thttp://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n*/\n\npackage filecontent\n\nimport (\n\t\"encoding/hex\"\n\t\"fmt\"\n\t\"io/ioutil\"\n\t\"os\"\n\t\"os/exec\"\n\t\"path\"\n\t\"regexp\"\n\t\"strings\"\n\n\t\"github.com/BurntSushi/toml\"\n\t\"github.com/bmatcuk/doublestar\"\n\n\t\"github.com/cruise-automation/fwanalyzer/pkg/analyzer\"\n\t\"github.com/cruise-automation/fwanalyzer/pkg/fsparser\"\n\t\"github.com/cruise-automation/fwanalyzer/pkg/util\"\n)\n\ntype contentType struct {\n\tFile              string   // filename\n\tInformationalOnly bool     // put result into Informational (not Offenders)\n\tRegEx             string   // regex to match against the file content\n\tRegExLineByLine   bool     // match regex line by line vs whole file\n\tMatch             bool     // define if regex should match or not\n\tDigest            string   // used for SHA256 matching\n\tScript            string   // used for script execution\n\tScriptOptions     []string // options for script execution\n\tJson              string   // used for json field matching\n\tDesc              string   // description\n\tname              string   // name of this check (need to be unique)\n\tchecked           bool     // if this file was checked or not\n}\n\ntype fileContentType struct {\n\tfiles map[string][]contentType\n\ta     analyzer.AnalyzerType\n}\n\nfunc validateItem(item contentType) bool {\n\tif item.RegEx != \"\" && (item.Digest == \"\" && item.Script == \"\" && item.Json == \"\") {\n\t\treturn true\n\t}\n\tif item.Digest != \"\" && (item.RegEx == \"\" && item.Script == \"\" && item.Json == \"\") {\n\t\treturn true\n\t}\n\tif item.Script != \"\" && (item.RegEx == \"\" && item.Digest == \"\" && item.Json == \"\") {\n\t\treturn true\n\t}\n\tif item.Json != \"\" && (item.RegEx == \"\" && item.Digest == \"\" && item.Script == \"\") {\n\t\treturn true\n\t}\n\treturn false\n}\n\nfunc New(config string, a analyzer.AnalyzerType, MatchInvert bool) *fileContentType {\n\ttype fileContentListType struct {\n\t\tFileContent map[string]contentType\n\t}\n\tcfg := fileContentType{a: a, files: make(map[string][]contentType)}\n\n\tvar fcc fileContentListType\n\t_, err := toml.Decode(config, &fcc)\n\tif err != nil {\n\t\tpanic(\"can't read config data: \" + err.Error())\n\t}\n\n\t// convert text name based map to filename based map with an array of checks\n\tfor name, item := range fcc.FileContent {\n\t\tif !validateItem(item) {\n\t\t\ta.AddOffender(name, \"FileContent: check must include one of Digest, RegEx, Json, or Script\")\n\t\t\tcontinue\n\t\t}\n\t\tvar items []contentType\n\t\tif _, ok := cfg.files[item.File]; ok {\n\t\t\titems = cfg.files[item.File]\n\t\t}\n\t\titem.name = name\n\t\tif MatchInvert {\n\t\t\titem.Match = !item.Match\n\t\t}\n\t\titems = append(items, item)\n\t\titem.File = path.Clean(item.File)\n\t\tcfg.files[item.File] = items\n\t}\n\n\treturn &cfg\n}\n\nfunc (state *fileContentType) Start() {}\n\nfunc (state *fileContentType) Finalize() string {\n\tfor fn, items := range state.files {\n\t\tfor _, item := range items {\n\t\t\tif !item.checked {\n\t\t\t\tstate.a.AddOffender(fn, fmt.Sprintf(\"FileContent: file %s not found\", fn))\n\t\t\t}\n\t\t}\n\t}\n\treturn \"\"\n}\n\nfunc (state *fileContentType) Name() string {\n\treturn \"FileContent\"\n}\n\nfunc regexCompile(rx string) (*regexp.Regexp, error) {\n\treg, err := regexp.CompilePOSIX(rx)\n\tif err != nil {\n\t\treg, err = regexp.Compile(rx)\n\t}\n\treturn reg, err\n}\n\nfunc (state *fileContentType) canCheckFile(fi *fsparser.FileInfo, fn string, item contentType) bool {\n\tif !fi.IsFile() {\n\t\tstate.a.AddOffender(fn, fmt.Sprintf(\"FileContent: '%s' file is NOT a file : %s\", item.name, item.Desc))\n\t\treturn false\n\t}\n\tif fi.IsLink() {\n\t\tstate.a.AddOffender(fn, fmt.Sprintf(\"FileContent: '%s' file is a link (check actual file) : %s\", item.name, item.Desc))\n\t\treturn false\n\t}\n\treturn true\n}\n\nfunc (state *fileContentType) CheckFile(fi *fsparser.FileInfo, filepath string) error {\n\tfn := path.Join(filepath, fi.Name)\n\tif _, ok := state.files[fn]; !ok {\n\t\treturn nil\n\t}\n\n\titems := state.files[fn]\n\n\tfor n, item := range items {\n\t\titems[n].checked = true\n\t\t//fmt.Printf(\"name: %s file: %s (%s)\\n\", item.name, item.File, fn)\n\t\tif item.RegEx != \"\" {\n\t\t\tif !state.canCheckFile(fi, fn, item) {\n\t\t\t\tcontinue\n\t\t\t}\n\t\t\treg, err := regexCompile(item.RegEx)\n\t\t\tif err != nil {\n\t\t\t\tstate.a.AddOffender(fn, fmt.Sprintf(\"FileContent: regex compile error: %s : %s : %s\", item.RegEx, item.name, item.Desc))\n\t\t\t\tcontinue\n\t\t\t}\n\n\t\t\ttmpfn, err := state.a.FileGet(fn)\n\t\t\t// this should never happen since this function is called for every existing file\n\t\t\tif err != nil {\n\t\t\t\tstate.a.AddOffender(fn, fmt.Sprintf(\"FileContent: error reading file: %s\", err))\n\t\t\t\tcontinue\n\t\t\t}\n\t\t\tfdata, _ := ioutil.ReadFile(tmpfn)\n\t\t\terr = state.a.RemoveFile(tmpfn)\n\t\t\tif err != nil {\n\t\t\t\tpanic(\"RemoveFile failed\")\n\t\t\t}\n\t\t\tif item.RegExLineByLine {\n\t\t\t\tfor _, line := range strings.Split(strings.TrimSuffix(string(fdata), \"\\n\"), \"\\n\") {\n\t\t\t\t\tif reg.MatchString(line) == item.Match {\n\t\t\t\t\t\tif item.InformationalOnly {\n\t\t\t\t\t\t\tstate.a.AddInformational(fn, fmt.Sprintf(\"RegEx check failed, for: %s : %s : line: %s\", item.name, item.Desc, line))\n\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\tstate.a.AddOffender(fn, fmt.Sprintf(\"RegEx check failed, for: %s : %s : line: %s\", item.name, item.Desc, line))\n\t\t\t\t\t\t}\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t} else {\n\t\t\t\tif reg.Match(fdata) == item.Match {\n\t\t\t\t\tif item.InformationalOnly {\n\t\t\t\t\t\tstate.a.AddInformational(fn, fmt.Sprintf(\"RegEx check failed, for: %s : %s\", item.name, item.Desc))\n\t\t\t\t\t} else {\n\t\t\t\t\t\tstate.a.AddOffender(fn, fmt.Sprintf(\"RegEx check failed, for: %s : %s\", item.name, item.Desc))\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t}\n\t\t\tcontinue\n\t\t}\n\n\t\tif item.Digest != \"\" {\n\t\t\tif !state.canCheckFile(fi, fn, item) {\n\t\t\t\tcontinue\n\t\t\t}\n\t\t\tdigestRaw, err := state.a.FileGetSha256(fn)\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t\tdigest := hex.EncodeToString(digestRaw)\n\t\t\tsaved, _ := hex.DecodeString(item.Digest)\n\t\t\tsavedStr := hex.EncodeToString(saved)\n\t\t\tif digest != savedStr {\n\t\t\t\tif item.InformationalOnly {\n\t\t\t\t\tstate.a.AddInformational(fn, fmt.Sprintf(\"Digest (sha256) did not match found = %s should be = %s. %s : %s \", digest, savedStr, item.name, item.Desc))\n\t\t\t\t} else {\n\t\t\t\t\tstate.a.AddOffender(fn, fmt.Sprintf(\"Digest (sha256) did not match found = %s should be = %s. %s : %s \", digest, savedStr, item.name, item.Desc))\n\t\t\t\t}\n\t\t\t}\n\t\t\tcontinue\n\t\t}\n\n\t\tif item.Script != \"\" {\n\t\t\tcbd := callbackDataType{state, item.Script, item.ScriptOptions, item.InformationalOnly}\n\t\t\tif fi.IsDir() {\n\t\t\t\tstate.a.CheckAllFilesWithPath(checkFileScript, &cbd, fn)\n\t\t\t} else {\n\t\t\t\tif !state.canCheckFile(fi, fn, item) {\n\t\t\t\t\tcontinue\n\t\t\t\t}\n\t\t\t\tcheckFileScript(fi, filepath, &cbd)\n\t\t\t}\n\t\t}\n\n\t\tif item.Json != \"\" {\n\t\t\tif !state.canCheckFile(fi, fn, item) {\n\t\t\t\tcontinue\n\t\t\t}\n\t\t\ttmpfn, err := state.a.FileGet(fn)\n\t\t\tif err != nil {\n\t\t\t\tstate.a.AddOffender(fn, fmt.Sprintf(\"FileContent: error getting file: %s\", err))\n\t\t\t\tcontinue\n\t\t\t}\n\t\t\tfdata, err := ioutil.ReadFile(tmpfn)\n\t\t\tif err != nil {\n\t\t\t\tstate.a.AddOffender(fn, fmt.Sprintf(\"FileContent: error reading file: %s\", err))\n\t\t\t\tcontinue\n\t\t\t}\n\t\t\terr = state.a.RemoveFile(tmpfn)\n\t\t\tif err != nil {\n\t\t\t\tpanic(\"RemoveFile failed\")\n\t\t\t}\n\n\t\t\tfield := strings.SplitAfterN(item.Json, \":\", 2)\n\t\t\tif len(field) != 2 {\n\t\t\t\tstate.a.AddOffender(fn, fmt.Sprintf(\"FileContent: error Json config bad = %s, %s, %s\", item.Json, item.name, item.Desc))\n\t\t\t\tcontinue\n\t\t\t}\n\n\t\t\t// remove \":\" so we just have the value we want to check\n\t\t\tfield[0] = strings.Replace(field[0], \":\", \"\", 1)\n\n\t\t\tfieldData, err := util.XtractJsonField(fdata, strings.Split(field[0], \".\"))\n\t\t\tif err != nil {\n\t\t\t\tstate.a.AddOffender(fn, fmt.Sprintf(\"FileContent: error Json bad field = %s, %s, %s\", field[0], item.name, item.Desc))\n\t\t\t\tcontinue\n\t\t\t}\n\t\t\tif fieldData != field[1] {\n\t\t\t\tif item.InformationalOnly {\n\t\t\t\t\tstate.a.AddInformational(fn, fmt.Sprintf(\"Json field %s = %s did not match = %s, %s, %s\", field[0], fieldData, field[1], item.name, item.Desc))\n\t\t\t\t} else {\n\t\t\t\t\tstate.a.AddOffender(fn, fmt.Sprintf(\"Json field %s = %s did not match = %s, %s, %s\", field[0], fieldData, field[1], item.name, item.Desc))\n\t\t\t\t}\n\t\t\t}\n\t\t}\n\t}\n\treturn nil\n}\n\ntype callbackDataType struct {\n\tstate             *fileContentType\n\tscript            string\n\tscriptOptions     []string\n\tinformationalOnly bool\n}\n\n/*\n * Extract file and run script passing the file name as the argument to the script.\n * Only regular files that are not empty are processed, script is for checking content.\n * The script output is used to indicate an issue, the output is saved in the offender record.\n *\n * The first element in scriptOptions (from the callback data) defines a path match string.\n * This allows to specify a pattern the filename has to match. Files with names that do not match will\n * not be analyzed by the script. This is to speed up execution time since files have to be extracted\n * to analyze them with the external script.\n *\n * The following elements in scriptOptions will be passed to the script as cmd line arguments.\n *\n * The script is run with the following parameters:\n * script.sh <filename> <filename in filesystem> <uid> <gid> <mode> <selinux label - can be empty> -- <ScriptOptions[1]> <ScriptOptions[2]>\n */\nfunc checkFileScript(fi *fsparser.FileInfo, fullpath string, cbData analyzer.AllFilesCallbackData) {\n\tcbd := cbData.(*callbackDataType)\n\n\tfullname := path.Join(fullpath, fi.Name)\n\n\t// skip/ignore anything but normal files\n\tif !fi.IsFile() || fi.IsLink() {\n\t\treturn\n\t}\n\n\tif len(cbd.scriptOptions) >= 1 {\n\t\tm, err := doublestar.Match(cbd.scriptOptions[0], fi.Name)\n\t\tif err != nil {\n\t\t\tfmt.Fprintf(os.Stderr, \"Match error: %s\\n\", err)\n\t\t\treturn\n\t\t}\n\t\t// file name didn't match the specifications in scriptOptions[0]\n\t\tif !m {\n\t\t\treturn\n\t\t}\n\t}\n\n\tfname, _ := cbd.state.a.FileGet(fullname)\n\targs := []string{fname,\n\t\tfullname,\n\t\tfmt.Sprintf(\"%d\", fi.Uid),\n\t\tfmt.Sprintf(\"%d\", fi.Gid),\n\t\tfmt.Sprintf(\"%o\", fi.Mode),\n\t\tfi.SELinuxLabel,\n\t}\n\tif len(cbd.scriptOptions) >= 2 {\n\t\targs = append(args, \"--\")\n\t\targs = append(args, cbd.scriptOptions[1:]...)\n\t}\n\n\tout, err := exec.Command(cbd.script, args...).CombinedOutput()\n\tif err != nil {\n\t\tcbd.state.a.AddOffender(fullname, fmt.Sprintf(\"script(%s) error=%s\", cbd.script, err))\n\t}\n\n\terr = cbd.state.a.RemoveFile(fname)\n\tif err != nil {\n\t\tpanic(\"removeFile failed\")\n\t}\n\n\tif len(out) > 0 {\n\t\tif cbd.informationalOnly {\n\t\t\tcbd.state.a.AddInformational(fullname, string(out))\n\t\t} else {\n\t\t\tcbd.state.a.AddOffender(fullname, string(out))\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "pkg/analyzer/filecontent/filecontent_test.go",
    "content": "/*\nCopyright 2019-present, Cruise LLC\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n\thttp://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n*/\n\npackage filecontent\n\nimport (\n\t\"io/ioutil\"\n\t\"os\"\n\t\"testing\"\n\n\t\"github.com/cruise-automation/fwanalyzer/pkg/analyzer\"\n\t\"github.com/cruise-automation/fwanalyzer/pkg/fsparser\"\n)\n\ntype OffenderCallack func(fn string)\n\ntype testAnalyzer struct {\n\tocb      OffenderCallack\n\ttestfile string\n}\n\nfunc (a *testAnalyzer) AddData(key, value string) {}\nfunc (a *testAnalyzer) GetFileInfo(filepath string) (fsparser.FileInfo, error) {\n\treturn fsparser.FileInfo{}, nil\n}\nfunc (a *testAnalyzer) RemoveFile(filepath string) error {\n\treturn nil\n}\nfunc (a *testAnalyzer) FileGetSha256(filepath string) ([]byte, error) {\n\n\tif filepath == \"/tmp/datatestfile.1\" {\n\t\treturn []byte(\"AABBCCDDEEFF11223344\"), nil\n\t}\n\treturn []byte(\"AABBCCDDEEFF11223341\"), nil\n}\nfunc (a *testAnalyzer) FileGet(filepath string) (string, error) {\n\treturn a.testfile, nil\n}\nfunc (a *testAnalyzer) AddOffender(filepath string, reason string) {\n\ta.ocb(filepath)\n}\nfunc (a *testAnalyzer) AddInformational(filepath string, reason string) {}\nfunc (a *testAnalyzer) CheckAllFilesWithPath(cb analyzer.AllFilesCallback, cbdata analyzer.AllFilesCallbackData, filepath string) {\n}\nfunc (a *testAnalyzer) ImageInfo() analyzer.AnalyzerReport {\n\treturn analyzer.AnalyzerReport{}\n}\n\nfunc makeFile(data string, fn string) fsparser.FileInfo {\n\terr := ioutil.WriteFile(\"/tmp/\"+fn, []byte(data), 0666)\n\tif err != nil {\n\t\tpanic(err)\n\t}\n\treturn fsparser.FileInfo{Name: fn, Size: 1, Mode: 100666}\n}\n\nfunc TestRegex(t *testing.T) {\n\n\ta := &testAnalyzer{}\n\n\tcfg := `\n[FileContent.\"RegExTest1\"]\nRegEx = \".*Ver=1337.*\"\nMatch = true\nFile =\"/tmp/datatestfile.1\"\n\n[FileContent.\"RegExTest2\"]\nRegEx = \".*Ver=1337.*\"\nMatch = true\nFile =\"/tmp/datatestfile.1\"\n`\n\n\tg := New(cfg, a, false)\n\n\tg.Start()\n\n\ta.testfile = \"/tmp/datatestfile.1\"\n\n\t// match\n\ttriggered := false\n\ta.ocb = func(fn string) { triggered = true }\n\tfi := makeFile(\"sadkljhlksaj Ver=1337  \\naasas\\n \", \"datatestfile.1\")\n\terr := g.CheckFile(&fi, \"/tmp\")\n\tif err != nil {\n\t\tt.Errorf(\"CheckFile failed\")\n\t}\n\tif !triggered {\n\t\tt.Errorf(\"file content failed Regex\")\n\t}\n\tos.Remove(\"/tmp/datatestfile.1\")\n\n\t// do not match\n\ttriggered = false\n\ta.ocb = func(fn string) { triggered = true }\n\tfi = makeFile(\"sadkljhlksaj Ver=1338\\nasdads\\nadaasd\\n\", \"datatestfile.1\")\n\terr = g.CheckFile(&fi, \"/tmp\")\n\tif err != nil {\n\t\tt.Errorf(\"CheckFile failed\")\n\t}\n\tif triggered {\n\t\tt.Errorf(\"file content failed regex\")\n\t}\n\tos.Remove(\"/tmp/datatestfile.1\")\n\n\t// ensure file isn't flagged as not-found\n\tg.Finalize()\n\tif triggered {\n\t\tt.Errorf(\"file content failed, found file flagged as not-found\")\n\t}\n}\n\nfunc TestDigest(t *testing.T) {\n\n\ta := &testAnalyzer{}\n\n\tcfg := `\n[FileContent.\"digest test 1\"]\nDigest = \"4141424243434444454546463131323233333434\"\nFile = \"/tmp/datatestfile.1\"\n\n[FileContent.\"digest test 2\"]\nDigest = \"4141424243434444454546463131323233333435\"\nFile =\"/tmp/datatestfile.2\"\n`\n\n\tg := New(cfg, a, false)\n\n\tg.Start()\n\n\ta.testfile = \"/tmp/datatestfile.1\"\n\n\t// match\n\ttriggered := false\n\ta.ocb = func(fn string) { triggered = true }\n\tfi := makeFile(\"sadkljhlksaj Ver=1337  \\naasas\\n \", \"datatestfile.1\")\n\terr := g.CheckFile(&fi, \"/tmp\")\n\tif err != nil {\n\t\tt.Errorf(\"CheckFile failed\")\n\t}\n\tif triggered {\n\t\tt.Errorf(\"file content failed digest\")\n\t}\n\tos.Remove(\"/tmp/datatestfile.1\")\n\n\t// do not match\n\ttriggered = false\n\ta.ocb = func(fn string) { triggered = true }\n\tfi = makeFile(\"sadkljhlksaj Ver=1338\\nasdads\\nadaasd\\n\", \"datatestfile.2\")\n\terr = g.CheckFile(&fi, \"/tmp\")\n\tif err != nil {\n\t\tt.Errorf(\"CheckFile failed\")\n\t}\n\tif !triggered {\n\t\tt.Errorf(\"file content failed digest\")\n\t}\n\tos.Remove(\"/tmp/datatestfile.2\")\n\n\tg.Finalize()\n}\n\nfunc TestScript(t *testing.T) {\n\n\ta := &testAnalyzer{}\n\n\tcfg := `\n[FileContent.\"script test 1\"]\nScript=\"/tmp/testfilescript.sh\"\nFile = \"/tmp/datatestfile.1\"\n`\n\n\tscript := `#!/bin/sh\ncat $1\n`\n\n\terr := ioutil.WriteFile(\"/tmp/testfilescript.sh\", []byte(script), 0777)\n\tif err != nil {\n\t\tt.Error(err)\n\t}\n\n\tg := New(cfg, a, false)\n\n\tg.Start()\n\n\ta.testfile = \"/tmp/datatestfile.1\"\n\n\t// match\n\ttriggered := false\n\ta.ocb = func(fn string) { triggered = true }\n\tfi := makeFile(\"sadkljhlksaj Ver=1337  \\naasas\\n \", \"datatestfile.1\")\n\terr = g.CheckFile(&fi, \"/tmp\")\n\tif err != nil {\n\t\tt.Errorf(\"CheckFile failed\")\n\t}\n\tif !triggered {\n\t\tt.Errorf(\"file content script test failed\")\n\t}\n\tos.Remove(\"/tmp/datatestfile.1\")\n\tos.Remove(\"/tmp/testfilescript.sh\")\n\n\tg.Finalize()\n}\n\nfunc TestValidateItem(t *testing.T) {\n\n\ta := &testAnalyzer{}\n\n\tcfg := `\n[FileContent.\"digest test 1\"]\nDigest = \"4141424243434444454546463131323233333434\"\nScript = \"asdf.sh\"\nFile = \"/tmp/datatestfile.1\"\n`\n\ttriggered := false\n\ta.ocb = func(fn string) { triggered = true }\n\tg := New(cfg, a, false)\n\tif !triggered {\n\t\tt.Errorf(\"file content failed validate with multiple check types\")\n\t}\n\tg.Finalize()\n\n\ttriggered = false\n\tcfg = `\n[FileContent.\"digest test 1\"]\nFile = \"/tmp/datatestfile.1\"\n`\n\n\tNew(cfg, a, false)\n\tif !triggered {\n\t\tt.Errorf(\"file content failed validate without check type\")\n\t}\n}\n\nfunc TestMissingFile(t *testing.T) {\n\ta := &testAnalyzer{}\n\n\tcfg := `\n[FileContent.\"RegExTest1\"]\nRegEx = \".*Ver=1337.*\"\nMatch = true\nFile =\"/tmp/datatestfile.notfound\"\n`\n\tg := New(cfg, a, false)\n\tg.Start()\n\ta.testfile = \"/tmp/datatestfile.1\"\n\n\t// match\n\ttriggered := false\n\ta.ocb = func(fn string) { triggered = true }\n\tfi := makeFile(\"sadkljhlksaj Ver=1337  \\naasas\\n \", \"datatestfile.1\")\n\terr := g.CheckFile(&fi, \"/tmp\")\n\tif err != nil {\n\t\tt.Errorf(\"CheckFile failed\")\n\t}\n\t// pass should still be false here because CheckFile did not see the file\n\tif triggered {\n\t\tt.Errorf(\"file content failed, missing file checked\")\n\t}\n\n\tos.Remove(\"/tmp/datatestfile.1\")\n\tg.Finalize()\n\t// triggered should be true here because Finalize should call AddOffender\n\tif !triggered {\n\t\tt.Errorf(\"file content failed, missing file not found\")\n\t}\n}\n\nfunc TestJson(t *testing.T) {\n\n\ta := &testAnalyzer{}\n\n\tcfg := `\n[FileContent.\"json test 1\"]\nJson=\"a.b:test123\"\nFile = \"/tmp/datatestfile.1\"\n`\n\tg := New(cfg, a, false)\n\n\tg.Start()\n\n\ta.testfile = \"/tmp/datatestfile.1\"\n\n\ttriggered := false\n\ta.ocb = func(fn string) { triggered = true }\n\tfi := makeFile(`{\"a\":{\"b\": \"test123\"}}`, \"datatestfile.1\")\n\terr := g.CheckFile(&fi, \"/tmp\")\n\tif err != nil {\n\t\tt.Errorf(\"CheckFile failed\")\n\t}\n\tif triggered {\n\t\tt.Errorf(\"file content json failed\")\n\t}\n\tos.Remove(\"/tmp/datatestfile.1\")\n\n\tg.Finalize()\n}\n\nfunc TestJsonDoesNotMatch(t *testing.T) {\n\n\ta := &testAnalyzer{}\n\n\tcfg := `\n[FileContent.\"json test 1\"]\nJson=\"a.b:test12A\"\nFile = \"/tmp/datatestfile.1\"\n`\n\tg := New(cfg, a, false)\n\n\tg.Start()\n\n\ta.testfile = \"/tmp/datatestfile.1\"\n\n\ttriggered := false\n\ta.ocb = func(fn string) { triggered = true }\n\tfi := makeFile(`{\"a\":{\"b\": \"test123\"}}`, \"datatestfile.1\")\n\terr := g.CheckFile(&fi, \"/tmp\")\n\tif err != nil {\n\t\tt.Errorf(\"CheckFile failed\")\n\t}\n\tif !triggered {\n\t\tt.Errorf(\"file content json failed\")\n\t}\n\tos.Remove(\"/tmp/datatestfile.1\")\n\n\tg.Finalize()\n}\n\nfunc TestGlobalInvert(t *testing.T) {\n\n\ta := &testAnalyzer{}\n\n\tcfg := `[FileContent.\"RegExTest1\"]\nRegEx = \".*Ver=1337.*\"\nMatch = true\nFile =\"/tmp/datatestfile.1\"\n\n[FileContent.\"RegExTest2\"]\nRegEx = \".*Ver=1337.*\"\nMatch = true\nFile =\"/tmp/datatestfile.1\"\n`\n\n\tg := New(cfg, a, true)\n\n\tg.Start()\n\n\ta.testfile = \"/tmp/datatestfile.1\"\n\n\t// match\n\ttriggered := false\n\ta.ocb = func(fn string) { triggered = true }\n\tfi := makeFile(\"sadkljhlksaj Ver=1337  \\naasas\\n \", \"datatestfile.1\")\n\terr := g.CheckFile(&fi, \"/tmp\")\n\tif err != nil {\n\t\tt.Errorf(\"CheckFile failed\")\n\t}\n\tif triggered {\n\t\tt.Errorf(\"file content failed Regex\")\n\t}\n\tos.Remove(\"/tmp/datatestfile.1\")\n\n\t// dont match\n\ttriggered = false\n\ta.ocb = func(fn string) { triggered = true }\n\tfi = makeFile(\"sadkljhlksaj Ver=1338\\nasdads\\nadaasd\\n\", \"datatestfile.1\")\n\terr = g.CheckFile(&fi, \"/tmp\")\n\tif err != nil {\n\t\tt.Errorf(\"CheckFile failed\")\n\t}\n\tif !triggered {\n\t\tt.Errorf(\"file content failed regex\")\n\t}\n\tos.Remove(\"/tmp/datatestfile.1\")\n\n\t// ensure file isn't flagged as not-found\n\tg.Finalize()\n\tif !triggered {\n\t\tt.Errorf(\"file content failed, found file flagged as not-found\")\n\t}\n}\n"
  },
  {
    "path": "pkg/analyzer/filepathowner/filepathowner.go",
    "content": "/*\nCopyright 2019-present, Cruise LLC\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n\thttp://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n*/\n\npackage filepathowner\n\nimport (\n\t\"fmt\"\n\t\"path\"\n\n\t\"github.com/BurntSushi/toml\"\n\n\t\"github.com/cruise-automation/fwanalyzer/pkg/analyzer\"\n\t\"github.com/cruise-automation/fwanalyzer/pkg/fsparser\"\n)\n\ntype filePathOwner struct {\n\tUid int\n\tGid int\n}\n\ntype filePathOwenrList struct {\n\tFilePathOwner map[string]filePathOwner\n}\n\ntype fileownerpathType struct {\n\tfiles filePathOwenrList\n\ta     analyzer.AnalyzerType\n}\n\nfunc New(config string, a analyzer.AnalyzerType) *fileownerpathType {\n\tcfg := fileownerpathType{a: a}\n\n\t_, err := toml.Decode(config, &cfg.files)\n\tif err != nil {\n\t\tpanic(\"can't read config data: \" + err.Error())\n\t}\n\n\treturn &cfg\n}\n\nfunc (state *fileownerpathType) Start() {}\nfunc (state *fileownerpathType) CheckFile(fi *fsparser.FileInfo, filepath string) error {\n\treturn nil\n}\n\nfunc (state *fileownerpathType) Name() string {\n\treturn \"FilePathOwner\"\n}\n\ntype cbDataCheckOwnerPath struct {\n\ta   analyzer.AnalyzerType\n\tfop filePathOwner\n}\n\nfunc (state *fileownerpathType) Finalize() string {\n\tfor fn, item := range state.files.FilePathOwner {\n\t\tfilelist := cbDataCheckOwnerPath{a: state.a, fop: item}\n\t\tdf, err := state.a.GetFileInfo(fn)\n\t\tif err != nil {\n\t\t\tstate.a.AddOffender(fn, fmt.Sprintf(\"FilePathOwner, directory not found: %s\", fn))\n\t\t\tcontinue\n\t\t}\n\t\t// check the directory itself\n\t\tcbCheckOwnerPath(&df, fn, &filelist)\n\t\t// check anything within the directory\n\t\tstate.a.CheckAllFilesWithPath(cbCheckOwnerPath, &filelist, fn)\n\t}\n\n\treturn \"\"\n}\n\n// check that every file within a given directory is owned by the given UID and GID\nfunc cbCheckOwnerPath(fi *fsparser.FileInfo, fullpath string, data analyzer.AllFilesCallbackData) {\n\tvar filelist *cbDataCheckOwnerPath = data.(*cbDataCheckOwnerPath)\n\n\tppath := fullpath\n\tif len(fi.Name) > 0 {\n\t\tppath = path.Join(ppath, fi.Name)\n\t}\n\n\tif fi.Uid != filelist.fop.Uid {\n\t\tfilelist.a.AddOffender(ppath, fmt.Sprintf(\"FilePathOwner Uid not allowed, Uid = %d should be = %d\", fi.Uid, filelist.fop.Uid))\n\t}\n\tif fi.Gid != filelist.fop.Gid {\n\t\tfilelist.a.AddOffender(ppath, fmt.Sprintf(\"FilePathOwner Gid not allowed, Gid = %d should be = %d\", fi.Gid, filelist.fop.Gid))\n\t}\n}\n"
  },
  {
    "path": "pkg/analyzer/filepathowner/filepathowner_test.go",
    "content": "/*\nCopyright 2019-present, Cruise LLC\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n\thttp://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n*/\n\npackage filepathowner\n\nimport (\n\t\"testing\"\n\n\t\"github.com/cruise-automation/fwanalyzer/pkg/analyzer\"\n\t\"github.com/cruise-automation/fwanalyzer/pkg/fsparser\"\n)\n\ntype OffenderCallack func(fn string)\n\ntype testAnalyzer struct {\n\tocb      OffenderCallack\n\ttestfile string\n}\n\nfunc (a *testAnalyzer) AddData(key, value string) {}\nfunc (a *testAnalyzer) GetFileInfo(filepath string) (fsparser.FileInfo, error) {\n\treturn fsparser.FileInfo{}, nil\n}\nfunc (a *testAnalyzer) RemoveFile(filepath string) error {\n\treturn nil\n}\nfunc (a *testAnalyzer) FileGetSha256(filepath string) ([]byte, error) {\n\treturn []byte(\"\"), nil\n}\nfunc (a *testAnalyzer) FileGet(filepath string) (string, error) {\n\treturn a.testfile, nil\n}\nfunc (a *testAnalyzer) AddOffender(filepath string, reason string) {\n\ta.ocb(filepath)\n}\nfunc (a *testAnalyzer) AddInformational(filepath string, reason string) {}\nfunc (a *testAnalyzer) CheckAllFilesWithPath(cb analyzer.AllFilesCallback, cbdata analyzer.AllFilesCallbackData, filepath string) {\n}\nfunc (a *testAnalyzer) ImageInfo() analyzer.AnalyzerReport {\n\treturn analyzer.AnalyzerReport{}\n}\n\nfunc Test(t *testing.T) {\n\n\ta := &testAnalyzer{}\n\n\tcfg := `\n[FilePathOwner.\"/bin\"]\nUid = 0\nGid = 0\n`\n\n\tg := New(cfg, a)\n\n\tg.Start()\n\n\t// uid/gid match\n\ttriggered := false\n\ta.ocb = func(fp string) { triggered = true }\n\tfi := fsparser.FileInfo{Name: \"test1\", Uid: 0, Gid: 0}\n\tcbCheckOwnerPath(&fi, \"/bin\", &cbDataCheckOwnerPath{a, filePathOwner{0, 0}})\n\tif triggered {\n\t\tt.Errorf(\"checkOwnerPath failed\")\n\t}\n\n\t// gid does not match\n\ttriggered = false\n\ta.ocb = func(fp string) { triggered = true }\n\tfi = fsparser.FileInfo{Name: \"test1\", Uid: 0, Gid: 1}\n\tcbCheckOwnerPath(&fi, \"/bin\", &cbDataCheckOwnerPath{a, filePathOwner{0, 0}})\n\tif !triggered {\n\t\tt.Errorf(\"checkOwnerPath failed\")\n\t}\n\n\t// do not call finalize() since we do not have a real source\n}\n"
  },
  {
    "path": "pkg/analyzer/filestatcheck/filestatcheck.go",
    "content": "/*\nCopyright 2019-present, Cruise LLC\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n\thttp://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n*/\n\npackage filestatcheck\n\nimport (\n\t\"fmt\"\n\t\"strconv\"\n\t\"strings\"\n\n\t\"github.com/BurntSushi/toml\"\n\n\t\"github.com/cruise-automation/fwanalyzer/pkg/analyzer\"\n\t\"github.com/cruise-automation/fwanalyzer/pkg/capability\"\n\t\"github.com/cruise-automation/fwanalyzer/pkg/fsparser\"\n)\n\ntype fileexistType struct {\n\tAllowEmpty        bool\n\tMode              string\n\tUid               int\n\tGid               int\n\tSELinuxLabel      string\n\tLinkTarget        string\n\tCapabilities      []string\n\tDesc              string\n\tInformationalOnly bool\n}\n\ntype fileExistListType struct {\n\tFileStatCheck map[string]fileexistType\n}\n\ntype fileExistType struct {\n\tfiles fileExistListType\n\ta     analyzer.AnalyzerType\n}\n\nfunc New(config string, a analyzer.AnalyzerType) *fileExistType {\n\tcfg := fileExistType{a: a}\n\n\tmd, err := toml.Decode(config, &cfg.files)\n\tif err != nil {\n\t\tpanic(\"can't read config data: \" + err.Error())\n\t}\n\n\tfor fn, item := range cfg.files.FileStatCheck {\n\t\tif !md.IsDefined(\"FileStatCheck\", fn, \"Uid\") {\n\t\t\titem.Uid = -1\n\t\t\tcfg.files.FileStatCheck[fn] = item\n\t\t}\n\n\t\tif !md.IsDefined(\"FileStatCheck\", fn, \"Gid\") {\n\t\t\titem.Gid = -1\n\t\t\tcfg.files.FileStatCheck[fn] = item\n\t\t}\n\t}\n\n\treturn &cfg\n}\n\nfunc (state *fileExistType) Start() {}\n\nfunc (state *fileExistType) CheckFile(fi *fsparser.FileInfo, filepath string) error {\n\treturn nil\n}\n\nfunc (state *fileExistType) Name() string {\n\treturn \"FileStatCheck\"\n}\n\nfunc (state *fileExistType) Finalize() string {\n\tfor fn, item := range state.files.FileStatCheck {\n\t\tfi, err := state.a.GetFileInfo(fn)\n\t\tif err != nil {\n\t\t\tstate.a.AddOffender(fn, \"file does not exist\")\n\t\t} else {\n\t\t\tcheckMode := false\n\t\t\tvar mode uint64\n\t\t\tif item.Mode != \"\" {\n\t\t\t\tcheckMode = true\n\t\t\t\tmode, _ = strconv.ParseUint(item.Mode, 8, 0)\n\t\t\t}\n\t\t\tif item.LinkTarget != \"\" {\n\t\t\t\tif !fi.IsLink() {\n\t\t\t\t\tstate.a.AddOffender(fn, fmt.Sprintf(\"File State Check failed LinkTarget set but file is not a link : %s\", item.Desc))\n\t\t\t\t} else if item.LinkTarget != fi.LinkTarget {\n\t\t\t\t\tif item.InformationalOnly {\n\t\t\t\t\t\tstate.a.AddInformational(fn, fmt.Sprintf(\"File State Check failed LinkTarget does not match '%s' found '%s' : %s\", item.LinkTarget, fi.LinkTarget, item.Desc))\n\t\t\t\t\t} else {\n\t\t\t\t\t\tstate.a.AddOffender(fn, fmt.Sprintf(\"File State Check failed LinkTarget does not match '%s' found '%s' : %s\", item.LinkTarget, fi.LinkTarget, item.Desc))\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t}\n\t\t\t// not allow empty with check if file size is zero\n\t\t\tif !item.AllowEmpty && fi.Size == 0 {\n\t\t\t\tif item.InformationalOnly {\n\t\t\t\t\tstate.a.AddInformational(fn, fmt.Sprintf(\"File State Check failed: size: %d AllowEmpyt=false : %s\", fi.Size, item.Desc))\n\t\t\t\t} else {\n\t\t\t\t\tstate.a.AddOffender(fn, fmt.Sprintf(\"File State Check failed: size: %d AllowEmpyt=false : %s\", fi.Size, item.Desc))\n\t\t\t\t}\n\t\t\t}\n\t\t\t// not allow empty with check that file is not a Link\n\t\t\tif !item.AllowEmpty && fi.IsLink() {\n\t\t\t\tif item.InformationalOnly {\n\t\t\t\t\tstate.a.AddInformational(fn, fmt.Sprintf(\"File State Check failed: AllowEmpyt=false but file is Link (check link target instead) : %s\", item.Desc))\n\t\t\t\t} else {\n\t\t\t\t\tstate.a.AddOffender(fn, fmt.Sprintf(\"File State Check failed: AllowEmpyt=false but file is Link (check link target instead) : %s\", item.Desc))\n\t\t\t\t}\n\t\t\t}\n\t\t\tif checkMode && fi.Mode != mode {\n\t\t\t\tif item.InformationalOnly {\n\t\t\t\t\tstate.a.AddInformational(fn, fmt.Sprintf(\"File State Check failed: mode found %o should be %s : %s\", fi.Mode, item.Mode, item.Desc))\n\t\t\t\t} else {\n\t\t\t\t\tstate.a.AddOffender(fn, fmt.Sprintf(\"File State Check failed: mode found %o should be %s : %s\", fi.Mode, item.Mode, item.Desc))\n\t\t\t\t}\n\t\t\t}\n\t\t\tif item.Gid >= 0 && fi.Gid != item.Gid {\n\t\t\t\tif item.InformationalOnly {\n\t\t\t\t\tstate.a.AddInformational(fn, fmt.Sprintf(\"File State Check failed: group found %d should be %d : %s\", fi.Gid, item.Gid, item.Desc))\n\t\t\t\t} else {\n\t\t\t\t\tstate.a.AddOffender(fn, fmt.Sprintf(\"File State Check failed: group found %d should be %d : %s\", fi.Gid, item.Gid, item.Desc))\n\t\t\t\t}\n\t\t\t}\n\t\t\tif item.Uid >= 0 && fi.Uid != item.Uid {\n\t\t\t\tif item.InformationalOnly {\n\t\t\t\t\tstate.a.AddInformational(fn, fmt.Sprintf(\"File State Check failed: owner found %d should be %d : %s\", fi.Uid, item.Uid, item.Desc))\n\t\t\t\t} else {\n\t\t\t\t\tstate.a.AddOffender(fn, fmt.Sprintf(\"File State Check failed: owner found %d should be %d : %s\", fi.Uid, item.Uid, item.Desc))\n\t\t\t\t}\n\t\t\t}\n\t\t\tif item.SELinuxLabel != \"\" && !strings.EqualFold(item.SELinuxLabel, fi.SELinuxLabel) {\n\t\t\t\tif item.InformationalOnly {\n\t\t\t\t\tstate.a.AddInformational(fn, fmt.Sprintf(\"File State Check failed: selinux label found = %s should be = %s : %s\", fi.SELinuxLabel, item.SELinuxLabel, item.Desc))\n\t\t\t\t} else {\n\t\t\t\t\tstate.a.AddOffender(fn, fmt.Sprintf(\"File State Check failed: selinux label found = %s should be = %s : %s\", fi.SELinuxLabel, item.SELinuxLabel, item.Desc))\n\t\t\t\t}\n\t\t\t}\n\t\t\tif len(item.Capabilities) > 0 {\n\t\t\t\tif !capability.CapsEqual(item.Capabilities, fi.Capabilities) {\n\t\t\t\t\tif item.InformationalOnly {\n\t\t\t\t\t\tstate.a.AddInformational(fn, fmt.Sprintf(\"Capabilities found: %s expected: %s\", fi.Capabilities, item.Capabilities))\n\t\t\t\t\t} else {\n\t\t\t\t\t\tstate.a.AddOffender(fn, fmt.Sprintf(\"Capabilities found: %s expected: %s\", fi.Capabilities, item.Capabilities))\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t}\n\t\t}\n\t}\n\treturn \"\"\n}\n"
  },
  {
    "path": "pkg/analyzer/filestatcheck/filestatcheck_test.go",
    "content": "/*\nCopyright 2019-present, Cruise LLC\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n\thttp://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n*/\n\npackage filestatcheck\n\nimport (\n\t\"fmt\"\n\t\"testing\"\n\n\t\"github.com/cruise-automation/fwanalyzer/pkg/analyzer\"\n\t\"github.com/cruise-automation/fwanalyzer/pkg/fsparser\"\n)\n\ntype OffenderCallack func(fn string)\n\ntype testAnalyzer struct {\n\tocb OffenderCallack\n\tfi  fsparser.FileInfo\n\terr error\n}\n\nfunc (a *testAnalyzer) AddData(key, value string) {}\n\nfunc (a *testAnalyzer) GetFileInfo(filepath string) (fsparser.FileInfo, error) {\n\treturn a.fi, a.err\n}\nfunc (a *testAnalyzer) RemoveFile(filepath string) error {\n\treturn nil\n}\nfunc (a *testAnalyzer) FileGetSha256(filepath string) ([]byte, error) {\n\treturn []byte{}, nil\n}\nfunc (a *testAnalyzer) FileGet(filepath string) (string, error) {\n\treturn \"\", nil\n}\nfunc (a *testAnalyzer) AddOffender(filepath string, reason string) {\n\ta.ocb(filepath)\n}\nfunc (a *testAnalyzer) AddInformational(filepath string, reason string) {}\nfunc (a *testAnalyzer) CheckAllFilesWithPath(cb analyzer.AllFilesCallback, cbdata analyzer.AllFilesCallbackData, filepath string) {\n}\nfunc (a *testAnalyzer) ImageInfo() analyzer.AnalyzerReport {\n\treturn analyzer.AnalyzerReport{}\n}\n\nfunc TestGlobal(t *testing.T) {\n\n\ta := &testAnalyzer{}\n\ta.err = nil\n\n\tcfg := `\n[FileStatCheck.\"/file1111\"]\nAllowEmpty = false\nUid = 1\nMode = \"0755\"\nDesc = \"this need to be this way\"`\n\n\tg := New(cfg, a)\n\n\t// ensure gid/uid are set to correct values\n\tfor _, item := range g.files.FileStatCheck {\n\t\tif item.Gid != -1 {\n\t\t\tt.Errorf(\"Gid should default to -1, is %d\", item.Gid)\n\t\t}\n\n\t\tif item.Uid != 1 {\n\t\t\tt.Errorf(\"Uid should be 1, is %d\", item.Uid)\n\t\t}\n\t}\n\n\tg.Start()\n\n\tfi := fsparser.FileInfo{}\n\tif g.CheckFile(&fi, \"/\") != nil {\n\t\tt.Errorf(\"checkfile failed\")\n\t}\n\n\ttests := []struct {\n\t\tfi            fsparser.FileInfo\n\t\terr           error\n\t\tshouldTrigger bool\n\t}{\n\n\t\t{fsparser.FileInfo{Name: \"file1111\", Uid: 0, Gid: 0, Mode: 0755, Size: 1}, nil, true},\n\t\t{fsparser.FileInfo{Name: \"file1111\", Uid: 1, Gid: 0, Mode: 0755, Size: 0}, nil, true},\n\t\t{fsparser.FileInfo{Name: \"file1111\", Uid: 1, Gid: 1, Mode: 0755, Size: 1}, nil, false},\n\t\t{fsparser.FileInfo{Name: \"file1111\", Uid: 1, Gid: 0, Mode: 0754, Size: 1}, nil, true},\n\t\t{\n\t\t\tfsparser.FileInfo{Name: \"filedoesnotexist\", Uid: 0, Gid: 0, Mode: 0755, Size: 1},\n\t\t\tfmt.Errorf(\"file does not exist\"),\n\t\t\ttrue,\n\t\t},\n\t}\n\tvar triggered bool\n\tfor _, test := range tests {\n\t\ttriggered = false\n\t\ta.fi = test.fi\n\t\ta.err = test.err\n\t\ta.ocb = func(fn string) { triggered = true }\n\t\tg.Finalize()\n\t\tif triggered != test.shouldTrigger {\n\t\t\tt.Errorf(\"FileStatCheck failed\")\n\t\t}\n\t}\n}\nfunc TestLink(t *testing.T) {\n\n\ta := &testAnalyzer{}\n\ta.err = nil\n\n\tcfg := `\n[FileStatCheck.\"/filelink\"]\nAllowEmpty = true\nLinkTarget = \"hello\"\nUid = 1\nDesc = \"this need to be this way\"\n`\n\n\tg := New(cfg, a)\n\n\t// ensure gid/uid are set to correct values\n\tfor _, item := range g.files.FileStatCheck {\n\t\tif item.Gid != -1 {\n\t\t\tt.Errorf(\"Gid should default to -1, is %d\", item.Gid)\n\t\t}\n\n\t\tif item.Uid != 1 {\n\t\t\tt.Errorf(\"Uid should be 1, is %d\", item.Uid)\n\t\t}\n\t}\n\n\tg.Start()\n\n\tfi := fsparser.FileInfo{}\n\tif g.CheckFile(&fi, \"/\") != nil {\n\t\tt.Errorf(\"checkfile failed\")\n\t}\n\n\ttests := []struct {\n\t\tfi            fsparser.FileInfo\n\t\terr           error\n\t\tshouldTrigger bool\n\t}{\n\t\t{fsparser.FileInfo{Name: \"filelink\", Uid: 1, Gid: 0, Mode: 0120000, LinkTarget: \"hello\", Size: 1}, nil, false},\n\t\t{fsparser.FileInfo{Name: \"filelink\", Uid: 1, Gid: 0, Mode: 0120000, LinkTarget: \"hello1\", Size: 1}, nil, true},\n\t\t{fsparser.FileInfo{Name: \"filelink\", Uid: 1, Gid: 0, Mode: 0755, Size: 1}, nil, true},\n\t}\n\n\tvar triggered bool\n\tfor _, test := range tests {\n\t\ttriggered = false\n\t\ta.fi = test.fi\n\t\ta.err = test.err\n\t\ta.ocb = func(fn string) { triggered = true }\n\t\tg.Finalize()\n\t\tif triggered != test.shouldTrigger {\n\t\t\tt.Errorf(\"FileStatCheck failed\")\n\t\t}\n\t}\n}\n\nfunc TestLinkEmpty(t *testing.T) {\n\n\ta := &testAnalyzer{}\n\ta.err = nil\n\n\tcfg := `\n[FileStatCheck.\"/filelink\"]\nAllowEmpty = false\nLinkTarget = \"hello\"\nUid = 1\nDesc = \"this need to be this way\"\n`\n\n\tg := New(cfg, a)\n\n\t// ensure gid/uid are set to correct values\n\tfor _, item := range g.files.FileStatCheck {\n\t\tif item.Gid != -1 {\n\t\t\tt.Errorf(\"Gid should default to -1, is %d\", item.Gid)\n\t\t}\n\n\t\tif item.Uid != 1 {\n\t\t\tt.Errorf(\"Uid should be 1, is %d\", item.Uid)\n\t\t}\n\t}\n\n\tg.Start()\n\n\tfi := fsparser.FileInfo{}\n\tif g.CheckFile(&fi, \"/\") != nil {\n\t\tt.Errorf(\"checkfile failed\")\n\t}\n\n\ttests := []struct {\n\t\tfi            fsparser.FileInfo\n\t\terr           error\n\t\tshouldTrigger bool\n\t}{\n\t\t{fsparser.FileInfo{Name: \"filelink\", Uid: 1, Gid: 0, Mode: 0120000, LinkTarget: \"hello\", Size: 1}, nil, true},\n\t}\n\n\tvar triggered bool\n\tfor _, test := range tests {\n\t\ttriggered = false\n\t\ta.fi = test.fi\n\t\ta.err = test.err\n\t\ta.ocb = func(fn string) { triggered = true }\n\t\tg.Finalize()\n\t\tif triggered != test.shouldTrigger {\n\t\t\tt.Errorf(\"FileStatCheck failed\")\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "pkg/analyzer/filetree/filetree.go",
    "content": "/*\nCopyright 2019-present, Cruise LLC\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n\thttp://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n*/\n\npackage filetree\n\nimport (\n\t\"bytes\"\n\t\"encoding/hex\"\n\t\"encoding/json\"\n\t\"fmt\"\n\t\"io/ioutil\"\n\t\"path\"\n\t\"strings\"\n\n\t\"github.com/BurntSushi/toml\"\n\n\t\"github.com/cruise-automation/fwanalyzer/pkg/analyzer\"\n\t\"github.com/cruise-automation/fwanalyzer/pkg/capability\"\n\t\"github.com/cruise-automation/fwanalyzer/pkg/fsparser\"\n\t\"github.com/cruise-automation/fwanalyzer/pkg/util\"\n)\n\nconst (\n\tnewFileTreeExt string = \".new\"\n)\n\ntype fileTreeConfig struct {\n\tOldTreeFilePath       string\n\tCheckPath             []string\n\tCheckPermsOwnerChange bool\n\tCheckFileSize         bool\n\tCheckFileDigest       bool\n\tSkipFileDigest        bool\n}\n\ntype fileTreeType struct {\n\tconfig fileTreeConfig\n\ta      analyzer.AnalyzerType\n\n\ttree    map[string]fileInfoSaveType\n\toldTree map[string]fileInfoSaveType\n}\n\ntype fileInfoSaveType struct {\n\tfsparser.FileInfo\n\tDigest string `json:\"digest,omitempty\"`\n}\ntype imageInfoSaveType struct {\n\tImageName   string             `json:\"image_name\"`\n\tImageDigest string             `json:\"image_digest\"`\n\tFiles       []fileInfoSaveType `json:\"files\"`\n}\n\nfunc New(config string, a analyzer.AnalyzerType, outputDirectory string) *fileTreeType {\n\ttype ftcfg struct {\n\t\tFileTreeCheck fileTreeConfig\n\t}\n\tvar conf ftcfg\n\tmd, err := toml.Decode(config, &conf)\n\tif err != nil {\n\t\tpanic(\"can't read config data: \" + err.Error())\n\t}\n\n\t// if CheckPath is undefined set CheckPath to root\n\tif !md.IsDefined(\"FileTreeCheck\", \"CheckPath\") {\n\t\tconf.FileTreeCheck.CheckPath = []string{\"/\"}\n\t}\n\n\tfor i := range conf.FileTreeCheck.CheckPath {\n\t\tconf.FileTreeCheck.CheckPath[i] = util.CleanPathDir(conf.FileTreeCheck.CheckPath[i])\n\t}\n\n\tcfg := fileTreeType{config: conf.FileTreeCheck, a: a}\n\n\t// if an output directory is set concat the path of the old filetree\n\tif outputDirectory != \"\" && cfg.config.OldTreeFilePath != \"\" {\n\t\tcfg.config.OldTreeFilePath = path.Join(outputDirectory, cfg.config.OldTreeFilePath)\n\t}\n\n\treturn &cfg\n}\n\nfunc inPath(checkPath string, cfgPath []string) bool {\n\tfor _, p := range cfgPath {\n\t\tif strings.HasPrefix(checkPath, p) {\n\t\t\treturn true\n\t\t}\n\t}\n\treturn false\n}\n\nfunc (state *fileTreeType) Start() {\n\tstate.tree = make(map[string]fileInfoSaveType)\n}\n\nfunc (state *fileTreeType) Name() string {\n\treturn \"FileTreeChecks\"\n}\n\nfunc (tree *fileTreeType) readOldTree() error {\n\tdata, err := ioutil.ReadFile(tree.config.OldTreeFilePath)\n\tif err != nil {\n\t\treturn err\n\t}\n\tvar oldTree imageInfoSaveType\n\terr = json.Unmarshal(data, &oldTree)\n\tif err != nil {\n\t\treturn err\n\t}\n\ttree.oldTree = make(map[string]fileInfoSaveType)\n\tfor _, fi := range oldTree.Files {\n\t\ttree.oldTree[fi.Name] = fi\n\t}\n\treturn nil\n}\n\nfunc (tree *fileTreeType) saveTree() error {\n\timageInfo := tree.a.ImageInfo()\n\toldtree := imageInfoSaveType{\n\t\tImageName:   imageInfo.ImageName,\n\t\tImageDigest: imageInfo.ImageDigest,\n\t}\n\n\tfor _, fi := range tree.tree {\n\t\toldtree.Files = append(oldtree.Files, fi)\n\t}\n\n\tjdata, err := json.Marshal(oldtree)\n\tif err != nil {\n\t\treturn err\n\t}\n\t// make json look pretty\n\tvar prettyJson bytes.Buffer\n\terr = json.Indent(&prettyJson, jdata, \"\", \"\\t\")\n\tif err != nil {\n\t\treturn err\n\t}\n\n\terr = ioutil.WriteFile(tree.config.OldTreeFilePath+newFileTreeExt, prettyJson.Bytes(), 0644)\n\tif err != nil {\n\t\treturn err\n\t}\n\treturn nil\n}\n\nfunc (state *fileTreeType) CheckFile(fi *fsparser.FileInfo, filepath string) error {\n\tif state.config.OldTreeFilePath == \"\" {\n\t\treturn nil\n\t}\n\n\tfn := path.Join(filepath, fi.Name)\n\n\tdigest := \"0\"\n\tif fi.IsFile() && !state.config.SkipFileDigest {\n\t\tdigestRaw, err := state.a.FileGetSha256(fn)\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\t\tdigest = hex.EncodeToString(digestRaw)\n\t}\n\n\tstate.tree[fn] = fileInfoSaveType{\n\t\tfsparser.FileInfo{\n\t\t\tName:         fn,\n\t\t\tSize:         fi.Size,\n\t\t\tUid:          fi.Uid,\n\t\t\tGid:          fi.Gid,\n\t\t\tMode:         fi.Mode,\n\t\t\tSELinuxLabel: fi.SELinuxLabel,\n\t\t\tCapabilities: fi.Capabilities,\n\t\t\tLinkTarget:   fi.LinkTarget,\n\t\t},\n\t\tdigest,\n\t}\n\n\treturn nil\n}\n\nfunc (state *fileTreeType) Finalize() string {\n\tif state.config.OldTreeFilePath == \"\" {\n\t\treturn \"\"\n\t}\n\n\tvar added []fileInfoSaveType\n\tvar removed []fileInfoSaveType\n\tvar changed []string\n\n\t_ = state.readOldTree()\n\n\t// find modified files\n\tfor filepath, fi := range state.oldTree {\n\t\t// skip files if not in configured path\n\t\tif !inPath(filepath, state.config.CheckPath) {\n\t\t\tcontinue\n\t\t}\n\t\t_, ok := state.tree[filepath]\n\t\tif !ok {\n\t\t\tremoved = append(removed, fi)\n\t\t} else {\n\t\t\toFi := fi\n\t\t\tcFi := state.tree[filepath]\n\t\t\tif oFi.Mode != cFi.Mode ||\n\t\t\t\toFi.Uid != cFi.Uid ||\n\t\t\t\toFi.Gid != cFi.Gid ||\n\t\t\t\toFi.LinkTarget != cFi.LinkTarget ||\n\t\t\t\toFi.SELinuxLabel != cFi.SELinuxLabel ||\n\t\t\t\t!capability.CapsEqual(oFi.Capabilities, cFi.Capabilities) ||\n\t\t\t\t((oFi.Size != cFi.Size) && state.config.CheckFileSize) ||\n\t\t\t\t((oFi.Digest != cFi.Digest) && state.config.CheckFileDigest) {\n\t\t\t\tchanged = append(changed, filepath)\n\t\t\t}\n\t\t}\n\t}\n\n\t// find new files\n\tfor filepath, fi := range state.tree {\n\t\t// skip files if not in configured path\n\t\tif !inPath(filepath, state.config.CheckPath) {\n\t\t\tcontinue\n\t\t}\n\t\t_, ok := state.oldTree[filepath]\n\t\tif !ok {\n\t\t\tadded = append(added, fi)\n\t\t}\n\t}\n\n\ttreeUpdated := false\n\tif len(added) > 0 || len(removed) > 0 || (len(changed) > 0 && state.config.CheckPermsOwnerChange) {\n\t\terr := state.saveTree()\n\t\tif err != nil {\n\t\t\tpanic(\"saveTree failed\")\n\t\t}\n\t\ttreeUpdated = true\n\t}\n\n\tfor _, fi := range added {\n\t\tfileInfoStr := fiToString(fi, true) //a.config.GlobalConfig.FsTypeOptions == \"selinux\")\n\t\tstate.a.AddInformational(fi.Name, fmt.Sprintf(\"CheckFileTree: new file: %s\", fileInfoStr))\n\t}\n\tfor _, fi := range removed {\n\t\tfileInfoStr := fiToString(fi, true) //a.config.GlobalConfig.FsTypeOptions == \"selinux\")\n\t\tstate.a.AddInformational(fi.Name, fmt.Sprintf(\"CheckFileTree: file removed: %s\", fileInfoStr))\n\t}\n\tif state.config.CheckPermsOwnerChange {\n\t\tfor _, filepath := range changed {\n\t\t\tfileInfoStrOld := fiToString(state.oldTree[filepath], true) //state.config..FsTypeOptions == \"selinux\")\n\t\t\tfileInfoStrCur := fiToString(state.tree[filepath], true)    //a.config.GlobalConfig.FsTypeOptions == \"selinux\")\n\t\t\tstate.a.AddInformational(state.tree[filepath].Name,\n\t\t\t\tfmt.Sprintf(\"CheckFileTree: file perms/owner/size/digest changed from: %s to: %s\", fileInfoStrOld, fileInfoStrCur))\n\t\t}\n\t}\n\n\tif state.config.OldTreeFilePath != \"\" {\n\t\ttype reportData struct {\n\t\t\tOldFileTreePath     string `json:\"old_file_tree_path\"`\n\t\t\tCurrentFileTreePath string `json:\"current_file_tree_path,omitempty\"`\n\t\t}\n\t\tnewPath := \"\"\n\t\tif treeUpdated {\n\t\t\tnewPath = state.config.OldTreeFilePath + newFileTreeExt\n\t\t}\n\n\t\tdata := reportData{state.config.OldTreeFilePath, newPath}\n\t\tjdata, _ := json.Marshal(&data)\n\t\treturn string(jdata)\n\t}\n\n\treturn \"\"\n}\n\n// provide fileinfo as a human readable string\nfunc fiToString(fi fileInfoSaveType, selinux bool) string {\n\tif selinux {\n\t\treturn fmt.Sprintf(\"%o %d:%d %d %s SELinux label: %s\", fi.Mode, fi.Uid, fi.Gid, fi.Size, fi.Digest, fi.SELinuxLabel)\n\t} else {\n\t\treturn fmt.Sprintf(\"%o %d:%d %d %s\", fi.Mode, fi.Uid, fi.Gid, fi.Size, fi.Digest)\n\t}\n}\n"
  },
  {
    "path": "pkg/analyzer/filetree/filetree_test.go",
    "content": "/*\nCopyright 2019-present, Cruise LLC\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n\thttp://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n*/\n\npackage filetree\n\nimport (\n\t\"os\"\n\t\"strings\"\n\t\"testing\"\n\n\t\"github.com/cruise-automation/fwanalyzer/pkg/analyzer\"\n\t\"github.com/cruise-automation/fwanalyzer/pkg/fsparser\"\n)\n\ntype OffenderCallack func(fn string, reason string)\n\ntype testAnalyzer struct {\n\tocb      OffenderCallack\n\ttestfile string\n}\n\nfunc (a *testAnalyzer) AddData(key, value string) {}\nfunc (a *testAnalyzer) GetFileInfo(filepath string) (fsparser.FileInfo, error) {\n\treturn fsparser.FileInfo{}, nil\n}\nfunc (a *testAnalyzer) RemoveFile(filepath string) error {\n\treturn nil\n}\nfunc (a *testAnalyzer) FileGetSha256(filepath string) ([]byte, error) {\n\treturn []byte(\"\"), nil\n}\nfunc (a *testAnalyzer) FileGet(filepath string) (string, error) {\n\treturn a.testfile, nil\n}\nfunc (a *testAnalyzer) AddOffender(filepath string, reason string) {\n}\nfunc (a *testAnalyzer) AddInformational(filepath string, reason string) {\n\ta.ocb(filepath, reason)\n}\nfunc (a *testAnalyzer) CheckAllFilesWithPath(cb analyzer.AllFilesCallback, cbdata analyzer.AllFilesCallbackData, filepath string) {\n}\nfunc (a *testAnalyzer) ImageInfo() analyzer.AnalyzerReport {\n\treturn analyzer.AnalyzerReport{}\n}\n\nfunc TestGlobal(t *testing.T) {\n\n\ta := &testAnalyzer{}\n\n\tcfg := `\n[FileTreeCheck]\nOldTreeFilePath = \"/tmp/blatreetest1337.json\"\nCheckPath = [\"/\"]\nCheckPermsOwnerChange = true\nCheckFileSize         = true\nCheckFileDigest       = false\n`\n\n\tg := New(cfg, a, \"\")\n\tg.Start()\n\n\ttriggered := false\n\ta.ocb = func(fn string, reason string) {\n\t\tif strings.HasPrefix(reason, \"CheckFileTree: new file:\") {\n\t\t\ttriggered = true\n\t\t}\n\t}\n\tfi := fsparser.FileInfo{Name: \"test1\"}\n\terr := g.CheckFile(&fi, \"/\")\n\tif err != nil {\n\t\tt.Errorf(\"CheckFile failed\")\n\t}\n\n\tresult := g.Finalize()\n\tif !triggered {\n\t\tt.Errorf(\"filetree check failed\")\n\t}\n\n\tif result == \"\" {\n\t\tt.Errorf(\"Finalize should not return empty string\")\n\t}\n\n\t// rename so we have input for the next test\n\terr = os.Rename(\"/tmp/blatreetest1337.json.new\", \"/tmp/blatreetest1337.json\")\n\tif err != nil {\n\t\tt.Errorf(\"rename %s %s: failed\", \"/tmp/blatreetest1337.json.new\", \"/tmp/blatreetest1337.json\")\n\t}\n\n\t// diff test\n\tg = New(cfg, a, \"\")\n\n\tg.Start()\n\n\ttriggered = false\n\ta.ocb = func(fn string, reason string) {\n\t\tif strings.HasPrefix(reason, \"CheckFileTree: file perms/owner/size/digest changed\") {\n\t\t\ttriggered = true\n\t\t}\n\t}\n\tfi = fsparser.FileInfo{Name: \"test1\", Uid: 1}\n\terr = g.CheckFile(&fi, \"/\")\n\tif err != nil {\n\t\tt.Errorf(\"CheckFile failed\")\n\t}\n\n\tg.Finalize()\n\tif !triggered {\n\t\tt.Errorf(\"filetree check failed\")\n\t}\n\n\t// delete test\n\tg = New(cfg, a, \"\")\n\n\tg.Start()\n\n\ttriggered = false\n\ta.ocb = func(fn string, reason string) {\n\t\tif fn == \"/test1\" && strings.HasPrefix(reason, \"CheckFileTree: file removed\") {\n\t\t\ttriggered = true\n\t\t}\n\t}\n\n\tg.Finalize()\n\tif !triggered {\n\t\tt.Errorf(\"filetree check failed\")\n\t}\n\n\tos.Remove(\"/tmp/blatreetest1337.json\")\n\tos.Remove(\"/tmp/blatreetest1337.json.new\")\n\tos.Remove(\"/tmp/blatreetest1337.json.new.new\")\n}\n\nfunc TestGlobalCheckPath1(t *testing.T) {\n\n\ta := &testAnalyzer{}\n\n\tcfg := `\n[FileTreeCheck]\nOldTreeFilePath = \"/tmp/blatreetest1337.json\"\nCheckPath = []\nCheckPermsOwnerChange = true\nCheckFileSize         = true\nCheckFileDigest       = false\n`\n\tg := New(cfg, a, \"\")\n\n\tif len(g.config.CheckPath) != 0 {\n\t\tt.Error(\"CheckPath should ne empty\")\n\t}\n}\n\nfunc TestGlobalCheckPath2(t *testing.T) {\n\n\ta := &testAnalyzer{}\n\n\tcfg := `\n[FileTreeCheck]\nOldTreeFilePath = \"/tmp/blatreetest1337.json\"\nCheckPermsOwnerChange = true\nCheckFileSize         = true\nCheckFileDigest       = false\n`\n\tg := New(cfg, a, \"\")\n\n\tif len(g.config.CheckPath) != 1 && g.config.CheckPath[0] != \"/\" {\n\t\tt.Error(\"CheckPath should be: /\")\n\t}\n}\n"
  },
  {
    "path": "pkg/analyzer/globalfilechecks/globalfilechecks.go",
    "content": "/*\nCopyright 2019-present, Cruise LLC\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n\thttp://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n*/\n\npackage globalfilechecks\n\nimport (\n\t\"fmt\"\n\t\"path\"\n\n\t\"github.com/BurntSushi/toml\"\n\t\"github.com/bmatcuk/doublestar\"\n\n\t\"github.com/cruise-automation/fwanalyzer/pkg/analyzer\"\n\t\"github.com/cruise-automation/fwanalyzer/pkg/fsparser\"\n)\n\ntype filePermsConfigType struct {\n\tSuid                            bool\n\tSuidAllowedList                 map[string]bool\n\tWorldWrite                      bool\n\tSELinuxLabel                    bool\n\tUids                            map[int]bool\n\tGids                            map[int]bool\n\tBadFiles                        map[string]bool\n\tBadFilesInformationalOnly       bool\n\tFlagCapabilityInformationalOnly bool\n}\n\ntype filePermsType struct {\n\tconfig *filePermsConfigType\n\ta      analyzer.AnalyzerType\n}\n\nfunc New(config string, a analyzer.AnalyzerType) *filePermsType {\n\ttype filePermsConfig struct {\n\t\tSuid                            bool\n\t\tSuidWhiteList                   []string // keep for backward compatibility\n\t\tSuidAllowedList                 []string\n\t\tWorldWrite                      bool\n\t\tSELinuxLabel                    bool\n\t\tUids                            []int\n\t\tGids                            []int\n\t\tBadFiles                        []string\n\t\tBadFilesInformationalOnly       bool\n\t\tFlagCapabilityInformationalOnly bool\n\t}\n\ttype fpc struct {\n\t\tGlobalFileChecks filePermsConfig\n\t}\n\tvar conf fpc\n\t_, err := toml.Decode(config, &conf)\n\tif err != nil {\n\t\tpanic(\"can't read config data: \" + err.Error())\n\t}\n\n\tconfiguration := filePermsConfigType{\n\t\tSuid:                            conf.GlobalFileChecks.Suid,\n\t\tWorldWrite:                      conf.GlobalFileChecks.WorldWrite,\n\t\tSELinuxLabel:                    conf.GlobalFileChecks.SELinuxLabel,\n\t\tBadFilesInformationalOnly:       conf.GlobalFileChecks.BadFilesInformationalOnly,\n\t\tFlagCapabilityInformationalOnly: conf.GlobalFileChecks.FlagCapabilityInformationalOnly,\n\t}\n\tconfiguration.SuidAllowedList = make(map[string]bool)\n\tfor _, alfn := range conf.GlobalFileChecks.SuidAllowedList {\n\t\tconfiguration.SuidAllowedList[path.Clean(alfn)] = true\n\t}\n\t// keep for backward compatibility\n\tfor _, wlfn := range conf.GlobalFileChecks.SuidWhiteList {\n\t\tconfiguration.SuidAllowedList[path.Clean(wlfn)] = true\n\t}\n\tconfiguration.Uids = make(map[int]bool)\n\tfor _, uid := range conf.GlobalFileChecks.Uids {\n\t\tconfiguration.Uids[uid] = true\n\t}\n\tconfiguration.Gids = make(map[int]bool)\n\tfor _, gid := range conf.GlobalFileChecks.Gids {\n\t\tconfiguration.Gids[gid] = true\n\t}\n\n\tconfiguration.BadFiles = make(map[string]bool)\n\tfor _, bf := range conf.GlobalFileChecks.BadFiles {\n\t\tconfiguration.BadFiles[path.Clean(bf)] = true\n\t}\n\n\tcfg := filePermsType{&configuration, a}\n\n\treturn &cfg\n}\n\nfunc (state *filePermsType) Start() {}\nfunc (state *filePermsType) Finalize() string {\n\treturn \"\"\n}\n\nfunc (state *filePermsType) Name() string {\n\treturn \"GlobalFileChecks\"\n}\n\nfunc (state *filePermsType) CheckFile(fi *fsparser.FileInfo, fpath string) error {\n\tif state.config.Suid {\n\t\tif fi.IsSUid() || fi.IsSGid() {\n\t\t\tif _, ok := state.config.SuidAllowedList[path.Join(fpath, fi.Name)]; !ok {\n\t\t\t\tstate.a.AddOffender(path.Join(fpath, fi.Name), \"File is SUID, not allowed\")\n\t\t\t}\n\t\t}\n\t}\n\tif state.config.WorldWrite {\n\t\tif fi.IsWorldWrite() && !fi.IsLink() && !fi.IsDir() {\n\t\t\tstate.a.AddOffender(path.Join(fpath, fi.Name), \"File is WorldWriteable, not allowed\")\n\t\t}\n\t}\n\tif state.config.SELinuxLabel {\n\t\tif fi.SELinuxLabel == fsparser.SELinuxNoLabel {\n\t\t\tstate.a.AddOffender(path.Join(fpath, fi.Name), \"File does not have SELinux label\")\n\t\t}\n\t}\n\n\tif len(state.config.Uids) > 0 {\n\t\tif _, ok := state.config.Uids[fi.Uid]; !ok {\n\t\t\tstate.a.AddOffender(path.Join(fpath, fi.Name), fmt.Sprintf(\"File Uid not allowed, Uid = %d\", fi.Uid))\n\t\t}\n\t}\n\n\tif len(state.config.Gids) > 0 {\n\t\tif _, ok := state.config.Gids[fi.Gid]; !ok {\n\t\t\tstate.a.AddOffender(path.Join(fpath, fi.Name), fmt.Sprintf(\"File Gid not allowed, Gid = %d\", fi.Gid))\n\t\t}\n\t}\n\n\tif state.config.FlagCapabilityInformationalOnly {\n\t\tif len(fi.Capabilities) > 0 {\n\t\t\tstate.a.AddInformational(path.Join(fpath, fi.Name), fmt.Sprintf(\"Capabilities found: %s\", fi.Capabilities))\n\t\t}\n\t}\n\n\tfor item := range state.config.BadFiles {\n\t\tfullpath := fi.Name\n\t\t// match the fullpath if it starts with \"/\"\n\t\tif item[0] == '/' {\n\t\t\tfullpath = path.Join(fpath, fi.Name)\n\t\t}\n\t\tm, err := doublestar.Match(item, fullpath)\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\t\tif m {\n\t\t\tmsg := \"File not allowed\"\n\t\t\tif item != fullpath {\n\t\t\t\tmsg = fmt.Sprintf(\"File not allowed for pattern: %s\", item)\n\t\t\t}\n\n\t\t\tif state.config.BadFilesInformationalOnly {\n\t\t\t\tstate.a.AddInformational(path.Join(fpath, fi.Name), msg)\n\t\t\t} else {\n\t\t\t\tstate.a.AddOffender(path.Join(fpath, fi.Name), msg)\n\t\t\t}\n\t\t}\n\t}\n\n\treturn nil\n}\n"
  },
  {
    "path": "pkg/analyzer/globalfilechecks/globalfilechecks_test.go",
    "content": "/*\nCopyright 2019-present, Cruise LLC\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n\thttp://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n*/\n\npackage globalfilechecks\n\nimport (\n\t\"testing\"\n\n\t\"github.com/cruise-automation/fwanalyzer/pkg/analyzer\"\n\t\"github.com/cruise-automation/fwanalyzer/pkg/fsparser\"\n)\n\ntype OffenderCallack func(fn string)\n\ntype testAnalyzer struct {\n\tocb OffenderCallack\n}\n\nfunc (a *testAnalyzer) AddData(key, value string) {}\nfunc (a *testAnalyzer) GetFileInfo(filepath string) (fsparser.FileInfo, error) {\n\treturn fsparser.FileInfo{}, nil\n}\nfunc (a *testAnalyzer) RemoveFile(filepath string) error {\n\treturn nil\n}\nfunc (a *testAnalyzer) FileGetSha256(filepath string) ([]byte, error) {\n\treturn []byte{}, nil\n}\nfunc (a *testAnalyzer) FileGet(filepath string) (string, error) {\n\treturn \"\", nil\n}\nfunc (a *testAnalyzer) AddOffender(filepath string, reason string) {\n\ta.ocb(filepath)\n}\nfunc (a *testAnalyzer) AddInformational(filepath string, reason string) {\n\ta.ocb(filepath)\n}\nfunc (a *testAnalyzer) CheckAllFilesWithPath(cb analyzer.AllFilesCallback, cbdata analyzer.AllFilesCallbackData, filepath string) {\n}\nfunc (a *testAnalyzer) ImageInfo() analyzer.AnalyzerReport {\n\treturn analyzer.AnalyzerReport{}\n}\n\nfunc TestGlobal(t *testing.T) {\n\ta := &testAnalyzer{}\n\tcfg := `\n[GlobalFileChecks]\nSuid = true\nSuidAllowedList = [\"/shouldbesuid\"]\nSeLinuxLabel = true\nWorldWrite = true\nUids = [0]\nGids = [0]\nBadFiles = [\"/file99\", \"/file1\", \"**.h\"]\nFlagCapabilityInformationalOnly = true\n`\n\n\tg := New(cfg, a)\n\tg.Start()\n\n\ttests := []struct {\n\t\tfi            fsparser.FileInfo\n\t\tpath          string\n\t\tshouldTrigger bool\n\t}{\n\t\t{fsparser.FileInfo{Name: \"suid\", Mode: 0004000}, \"/\", true},\n\t\t{fsparser.FileInfo{Name: \"sgid\", Mode: 0002000}, \"/\", true},\n\t\t{fsparser.FileInfo{Name: \"sgid\", Mode: 0000000}, \"/\", false},\n\t\t// allowed suid files\n\t\t{fsparser.FileInfo{Name: \"shouldbesuid\", Mode: 0004000}, \"/\", false},\n\t\t// World write\n\t\t{fsparser.FileInfo{Name: \"ww\", Mode: 0007}, \"/\", true},\n\t\t{fsparser.FileInfo{Name: \"ww\", Mode: 0004}, \"/\", false},\n\t\t{fsparser.FileInfo{Name: \"label\", SELinuxLabel: \"-\"}, \"/\", true},\n\t\t{fsparser.FileInfo{Name: \"label\", SELinuxLabel: \"label\"}, \"/\", false},\n\t\t{fsparser.FileInfo{Name: \"uidfile\", SELinuxLabel: \"uidfile\", Uid: 1, Gid: 1}, \"/\", true},\n\t\t// Bad files\n\t\t{fsparser.FileInfo{Name: \"file99\", SELinuxLabel: \"uidfile\"}, \"/\", true},\n\t\t{fsparser.FileInfo{Name: \"test.h\", SELinuxLabel: \"uidfile\"}, \"/usr/\", true},\n\t\t// Capability\n\t\t{fsparser.FileInfo{Name: \"ping\", Capabilities: []string{\"cap_net_admin+p\"}}, \"/usr/bin\", true},\n\t}\n\n\tvar triggered bool\n\tvar err error\n\tfor _, test := range tests {\n\t\ttriggered = false\n\t\ta.ocb = func(fn string) { triggered = true }\n\t\terr = g.CheckFile(&test.fi, test.path)\n\t\tif err != nil {\n\t\t\tt.Errorf(\"CheckFile failed\")\n\t\t}\n\t\tif triggered != test.shouldTrigger {\n\t\t\tt.Errorf(\"%s test failed\", test.fi.Name)\n\t\t}\n\t}\n\n\tg.Finalize()\n}\n"
  },
  {
    "path": "pkg/capability/capability.go",
    "content": "/*\nCopyright 2019-present, Cruise LLC\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n\thttp://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n*/\n\npackage capability\n\nimport (\n\t\"bytes\"\n\t\"encoding/binary\"\n\t\"fmt\"\n\t\"strconv\"\n\t\"strings\"\n)\n\n/*\n * Consts and structs are based on the linux kernel headers for capabilities\n * see: https://github.com/torvalds/linux/blob/master/include/uapi/linux/capability.h\n */\n\nconst (\n\tCAP_CHOWN            = 0\n\tCAP_DAC_OVERRIDE     = 1\n\tCAP_DAC_READ_SEARCH  = 2\n\tCAP_FOWNER           = 3\n\tCAP_FSETID           = 4\n\tCAP_KILL             = 5\n\tCAP_SETGID           = 6\n\tCAP_SETUID           = 7\n\tCAP_SETPCAP          = 8\n\tCAP_LINUX_IMMUTABLE  = 9\n\tCAP_NET_BIND_SERVICE = 10\n\tCAP_NET_BROADCAST    = 11\n\tCAP_NET_ADMIN        = 12\n\tCAP_NET_RAW          = 13\n\tCAP_IPC_LOCK         = 14\n\tCAP_IPC_OWNER        = 15\n\tCAP_SYS_MODULE       = 16\n\tCAP_SYS_RAWIO        = 17\n\tCAP_SYS_CHROOT       = 18\n\tCAP_SYS_PTRACE       = 19\n\tCAP_SYS_PACCT        = 20\n\tCAP_SYS_ADMIN        = 21\n\tCAP_SYS_BOOT         = 22\n\tCAP_SYS_NICE         = 23\n\tCAP_SYS_RESOURCE     = 24\n\tCAP_SYS_TIME         = 25\n\tCAP_SYS_TTY_CONFIG   = 26\n\tCAP_MKNOD            = 27\n\tCAP_LEASE            = 28\n\tCAP_AUDIT_WRITE      = 29\n\tCAP_AUDIT_CONTROL    = 30\n\tCAP_SETFCAP          = 31\n\tCAP_MAC_OVERRIDE     = 32\n\tCAP_MAC_ADMIN        = 33\n\tCAP_SYSLOG           = 34\n\tCAP_WAKE_ALARM       = 35\n\tCAP_BLOCK_SUSPEND    = 36\n\tCAP_AUDIT_READ       = 37\n\tCAP_LAST_CAP         = CAP_AUDIT_READ\n)\n\nvar CapabilityNames = []string{\n\t\"CAP_CHOWN\",\n\t\"CAP_DAC_OVERRIDE\",\n\t\"CAP_DAC_READ_SEARCH\",\n\t\"CAP_FOWNER\",\n\t\"CAP_FSETID\",\n\t\"CAP_KILL\",\n\t\"CAP_SETGID\",\n\t\"CAP_SETUID\",\n\t\"CAP_SETPCAP\",\n\t\"CAP_LINUX_IMMUTABLE\",\n\t\"CAP_NET_BIND_SERVICE\",\n\t\"CAP_NET_BROADCAST\",\n\t\"CAP_NET_ADMIN\",\n\t\"CAP_NET_RAW\",\n\t\"CAP_IPC_LOCK\",\n\t\"CAP_IPC_OWNER\",\n\t\"CAP_SYS_MODULE\",\n\t\"CAP_SYS_RAWIO\",\n\t\"CAP_SYS_CHROOT\",\n\t\"CAP_SYS_PTRACE\",\n\t\"CAP_SYS_PACCT\",\n\t\"CAP_SYS_ADMIN\",\n\t\"CAP_SYS_BOOT\",\n\t\"CAP_SYS_NICE\",\n\t\"CAP_SYS_RESOURCE\",\n\t\"CAP_SYS_TIME\",\n\t\"CAP_SYS_TTY_CONFIG\",\n\t\"CAP_MKNOD\",\n\t\"CAP_LEASE\",\n\t\"CAP_AUDIT_WRITE\",\n\t\"CAP_AUDIT_CONTROL\",\n\t\"CAP_SETFCAP\",\n\t\"CAP_MAC_OVERRIDE\",\n\t\"CAP_MAC_ADMIN\",\n\t\"CAP_SYSLOG\",\n\t\"CAP_WAKE_ALARM\",\n\t\"CAP_BLOCK_SUSPEND\",\n\t\"CAP_AUDIT_READ\"}\n\nconst capOffset = 2\nconst CapByteSizeMax = 24\n\nconst (\n\tCAP_PERMITTED   = 0\n\tCAP_INHERITABLE = 1\n)\n\n/*\n * capabilities are store in the vfs_cap_data struct\n *\n\nstruct vfs_cap_data {\n\t__le32 magic_etc;            // Little endian\n\tstruct {\n\t\t__le32 permitted;    // Little endian\n\t\t__le32 inheritable;  // Little endian\n\t} data[VFS_CAP_U32];\n};\n*/\n\n// https://github.com/torvalds/linux/blob/master/include/uapi/linux/capability.h#L373\nfunc capValid(cap uint32) bool {\n\t// cap >= 0 && cap <= CAP_LAST_CAP\n\treturn cap <= CAP_LAST_CAP\n}\n\n// https://github.com/torvalds/linux/blob/master/include/uapi/linux/capability.h#L379\nfunc capIndex(cap uint32) int {\n\treturn int(cap>>5) * capOffset\n}\n\n// https://github.com/torvalds/linux/blob/master/include/uapi/linux/capability.h#L380\nfunc capMask(cap uint32) uint32 {\n\treturn (1 << ((cap) & 31))\n}\n\nfunc capHasCap(caps []uint32, cap uint32, capPerm int) bool {\n\treturn caps[capIndex(cap)+capPerm]&capMask(cap) == capMask(cap)\n}\n\n// perm = 0 -> permitted\n// perm = 1 -> inheritable\nfunc capSet(caps []uint32, cap uint32, capPerm int) ([]uint32, error) {\n\tif !capValid(cap) {\n\t\treturn nil, fmt.Errorf(\"capability is invalid\")\n\t}\n\tcaps[capIndex(cap)+capPerm] = caps[capIndex(cap)+capPerm] | capMask(cap)\n\treturn caps, nil\n}\n\nfunc capToText(cap []uint32) []string {\n\tout := []string{}\n\tfor i := range CapabilityNames {\n\t\tcapPermitted := capHasCap(cap, uint32(i), CAP_PERMITTED)\n\t\tcapInheritable := capHasCap(cap, uint32(i), CAP_INHERITABLE)\n\n\t\tif capPermitted || capInheritable {\n\t\t\tvar capStr strings.Builder\n\t\t\tcapStr.WriteString(strings.ToLower(CapabilityNames[i]))\n\t\t\tcapStr.WriteString(\"+\")\n\t\t\tif capPermitted {\n\t\t\t\tcapStr.WriteString(\"p\")\n\t\t\t}\n\t\t\tif capInheritable {\n\t\t\t\tcapStr.WriteString(\"i\")\n\t\t\t}\n\t\t\tout = append(out, capStr.String())\n\t\t}\n\t}\n\treturn out\n}\n\nfunc New(caps interface{}) ([]string, error) {\n\tcap := []string{}\n\tvar capabilities []uint32\n\tvar err error\n\tswitch capsVal := caps.(type) {\n\tcase []byte:\n\t\tcapabilities, err = capsParse(capsVal, 20)\n\tcase string:\n\t\tcapabilities, err = capsParseFromText(capsVal)\n\tdefault:\n\t\treturn cap, nil\n\t}\n\n\tif err != nil {\n\t\treturn cap, nil\n\t}\n\n\treturn capToText(capabilities), nil\n}\n\nfunc capsParse(caps []byte, capsLen uint32) ([]uint32, error) {\n\tif capsLen%4 != 0 {\n\t\treturn nil, fmt.Errorf(\"capability length bad\")\n\t}\n\t// capabilities are stored in uint32\n\trealCap := make([]uint32, capsLen/4)\n\n\tfor i := 0; i < int(capsLen)/4; i++ {\n\t\tbuf := bytes.NewBuffer(caps[i*4 : (i+1)*4])\n\t\tvar num uint32\n\t\terr := binary.Read(buf, binary.LittleEndian, &num)\n\t\tif err != nil {\n\t\t\treturn nil, err\n\t\t}\n\t\trealCap[i] = uint32(num)\n\t}\n\t// strip magic (first uint32 in the array)\n\treturn realCap[1:], nil\n}\n\n// parse caps from string: 0x2000001,0x1000,0x0,0x0,0x0\n// this is the format produced by e2tools and unsquashfs\nfunc capsParseFromText(capsText string) ([]uint32, error) {\n\tcapsInts := strings.Split(capsText, \",\")\n\tcapsParsedInts := make([]uint32, 5)\n\tfor i, val := range capsInts {\n\t\tintVal, err := strconv.ParseUint(val[2:], 16, 32)\n\t\tif err != nil {\n\t\t\treturn nil, err\n\t\t}\n\t\tcapsParsedInts[i] = uint32(intVal)\n\t}\n\tcapsBytes := make([]byte, 20)\n\tfor i := range capsParsedInts {\n\t\tbinary.LittleEndian.PutUint32(capsBytes[(i)*4:], capsParsedInts[i])\n\t}\n\treturn capsParse(capsBytes, 20)\n}\n\nfunc CapsEqual(a, b []string) bool {\n\tif len(a) != len(b) {\n\t\treturn false\n\t}\n\taM := make(map[string]bool)\n\tfor _, cap := range a {\n\t\taM[cap] = true\n\t}\n\tbM := make(map[string]bool)\n\tfor _, cap := range b {\n\t\tbM[cap] = true\n\t}\n\n\tfor cap := range aM {\n\t\tif _, ok := bM[cap]; !ok {\n\t\t\treturn false\n\t\t}\n\t}\n\treturn true\n}\n"
  },
  {
    "path": "pkg/capability/capability_test.go",
    "content": "/*\nCopyright 2019-present, Cruise LLC\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n\thttp://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n*/\n\npackage capability\n\nimport (\n\t\"strings\"\n\t\"testing\"\n)\n\nfunc TestCap(t *testing.T) {\n\tif !strings.EqualFold(\"cap_net_admin+p\", capToText([]uint32{0x1000, 0x0, 0x0, 0x0})[0]) {\n\t\tt.Error(\"bad cap\")\n\t}\n\n\tcap := []uint32{0, 0, 0, 0}\n\tcap, _ = capSet(cap, CAP_DAC_OVERRIDE, CAP_PERMITTED)\n\tcap, _ = capSet(cap, CAP_AUDIT_READ, CAP_INHERITABLE)\n\tif !capHasCap(cap, CAP_DAC_OVERRIDE, CAP_PERMITTED) {\n\t\tt.Error(\"bad cap\")\n\t}\n\tif !capHasCap(cap, CAP_AUDIT_READ, CAP_INHERITABLE) {\n\t\tt.Error(\"bad cap\")\n\t}\n}\n\nfunc TestCapsParse(t *testing.T) {\n\tcaps, err := capsParse([]byte{0, 0, 0, 0, 0, 0x10, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0}, 20)\n\tif err != nil {\n\t\tt.Error(err)\n\t}\n\tif !capHasCap(caps, CAP_NET_ADMIN, CAP_PERMITTED) {\n\t\tt.Error(\"bad cap\")\n\t}\n}\n\nfunc TestCapsStringParse(t *testing.T) {\n\tcaps, err := capsParseFromText(\"0x2000001,0x1000,0x0,0x0,0x0\")\n\tif err != nil {\n\t\tt.Error(err)\n\t}\n\tif !capHasCap(caps, CAP_NET_ADMIN, CAP_PERMITTED) {\n\t\tt.Error(\"bad cap\")\n\t}\n}\n\nfunc TestCapMain(t *testing.T) {\n\tcaps, err := New(\"0x2000001,0x1000,0x0,0x0,0x0\")\n\tif err != nil {\n\t\tt.Error(err)\n\t}\n\tif !strings.EqualFold(caps[0], \"cap_net_admin+p\") {\n\t\tt.Error(\"bad cap\")\n\t}\n\n\tcaps2, err := New([]byte{0, 0, 0, 0, 0, 0x10, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0})\n\tif err != nil {\n\t\tt.Error(err)\n\t}\n\tif !strings.EqualFold(caps2[0], \"cap_net_admin+p\") {\n\t\tt.Error(\"bad cap\")\n\t}\n}\n"
  },
  {
    "path": "pkg/cpioparser/cpioparser.go",
    "content": "/*\nCopyright 2019-present, Cruise LLC\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n\thttp://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n*/\n\npackage cpioparser\n\nimport (\n\t\"errors\"\n\t\"fmt\"\n\t\"os\"\n\t\"os/exec\"\n\t\"path\"\n\t\"regexp\"\n\t\"strconv\"\n\t\"strings\"\n\n\t\"github.com/cruise-automation/fwanalyzer/pkg/fsparser\"\n)\n\nconst (\n\tcpioCmd         = \"cpio\"\n\tcpCmd           = \"cp\"\n\tMIN_LINE_LENGTH = 25\n)\n\ntype CpioParser struct {\n\tfileInfoReg *regexp.Regexp\n\tdevInfoReg  *regexp.Regexp\n\tfileLinkReg *regexp.Regexp\n\timagepath   string\n\tfiles       map[string][]fsparser.FileInfo\n\tfixDirs     bool\n}\n\nfunc New(imagepath string, fixDirs bool) *CpioParser {\n\tparser := &CpioParser{\n\t\t//lrwxrwxrwx   1 0        0              19 Apr 24  2019 lib/libnss_dns.so.2 -> libnss_dns-2.18.so\n\t\t//-rwxrwxrwx   1 0        0              19 Apr 24 13:37 lib/lib.c\n\t\tfileInfoReg: regexp.MustCompile(\n\t\t\t`^([\\w-]+)\\s+(\\d+)\\s+(\\d+)\\s+(\\d+)\\s+(\\d+)\\s+(\\w+\\s+\\d+\\s+[\\d:]+)\\s+(.*)$`),\n\t\t// crw-r--r--   1 0        0          4,  64 Apr 24  2019 dev/ttyS0\n\t\tdevInfoReg: regexp.MustCompile(\n\t\t\t`^([\\w-]+)\\s+(\\d+)\\s+(\\d+)\\s+(\\d+)\\s+(\\d+),\\s+(\\d+)\\s+(\\w+\\s+\\d+\\s+[\\d:]+)\\s+(.*)$`),\n\t\tfileLinkReg: regexp.MustCompile(`(\\S+)\\s->\\s(\\S+)`),\n\t\timagepath:   imagepath,\n\t\tfixDirs:     fixDirs,\n\t}\n\n\treturn parser\n}\n\nfunc (p *CpioParser) ImageName() string {\n\treturn p.imagepath\n}\n\nvar modeFlags = []struct {\n\tpos int\n\tchr byte\n\tval uint64\n}{\n\t{0, '-', fsparser.S_IFREG},\n\t{0, 's', fsparser.S_IFSOCK},\n\t{0, 'l', fsparser.S_IFLNK},\n\t{0, 'b', fsparser.S_IFBLK},\n\t{0, 'd', fsparser.S_IFDIR},\n\t{0, 'c', fsparser.S_IFCHR},\n\t{0, 'p', fsparser.S_IFIFO},\n\t{1, 'r', fsparser.S_IRUSR},\n\t{2, 'w', fsparser.S_IWUSR},\n\t{3, 'x', fsparser.S_IXUSR},\n\t{3, 's', fsparser.S_IXUSR | fsparser.S_ISUID},\n\t{3, 'S', fsparser.S_ISUID},\n\t{4, 'r', fsparser.S_IRGRP},\n\t{5, 'w', fsparser.S_IWGRP},\n\t{6, 'x', fsparser.S_IXGRP},\n\t{6, 's', fsparser.S_IXGRP | fsparser.S_ISGID},\n\t{6, 'S', fsparser.S_ISGID},\n\t{7, 'r', fsparser.S_IROTH},\n\t{8, 'w', fsparser.S_IWOTH},\n\t{9, 'x', fsparser.S_IXOTH},\n\t{9, 't', fsparser.S_IXOTH | fsparser.S_ISVTX},\n\t{9, 'T', fsparser.S_ISVTX},\n}\n\nconst (\n\tFILE_MODE_STR_LEN = 10 // such as \"-rw-r--r--\"\n)\n\nfunc parseMode(mode string) (uint64, error) {\n\tvar m uint64\n\tif len(mode) != FILE_MODE_STR_LEN {\n\t\treturn 0, fmt.Errorf(\"parseMode: invalid mode string %s\", mode)\n\t}\n\tfor _, f := range modeFlags {\n\t\tif mode[f.pos] == f.chr {\n\t\t\tm |= f.val\n\t\t}\n\t}\n\treturn m, nil\n}\n\n// Ensure directory and file names are consistent, with no relative parts\n// or trailing slash on directory names.\nfunc normalizePath(filepath string) (dir string, name string) {\n\tdir, name = path.Split(path.Clean(filepath))\n\tdir = path.Clean(dir)\n\treturn\n}\n\nconst (\n\tNAME_IDX_NORMAL_FILE = 7\n\tNAME_IDX_DEVICE_FILE = 8\n)\n\nfunc (p *CpioParser) parseFileLine(line string) (string, fsparser.FileInfo, error) {\n\treg := p.fileInfoReg\n\tnameIdx := NAME_IDX_NORMAL_FILE\n\tdirpath := \"\"\n\tif strings.HasPrefix(line, \"b\") || strings.HasPrefix(line, \"c\") {\n\t\treg = p.devInfoReg\n\t\tnameIdx = NAME_IDX_DEVICE_FILE\n\t}\n\tres := reg.FindAllStringSubmatch(line, -1)\n\tvar fi fsparser.FileInfo\n\t// only normal files have a size\n\tif nameIdx == NAME_IDX_NORMAL_FILE {\n\t\tsize, _ := strconv.Atoi(res[0][5])\n\t\tfi.Size = int64(size)\n\t}\n\tfi.Mode, _ = parseMode(res[0][1])\n\tfi.Uid, _ = strconv.Atoi(res[0][3])\n\tfi.Gid, _ = strconv.Atoi(res[0][4])\n\t// cpio returns relative pathnames so add leading \"/\"\n\tfi.Name = \"/\" + res[0][nameIdx]\n\n\t// fill in linktarget\n\tif fi.IsLink() && strings.Contains(fi.Name, \"->\") {\n\t\trlnk := p.fileLinkReg.FindAllStringSubmatch(fi.Name, -1)\n\t\tif rlnk == nil {\n\t\t\treturn \"\", fsparser.FileInfo{}, fmt.Errorf(\"can't parse LinkTarget from %s\", fi.Name)\n\t\t}\n\t\tfi.Name = rlnk[0][1]\n\t\tfi.LinkTarget = rlnk[0][2]\n\t}\n\n\t// handle root directory\n\tif fi.Name == \"/.\" {\n\t\tdirpath = \".\"\n\t\tfi.Name = \".\"\n\t} else {\n\t\tdirpath, fi.Name = normalizePath(fi.Name)\n\t}\n\n\treturn dirpath, fi, nil\n\n}\n\n// GetDirInfo returns information on the specified directory.\nfunc (p *CpioParser) GetDirInfo(dirpath string) ([]fsparser.FileInfo, error) {\n\tif err := p.loadFileList(); err != nil {\n\t\treturn nil, err\n\t}\n\n\treturn p.files[path.Clean(dirpath)], nil\n}\n\n// GetFileInfo returns information on the specified file.\nfunc (p *CpioParser) GetFileInfo(filepath string) (fsparser.FileInfo, error) {\n\tif err := p.loadFileList(); err != nil {\n\t\treturn fsparser.FileInfo{}, err\n\t}\n\n\tdirpath, name := normalizePath(filepath)\n\t// the root is stored as \".\"\n\tif dirpath == \"/\" && name == \"\" {\n\t\tdirpath = \".\"\n\t\tname = \".\"\n\t}\n\tdir := p.files[dirpath]\n\tfor _, fi := range dir {\n\t\tif fi.Name == name {\n\t\t\treturn fi, nil\n\t\t}\n\t}\n\treturn fsparser.FileInfo{}, fmt.Errorf(\"Can't find file %s\", filepath)\n}\n\nfunc (p *CpioParser) loadFileList() error {\n\tif p.files != nil {\n\t\treturn nil\n\t}\n\n\tout, err := exec.Command(\"sh\", \"-c\", cpioCmd+\" -tvn --quiet < \"+p.imagepath).CombinedOutput()\n\tif err != nil {\n\t\tif err.Error() != errors.New(\"exit status 2\").Error() {\n\t\t\tfmt.Fprintf(os.Stderr, \"getDirList: >%s<\", err)\n\t\t\treturn err\n\t\t}\n\t}\n\treturn p.loadFileListFromString(string(out))\n}\n\nfunc (p *CpioParser) loadFileListFromString(rawFileList string) error {\n\tp.files = make(map[string][]fsparser.FileInfo)\n\n\tlines := strings.Split(rawFileList, \"\\n\")\n\tfor _, line := range lines {\n\t\tif len(line) < MIN_LINE_LENGTH {\n\t\t\tcontinue\n\t\t}\n\t\tif strings.HasPrefix(line, \"cpio\") {\n\t\t\tcontinue\n\t\t}\n\t\tpath, fi, err := p.parseFileLine(line)\n\t\tif err == nil {\n\t\t\tdirfiles := p.files[path]\n\t\t\tdirfiles = append(dirfiles, fi)\n\t\t\tp.files[path] = dirfiles\n\n\t\t\tif p.fixDirs {\n\t\t\t\tp.fixDir(path, fi.Name)\n\t\t\t}\n\t\t}\n\t}\n\treturn nil\n}\n\n/*\n * With cpio it is possible that a file exists in a directory that does not have its own entry.\n *  e.g. \"dev/tty6\" exists in the cpio but there is no entry for \"dev\"\n * This function creates the missing directories in the internal structure.\n */\nfunc (p *CpioParser) fixDir(dir string, name string) {\n\tif dir == \"/\" {\n\t\treturn\n\t}\n\tbasename := path.Base(dir)\n\tdirname := path.Dir(dir)\n\n\t// check that all dirname parts exist\n\tif strings.Contains(dirname, \"/\") {\n\t\tp.fixDir(dirname, basename)\n\t}\n\n\tdirExists := false\n\tfor _, f := range p.files[dirname] {\n\t\tif f.Name == basename {\n\t\t\tdirExists = true\n\t\t}\n\t}\n\tif !dirExists {\n\t\tdirfiles := p.files[dirname]\n\t\tdirfiles = append(dirfiles, fsparser.FileInfo{Name: basename, Mode: 040755, Uid: 0, Gid: 0, Size: 0})\n\t\tp.files[dirname] = dirfiles\n\t}\n}\n\n// CopyFile copies the specified file to the specified destination.\nfunc (p *CpioParser) CopyFile(filepath string, dstdir string) bool {\n\tout, err := exec.Command(\"sh\", \"-c\", cpioCmd+\" -i --to-stdout \"+filepath[1:]+\" < \"+p.imagepath+\" > \"+dstdir).CombinedOutput()\n\tif err != nil {\n\t\tif err.Error() != errors.New(\"exit status 2\").Error() {\n\t\t\tfmt.Fprintf(os.Stderr, \"cpio failed: %v: %s\\n\", err, out)\n\t\t\treturn false\n\t\t}\n\t}\n\treturn true\n}\n\nfunc (p *CpioParser) Supported() bool {\n\tif _, err := exec.LookPath(cpioCmd); err != nil {\n\t\treturn false\n\t}\n\tif _, err := exec.LookPath(cpCmd); err != nil {\n\t\treturn false\n\t}\n\treturn true\n}\n"
  },
  {
    "path": "pkg/cpioparser/cpioparser_test.go",
    "content": "/*\nCopyright 2019-present, Cruise LLC\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n\thttp://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n*/\n\npackage cpioparser\n\nimport (\n\t\"os\"\n\t\"testing\"\n)\n\ntype testData struct {\n\tLine       string\n\tMode       uint64\n\tDir        string\n\tName       string\n\tIsFile     bool\n\tLinkTarget string\n}\n\nfunc TestParseLine(t *testing.T) {\n\ttestImage := \"../../test/test.cpio\"\n\tp := New(testImage, true)\n\n\ttestdata := []testData{\n\t\t{`-rw-r--r--   1 0        0              21 Apr 11  2008 etc/motd`, 0100644, \"/etc/\", \"motd\", true, \"\"},\n\t\t{`-rw-r--r--   1 0        0              21 Apr 11 13:37 etc/mxtd`, 0100644, \"/etc/\", \"mxtd\", true, \"\"},\n\t\t{`crw-r--r--   1 0        0          4,  64 Apr 24  2019 dev/ttyS0`, 020644, \"/dev\", \"ttyS0\", false, \"\"},\n\t\t{`lrwxrwxrwx   1 0        0              19 Apr 24  2019 lib/libcrypto.so.1.0.0 -> libcrypto-1.0.0.so`, 0120777, \"/lib\", \"libcrypto.so.1.0.0\", false, \"libcrypto-1.0.0.so\"},\n\t\t{`drwxr-xr-x   2 0        0               0 Aug  8 18:53 .`, 040755, \".\", \".\", false, \"\"},\n\t}\n\n\tfor _, test := range testdata {\n\t\tdir, res, err := p.parseFileLine(test.Line)\n\t\tif err != nil {\n\t\t\tt.Error(err)\n\t\t}\n\t\tif res.Mode != test.Mode {\n\t\t\tt.Errorf(\"bad file mode: %o\", res.Mode)\n\t\t}\n\t\tif dir != test.Dir && res.Name != test.Name {\n\t\t\tt.Errorf(\"name error: %s %s\", dir, res.Name)\n\t\t}\n\t\tif res.IsFile() != test.IsFile {\n\t\t\tt.Error(\"isFile bad\")\n\t\t}\n\t\tif test.LinkTarget != res.LinkTarget {\n\t\t\tt.Errorf(\"bad link target: %s\", res.LinkTarget)\n\t\t}\n\t}\n}\n\nfunc TestFixDir(t *testing.T) {\n\ttestImage := \"../../test/test.cpio\"\n\tp := New(testImage, true)\n\n\ttestdata := `\ncrw-r--r--   1 0        0          3,   1 Jan 13 17:57 dev/ttyp1\ncrw-r--r--   1 0        0          3,   1 Jan 13 17:57 dev/x/ttyp1`\n\n\terr := p.loadFileListFromString(testdata)\n\tif err != nil {\n\t\tt.Error(err)\n\t}\n\n\tok := false\n\tfor _, fn := range p.files[\"/\"] {\n\t\tif fn.Name == \"dev\" {\n\t\t\tok = true\n\t\t}\n\t}\n\tif !ok {\n\t\tt.Errorf(\"dir '/dev' not found\")\n\t}\n\n\tok = false\n\tfor _, fn := range p.files[\"/dev\"] {\n\t\tif fn.Name == \"x\" {\n\t\t\tok = true\n\t\t}\n\t}\n\tif !ok {\n\t\tt.Errorf(\"dir '/dev/x' not found\")\n\t}\n}\n\nfunc TestFull(t *testing.T) {\n\ttestImage := \"../../test/test.cpio\"\n\tp := New(testImage, false)\n\n\tfi, err := p.GetFileInfo(\"/\")\n\tif err != nil {\n\t\tt.Error(err)\n\t}\n\tif !fi.IsDir() {\n\t\tt.Errorf(\"/ should be dir\")\n\t}\n\n\tdir, err := p.GetDirInfo(\"/\")\n\tif err != nil {\n\t\tt.Error(err)\n\t}\n\tif len(dir) < 1 {\n\t\tt.Errorf(\"/ should not be empty\")\n\t}\n\n\tfi, err = p.GetFileInfo(\"/etc/fstab\")\n\tif err != nil {\n\t\tt.Error(err)\n\t}\n\tif !fi.IsFile() {\n\t\tt.Error(\"should be a file\")\n\t}\n\tif fi.Name != \"fstab\" {\n\t\tt.Errorf(\"name bad: %s\", fi.Name)\n\t}\n\tif fi.IsDir() {\n\t\tt.Error(\"should be a file\")\n\t}\n\tif fi.Size != 385 {\n\t\tt.Error(\"bad size\")\n\t}\n\tif fi.Uid != 1000 || fi.Gid != 1000 {\n\t\tt.Error(\"bad owner/group\")\n\t}\n\n\tfi, err = p.GetFileInfo(\"/dev/tty6\")\n\tif err != nil {\n\t\tt.Error(err)\n\t}\n\tif fi.IsFile() {\n\t\tt.Error(\"should not be a file\")\n\t}\n\tif fi.Name != \"tty6\" {\n\t\tt.Errorf(\"name bad: %s\", fi.Name)\n\t}\n\tif fi.IsDir() {\n\t\tt.Error(\"should not be a dir\")\n\t}\n\tif fi.Size != 0 {\n\t\tt.Error(\"bad size\")\n\t}\n\tif fi.Uid != 0 || fi.Gid != 0 {\n\t\tt.Error(\"bad owner/group\")\n\t}\n\n\ttestfilename := \"testfile123\"\n\tif !p.CopyFile(\"/etc/fstab\", testfilename) {\n\t\tt.Error(\"failed to copy fstab\")\n\t}\n\n\tstat, err := os.Stat(testfilename)\n\tif err != nil {\n\t\tt.Error(err)\n\t}\n\tif stat.Size() != 385 {\n\t\tt.Error(\"bad file size after copy out\")\n\t}\n\tos.Remove(testfilename)\n}\n"
  },
  {
    "path": "pkg/dirparser/dirparser.go",
    "content": "/*\nCopyright 2019-present, Cruise LLC\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n\thttp://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n*/\n\npackage dirparser\n\nimport (\n\t\"fmt\"\n\t\"os\"\n\t\"os/exec\"\n\t\"path\"\n\t\"path/filepath\"\n\t\"syscall\"\n\n\t\"github.com/cruise-automation/fwanalyzer/pkg/capability\"\n\t\"github.com/cruise-automation/fwanalyzer/pkg/fsparser\"\n)\n\nconst (\n\tcpCli string = \"cp\"\n)\n\ntype DirParser struct {\n\timagepath string\n}\n\nfunc New(imagepath string) *DirParser {\n\tvar global DirParser\n\tglobal.imagepath = imagepath\n\treturn &global\n}\n\nfunc (dir *DirParser) GetDirInfo(dirpath string) ([]fsparser.FileInfo, error) {\n\tfiles := make([]fsparser.FileInfo, 0)\n\tfilepath := path.Join(dir.imagepath, dirpath)\n\tfp, err := os.Open(filepath)\n\tif err != nil {\n\t\treturn files, err\n\t}\n\tdefer fp.Close()\n\tnames, err := fp.Readdirnames(0)\n\tif err != nil {\n\t\treturn files, err\n\t}\n\tfor _, fname := range names {\n\t\tfi, err := dir.GetFileInfo(path.Join(dirpath, fname))\n\t\tfi.Name = fname\n\t\tif err != nil {\n\t\t\treturn files, err\n\t\t}\n\t\tfiles = append(files, fi)\n\t}\n\treturn files, nil\n}\n\nfunc (dir *DirParser) GetFileInfo(dirpath string) (fsparser.FileInfo, error) {\n\tvar fi fsparser.FileInfo\n\tfpath := path.Join(dir.imagepath, dirpath)\n\tvar fileStat syscall.Stat_t\n\terr := syscall.Lstat(fpath, &fileStat)\n\tif err != nil {\n\t\treturn fi, err\n\t}\n\tfi.Name = filepath.Base(dirpath)\n\tfi.Mode = uint64(fileStat.Mode)\n\tfi.Uid = int(fileStat.Uid)\n\tfi.Gid = int(fileStat.Gid)\n\tfi.SELinuxLabel = fsparser.SELinuxNoLabel\n\tfi.Size = fileStat.Size\n\n\tcapsBytes := make([]byte, capability.CapByteSizeMax)\n\tcapsSize, _ := syscall.Getxattr(fpath, \"security.capability\", capsBytes)\n\t// ignore err since we only care about the returned size\n\tif capsSize > 0 {\n\t\tfi.Capabilities, err = capability.New(capsBytes)\n\t\tif err != nil {\n\t\t\tfmt.Println(err)\n\t\t}\n\t}\n\n\tif fi.IsLink() {\n\t\tfi.LinkTarget, err = os.Readlink(fpath)\n\t\tif err != nil {\n\t\t\treturn fi, err\n\t\t}\n\t}\n\treturn fi, nil\n}\n\n// copy (extract) file out of the FS into dest dir\nfunc (dir *DirParser) CopyFile(filepath string, dstdir string) bool {\n\t_, err := dir.GetFileInfo(filepath)\n\tif err != nil {\n\t\treturn false\n\t}\n\terr = exec.Command(cpCli, \"-a\", path.Join(dir.imagepath, filepath), dstdir).Run()\n\tif err != nil {\n\t\tfmt.Fprintf(os.Stderr, \"%s -a %s %s: failed\", cpCli, path.Join(dir.imagepath, filepath), dstdir)\n\t\treturn false\n\t}\n\treturn true\n}\n\nfunc (dir *DirParser) ImageName() string {\n\treturn dir.imagepath\n}\n\nfunc (f *DirParser) Supported() bool {\n\t_, err := exec.LookPath(cpCli)\n\treturn err == nil\n}\n"
  },
  {
    "path": "pkg/dirparser/dirparser_test.go",
    "content": "/*\nCopyright 2019-present, Cruise LLC\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n\thttp://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n*/\n\npackage dirparser\n\nimport (\n\t\"fmt\"\n\t\"os\"\n\t\"strings\"\n\t\"testing\"\n)\n\nvar d *DirParser\n\nfunc TestMain(t *testing.T) {\n\ttestImage := \"../../test/\"\n\td = New(testImage)\n\n\tif d.ImageName() != testImage {\n\t\tt.Errorf(\"ImageName returned bad name\")\n\t}\n}\n\nfunc TestGetDirInfo(t *testing.T) {\n\tdir, err := d.GetDirInfo(\"/\")\n\tif err != nil {\n\t\tt.Errorf(\"GetDirInfo failed\")\n\t}\n\tfor _, i := range dir {\n\t\tif i.Name == \".\" || i.Name == \"..\" {\n\t\t\tt.Errorf(\". or .. should not appear in dir listing\")\n\t\t}\n\t}\n\n\toutput_file := \"/tmp/dirfs_test_file\"\n\tif !d.CopyFile(\"test.img\", output_file) {\n\t\tt.Errorf(\"copyfile returned false\")\n\t}\n\tif _, err := os.Stat(output_file); os.IsNotExist(err) {\n\t\tt.Errorf(\"%s\", err)\n\t} else {\n\t\tos.Remove(output_file)\n\t}\n}\n\nfunc TestGetFileInfo(t *testing.T) {\n\ttests := []struct {\n\t\tfilePath string\n\t\tisFile   bool\n\t\tisDir    bool\n\t\tfilename string\n\t}{\n\t\t{\"/test.img\", true, false, \"test.img\"},\n\t\t{\"/\", false, true, \"/\"},\n\t\t{\"/testdir\", false, true, \"testdir\"},\n\t}\n\tfor _, test := range tests {\n\t\tfi, err := d.GetFileInfo(test.filePath)\n\t\tif err != nil {\n\t\t\tt.Errorf(\"GetFileInfo failed\")\n\t\t}\n\t\tif fi.IsFile() != test.isFile {\n\t\t\tt.Errorf(\"GetFileInfo failed, isFile != %v\", test.isFile)\n\t\t}\n\t\tif fi.IsDir() != test.isDir {\n\t\t\tt.Errorf(\"GetFileInfo failed, isDir != %v\", test.isDir)\n\t\t}\n\t\tif fi.Name != test.filename {\n\t\t\tt.Errorf(\"filename does not match: %s\", fi.Name)\n\t\t}\n\t}\n\n\tfi, err := d.GetFileInfo(\"/testlink\")\n\tif err != nil {\n\t\tt.Errorf(\"GetFileInfo failed\")\n\t}\n\tif !fi.IsLink() {\n\t\tt.Errorf(\"GetFileInfo failed, not a link\")\n\t}\n\tif fi.Name != \"testlink\" {\n\t\tt.Errorf(\"GetFileInfo failed, incorrect link name: %s\", fi.Name)\n\t}\n\tif fi.LinkTarget != \"testdir\" {\n\t\tt.Errorf(\"GetFileInfo failed, incorrect link target: %s\", fi.LinkTarget)\n\t}\n}\n\nfunc TestCapability(t *testing.T) {\n\tfi, err := d.GetFileInfo(\"/test.cap.file\")\n\tif err != nil {\n\t\tt.Error(err)\n\t}\n\tfmt.Println(fi.Capabilities)\n\tif len(fi.Capabilities) == 0 || !strings.EqualFold(fi.Capabilities[0], \"cap_net_admin+p\") {\n\t\tt.Error(\"capability test failed: likely need to run 'sudo setcap cap_net_admin+p test/test.cap.file'\")\n\t}\n}\n"
  },
  {
    "path": "pkg/extparser/extparser.go",
    "content": "/*\nCopyright 2019-present, Cruise LLC\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n\thttp://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n*/\n\npackage extparser\n\nimport (\n\t\"fmt\"\n\t\"os\"\n\t\"os/exec\"\n\t\"path/filepath\"\n\t\"regexp\"\n\t\"strconv\"\n\t\"strings\"\n\n\t\"github.com/cruise-automation/fwanalyzer/pkg/capability\"\n\t\"github.com/cruise-automation/fwanalyzer/pkg/fsparser\"\n)\n\ntype Ext2Parser struct {\n\tfileinfoReg  *regexp.Regexp\n\tregexString  string\n\tselinux      bool\n\tcapabilities bool\n\timagepath    string\n}\n\nconst (\n\te2ToolsCp = \"e2cp\"\n\te2ToolsLs = \"e2ls\"\n)\n\nfunc New(imagepath string, selinux, capabilities bool) *Ext2Parser {\n\tparser := &Ext2Parser{\n\t\t// 365  120777     0     0        7 12-Jul-2018 10:15 true\n\t\tregexString:  `^\\s*(\\d+)\\s+(\\d+)\\s+(\\d+)\\s+(\\d+)\\s+(\\d+)\\s+(\\d+-\\w+-\\d+)\\s+(\\d+:\\d+)\\s+([\\S ]+)`,\n\t\timagepath:    imagepath,\n\t\tselinux:      false,\n\t\tcapabilities: false,\n\t}\n\n\tif selinux && seLinuxSupported() {\n\t\tparser.enableSeLinux()\n\t}\n\tif capabilities && capabilitiesSupported() {\n\t\tparser.enableCapabilities()\n\t}\n\tparser.fileinfoReg = regexp.MustCompile(parser.regexString)\n\treturn parser\n}\n\nfunc (e *Ext2Parser) ImageName() string {\n\treturn e.imagepath\n}\n\nfunc (e *Ext2Parser) enableSeLinux() {\n\t// with selinux support (-Z)\n\t// 2600  100750     0  2000     1041   1-Jan-2009 03:00 init.environ.rc   u:object_r:rootfs:s0\n\t// `^\\s*(\\d+)\\s+(\\d+)\\s+(\\d+)\\s+(\\d+)\\s+(\\d+)\\s+(\\d+-\\w+-\\d+)\\s+(\\d+:\\d+)\\s+(\\S+)\\s+(\\S+)`)\n\n\t// append selinux part\n\te.regexString = e.regexString + `\\t\\s+(\\S+)`\n\te.selinux = true\n}\n\nfunc (e *Ext2Parser) enableCapabilities() {\n\t// with capabilities support (-C)\n\t// 2600  100750     0  2000     1041   1-Jan-2009 03:00 init.environ.rc 0x2000001,0x0,0x0,0x0,0x0\n\t// `^\\s*(\\d+)\\s+(\\d+)\\s+(\\d+)\\s+(\\d+)\\s+(\\d+)\\s+(\\d+-\\w+-\\d+)\\s+(\\d+:\\d+)\\s+(\\S+)\\s+(\\S+)`)\n\n\t// tab is the separator for security options\n\tif !e.selinux {\n\t\te.regexString = e.regexString + `\\t`\n\t}\n\t// append capability part\n\te.regexString = e.regexString + `\\s+(\\S+)`\n\te.capabilities = true\n}\n\nfunc (e *Ext2Parser) parseFileLine(line string) fsparser.FileInfo {\n\tres := e.fileinfoReg.FindAllStringSubmatch(line, -1)\n\tvar fi fsparser.FileInfo\n\tsize, _ := strconv.Atoi(res[0][5])\n\tfi.Size = int64(size)\n\tfi.Mode, _ = strconv.ParseUint(res[0][2], 8, 32)\n\tfi.Uid, _ = strconv.Atoi(res[0][3])\n\tfi.Gid, _ = strconv.Atoi(res[0][4])\n\tfi.Name = res[0][8]\n\n\tif fi.IsLink() && strings.Contains(fi.Name, \" -> \") {\n\t\tparts := strings.Split(fi.Name, \" -> \")\n\t\tfi.Name = parts[0]\n\t\tfi.LinkTarget = parts[1]\n\t}\n\n\tif e.selinux {\n\t\tfi.SELinuxLabel = res[0][9]\n\t} else {\n\t\tfi.SELinuxLabel = fsparser.SELinuxNoLabel\n\t}\n\n\tif e.capabilities {\n\t\tidx := 9\n\t\tif e.selinux {\n\t\t\tidx = 10\n\t\t}\n\t\tif res[0][idx] != \"-\" {\n\t\t\tfi.Capabilities, _ = capability.New(res[0][idx])\n\t\t}\n\t}\n\treturn fi\n}\n\n// ignoreDot=true: will filter out \".\" and \"..\" files from the directory listing\nfunc (e *Ext2Parser) getDirList(dirpath string, ignoreDot bool) ([]fsparser.FileInfo, error) {\n\targ := fmt.Sprintf(\"%s:%s\", e.imagepath, dirpath)\n\tparams := \"-la\"\n\tif e.selinux {\n\t\tparams += \"Z\"\n\t}\n\tif e.capabilities {\n\t\tparams += \"C\"\n\t}\n\tout, err := exec.Command(e2ToolsLs, params, arg).CombinedOutput()\n\tif err != nil {\n\t\t// do NOT print file not found error\n\t\tif !strings.EqualFold(string(out), \"File not found by ext2_lookup\") {\n\t\t\tfmt.Fprintln(os.Stderr, err)\n\t\t}\n\t\treturn nil, err\n\t}\n\tvar dir []fsparser.FileInfo\n\tlines := strings.Split(string(out), \"\\n\")\n\tfor _, fline := range lines {\n\t\tif len(fline) > 1 && fline[0] != '>' {\n\t\t\tfi := e.parseFileLine(fline)\n\t\t\t// filter: . and ..\n\t\t\tif !ignoreDot || (fi.Name != \".\" && fi.Name != \"..\") {\n\t\t\t\tdir = append(dir, fi)\n\t\t\t}\n\t\t}\n\t}\n\treturn dir, nil\n}\n\nfunc (e *Ext2Parser) GetDirInfo(dirpath string) ([]fsparser.FileInfo, error) {\n\tdir, err := e.getDirList(dirpath, true)\n\treturn dir, err\n}\n\nfunc (e *Ext2Parser) GetFileInfo(dirpath string) (fsparser.FileInfo, error) {\n\tvar fi fsparser.FileInfo\n\tdir, err := e.getDirList(dirpath, false)\n\tif len(dir) == 1 {\n\t\treturn dir[0], err\n\t}\n\t// GetFileInfo was called on a directory only return entry for \".\"\n\tfor _, info := range dir {\n\t\tif info.Name == \".\" {\n\t\t\tinfo.Name = filepath.Base(dirpath)\n\t\t\treturn info, nil\n\t\t}\n\t}\n\treturn fi, fmt.Errorf(\"file not found: %s\", dirpath)\n}\n\nfunc (e *Ext2Parser) CopyFile(filepath string, dstdir string) bool {\n\tsrc := fmt.Sprintf(\"%s:%s\", e.imagepath, filepath)\n\t_, err := exec.Command(e2ToolsCp, src, dstdir).Output()\n\tif err != nil {\n\t\tfmt.Fprintln(os.Stderr, err)\n\t\treturn false\n\t}\n\treturn true\n}\n\nfunc (f *Ext2Parser) Supported() bool {\n\t_, err := exec.LookPath(e2ToolsLs)\n\tif err != nil {\n\t\treturn false\n\t}\n\t_, err = exec.LookPath(e2ToolsCp)\n\treturn err == nil\n}\n\nfunc seLinuxSupported() bool {\n\tout, _ := exec.Command(e2ToolsLs).CombinedOutput()\n\t// look for Z (selinux support) in \"Usage: e2ls [-acDfilrtZ][-d dir] file\"\n\tif strings.Contains(string(out), \"Z\") {\n\t\treturn true\n\t}\n\tfmt.Fprintln(os.Stderr, \"extparser: selinux not supported by your version of e2ls\")\n\treturn false\n}\n\nfunc capabilitiesSupported() bool {\n\tout, _ := exec.Command(e2ToolsLs).CombinedOutput()\n\t// look for C (capability support) in \"Usage: e2ls [-acDfilrtZC][-d dir] file\"\n\tif strings.Contains(string(out), \"C\") {\n\t\treturn true\n\t}\n\tfmt.Fprintln(os.Stderr, \"extparser: capabilities not supported by your version of e2ls\")\n\treturn false\n}\n"
  },
  {
    "path": "pkg/extparser/extparser_test.go",
    "content": "/*\nCopyright 2019-present, Cruise LLC\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n\thttp://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n*/\n\npackage extparser\n\nimport (\n\t\"os\"\n\t\"strings\"\n\t\"testing\"\n)\n\nvar e *Ext2Parser\n\nfunc TestMain(t *testing.T) {\n\ttestImage := \"../../test/test.img\"\n\n\te = New(testImage, false, false)\n\n\tif e.ImageName() != testImage {\n\t\tt.Errorf(\"ImageName returned bad name\")\n\t}\n}\n\nfunc TestGetDirList(t *testing.T) {\n\tdir, err := e.getDirList(\"/\", true)\n\tif err != nil {\n\t\tt.Errorf(\"getDirList failed\")\n\t}\n\tfor _, i := range dir {\n\t\tif i.Name == \".\" || i.Name == \"..\" {\n\t\t\tt.Errorf(\". or .. should not appear in dir listing\")\n\t\t}\n\t}\n\n\tdir, err = e.getDirList(\"/\", false)\n\tif err != nil {\n\t\tt.Errorf(\"getDirList failed\")\n\t}\n\n\tdot := false\n\tdotdot := false\n\tfor _, i := range dir {\n\t\tif i.Name == \".\" {\n\t\t\tdot = true\n\t\t}\n\t\tif i.Name == \"..\" {\n\t\t\tdotdot = true\n\t\t}\n\t}\n\tif !dot || !dotdot {\n\t\tt.Errorf(\". and .. should appear in dir listing\")\n\t}\n}\n\nfunc TestGetDirInfo(t *testing.T) {\n\tdir, err := e.GetDirInfo(\"/\")\n\tif err != nil {\n\t\tt.Errorf(\"GetDirInfo failed\")\n\t}\n\tfor _, i := range dir {\n\t\tif i.Name == \".\" || i.Name == \"..\" {\n\t\t\tt.Errorf(\". or .. should not appear in dir listing\")\n\t\t}\n\t}\n\tif len(dir) == 0 {\n\t\tt.Errorf(\"root needs to be >= 1 entries due to lost+found\")\n\t}\n\n\tif !e.CopyFile(\"/date1\", \".\") {\n\t\tt.Errorf(\"copyfile returned false\")\n\t}\n\tif _, err := os.Stat(\"date1\"); os.IsNotExist(err) {\n\t\tt.Errorf(\"%s\", err)\n\t} else {\n\t\tos.Remove(\"date1\")\n\t}\n}\n\nfunc TestGetFileInfo(t *testing.T) {\n\ttests := []struct {\n\t\tfilePath   string\n\t\tisFile     bool\n\t\tisDir      bool\n\t\tisLink     bool\n\t\tlinkTarget string\n\t\tfilename   string\n\t}{\n\t\t{\"/date1\", true, false, false, \"\", \"date1\"},\n\t\t{\"/\", false, true, false, \"\", \"/\"},\n\t\t{\"/dir1\", false, true, false, \"\", \"dir1\"},\n\t\t{\"/file_link\", false, false, true, \"file2\", \"file_link\"},\n\t}\n\tfor _, test := range tests {\n\t\tfi, err := e.GetFileInfo(test.filePath)\n\t\tif err != nil {\n\t\t\tt.Errorf(\"GetFileInfo failed: %v\", err)\n\t\t}\n\t\tif fi.IsFile() != test.isFile {\n\t\t\tt.Errorf(\"GetFileInfo failed, isFile != %v\", test.isFile)\n\t\t}\n\t\tif fi.IsLink() != test.isLink {\n\t\t\tt.Errorf(\"GetFileInfo failed, isLink != %v\", test.isLink)\n\t\t}\n\t\tif fi.LinkTarget != test.linkTarget {\n\t\t\tt.Errorf(\"GetFileInfo failed, link target bad\")\n\t\t}\n\t\tif fi.IsDir() != test.isDir {\n\t\t\tt.Errorf(\"GetFileInfo failed, isDir != %v\", test.isDir)\n\t\t}\n\t\tif fi.Name != test.filename {\n\t\t\tt.Errorf(\"filename does not match: %s\", fi.Name)\n\t\t}\n\t}\n}\n\nfunc TestCap(t *testing.T) {\n\ttestImage := \"../../test/cap_ext2.img\"\n\n\te = New(testImage, false, true)\n\tif !capabilitiesSupported() {\n\t\tt.Error(\"capabilities are not supported by e2ls\")\n\t\treturn\n\t}\n\n\tif e.ImageName() != testImage {\n\t\tt.Errorf(\"ImageName returned bad name\")\n\t}\n\n\tfi, err := e.GetFileInfo(\"/test\")\n\tif err != nil {\n\t\tt.Error(err)\n\t}\n\tif !strings.EqualFold(fi.Capabilities[0], \"cap_net_admin+p\") {\n\t\tt.Errorf(\"Capabilities %s don't match\", fi.Capabilities)\n\t}\n}\n"
  },
  {
    "path": "pkg/fsparser/fsparser.go",
    "content": "/*\nCopyright 2019-present, Cruise LLC\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n\thttp://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n*/\n\npackage fsparser\n\ntype FsParser interface {\n\t// get directory listing. only returns files in the given directory and\n\t// does not recurse into subdirectories.\n\tGetDirInfo(dirpath string) ([]FileInfo, error)\n\t// get file/dir info\n\tGetFileInfo(dirpath string) (FileInfo, error)\n\t// copy (extract) file out of the FS into dest dir\n\tCopyFile(filepath string, dstDir string) bool\n\t// get imagename\n\tImageName() string\n\t// determine if FS type is supported\n\tSupported() bool\n}\n\ntype FileInfo struct {\n\tSize         int64    `json:\"size\"`\n\tMode         uint64   `json:\"mode\"`\n\tUid          int      `json:\"uid\"`\n\tGid          int      `json:\"gid\"`\n\tSELinuxLabel string   `json:\"se_linux_label,omitempty\"`\n\tCapabilities []string `json:\"capabilities,omitempty\"`\n\tName         string   `json:\"name\"`\n\tLinkTarget   string   `json:\"link_target,omitempty\"`\n}\n\nconst (\n\tSELinuxNoLabel string = \"-\"\n)\n\nconst (\n\tS_IFMT   = 0170000 // bit mask for the file type bit fields\n\tS_IFSOCK = 0140000 // socket\n\tS_IFLNK  = 0120000 // symbolic link\n\tS_IFREG  = 0100000 // regular file\n\tS_IFBLK  = 0060000 // block device\n\tS_IFDIR  = 0040000 // directory\n\tS_IFCHR  = 0020000 // character device\n\tS_IFIFO  = 0010000 // FIFO\n\tS_ISUID  = 0004000 // set-user-ID bit\n\tS_ISGID  = 0002000 // set-group-ID bit (see below)\n\tS_ISVTX  = 0001000 // sticky bit (see below)\n\tS_IRWXU  = 00700   // mask for file owner permissions\n\tS_IRUSR  = 00400   // owner has read permission\n\tS_IWUSR  = 00200   // owner has write permission\n\tS_IXUSR  = 00100   // owner has execute permission\n\tS_IRWXG  = 00070   // mask for group permissions\n\tS_IRGRP  = 00040   // group has read permission\n\tS_IWGRP  = 00020   // group has write permission\n\tS_IXGRP  = 00010   // group has execute permission\n\tS_IRWXO  = 00007   // mask for permissions for others (not in group)\n\tS_IROTH  = 00004   // others have read permission\n\tS_IWOTH  = 00002   // others have write permission\n\tS_IXOTH  = 00001   // others have execute permission\n)\n\nfunc (fi *FileInfo) IsSUid() bool {\n\treturn (fi.Mode & S_ISUID) != 0\n}\n\nfunc (fi *FileInfo) IsSGid() bool {\n\treturn (fi.Mode & S_ISGID) != 0\n}\n\nfunc (fi *FileInfo) IsWorldWrite() bool {\n\treturn (fi.Mode & S_IWOTH) != 0\n}\n\nfunc (fi *FileInfo) IsFile() bool {\n\treturn (fi.Mode & S_IFMT) == S_IFREG\n}\n\nfunc (fi *FileInfo) IsDir() bool {\n\treturn (fi.Mode & S_IFMT) == S_IFDIR\n}\n\nfunc (fi *FileInfo) IsLink() bool {\n\treturn (fi.Mode & S_IFMT) == S_IFLNK\n}\n"
  },
  {
    "path": "pkg/squashfsparser/squashfsparser.go",
    "content": "/*\nCopyright 2019-present, Cruise LLC\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n\thttp://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n*/\n\npackage squashfsparser\n\nimport (\n\t\"fmt\"\n\t\"io/ioutil\"\n\t\"os\"\n\t\"os/exec\"\n\t\"os/user\"\n\t\"path\"\n\t\"regexp\"\n\t\"strconv\"\n\t\"strings\"\n\n\t\"github.com/cruise-automation/fwanalyzer/pkg/capability\"\n\t\"github.com/cruise-automation/fwanalyzer/pkg/fsparser\"\n)\n\nconst (\n\tunsquashfsCmd = \"unsquashfs\"\n\tcpCmd         = \"cp\"\n)\n\n// SquashFSParser parses SquashFS filesystem images.\ntype SquashFSParser struct {\n\tfileLineRegex *regexp.Regexp\n\timagepath     string\n\tfiles         map[string][]fsparser.FileInfo\n\tsecurityInfo  bool\n}\n\nfunc uidForUsername(username string) (int, error) {\n\t// First check to see if it's an int. If not, look it up by name.\n\tuid, err := strconv.Atoi(username)\n\tif err == nil {\n\t\treturn uid, nil\n\t}\n\tu, err := user.Lookup(username)\n\tif err != nil {\n\t\treturn 0, err\n\t}\n\treturn strconv.Atoi(u.Uid)\n}\n\nfunc gidForGroup(group string) (int, error) {\n\t// First check to see if it's an int. If not, look it up by name.\n\tgid, err := strconv.Atoi(group)\n\tif err == nil {\n\t\treturn gid, nil\n\t}\n\tg, err := user.LookupGroup(group)\n\tif err != nil {\n\t\treturn 0, err\n\t}\n\treturn strconv.Atoi(g.Gid)\n}\n\n// From table[] in https://github.com/plougher/squashfs-tools/blob/master/squashfs-tools/unsquashfs.c\nvar modeFlags = []struct {\n\tpos int\n\tchr byte\n\tval uint64\n}{\n\t{0, '-', fsparser.S_IFREG},\n\t{0, 's', fsparser.S_IFSOCK},\n\t{0, 'l', fsparser.S_IFLNK},\n\t{0, 'b', fsparser.S_IFBLK},\n\t{0, 'd', fsparser.S_IFDIR},\n\t{0, 'c', fsparser.S_IFCHR},\n\t{0, 'p', fsparser.S_IFIFO},\n\t{1, 'r', fsparser.S_IRUSR},\n\t{2, 'w', fsparser.S_IWUSR},\n\t{3, 'x', fsparser.S_IXUSR},\n\t{3, 's', fsparser.S_IXUSR | fsparser.S_ISUID},\n\t{3, 'S', fsparser.S_ISUID},\n\t{4, 'r', fsparser.S_IRGRP},\n\t{5, 'w', fsparser.S_IWGRP},\n\t{6, 'x', fsparser.S_IXGRP},\n\t{6, 's', fsparser.S_IXGRP | fsparser.S_ISGID},\n\t{6, 'S', fsparser.S_ISGID},\n\t{7, 'r', fsparser.S_IROTH},\n\t{8, 'w', fsparser.S_IWOTH},\n\t{9, 'x', fsparser.S_IXOTH},\n\t{9, 't', fsparser.S_IXOTH | fsparser.S_ISVTX},\n\t{9, 'T', fsparser.S_ISVTX},\n}\n\nfunc parseMode(mode string) (uint64, error) {\n\tvar m uint64\n\tif len(mode) != 10 {\n\t\treturn 0, fmt.Errorf(\"parseMode: invalid mode string %s\", mode)\n\t}\n\tfor _, f := range modeFlags {\n\t\tif mode[f.pos] == f.chr {\n\t\t\tm |= f.val\n\t\t}\n\t}\n\treturn m, nil\n}\n\nfunc getExtractFile(dirpath string) (string, error) {\n\textractFile, err := ioutil.TempFile(\"\", \"squashfsparser\")\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\t_, err = extractFile.Write([]byte(dirpath))\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\terr = extractFile.Close()\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\treturn extractFile.Name(), nil\n}\n\nfunc (s *SquashFSParser) enableSecurityInfo() {\n\t// drwxr-xr-x administrator/administrator 66 2019-04-08 18:49 squashfs-root\t- -\n\ts.fileLineRegex = regexp.MustCompile(`^([A-Za-z-]+)\\s+([\\-\\.\\w]+|\\d+)/([\\-\\.\\w]+|\\d+)\\s+(\\d+)\\s+(\\d+-\\d+-\\d+)\\s+(\\d+:\\d+)\\s+([\\S ]+)\\t(\\S+)\\s+(\\S)`)\n\ts.securityInfo = true\n}\n\n// New returns a new SquashFSParser instance for the given image file.\nfunc New(imagepath string, securityInfo bool) *SquashFSParser {\n\tparser := &SquashFSParser{\n\t\t// drwxr-xr-x administrator/administrator 66 2019-04-08 18:49 squashfs-root\n\t\tfileLineRegex: regexp.MustCompile(`^([A-Za-z-]+)\\s+([\\-\\.\\w]+|\\d+)/([\\-\\.\\w]+|\\d+)\\s+(\\d+)\\s+(\\d+-\\d+-\\d+)\\s+(\\d+:\\d+)\\s+(.*)$`),\n\t\timagepath:     imagepath,\n\t\tsecurityInfo:  false,\n\t}\n\n\tif securityInfo && securityInfoSupported() {\n\t\tparser.enableSecurityInfo()\n\t}\n\n\treturn parser\n}\n\nfunc normalizePath(filepath string) (dir string, name string) {\n\t// Ensure directory and file names are consistent, with no relative parts\n\t// or trailing slash on directory names.\n\tdir, name = path.Split(path.Clean(filepath))\n\tdir = path.Clean(dir)\n\treturn\n}\n\nfunc (s *SquashFSParser) parseFileLine(line string) (string, fsparser.FileInfo, error) {\n\t// TODO(jlarimer): add support for reading xattrs. unsquashfs can read\n\t// and write xattrs, but it doesn't display them when just listing files.\n\tvar fi fsparser.FileInfo\n\tdirpath := \"\"\n\tres := s.fileLineRegex.FindStringSubmatch(line)\n\tif res == nil {\n\t\treturn dirpath, fi, fmt.Errorf(\"Can't match line %s\\n\", line)\n\t}\n\tvar err error\n\tfi.Mode, err = parseMode(res[1])\n\tif err != nil {\n\t\treturn dirpath, fi, err\n\t}\n\t// unsquashfs converts the uid/gid to a username/group on this system, so\n\t// we need to convert it back to the numeric values.\n\tfi.Uid, err = uidForUsername(res[2])\n\tif err != nil {\n\t\treturn dirpath, fi, err\n\t}\n\tfi.Gid, err = gidForGroup(res[3])\n\tif err != nil {\n\t\treturn dirpath, fi, err\n\t}\n\tfi.Size, err = strconv.ParseInt(res[4], 10, 64)\n\tif err != nil {\n\t\treturn dirpath, fi, err\n\t}\n\t// links show up with a name like \"./dir2/file3 -> file1\"\n\tif fi.Mode&fsparser.S_IFLNK == fsparser.S_IFLNK {\n\t\tparts := strings.Split(res[7], \" -> \")\n\t\tdirpath, fi.Name = normalizePath(parts[0])\n\t\tfi.LinkTarget = parts[1]\n\t} else {\n\t\tdirpath, fi.Name = normalizePath(res[7])\n\t}\n\n\tif s.securityInfo {\n\t\tif res[8] != \"-\" {\n\t\t\tfi.Capabilities, _ = capability.New(res[8])\n\t\t}\n\t\tfi.SELinuxLabel = res[9]\n\t}\n\n\treturn dirpath, fi, nil\n}\n\nfunc (s *SquashFSParser) loadFileList() error {\n\tif s.files != nil {\n\t\treturn nil\n\t}\n\ts.files = make(map[string][]fsparser.FileInfo)\n\n\t// we want to use -lln (numeric output) but that is only available in 4.4 and later\n\targs := []string{\"-d\", \"\", \"-lls\", s.imagepath}\n\tif s.securityInfo {\n\t\t// -llS is only available in our patched version\n\t\targs = append([]string{\"-llS\"}, args...)\n\t}\n\n\tout, err := exec.Command(unsquashfsCmd, args...).CombinedOutput()\n\tif err != nil {\n\t\tfmt.Fprintf(os.Stderr, \"getDirList: %s\", err)\n\t\treturn err\n\t}\n\tlines := strings.Split(string(out), \"\\n\")\n\tfor _, line := range lines {\n\t\tpath, fi, err := s.parseFileLine(line)\n\t\tif err == nil {\n\t\t\tdirfiles := s.files[path]\n\t\t\tdirfiles = append(dirfiles, fi)\n\t\t\ts.files[path] = dirfiles\n\t\t}\n\t}\n\treturn nil\n}\n\n// GetDirInfo returns information on the specified directory.\nfunc (s *SquashFSParser) GetDirInfo(dirpath string) ([]fsparser.FileInfo, error) {\n\tif err := s.loadFileList(); err != nil {\n\t\treturn nil, err\n\t}\n\n\treturn s.files[path.Clean(dirpath)], nil\n}\n\n// GetFileInfo returns information on the specified file.\nfunc (s *SquashFSParser) GetFileInfo(filepath string) (fsparser.FileInfo, error) {\n\tif err := s.loadFileList(); err != nil {\n\t\treturn fsparser.FileInfo{}, err\n\t}\n\n\tdirpath, name := normalizePath(filepath)\n\t// the root is stored as \".\"\n\tif dirpath == \"/\" && name == \"\" {\n\t\tdirpath = \".\"\n\t\tname = \".\"\n\t}\n\tdir := s.files[dirpath]\n\tfor _, fi := range dir {\n\t\tif fi.Name == name {\n\t\t\treturn fi, nil\n\t\t}\n\t}\n\treturn fsparser.FileInfo{}, fmt.Errorf(\"Can't find file %s\", filepath)\n}\n\n// CopyFile copies the specified file to the specified destination.\nfunc (s *SquashFSParser) CopyFile(filepath string, dstdir string) bool {\n\t// The list of files/directories to extract needs to be in a file...\n\textractFile, err := getExtractFile(filepath)\n\tif err != nil {\n\t\tfmt.Fprintf(os.Stderr, \"Failed to create temporary file: %v\\n\", err)\n\t\treturn false\n\t}\n\tdefer os.Remove(extractFile)\n\n\t// The -d argument to unsquashfs specifies a directory to unsquash to, but\n\t// the directory can't exist. It also extracts the full path. To fit the\n\t// semantics of CopyFile, we need to extract to a new temporary directly and\n\t// then copy the file to the specified destination.\n\ttmpdir, err := ioutil.TempDir(\"\", \"squashfsparser\")\n\tif err != nil {\n\t\tfmt.Fprintf(os.Stderr, \"Failed to create temporary directory: %v\\n\", err)\n\t\treturn false\n\t}\n\tdefer os.RemoveAll(tmpdir)\n\ttmpdir = path.Join(tmpdir, \"files\")\n\n\tout, err := exec.Command(unsquashfsCmd, \"-d\", tmpdir, \"-e\", extractFile, s.imagepath).CombinedOutput()\n\tif err != nil {\n\t\tfmt.Fprintf(os.Stderr, \"unsquashfs failed: %v: %s\\n\", err, out)\n\t\treturn false\n\t}\n\n\terr = exec.Command(cpCmd, \"-a\", path.Join(tmpdir, filepath), dstdir).Run()\n\tif err != nil {\n\t\tfmt.Fprintf(os.Stderr, \"%s -a %s %s: failed\", cpCmd, path.Join(tmpdir, filepath), dstdir)\n\t\treturn false\n\t}\n\treturn true\n}\n\n// ImageName returns the name of the filesystem image.\nfunc (s *SquashFSParser) ImageName() string {\n\treturn s.imagepath\n}\n\nfunc (f *SquashFSParser) Supported() bool {\n\t_, err := exec.LookPath(unsquashfsCmd)\n\tif err != nil {\n\t\treturn false\n\t}\n\t_, err = exec.LookPath(cpCmd)\n\treturn err == nil\n}\n\nfunc securityInfoSupported() bool {\n\tout, _ := exec.Command(unsquashfsCmd).CombinedOutput()\n\t// look for -ll[S] (securityInfo support) in output\n\tif strings.Contains(string(out), \"-ll[S]\") {\n\t\treturn true\n\t}\n\tfmt.Fprintln(os.Stderr, \"squashfsparser: security info (selinux + capabilities) not supported by your version of unsquashfs\")\n\treturn false\n}\n"
  },
  {
    "path": "pkg/squashfsparser/squashfsparser_test.go",
    "content": "/*\nCopyright 2019-present, Cruise LLC\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n\thttp://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n*/\n\npackage squashfsparser\n\nimport (\n\t\"io/ioutil\"\n\t\"os\"\n\t\"strings\"\n\t\"testing\"\n\n\t\"github.com/google/go-cmp/cmp\"\n\n\t\"github.com/cruise-automation/fwanalyzer/pkg/fsparser\"\n)\n\n// This could be considered environment-specific...\nfunc TestUidForUsername(t *testing.T) {\n\tuid, err := uidForUsername(\"root\")\n\tif err != nil {\n\t\tt.Errorf(\"uidForUsername(\\\"root\\\") returned error: %v\", err)\n\t\treturn\n\t}\n\tif uid != 0 {\n\t\tt.Errorf(\"uidForUsername(\\\"root\\\") returned %d, should be 0\", uid)\n\t}\n\n\t_, err = uidForUsername(\"asdfASDFxxx999\")\n\tif err == nil {\n\t\tt.Errorf(\"uidForUsername(\\\"asdfASDFxxx\\\") did not return error\")\n\t\treturn\n\t}\n}\n\n// This could be considered environment-specific...\nfunc TestGidForGroup(t *testing.T) {\n\tuid, err := gidForGroup(\"root\")\n\tif err != nil {\n\t\tt.Errorf(\"gidForGroup(\\\"root\\\") returned error: %v\", err)\n\t\treturn\n\t}\n\tif uid != 0 {\n\t\tt.Errorf(\"gidForGroup(\\\"root\\\") returned %d, should be 0\", uid)\n\t}\n\n\t_, err = gidForGroup(\"asdfASDFxxx999\")\n\tif err == nil {\n\t\tt.Errorf(\"gidForGroup(\\\"asdfASDFxxx999\\\") did not return error\")\n\t}\n}\n\nfunc TestParseMode(t *testing.T) {\n\ttests := []struct {\n\t\tmode   string\n\t\tresult uint64\n\t\terr    bool\n\t}{\n\t\t{\n\t\t\tmode: \"drwxr-xr-x\",\n\t\t\tresult: fsparser.S_IFDIR | fsparser.S_IRWXU | fsparser.S_IRGRP |\n\t\t\t\tfsparser.S_IXGRP | fsparser.S_IROTH | fsparser.S_IXOTH,\n\t\t\terr: false,\n\t\t},\n\t\t{\n\t\t\tmode: \"-rw-r--r--\",\n\t\t\tresult: fsparser.S_IFREG | fsparser.S_IRUSR | fsparser.S_IWUSR |\n\t\t\t\tfsparser.S_IRGRP | fsparser.S_IROTH,\n\t\t\terr: false,\n\t\t},\n\t\t{\n\t\t\tmode: \"lrwxrwxrwx\",\n\t\t\tresult: fsparser.S_IFLNK | fsparser.S_IRWXU | fsparser.S_IRWXG |\n\t\t\t\tfsparser.S_IRWXO,\n\t\t\terr: false,\n\t\t},\n\t\t{\n\t\t\tmode: \"drwxrwxrwt\",\n\t\t\tresult: fsparser.S_IFDIR | fsparser.S_IRWXU | fsparser.S_IRWXG |\n\t\t\t\tfsparser.S_IRWXO | fsparser.S_ISVTX,\n\t\t\terr: false,\n\t\t},\n\t\t{\n\t\t\t// too short\n\t\t\tmode:   \"blahblah\",\n\t\t\tresult: 0,\n\t\t\terr:    true,\n\t\t},\n\t\t{\n\t\t\t// too long\n\t\t\tmode:   \"blahblahblah\",\n\t\t\tresult: 0,\n\t\t\terr:    true,\n\t\t},\n\t}\n\n\tfor _, test := range tests {\n\t\tresult, err := parseMode(test.mode)\n\t\tif err != nil && !test.err {\n\t\t\tt.Errorf(\"parseMode(\\\"%s\\\") returned error but shouldn't have: %s\", test.mode, err)\n\t\t\tcontinue\n\t\t}\n\t\tif result != test.result {\n\t\t\tt.Errorf(\"parseMode(\\\"%s\\\") should be %#o, is %#o\", test.mode, test.result, result)\n\t\t}\n\t}\n}\n\nfunc TestParseFileLine(t *testing.T) {\n\ttests := []struct {\n\t\tline    string\n\t\tdirpath string\n\t\tfi      fsparser.FileInfo\n\t\terr     bool\n\t}{\n\t\t{\n\t\t\tline:    \"-rw-r--r-- root/root         32 2019-04-10 14:41 /Filey McFileFace\",\n\t\t\tdirpath: \"/\",\n\t\t\tfi: fsparser.FileInfo{\n\t\t\t\tSize: 32,\n\t\t\t\tMode: 0100644,\n\t\t\t\tUid:  0,\n\t\t\t\tGid:  0,\n\t\t\t\tName: \"Filey McFileFace\",\n\t\t\t},\n\t\t\terr: false,\n\t\t},\n\t\t{\n\t\t\tline:    \"lrwxrwxrwx 1010/2020         5 2019-04-10 14:36 /dir2/file3 -> file1\",\n\t\t\tdirpath: \"/dir2\",\n\t\t\tfi: fsparser.FileInfo{\n\t\t\t\tSize:       5,\n\t\t\t\tMode:       0120777,\n\t\t\t\tUid:        1010,\n\t\t\t\tGid:        2020,\n\t\t\t\tName:       \"file3\",\n\t\t\t\tLinkTarget: \"file1\",\n\t\t\t},\n\t\t\terr: false,\n\t\t},\n\t\t{\n\t\t\tline:    \"blah blah blah!\",\n\t\t\tdirpath: \"\",\n\t\t\tfi:      fsparser.FileInfo{},\n\t\t\terr:     true,\n\t\t},\n\t}\n\n\ts := New(\"\", false)\n\n\tfor _, test := range tests {\n\t\tdirpath, fi, err := s.parseFileLine(test.line)\n\t\tif err != nil && !test.err {\n\t\t\tt.Errorf(\"parseFileLine(\\\"%s\\\") returned error but shouldn't have: %s\", test.line, err)\n\t\t\tcontinue\n\t\t}\n\t\tif dirpath != test.dirpath {\n\t\t\tt.Errorf(\"parseFileLine(\\\"%s\\\") dirpath got \\\"%s\\\", wanted \\\"%s\\\"\", test.line, dirpath, test.dirpath)\n\t\t}\n\t\tif diff := cmp.Diff(fi, test.fi); diff != \"\" {\n\t\t\tt.Errorf(\"parseFileLine(\\\"%s\\\") result mismatch (-got, +want):\\n%s\", test.line, diff)\n\t\t}\n\t}\n}\n\nfunc TestImageName(t *testing.T) {\n\ttestImage := \"../../test/squashfs.img\"\n\tf := New(testImage, false)\n\n\timageName := f.ImageName()\n\tif imageName != testImage {\n\t\tt.Errorf(\"ImageName() returned %s, wanted %s\", imageName, testImage)\n\t}\n}\n\nfunc TestDirInfoRoot(t *testing.T) {\n\ttestImage := \"../../test/squashfs.img\"\n\tf := New(testImage, false)\n\n\t/*\n\t\t$ unsquashfs -d \"\" -ll test/squashfs.img\n\t\tParallel unsquashfs: Using 8 processors\n\t\t5 inodes (4 blocks) to write\n\n\t\tdrwxr-xr-x jlarimer/jlarimer        63 2019-04-11 08:06\n\t\t-rw-r--r-- root/jlarimer             0 2019-04-10 14:41 /Filey McFileFace\n\t\tdrwxr-x--- 1007/1008                 3 2019-04-10 14:36 /dir1\n\t\tdrwxr-xr-x jlarimer/jlarimer        69 2019-04-10 14:40 /dir2\n\t\t---------- jlarimer/jlarimer         7 2019-04-10 14:36 /dir2/file1\n\t\t-rwsr-xr-x jlarimer/jlarimer         5 2019-04-10 14:36 /dir2/file2\n\t\tlrwxrwxrwx jlarimer/jlarimer         5 2019-04-10 14:36 /dir2/file3 -> file1\n\t\tdrwx------ 1005/1005                28 2019-04-10 14:40 /dir2/subdir2\n\t\t-rw-r--r-- jlarimer/jlarimer        20 2019-04-10 14:40 /dir2/subdir2/file4\n\t*/\n\n\ttests := map[string]map[string]fsparser.FileInfo{\n\t\t\"/\": {\n\t\t\t\"Filey McFileFace\": fsparser.FileInfo{\n\t\t\t\tName: \"Filey McFileFace\",\n\t\t\t\tMode: 0100644,\n\t\t\t\tUid:  0,\n\t\t\t\tGid:  1001,\n\t\t\t\tSize: 0,\n\t\t\t},\n\t\t\t\"dir1\": fsparser.FileInfo{\n\t\t\t\tName: \"dir1\",\n\t\t\t\tMode: 0040750,\n\t\t\t\tUid:  1007,\n\t\t\t\tGid:  1008,\n\t\t\t\tSize: 3,\n\t\t\t},\n\t\t\t\"dir2\": fsparser.FileInfo{\n\t\t\t\tName: \"dir2\",\n\t\t\t\tMode: 0040755,\n\t\t\t\tUid:  1001,\n\t\t\t\tGid:  1001,\n\t\t\t\tSize: 69,\n\t\t\t},\n\t\t},\n\t\t\"/dir2\": {\n\t\t\t\"file1\": fsparser.FileInfo{\n\t\t\t\tName: \"file1\",\n\t\t\t\tMode: 0100000,\n\t\t\t\tUid:  1001,\n\t\t\t\tGid:  1001,\n\t\t\t\tSize: 7,\n\t\t\t},\n\t\t\t\"file2\": fsparser.FileInfo{\n\t\t\t\tName: \"file2\",\n\t\t\t\tMode: 0104755,\n\t\t\t\tUid:  1001,\n\t\t\t\tGid:  1001,\n\t\t\t\tSize: 5,\n\t\t\t},\n\t\t\t\"file3\": fsparser.FileInfo{\n\t\t\t\tName:       \"file3\",\n\t\t\t\tMode:       0120777,\n\t\t\t\tUid:        1001,\n\t\t\t\tGid:        1001,\n\t\t\t\tSize:       5,\n\t\t\t\tLinkTarget: \"file1\",\n\t\t\t},\n\t\t\t\"subdir2\": fsparser.FileInfo{\n\t\t\t\tName: \"subdir2\",\n\t\t\t\tMode: 0040700,\n\t\t\t\tUid:  1005,\n\t\t\t\tGid:  1005,\n\t\t\t\tSize: 28,\n\t\t\t},\n\t\t},\n\t\t\"/dir2/subdir2\": {\n\t\t\t\"file4\": fsparser.FileInfo{\n\t\t\t\tName: \"file4\",\n\t\t\t\tMode: 0100644,\n\t\t\t\tUid:  1001,\n\t\t\t\tGid:  1001,\n\t\t\t\tSize: 20,\n\t\t\t},\n\t\t},\n\t}\n\n\tfor _, testdir := range []string{\"/\", \"/dir2\", \"/dir2/subdir2\"} {\n\t\tdirtests := tests[testdir]\n\t\tdir, err := f.GetDirInfo(testdir)\n\t\tif err != nil {\n\t\t\tt.Errorf(\"GetDirInfo() returned error: %v\", err)\n\t\t\treturn\n\t\t}\n\t\tfor _, fi := range dir {\n\t\t\t//fmt.Printf(\"Directory: %s, Name: %s, Size: %d, Mode: %o\\n\", testdir, fi.Name, fi.Size, fi.Mode)\n\t\t\ttfi, ok := dirtests[fi.Name]\n\t\t\tif !ok {\n\t\t\t\tt.Errorf(\"File \\\"%s\\\" not found in test map\", fi.Name)\n\t\t\t\tcontinue\n\t\t\t}\n\t\t\tif diff := cmp.Diff(fi, tfi); diff != \"\" {\n\t\t\t\tt.Errorf(\"GetDirInfo() result mismatch for \\\"%s\\\" (-got, +want):\\n%s\", fi.Name, diff)\n\t\t\t}\n\t\t\tdelete(dirtests, fi.Name)\n\t\t}\n\t\tfor name := range dirtests {\n\t\t\tt.Errorf(\"File \\\"%s\\\" exists in test map but not in test filesystem\", name)\n\t\t}\n\t}\n}\n\nfunc TestGetFileInfo(t *testing.T) {\n\ttestImage := \"../../test/squashfs.img\"\n\tf := New(testImage, false)\n\n\tfi, err := f.GetFileInfo(\"/\")\n\tif err != nil {\n\t\tt.Error(err)\n\t}\n\tif !fi.IsDir() {\n\t\tt.Errorf(\"/ should be dir\")\n\t}\n\n\tdir, err := f.GetDirInfo(\"/\")\n\tif err != nil {\n\t\tt.Error(err)\n\t}\n\tif len(dir) < 1 {\n\t\tt.Errorf(\"/ should not be empty\")\n\t}\n\n\tfi, err = f.GetFileInfo(\"/dir2/file3\")\n\tif err != nil {\n\t\tt.Errorf(\"GetDirInfo() returned error: %v\", err)\n\t\treturn\n\t}\n\n\ttfi := fsparser.FileInfo{\n\t\tName:       \"file3\",\n\t\tMode:       0120777,\n\t\tUid:        1001,\n\t\tGid:        1001,\n\t\tSize:       5,\n\t\tLinkTarget: \"file1\",\n\t}\n\n\tif diff := cmp.Diff(fi, tfi); diff != \"\" {\n\t\tt.Errorf(\"GetFileInfo() result mismatch (-got, +want):\\n%s\", diff)\n\t}\n}\n\nfunc TestCopyFile(t *testing.T) {\n\ttestImage := \"../../test/squashfs.img\"\n\tf := New(testImage, false)\n\n\tif !f.CopyFile(\"/dir2/subdir2/file4\", \".\") {\n\t\tt.Errorf(\"CopyFile() returned false\")\n\t\treturn\n\t}\n\tdefer os.Remove(\"file4\")\n\n\tdata, err := ioutil.ReadFile(\"file4\")\n\tif err != nil {\n\t\tt.Errorf(\"can't read file4: %v\", err)\n\t\treturn\n\t}\n\n\texpected := \"feed me a stray cat\\n\"\n\tif string(data) != expected {\n\t\tt.Errorf(\"file4 expected \\\"%s\\\" but got \\\"%s\\\"\", expected, data)\n\t}\n}\n\nfunc TestSecurityInfo(t *testing.T) {\n\ttestImage := \"../../test/squashfs_cap.img\"\n\tf := New(testImage, true)\n\n\tfi, err := f.GetFileInfo(\"/ifconfig\")\n\tif err != nil {\n\t\tt.Error(err)\n\t}\n\n\tif fi.SELinuxLabel != \"-\" {\n\t\tt.Error(\"no selinux label should be present\")\n\t}\n\n\tif !strings.EqualFold(fi.Capabilities[0], \"cap_net_admin+p\") {\n\t\tt.Errorf(\"bad capabilities: %s\", fi.Capabilities)\n\t}\n\n}\n"
  },
  {
    "path": "pkg/ubifsparser/ubifsparser.go",
    "content": "/*\nCopyright 2019-present, Cruise LLC\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n\thttp://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n*/\n\npackage ubifsparser\n\nimport (\n\t\"fmt\"\n\t\"os\"\n\t\"os/exec\"\n\t\"path\"\n\t\"regexp\"\n\t\"strconv\"\n\t\"strings\"\n\n\t\"github.com/cruise-automation/fwanalyzer/pkg/fsparser\"\n)\n\ntype UbifsParser struct {\n\tfileinfoReg *regexp.Regexp\n\tfileLinkReg *regexp.Regexp\n\timagepath   string\n}\n\nconst (\n\tubifsReaderCmd = \"ubireader_list_files\"\n)\n\nfunc New(imagepath string) *UbifsParser {\n\tparser := &UbifsParser{\n\t\t// 120777  1 0 0       0 Mar 13 08:53 tmp -> /var/tmp\n\t\tfileinfoReg: regexp.MustCompile(\n\t\t\t`^\\s*(\\d+)\\s+(\\d+)\\s+(\\d+)\\s+(\\d+)\\s+(\\d+)\\s+(\\S+\\s+\\d+)\\s+(\\d+:\\d+)\\s+(.+)$`),\n\t\tfileLinkReg: regexp.MustCompile(\n\t\t\t`(\\S+)\\s->\\s(\\S+)`),\n\t\timagepath: imagepath,\n\t}\n\n\treturn parser\n}\n\nfunc (e *UbifsParser) ImageName() string {\n\treturn e.imagepath\n}\n\nfunc (e *UbifsParser) parseFileLine(line string) (fsparser.FileInfo, error) {\n\tres := e.fileinfoReg.FindAllStringSubmatch(line, -1)\n\tvar fi fsparser.FileInfo\n\tif res == nil {\n\t\treturn fi, fmt.Errorf(\"can't parse: %s\", line)\n\t}\n\tsize, _ := strconv.Atoi(res[0][5])\n\tfi.Size = int64(size)\n\tfi.Mode, _ = strconv.ParseUint(res[0][1], 8, 32)\n\tfi.Uid, _ = strconv.Atoi(res[0][3])\n\tfi.Gid, _ = strconv.Atoi(res[0][4])\n\tfi.Name = res[0][8]\n\n\tfi.SELinuxLabel = fsparser.SELinuxNoLabel\n\n\t// fill in linktarget\n\tif fi.IsLink() && strings.Contains(fi.Name, \"->\") {\n\t\trlnk := e.fileLinkReg.FindAllStringSubmatch(fi.Name, -1)\n\t\tif rlnk == nil {\n\t\t\treturn fsparser.FileInfo{}, fmt.Errorf(\"can't parse LinkTarget from %s\", fi.Name)\n\t\t}\n\t\tfi.Name = rlnk[0][1]\n\t\tfi.LinkTarget = rlnk[0][2]\n\t}\n\n\treturn fi, nil\n}\n\nfunc (e *UbifsParser) getDirList(dirpath string) ([]fsparser.FileInfo, error) {\n\tout, err := exec.Command(ubifsReaderCmd, \"-P\", dirpath, e.imagepath).CombinedOutput()\n\tif err != nil {\n\t\tfmt.Fprintln(os.Stderr, err)\n\t\treturn nil, err\n\t}\n\tvar dir []fsparser.FileInfo\n\tlines := strings.Split(string(out), \"\\n\")\n\tfor _, fline := range lines {\n\t\tif fline == \"\" {\n\t\t\tcontinue\n\t\t}\n\t\tfi, err := e.parseFileLine(fline)\n\t\tif err != nil {\n\t\t\treturn nil, err\n\t\t}\n\t\tdir = append(dir, fi)\n\t}\n\treturn dir, nil\n}\n\nfunc (e *UbifsParser) GetDirInfo(dirpath string) ([]fsparser.FileInfo, error) {\n\tdir, err := e.getDirList(dirpath)\n\treturn dir, err\n}\n\nfunc (e *UbifsParser) GetFileInfo(dirpath string) (fsparser.FileInfo, error) {\n\t// return fake entry for root (/)\n\tif dirpath == \"/\" {\n\t\treturn fsparser.FileInfo{Name: \"/\", Mode: fsparser.S_IFDIR}, nil\n\t}\n\n\tlistpath := path.Dir(dirpath)\n\tlistfile := path.Base(dirpath)\n\tvar fi fsparser.FileInfo\n\tdir, err := e.getDirList(listpath)\n\tif err != nil {\n\t\treturn fi, err\n\t}\n\n\tfor _, info := range dir {\n\t\tif info.Name == listfile {\n\t\t\treturn info, nil\n\t\t}\n\t}\n\treturn fi, fmt.Errorf(\"file not found: %s\", dirpath)\n}\n\nfunc (e *UbifsParser) CopyFile(filepath string, dstdir string) bool {\n\terr := exec.Command(ubifsReaderCmd, \"--copy\", filepath, \"--copy-dest\", dstdir, e.imagepath).Run()\n\tif err != nil {\n\t\tfmt.Fprintln(os.Stderr, err)\n\t\treturn false\n\t}\n\treturn true\n}\n\nfunc (f *UbifsParser) Supported() bool {\n\t_, err := exec.LookPath(ubifsReaderCmd)\n\treturn err == nil\n}\n"
  },
  {
    "path": "pkg/ubifsparser/ubifsparser_test.go",
    "content": "/*\nCopyright 2019-present, Cruise LLC\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n\thttp://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n*/\n\npackage ubifsparser\n\nimport (\n\t\"os\"\n\t\"testing\"\n)\n\nfunc TestCleanup(t *testing.T) {\n\ttestImage := \"../../test/ubifs.img\"\n\n\te := New(testImage)\n\n\tif e.ImageName() != testImage {\n\t\tt.Errorf(\"ImageName returned bad name\")\n\t}\n\n\tfi, err := e.GetFileInfo(\"/\")\n\tif err != nil {\n\t\tt.Error(err)\n\t}\n\tif !fi.IsDir() {\n\t\tt.Errorf(\"/ should be dir\")\n\t}\n\n\tdir, err := e.GetDirInfo(\"/\")\n\tif err != nil {\n\t\tt.Errorf(\"getDirList failed\")\n\t}\n\tif len(dir) != 5 {\n\t\tt.Errorf(\"should be 5 files, but %d found\", len(dir))\n\t}\n\n\tfi, err = e.GetFileInfo(\"/file1.txt\")\n\tif err != nil {\n\t\tt.Errorf(\"GetFileInfo failed\")\n\t}\n\tif !fi.IsFile() {\n\t\tt.Errorf(\"GetFileInfo failed, not a file\")\n\t}\n\tif fi.IsDir() {\n\t\tt.Errorf(\"GetFileInfo failed, not a dir\")\n\t}\n\tif fi.Name != \"file1.txt\" {\n\t\tt.Errorf(\"filename does not match: %s\", fi.Name)\n\t}\n\n\tfi, err = e.GetFileInfo(\"/bin/elf_arm64\")\n\tif err != nil {\n\t\tt.Errorf(\"GetFileInfo failed\")\n\t}\n\tif !fi.IsFile() {\n\t\tt.Errorf(\"GetFileInfo failed, not a file\")\n\t}\n\tif fi.IsDir() {\n\t\tt.Errorf(\"GetFileInfo failed, not a dir\")\n\t}\n\tif fi.Size != 3740436 {\n\t\tt.Errorf(\"file size does not match: %s\", fi.Name)\n\t}\n\n\tfi, err = e.GetFileInfo(\"/dateX\")\n\tif err != nil {\n\t\tt.Errorf(\"GetFileInfo failed\")\n\t}\n\tif fi.IsFile() {\n\t\tt.Errorf(\"GetFileInfo failed, not a file\")\n\t}\n\tif fi.IsDir() {\n\t\tt.Errorf(\"GetFileInfo failed, not a dir\")\n\t}\n\tif !fi.IsLink() {\n\t\tt.Errorf(\"GetFileInfo failed, is link\")\n\t}\n\tif fi.LinkTarget != \"date1.txt\" {\n\t\tt.Errorf(\"link does not match: %s\", fi.LinkTarget)\n\t}\n\n\tfi, err = e.GetFileInfo(\"/dir1\")\n\tif err != nil {\n\t\tt.Errorf(\"GetFileInfo failed\")\n\t}\n\tif fi.IsFile() {\n\t\tt.Errorf(\"GetFileInfo failed, not a file\")\n\t}\n\tif !fi.IsDir() {\n\t\tt.Errorf(\"GetFileInfo failed, not a dir\")\n\t}\n\tif fi.Name != \"dir1\" {\n\t\tt.Errorf(\"filename does not match: %s\", fi.Name)\n\t}\n\n\tif !e.CopyFile(\"/bin/elf_arm32\", \"xxx-test-xxx\") {\n\t\tt.Errorf(\"copyfile returned false\")\n\t}\n\tif _, err := os.Stat(\"xxx-test-xxx\"); os.IsNotExist(err) {\n\t\tt.Errorf(\"%s\", err)\n\t} else {\n\t\tos.Remove(\"xxx-test-xxx\")\n\t}\n}\n"
  },
  {
    "path": "pkg/util/util.go",
    "content": "/*\nCopyright 2019-present, Cruise LLC\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n\thttp://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n*/\n\npackage util\n\nimport (\n\t\"crypto/sha256\"\n\t\"encoding/json\"\n\t\"fmt\"\n\t\"io\"\n\t\"io/ioutil\"\n\t\"os\"\n\t\"path\"\n\t\"reflect\"\n\t\"strconv\"\n)\n\nfunc MkTmpDir(prefix string) (string, error) {\n\ttmpDir, err := ioutil.TempDir(os.TempDir(), prefix)\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\treturn tmpDir, err\n}\n\nfunc DigestFileSha256(filepath string) []byte {\n\tf, err := os.Open(filepath)\n\tif err != nil {\n\t\treturn nil\n\t}\n\tdefer f.Close()\n\n\th := sha256.New()\n\tif _, err := io.Copy(h, f); err != nil {\n\t\treturn nil\n\t}\n\n\treturn h.Sum(nil)\n}\n\nfunc loadJson(data []byte, item string) (interface{}, error) {\n\tvar jd map[string]interface{}\n\terr := json.Unmarshal(data, &jd)\n\treturn jd[item], err\n}\n\nfunc XtractJsonField(data []byte, items []string) (string, error) {\n\tidx := 0\n\tid, err := loadJson(data, items[idx])\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\tif id == nil {\n\t\treturn \"\", fmt.Errorf(\"JSON field not found: %s\", items[idx])\n\t}\n\tidx++\n\tfor {\n\t\tif id == nil {\n\t\t\treturn \"\", fmt.Errorf(\"JSON field not found: %s\", items[idx-1])\n\t\t}\n\t\t// keep for debugging\n\t\t//fmt.Printf(\"idx=%d, type=%s\\n\", idx, reflect.TypeOf(id).String())\n\t\tif reflect.TypeOf(id).String() == \"map[string]interface {}\" {\n\t\t\tidc := id.(map[string]interface{})\n\t\t\tid = idc[items[idx]]\n\t\t\tidx++\n\t\t} else if reflect.TypeOf(id).String() == \"[]interface {}\" {\n\t\t\tidc := id.([]interface{})\n\t\t\tindex, _ := strconv.Atoi(items[idx])\n\t\t\tid = idc[index]\n\t\t\tidx++\n\t\t} else {\n\t\t\tswitch id := id.(type) {\n\t\t\tcase bool:\n\t\t\t\tif id {\n\t\t\t\t\treturn \"true\", nil\n\t\t\t\t} else {\n\t\t\t\t\treturn \"false\", nil\n\t\t\t\t}\n\t\t\tcase float32, float64:\n\t\t\t\treturn fmt.Sprintf(\"%f\", id), nil\n\t\t\tcase string:\n\t\t\t\treturn id, nil\n\t\t\tdefault:\n\t\t\t\treturn \"\", fmt.Errorf(\"can't handle type\")\n\t\t\t}\n\t\t}\n\t}\n}\n\nfunc CleanPathDir(pathName string) string {\n\tcleaned := path.Clean(pathName)\n\tif cleaned[len(cleaned)-1] != '/' {\n\t\tcleaned += \"/\"\n\t}\n\treturn cleaned\n}\n"
  },
  {
    "path": "pkg/vfatparser/vfatparser.go",
    "content": "/*\nCopyright 2019-present, Cruise LLC\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n\thttp://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n*/\n\npackage vfatparser\n\nimport (\n\t\"fmt\"\n\t\"os\"\n\t\"os/exec\"\n\t\"path/filepath\"\n\t\"regexp\"\n\t\"strconv\"\n\t\"strings\"\n\n\t\"github.com/cruise-automation/fwanalyzer/pkg/fsparser\"\n)\n\ntype mDirReg struct {\n\trx     *regexp.Regexp\n\thasExt bool\n}\n\ntype VFatParser struct {\n\tmDirRegex []mDirReg\n\timagepath string\n}\n\nconst (\n\tvFatLsCmd string = \"mdir\"\n\tvFatCpCmd string = \"mcopy\"\n)\n\nfunc New(imagepath string) *VFatParser {\n\tvar regs []mDirReg\n\t// BZIMAGE        5853744 2018-11-12  10:04  bzImage\n\tregs = append(regs, mDirReg{regexp.MustCompile(`^([~\\w]+)\\s+(\\d+)\\s\\d+-\\d+-\\d+\\s+\\d+:\\d+\\s+(\\w+)$`), false})\n\t// BZIMAGE  SIG       287 2018-11-12  10:04  bzImage.sig\n\tregs = append(regs, mDirReg{regexp.MustCompile(`^([~\\w]+)\\s+(\\w+)\\s+(\\d+)\\s\\d+-\\d+-\\d+\\s+\\d+:\\d+\\s+(.+).*`), true})\n\t// EFI          <DIR>     2018-11-12  10:04\n\tregs = append(regs, mDirReg{regexp.MustCompile(`^([~\\.\\w]+)\\s+<DIR>.*`), false})\n\t// startup  nsh        12 2018-11-12  10:04\n\tregs = append(regs, mDirReg{regexp.MustCompile(`^([~\\w]+)\\s+(\\w+)\\s+(\\d+).*`), true})\n\t// grubenv           1024 2018-11-12  10:04\n\tregs = append(regs, mDirReg{regexp.MustCompile(`^([~\\w]+)\\s+(\\d+).*`), false})\n\n\tparser := &VFatParser{\n\t\tmDirRegex: regs,\n\t\timagepath: imagepath,\n\t}\n\t// configure mtools to skip size checks on VFAT images\n\tos.Setenv(\"MTOOLS_SKIP_CHECK\", \"1\")\n\treturn parser\n}\n\nfunc (f *VFatParser) ImageName() string {\n\treturn f.imagepath\n}\n\nfunc (f *VFatParser) parseFileLine(line string) (fsparser.FileInfo, error) {\n\tvar fi fsparser.FileInfo\n\tfor _, reg := range f.mDirRegex {\n\t\tres := reg.rx.FindAllStringSubmatch(line, -1)\n\t\tif res != nil {\n\t\t\tsize := 0\n\t\t\tif len(res[0]) == 2 {\n\t\t\t\tfi.Mode = fsparser.S_IFDIR\n\t\t\t} else {\n\t\t\t\tfi.Mode = fsparser.S_IFREG\n\t\t\t\tif reg.hasExt {\n\t\t\t\t\tsize, _ = strconv.Atoi(res[0][3])\n\t\t\t\t} else {\n\t\t\t\t\tsize, _ = strconv.Atoi(res[0][2])\n\t\t\t\t}\n\t\t\t}\n\t\t\tfi.Mode |= fsparser.S_IRWXU | fsparser.S_IRWXG | fsparser.S_IRWXO\n\t\t\tfi.Size = int64(size)\n\t\t\tfi.Uid = 0\n\t\t\tfi.Gid = 0\n\t\t\tfi.SELinuxLabel = fsparser.SELinuxNoLabel\n\t\t\tfi.Name = res[0][1]\n\t\t\tif reg.hasExt {\n\t\t\t\tfi.Name = fmt.Sprintf(\"%s.%s\", res[0][1], res[0][2])\n\t\t\t}\n\t\t\t// use long name\n\t\t\tif (!reg.hasExt && len(res[0]) > 3) || (reg.hasExt && len(res[0]) > 4) {\n\t\t\t\tfi.Name = res[0][len(res[0])-1]\n\t\t\t}\n\t\t\treturn fi, nil\n\t\t}\n\t}\n\treturn fi, fmt.Errorf(\"not a file/dir\")\n}\n\nfunc (f *VFatParser) getDirList(dirpath string, ignoreDot bool) ([]fsparser.FileInfo, error) {\n\tvar dir []fsparser.FileInfo\n\tout, err := exec.Command(vFatLsCmd, \"-i\", f.imagepath, dirpath).CombinedOutput()\n\tif err != nil {\n\t\tfmt.Fprintln(os.Stderr, err)\n\t\treturn nil, err\n\t}\n\tlines := strings.Split(string(out), \"\\n\")\n\tfor _, fline := range lines {\n\t\tif len(fline) > 1 {\n\t\t\tfi, err := f.parseFileLine(fline)\n\t\t\tif err == nil {\n\t\t\t\t// filter: . and ..\n\t\t\t\tif !ignoreDot || (fi.Name != \".\" && fi.Name != \"..\") {\n\t\t\t\t\tdir = append(dir, fi)\n\t\t\t\t}\n\t\t\t}\n\t\t}\n\t}\n\treturn dir, nil\n}\n\nfunc (f *VFatParser) GetDirInfo(dirpath string) ([]fsparser.FileInfo, error) {\n\tif dirpath == \"\" {\n\t\tdirpath = \"/\"\n\t}\n\treturn f.getDirList(dirpath, true)\n}\n\nfunc (f *VFatParser) GetFileInfo(dirpath string) (fsparser.FileInfo, error) {\n\t// return fake entry for root (/)\n\tif dirpath == \"/\" {\n\t\treturn fsparser.FileInfo{Name: \"/\", Mode: fsparser.S_IFDIR}, nil\n\t}\n\n\tvar fifake fsparser.FileInfo\n\tdir, err := f.getDirList(dirpath, false)\n\tif err != nil {\n\t\treturn fifake, err\n\t}\n\n\t// GetFileInfo was called on non directory\n\tif len(dir) == 1 {\n\t\treturn dir[0], nil\n\t}\n\n\tfor _, info := range dir {\n\t\tif info.Name == \".\" {\n\t\t\tinfo.Name = filepath.Base(dirpath)\n\t\t\treturn info, nil\n\t\t}\n\t}\n\n\treturn fifake, fmt.Errorf(\"file not found: %s\", dirpath)\n}\n\nfunc (f *VFatParser) CopyFile(filepath string, dstdir string) bool {\n\tsrc := fmt.Sprintf(\"::%s\", filepath)\n\t_, err := exec.Command(vFatCpCmd, \"-bni\", f.imagepath, src, dstdir).Output()\n\tif err != nil {\n\t\tfmt.Fprintln(os.Stderr, err)\n\t\treturn false\n\t}\n\treturn true\n}\n\nfunc (f *VFatParser) Supported() bool {\n\t_, err := exec.LookPath(vFatCpCmd)\n\tif err != nil {\n\t\treturn false\n\t}\n\t_, err = exec.LookPath(vFatLsCmd)\n\treturn err == nil\n}\n"
  },
  {
    "path": "pkg/vfatparser/vfatparser_test.go",
    "content": "/*\nCopyright 2019-present, Cruise LLC\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n\thttp://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n*/\n\npackage vfatparser\n\nimport (\n\t\"fmt\"\n\t\"os\"\n\t\"testing\"\n)\n\nvar f *VFatParser\n\nfunc TestMain(t *testing.T) {\n\ttestImage := \"../../test/vfat.img\"\n\tf = New(testImage)\n\n\tif f.ImageName() != testImage {\n\t\tt.Errorf(\"imageName returned bad name\")\n\t}\n}\n\nfunc TestGetDirInfo(t *testing.T) {\n\tdir, err := f.GetDirInfo(\"/\")\n\tif err != nil {\n\t\tt.Errorf(\"GetDirInfo was nil: %s\", err)\n\t}\n\tfor _, fi := range dir {\n\t\tif fi.Name == \"dir1\" {\n\t\t\tif !fi.IsDir() {\n\t\t\t\tt.Errorf(\"dir1 must be Dir\")\n\t\t\t}\n\t\t}\n\t}\n\n\tdir, err = f.GetDirInfo(\"dir1\")\n\tif err != nil {\n\t\tt.Errorf(\"GetDirInfo was nil: %s\", err)\n\t}\n\tfor _, fi := range dir {\n\t\tfmt.Printf(\"Name: %s Size: %d Mode: %o\\n\", fi.Name, fi.Size, fi.Mode)\n\t\tif fi.Name == \"file1\" {\n\t\t\tif !fi.IsFile() {\n\t\t\t\tt.Errorf(\"file1 need to be a file\")\n\t\t\t}\n\t\t\tif fi.Uid != 0 || fi.Gid != 0 {\n\t\t\t\tt.Errorf(\"file1 need to be 0:0\")\n\t\t\t}\n\t\t\tif fi.Size != 5 {\n\t\t\t\tt.Errorf(\"file1 size needs to be 5\")\n\t\t\t}\n\t\t\tif !fi.IsWorldWrite() {\n\t\t\t\tt.Errorf(\"file1 needs to be world writable\")\n\t\t\t}\n\t\t}\n\t}\n\n\tif !f.CopyFile(\"dir1/file1\", \".\") {\n\t\tt.Errorf(\"CopyFile returned false\")\n\t}\n\tif _, err := os.Stat(\"file1\"); os.IsNotExist(err) {\n\t\tt.Errorf(\"%s\", err)\n\t} else {\n\t\tos.Remove(\"file1\")\n\t}\n}\n\nfunc TestGetFileInfo(t *testing.T) {\n\tfi, err := f.GetFileInfo(\"/DIR1/FILE1\")\n\tif err != nil {\n\t\tt.Error(err)\n\t}\n\tif !fi.IsFile() {\n\t\tt.Errorf(\"/DIR1/FILE1 should be file\")\n\t}\n\n\tfi, err = f.GetFileInfo(\"/\")\n\tif err != nil {\n\t\tt.Error(err)\n\t}\n\tif !fi.IsDir() {\n\t\tt.Errorf(\"/ should be dir\")\n\t}\n}\n"
  },
  {
    "path": "scripts/catfile.sh",
    "content": "#!/bin/sh\n\ncat $1\n"
  },
  {
    "path": "scripts/check_apkcert.sh",
    "content": "#!/bin/sh\n\nFILEPATH=$1\nORIG_FILENAME=$2\nORIG_UID=$3\nORIG_GID=$4\nORIG_MODE=$5\nORIG_SELINUXLABEL=$6\n\nAPK=$(echo ${ORIG_FILENAME}|grep -e \"\\.apk\")\n\nif [ -n \"$APK\" ]; then\n\n  DIR=$(dirname ${FILEPATH})\n  mkdir ${DIR}/apkdata\n  cd ${DIR}/apkdata; unzip ${FILEPATH} >/dev/null 2>&1; openssl cms -cmsout -noout -text -print -in META-INF/CERT.RSA -inform DER |grep subject:|sed 's/^ *//'\n  rm -rf ${DIR}/apkdata\n\nfi\n"
  },
  {
    "path": "scripts/check_cert.sh",
    "content": "#!/bin/sh\n\nFILEPATH=$1\nORIG_FILENAME=$2\nORIG_UID=$3\nORIG_GID=$4\nORIG_MODE=$5\nORIG_SELINUXLABEL=$6\n\n\nISPEM=$(file ${FILEPATH} |grep PEM)\n\nif [ -n \"$ISPEM\" ]; then\n\n  openssl x509 -noout -text -in ${FILEPATH} | grep Issuer:|sed 's/^ *//'\n  openssl x509 -noout -text -in ${FILEPATH} | grep Subject:|sed 's/^ *//'\n\nfi\n"
  },
  {
    "path": "scripts/check_file_arm32.sh",
    "content": "#!/bin/sh\n\nFILEPATH=$1\nORIG_FILENAME=$2\nORIG_UID=$3\nORIG_GID=$4\nORIG_MODE=$5\nORIG_SELINUXLABEL=$6\n\nINFO=$(file ${FILEPATH}|grep \"ELF 32-bit LSB  executable, ARM, EABI5\")\n\nif [ -z \"$INFO\" ]; then\n    echo -n ${ORIG_FILENAME} \"not an ARM32 elf file\"\nfi\n"
  },
  {
    "path": "scripts/check_file_arm64.sh",
    "content": "#!/bin/sh\n\nFILEPATH=$1\nORIG_FILENAME=$2\nORIG_UID=$3\nORIG_GID=$4\nORIG_MODE=$5\nORIG_SELINUXLABEL=$6\n\nINFO=$(file ${FILEPATH}|grep \"ELF 64-bit LSB  executable, ARM aarch64\")\n\nif [ -z \"$INFO\" ]; then\n    echo -n ${ORIG_FILENAME} \"not an ARM aarch64 elf file\"\nfi\n"
  },
  {
    "path": "scripts/check_file_elf_stripped.sh",
    "content": "#!/bin/sh\n\nFILEPATH=$1\nORIG_FILENAME=$2\nORIG_UID=$3\nORIG_GID=$4\nORIG_MODE=$5\nORIG_SELINUXLABEL=$6\n\nINFO=$(file ${FILEPATH}|grep \"not stripped\")\n\nif [ -n \"$INFO\" ]; then\n    echo -n ${ORIG_FILENAME} \"is not stripped\"\nfi\n"
  },
  {
    "path": "scripts/check_file_x8664.sh",
    "content": "#!/bin/sh\n\nFILEPATH=$1\nORIG_FILENAME=$2\nORIG_UID=$3\nORIG_GID=$4\nORIG_MODE=$5\nORIG_SELINUXLABEL=$6\n\nINFO=$(file ${FILEPATH}|grep \"ELF 64-bit LSB  executable, x86-64\")\n\nif [ -z \"$INFO\" ]; then\n    echo -n '{\"reason\": \"not a x86-64 elf file\"}'\nfi\n"
  },
  {
    "path": "scripts/check_otacert.sh",
    "content": "#!/bin/sh\n\nFILEPATH=$1\nORIG_FILENAME=$2\nORIG_UID=$3\nORIG_GID=$4\nORIG_MODE=$5\nORIG_SELINUXLABEL=$6\n\nZIP=$(echo ${ORIG_FILENAME}|grep -e \"\\.zip\")\n\nif [ -n \"$ZIP\" ]; then\n\n  DIR=$(dirname ${FILEPATH})\n  mkdir ${DIR}/otacertdata\n  pushd\n  cd ${DIR}/otacertdata\n  unzip ${FILEPATH} >/dev/null 2>&1\n  popd\n  find ${DIR}/otacertdata -name \"*\" -exec scripts/check_cert.sh {} {} \\;\n  rm -rf ${DIR}/otacertdata\n\nfi\n"
  },
  {
    "path": "scripts/check_privatekey.sh",
    "content": "#!/bin/sh\n\nFILEPATH=$1\nORIG_FILENAME=$2\nORIG_UID=$3\nORIG_GID=$4\nORIG_MODE=$5\nORIG_SELINUXLABEL=$6\n\nINFO=$(file ${FILEPATH} | grep \"private key\")\nPERMS=$(echo ${ORIG_MODE} | grep -E \".*r[-w][-x]$\")\n\nif [ -n \"$INFO\" ]; then\n    echo -n ${ORIG_FILENAME} \"is a private key\"\n    if [ -n \"$PERMS\" ]; then\n        echo -n \" that is world readable\"\n    fi\nfi\n"
  },
  {
    "path": "scripts/check_sec.sh",
    "content": "#!/bin/bash\n\nFILEPATH=$1\nORIG_FILENAME=$2\nORIG_UID=$3\nORIG_GID=$4\nORIG_MODE=$5\nORIG_LABEL=$6\nCONFIG=$8\n\nRESULT=$(checksec --output=json --file=\"$1\")\n\nexport RESULT\nexport FILEPATH\nexport CONFIG\nexport ORIG_FILENAME\n\n# Config format is JSON\n# array for values allows multiple acceptable values\n# {\"cfg\":\n#  {\n#    \"pie\": [\"yes\"],\n#    \"relro\": [\"full\", \"partial\"]\n#  },\n#  \"skip\": [\"/usr/bin/bla\"]\n# }\n#\n# usable cfg fields, omitted fields are not checked:\n# {\n#\t\"canary\": \"no\",\n#\t\"fortify_source\": \"no\",\n#\t\"nx\": \"yes\",\n#\t\"pie\": \"no\",\n#\t\"relro\": \"partial\",\n#\t\"rpath\": \"no\",\n#\t\"runpath\": \"no\",\n#\t\"symbols\": \"no\"\n# }\n\n\npython -c 'import json\nimport sys\nimport os\n\ncfg = os.getenv(\"CONFIG\")\nres = os.getenv(\"RESULT\")\nfp = os.getenv(\"FILEPATH\")\norig_name = os.getenv(\"ORIG_FILENAME\")\n\nexpected = {}\n\ntry:\n  expected = json.loads(cfg.rstrip())\nexcept Exception:\n  print(\"bad config: {}\".format(cfg.rstrip()))\n  sys.exit(1)\n\ntry:\n  result = json.loads(res.rstrip())\n\n  if \"skip\" in expected:\n    if orig_name in expected[\"skip\"]:\n      sys.exit(0)\n\n  if not fp in result:\n    fp = \"file\"\n\n  bad_keys = []\n  for k in expected[\"cfg\"]:\n    if k in result[fp]:\n      passed = False\n      for expected_value in expected[\"cfg\"][k]:\n        if expected_value == result[fp][k]:\n          passed = True\n          break\n      if not passed:\n        print(json.dumps(result[fp]).rstrip())\n        sys.exit(0)\n    else:\n      bad_keys.append(k)\n\n  if bad_keys:\n    print(\"results were missing expected keys: {}\".format(\", \".join(bad_keys)))\n    sys.exit(0)\n\nexcept Exception as e:\n  if not \"Not an ELF file:\" in res:\n     print(e)\n\nsys.exit(0)\n'\n"
  },
  {
    "path": "scripts/diff.sh",
    "content": "#!/bin/sh\n\norigname=$1\noldfile=$2\ncurfile=$3\n\ndiff -u $oldfile $curfile\n\nexit 0\n"
  },
  {
    "path": "scripts/prop2json.py",
    "content": "#!/usr/bin/python3\n\n#\n# read Android property file and convert it to JSON\n#\n\nimport json\nimport sys\n\nprops = {}\n\nwith open(sys.argv[1], 'r') as fp:\n    while True:\n        line = fp.readline()\n        if not line:\n            break\n        if line.startswith('#'):\n            continue\n        \n        line = line.rstrip(\"\\n\")\n        parts = line.split(\"=\", 1)\n        if len(parts) == 2:\n            props[parts[0]] = parts[1]\nprint(json.dumps(props))\n"
  },
  {
    "path": "test/elf_main.go",
    "content": "package main\n\nimport \"fmt\"\n\nfunc main() {\n\tfmt.Println(\"hello world\")\n}\n"
  },
  {
    "path": "test/oldtree.json",
    "content": "{\n    \"files\": [\n        {\n            \"name\": \"/world\", \n            \"gid\": 0, \n            \"mode\": 33206, \n            \"se_linux_label\": \"-\", \n            \"uid\": 0, \n            \"link_target\": \"\", \n            \"digest\": \"e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855\", \n            \"size\": 0\n        }, \n        {\n            \"name\": \"/bin/elf_arm64\", \n            \"gid\": 1001, \n            \"mode\": 33277, \n            \"se_linux_label\": \"-\", \n            \"uid\": 1001, \n            \"link_target\": \"\", \n            \"digest\": \"5e18d8042a38789761dc950cd3aa73b3562d69a4dce2a5ef8530301e71494168\", \n            \"size\": 2110768\n        }, \n        {\n            \"name\": \"/dir1\", \n            \"gid\": 0, \n            \"mode\": 16877, \n            \"se_linux_label\": \"-\", \n            \"uid\": 0, \n            \"link_target\": \"\", \n            \"digest\": \"0\", \n            \"size\": 1024\n        }, \n        {\n            \"name\": \"/dir1/dir11\", \n            \"gid\": 0, \n            \"mode\": 16877, \n            \"se_linux_label\": \"-\", \n            \"uid\": 0, \n            \"link_target\": \"\", \n            \"digest\": \"0\", \n            \"size\": 1024\n        }, \n        {\n            \"name\": \"/dir3/file31\", \n            \"gid\": 1001, \n            \"mode\": 33188, \n            \"se_linux_label\": \"-\", \n            \"uid\": 1001, \n            \"link_target\": \"\", \n            \"digest\": \"e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855\", \n            \"size\": 0\n        }, \n        {\n            \"name\": \"/bin/elf_arm32\", \n            \"gid\": 1001, \n            \"mode\": 33277, \n            \"se_linux_label\": \"-\", \n            \"uid\": 1001, \n            \"link_target\": \"\", \n            \"digest\": \"682dfed81319befa3fcec071d4d86e7093f19929ef75dea7b16abb196a2aa667\", \n            \"size\": 1957752\n        }, \n        {\n            \"name\": \"/date1\", \n            \"gid\": 0, \n            \"mode\": 33188, \n            \"se_linux_label\": \"-\", \n            \"uid\": 1, \n            \"link_target\": \"\", \n            \"digest\": \"8b15095ed1af38d5e383af1c4eadc5ae73cab03964142eb54cb0477ccd6a8dd5\", \n            \"size\": 29\n        }, \n        {\n            \"name\": \"/dir3\", \n            \"gid\": 1001, \n            \"mode\": 16877, \n            \"se_linux_label\": \"-\", \n            \"uid\": 1001, \n            \"link_target\": \"\", \n            \"digest\": \"0\", \n            \"size\": 1024\n        }, \n        {\n            \"name\": \"/dir1/file11\", \n            \"gid\": 0, \n            \"mode\": 33188, \n            \"se_linux_label\": \"-\", \n            \"uid\": 0, \n            \"link_target\": \"\", \n            \"digest\": \"e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855\", \n            \"size\": 0\n        }, \n        {\n            \"name\": \"/dir2\", \n            \"gid\": 1001, \n            \"mode\": 16877, \n            \"se_linux_label\": \"-\", \n            \"uid\": 1001, \n            \"link_target\": \"\", \n            \"digest\": \"0\", \n            \"size\": 1024\n        }, \n        {\n            \"name\": \"/dir2/file21\", \n            \"gid\": 0, \n            \"mode\": 36333, \n            \"se_linux_label\": \"-\", \n            \"uid\": 0, \n            \"link_target\": \"\", \n            \"digest\": \"e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855\", \n            \"size\": 0\n        }, \n        {\n            \"name\": \"/file1\", \n            \"gid\": 0, \n            \"mode\": 33178, \n            \"se_linux_label\": \"-\", \n            \"uid\": 0, \n            \"link_target\": \"\", \n            \"digest\": \"e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855\", \n            \"size\": 0\n        }, \n        {\n            \"name\": \"/file2\", \n            \"gid\": 0, \n            \"mode\": 32851, \n            \"se_linux_label\": \"-\", \n            \"uid\": 123, \n            \"link_target\": \"\", \n            \"digest\": \"e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855\", \n            \"size\": 0\n        }, \n        {\n            \"name\": \"/bin\", \n            \"gid\": 1001, \n            \"mode\": 16877, \n            \"se_linux_label\": \"-\", \n            \"uid\": 1001, \n            \"link_target\": \"\", \n            \"digest\": \"0\", \n            \"size\": 1024\n        }, \n        {\n            \"name\": \"/bin/elf_x8664\", \n            \"gid\": 1001, \n            \"mode\": 33277, \n            \"se_linux_label\": \"-\", \n            \"uid\": 1001, \n            \"link_target\": \"\", \n            \"digest\": \"9de3537418e232700ea8d187762beb6f1862a999372a47e5d04101ca754f2000\", \n            \"size\": 2011612\n        }, \n        {\n            \"name\": \"/bin/elf_x8664_stripped\", \n            \"gid\": 1001, \n            \"mode\": 33204, \n            \"se_linux_label\": \"-\", \n            \"uid\": 1001, \n            \"link_target\": \"\", \n            \"digest\": \"fb969a744228ee4471377930d6f31d3bb2c41a6f6b04c0253ebdf1f57180a421\", \n            \"size\": 1211048\n        }, \n        {\n            \"name\": \"/lost+found\", \n            \"gid\": 0, \n            \"mode\": 16832, \n            \"se_linux_label\": \"-\", \n            \"uid\": 0, \n            \"link_target\": \"\", \n            \"digest\": \"0\", \n            \"size\": 12288\n        }, \n        {\n            \"name\": \"/dir3/file33\", \n            \"gid\": 1001, \n            \"mode\": 33188, \n            \"se_linux_label\": \"-\", \n            \"uid\": 1001, \n            \"link_target\": \"\", \n            \"digest\": \"e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855\", \n            \"size\": 0\n        }, \n        {\n            \"name\": \"/ver\", \n            \"gid\": 0, \n            \"mode\": 33188, \n            \"se_linux_label\": \"-\", \n            \"uid\": 0, \n            \"link_target\": \"\", \n            \"digest\": \"44c77e41961f354f515e4081b12619fdb15829660acaa5d7438c66fc3d326df3\", \n            \"size\": 15\n        }, \n        {\n            \"name\": \"/dir1/dir11/file12\", \n            \"gid\": 0, \n            \"mode\": 33188, \n            \"se_linux_label\": \"-\", \n            \"uid\": 0, \n            \"link_target\": \"\", \n            \"digest\": \"e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855\", \n            \"size\": 0\n        }, \n        {\n            \"name\": \"/dir2/file22\", \n            \"gid\": 1002, \n            \"mode\": 33188, \n            \"se_linux_label\": \"-\", \n            \"uid\": 1002, \n            \"link_target\": \"\", \n            \"digest\": \"e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855\", \n            \"size\": 0\n        }, \n        {\n            \"name\": \"/dir3/file32\", \n            \"gid\": 1001, \n            \"mode\": 33188, \n            \"se_linux_label\": \"-\", \n            \"uid\": 1001, \n            \"link_target\": \"\", \n            \"digest\": \"e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855\", \n            \"size\": 0\n        }\n    ], \n    \"image_name\": \"test/test.img\", \n    \"image_digest\": \"9d5fd9acc98421b46976f283175cc438cf549bb0607a1bca6e881d3e7f323794\"\n}\n"
  },
  {
    "path": "test/script_test.sh",
    "content": "#!/bin/bash\n\nFILEPATH=$1\nORIG_FILENAME=$2\nORIG_UID=$3\nORIG_GID=$4\nORIG_MODE=$5\nORIG_SELINUXLABEL=$6\n\n# this is an artificial test\nif [ \"$7\" = \"--\" ]; then\n    echo -n $9 $8\nfi\n"
  },
  {
    "path": "test/test.cap.file",
    "content": ""
  },
  {
    "path": "test/test.py",
    "content": "#!/usr/bin/env python\n\n\n# Copyright 2019-present, Cruise LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\nimport json\nimport sys\nimport os\n\nerror = False\n\ndef SetError(log):\n    global error\n    print log\n    error = True\n\ndef test(cfgfile, e2toolspath=\"\"):\n    os.system(e2toolspath+\" fwanalyzer -in test/test.img -cfg \" + cfgfile + \" >test/test_out.json 2>&1\")\n\n    with open(\"test/test_out.json\") as read_file:\n        try:\n            data = json.load(read_file)\n        except:\n            data = {}\n\n    if data.get(\"image_name\") != \"test/test.img\":\n        SetError(\"image_name\")\n\n    if \"data\" not in data:\n        SetError(\"data\")\n\n    if data.get(\"data\", {}).get(\"Version\") != \"1.2.3\":\n        SetError(\"Data Version\")\n\n    if data.get(\"data\", {}).get(\"extract_test\") != \"test extract\":\n        SetError(\"extract test\")\n\n    if \"offenders\" not in data:\n        SetError(\"offenders\")\n    else:\n        if not \"/dir2/file21\" in data[\"offenders\"]:\n            SetError(\"dir2/file21\")\n\n        if not \"/dir2/file22\" in data[\"offenders\"]:\n            SetError(\"dir2/file22\")\n\n        if not \"File is WorldWriteable, not allowed\" in data[\"offenders\"][\"/world\"]:\n            SetError(\"WorldWriteable\")\n\n        if not \"File is SUID, not allowed\" in data[\"offenders\"][\"/dir2/file21\"]:\n            SetError(\"SUID\")\n\n        if not \"DirContent: File file22 not allowed in directory /dir2\" in data[\"offenders\"][\"/dir2/file22\"]:\n            SetError(\"DirContent\")\n\n        if not \"nofile\" in data[\"offenders\"]:\n            SetError(\"DirContent\")\n\n        if not \"test script\" in data[\"offenders\"][\"/file2\"]:\n            SetError(\"file2\")\n\n        if not \"Digest (sha256) did not match found = 44c77e41961f354f515e4081b12619fdb15829660acaa5d7438c66fc3d326df3 should be = 8b15095ed1af38d5e383af1c4eadc5ae73cab03964142eb54cb0477ccd6a8dd4. ver needs to be specific :  \" in data[\"offenders\"][\"/ver\"]:\n            SetError(\"ver digest\")\n\n        if \"File State Check failed: group found 1002 should be 0 : this needs to be this way\" in data[\"offenders\"][\"/dir2/file22\"]:\n            SetError(\"FileStatCheck shouldn't default to uid/guid 0\")\n\n        if not \"File not allowed for pattern: *1\" in data[\"offenders\"][\"/file1\"]:\n            SetError(\"file1 not allowed\")\n\n        if not \"File State Check failed: size: 0 AllowEmpyt=false : this needs to be this way\" in data[\"offenders\"][\"/file1\"]:\n            SetError(\"file1 exists but size 0\")\n\n        if not \"/bin/elf_x8664 is not stripped\" in data[\"offenders\"][\"/bin/elf_x8664\"]:\n            SetError(\"script failed\")\n\n    if not \"informational\" in data:\n        SetError(\"informational\")\n    else:\n        if not \"/file1\" in data[\"informational\"]:\n            SetError(\"/file1\")\n        else:\n            if not \"changed\" in data[\"informational\"][\"/file1\"][0]:\n                SetError(\"file1 not changed\")\n        if not \"/date1\" in data[\"informational\"]:\n            SetError(\"/date1\")\n        else:\n            if not \"changed\" in data[\"informational\"][\"/date1\"][0]:\n                SetError(\"date1 not changed\")\n\nif __name__ == \"__main__\":\n    test(\"test/test_cfg.toml\")\n    if error:\n       os.system(\"cat test/test_out.json\")\n       sys.exit(error)\n\n    # disable if your e2ls version does not support selinux (-Z) option\n    test(\"test/test_cfg_selinux.toml\")\n\n    if error:\n       os.system(\"cat test/test_out.json\")\n       sys.exit(error)\n"
  },
  {
    "path": "test/test_cfg.base.toml",
    "content": "\n# all checks for the integration test\n\n[GlobalFileChecks]\nSuid = true\nSuidAllowedList = []\nSeLinuxLabel = false\nWorldWrite = true\nUids = [0,1001,1002]\nGids = [0,1001,1002]\nBadFiles = [\"/file99\", \"*1\", \"/bin/elf_x8664\"]\n\n[FileTreeCheck]\nOldTreeFilePath = \"oldtree.json\"\nCheckPath = [\"/\"]\nCheckPermsOwnerChange = true\nCheckFileSize         = true\nCheckFileDigest       = true\n\n[FilePathOwner.\"/dir2\"]\nUid = 0\nGid = 0\n\n[FilePathOwner.\"/dir3\"]\nUid = 1001\nGid = 1001\n\n[FileStatCheck.\"/file2\"]\nAllowEmpty = true\nUid = 123\nGid = 0\nMode = \"100123\"\nDesc = \"this needs to be this way\"\n\n[FileStatCheck.\"/dir2/file22\"]\nAllowEmpty = true\nDesc = \"this needs to be this way\"\n\n[FileStatCheck.\"/ver\"]\nAllowEmpty = false\nUid = -1\nGid = -1\nMode = \"\"\nDesc = \"this needs to be this way\"\n\n[FileStatCheck.\"/file1\"]\nAllowEmpty = false\nUid = -1\nGid = -1\nMode = \"\"\nDesc = \"this needs to be this way\"\n\n[FileContent.\"ensure all elf files are x86 64bit\"]\nFile = \"/bin\"\nScript=\"check_file_x8664.sh\"\n\n[FileContent.\"ensure bins are stripped\"]\nFile = \"/\"\nScriptOptions = [\"*\"]\nScript=\"check_file_elf_stripped.sh\"\n\n[FileContent.\"script_test\"]\nFile = \"/file2\"\nScriptOptions = [\"*\", \"script\", \"test\"]\nScript=\"script_test.sh\"\n\n[DataExtract.\"extract_test\"]\nFile = \"/file2\"\nScriptOptions = [\"extract\", \"test\"]\nScript=\"script_test.sh\"\n\n[FileContent.\"date1 needs to be specific\"]\nFile = \"/date1\"\nDigest=\"8b15095ed1af38d5e383af1c4eadc5ae73cab03964142eb54cb0477ccd6a8dd5\"\n\n[FileContent.\"ver needs to be specific\"]\nFile = \"/ver\"\nDigest=\"8b15095ed1af38d5e383af1c4eadc5ae73cab03964142eb54cb0477ccd6a8dd4\"\n\n[FileContent.\"version check\"]\nFile = \"/ver\"\nRegex= \".*version=1.2.3.*\"\nMatch = false\n\n[DirContent.\"/dir1\"]\nAllowed=[\"dir11\", \"file11\"]\nRequired=[\"nofile\"]\n\n[DirContent.\"/dir2\"]\nAllowed=[\"file?1\"]\nRequired=[\"file21\"]\n\n[DataExtract.\"Version\"]\nFile = \"/ver\"\nRegEx = \"^version=(\\\\S+)\\\\n.*\"\n\n[DataExtract.\"date1_file\"]\nFile = \"/date1\"\nScript = \"catfile.sh\"\n"
  },
  {
    "path": "test/test_cfg.toml",
    "content": "\n# test with old e2tools without selinux support\n\n[GlobalConfig]\nFsType = \"extfs\"\nFsTypeOptions = \"\"\nDigestImage = true\n\n# load actual config\n[Include.\"test/test_cfg.base.toml\"]\n"
  },
  {
    "path": "test/test_cfg_selinux.toml",
    "content": "\n# test with NEW e2tools with selinux support\n\n[GlobalConfig]\nFsType = \"extfs\"\nFsTypeOptions = \"selinux\"\nDigestImage = true\n\n# load actual config\n[Include.\"test/test_cfg.base.toml\"]\n"
  },
  {
    "path": "test/testdir/dir1/file2",
    "content": ""
  },
  {
    "path": "test/testdir/file1.txt",
    "content": ""
  },
  {
    "path": "test/testdir/jsonfile.json",
    "content": "{\"test_var\": 1, \"test_str\": \"yolo\"}\n"
  }
]