Add files using upload-large-folder tool
Browse files- local-test-libxml2-delta-02/fuzz-tooling/docs/README.md +19 -0
- local-test-libxml2-delta-02/fuzz-tooling/docs/favicon.ico +0 -0
- local-test-libxml2-delta-02/fuzz-tooling/docs/index.md +88 -0
- local-test-libxml2-delta-02/fuzz-tooling/docs/new_project_guide.md +1 -0
- local-test-libxml2-delta-02/fuzz-tooling/infra/__pycache__/constants.cpython-312.pyc +0 -0
- local-test-libxml2-delta-02/fuzz-tooling/infra/base-images/aixcc_build_all.sh +59 -0
- local-test-libxml2-delta-02/fuzz-tooling/infra/base-images/all.sh +28 -0
- local-test-libxml2-delta-02/fuzz-tooling/infra/ci/build.py +292 -0
- local-test-libxml2-delta-02/fuzz-tooling/infra/ci/build_test.py +124 -0
- local-test-libxml2-delta-02/fuzz-tooling/infra/cifuzz/affected_fuzz_targets.py +113 -0
- local-test-libxml2-delta-02/fuzz-tooling/infra/cifuzz/build-images.sh +34 -0
- local-test-libxml2-delta-02/fuzz-tooling/infra/cifuzz/clusterfuzz_deployment.py +385 -0
- local-test-libxml2-delta-02/fuzz-tooling/infra/cifuzz/docker.py +127 -0
- local-test-libxml2-delta-02/fuzz-tooling/infra/cifuzz/requirements.txt +4 -0
- local-test-libxml2-delta-02/fuzz-tooling/infra/cifuzz/run_fuzzers_entrypoint.py +97 -0
- local-test-libxml2-delta-02/fuzz-tooling/infra/cifuzz/sarif_utils.py +251 -0
- local-test-libxml2-delta-02/fuzz-tooling/infra/cifuzz/test_helpers.py +117 -0
- local-test-libxml2-delta-02/fuzz-tooling/infra/cifuzz/workspace_utils.py +85 -0
- local-test-libxml2-delta-02/fuzz-tooling/infra/repo_manager.py +272 -0
- local-test-libxml2-delta-02/fuzz-tooling/infra/tools/hold_back_images.py +128 -0
local-test-libxml2-delta-02/fuzz-tooling/docs/README.md
ADDED
|
@@ -0,0 +1,19 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# Readme
|
| 2 |
+
|
| 3 |
+
Use the following instructions to make documentation changes locally.
|
| 4 |
+
|
| 5 |
+
## Prerequisites
|
| 6 |
+
```bash
|
| 7 |
+
$ sudo apt install ruby bundler
|
| 8 |
+
$ bundle config set path 'vendor/bundle'
|
| 9 |
+
$ bundle install
|
| 10 |
+
```
|
| 11 |
+
|
| 12 |
+
## Serving locally
|
| 13 |
+
```bash
|
| 14 |
+
$ bundle exec jekyll serve
|
| 15 |
+
```
|
| 16 |
+
|
| 17 |
+
## Theme documentation
|
| 18 |
+
We are using the [just the docs](https://just-the-docs.github.io/just-the-docs/)
|
| 19 |
+
theme.
|
local-test-libxml2-delta-02/fuzz-tooling/docs/favicon.ico
ADDED
|
|
local-test-libxml2-delta-02/fuzz-tooling/docs/index.md
ADDED
|
@@ -0,0 +1,88 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
layout: default
|
| 3 |
+
title: OSS-Fuzz
|
| 4 |
+
permalink: /
|
| 5 |
+
nav_order: 1
|
| 6 |
+
has_children: true
|
| 7 |
+
has_toc: false
|
| 8 |
+
---
|
| 9 |
+
|
| 10 |
+
# OSS-Fuzz
|
| 11 |
+
|
| 12 |
+
[Fuzz testing] is a well-known technique for uncovering programming errors in
|
| 13 |
+
software. Many of these detectable errors, like [buffer overflow], can have
|
| 14 |
+
serious security implications. Google has found [thousands] of security
|
| 15 |
+
vulnerabilities and stability bugs by deploying [guided in-process fuzzing of
|
| 16 |
+
Chrome components], and we now want to share that service with the open source
|
| 17 |
+
community.
|
| 18 |
+
|
| 19 |
+
[Fuzz testing]: https://en.wikipedia.org/wiki/Fuzz_testing
|
| 20 |
+
[buffer overflow]: https://en.wikipedia.org/wiki/Buffer_overflow
|
| 21 |
+
[thousands]: https://bugs.chromium.org/p/chromium/issues/list?q=label%3AStability-LibFuzzer%2CStability-AFL%20-status%3ADuplicate%2CWontFix&can=1
|
| 22 |
+
[guided in-process fuzzing of Chrome components]: https://security.googleblog.com/2016/08/guided-in-process-fuzzing-of-chrome.html
|
| 23 |
+
|
| 24 |
+
In cooperation with the [Core Infrastructure Initiative] and the [OpenSSF],
|
| 25 |
+
OSS-Fuzz aims to make common open source software more secure and stable by
|
| 26 |
+
combining modern fuzzing techniques with scalable, distributed execution.
|
| 27 |
+
Projects that do not qualify for OSS-Fuzz (e.g. closed source) can run their own
|
| 28 |
+
instances of [ClusterFuzz] or [ClusterFuzzLite].
|
| 29 |
+
|
| 30 |
+
[Core Infrastructure Initiative]: https://www.coreinfrastructure.org/
|
| 31 |
+
[OpenSSF]: https://www.openssf.org/
|
| 32 |
+
|
| 33 |
+
We support the [libFuzzer], [AFL++], [Honggfuzz], and [Centipede] fuzzing engines in
|
| 34 |
+
combination with [Sanitizers], as well as [ClusterFuzz], a distributed fuzzer
|
| 35 |
+
execution environment and reporting tool.
|
| 36 |
+
|
| 37 |
+
[libFuzzer]: https://llvm.org/docs/LibFuzzer.html
|
| 38 |
+
[AFL++]: https://github.com/AFLplusplus/AFLplusplus
|
| 39 |
+
[Honggfuzz]: https://github.com/google/honggfuzz
|
| 40 |
+
[Centipede]: https://github.com/google/centipede
|
| 41 |
+
[Sanitizers]: https://github.com/google/sanitizers
|
| 42 |
+
[ClusterFuzz]: https://github.com/google/clusterfuzz
|
| 43 |
+
[ClusterFuzzLite]: https://google.github.io/clusterfuzzlite/
|
| 44 |
+
|
| 45 |
+
Currently, OSS-Fuzz supports C/C++, Rust, Go, Python and Java/JVM code. Other
|
| 46 |
+
languages supported by [LLVM] may work too. OSS-Fuzz supports fuzzing x86_64
|
| 47 |
+
and i386 builds.
|
| 48 |
+
|
| 49 |
+
[LLVM]: https://llvm.org
|
| 50 |
+
|
| 51 |
+
|
| 52 |
+
## Project history
|
| 53 |
+
OSS-Fuzz was launched in 2016 in response to the
|
| 54 |
+
[Heartbleed] vulnerability, discovered in [OpenSSL], one of the
|
| 55 |
+
most popular open source projects for encrypting web traffic. The vulnerability
|
| 56 |
+
had the potential to affect almost every internet user, yet was caused by a
|
| 57 |
+
relatively simple memory buffer overflow bug that could have been detected by
|
| 58 |
+
fuzzing—that is, by running the code on randomized inputs to intentionally cause
|
| 59 |
+
unexpected behaviors or crashes. At the time, though, fuzzing
|
| 60 |
+
was not widely used and was cumbersome for developers, requiring extensive
|
| 61 |
+
manual effort.
|
| 62 |
+
|
| 63 |
+
Google created OSS-Fuzz to fill this gap: it's a free service that runs fuzzers
|
| 64 |
+
for open source projects and privately alerts developers to the bugs detected.
|
| 65 |
+
Since its launch, OSS-Fuzz has become a critical service for the open source
|
| 66 |
+
community, growing beyond C/C++ to
|
| 67 |
+
detect problems in memory-safe languages such as Go, Rust, and Python.
|
| 68 |
+
|
| 69 |
+
[Heartbleed]: https://heartbleed.com/
|
| 70 |
+
[OpenSSL]: https://www.openssl.org/
|
| 71 |
+
|
| 72 |
+
## Learn more about fuzzing
|
| 73 |
+
|
| 74 |
+
This documentation describes how to use OSS-Fuzz service for your open source
|
| 75 |
+
project. To learn more about fuzzing in general, we recommend reading [libFuzzer
|
| 76 |
+
tutorial] and the other docs in [google/fuzzing] repository. These and some
|
| 77 |
+
other resources are listed on the [useful links] page.
|
| 78 |
+
|
| 79 |
+
[google/fuzzing]: https://github.com/google/fuzzing/tree/master/docs
|
| 80 |
+
[libFuzzer tutorial]: https://github.com/google/fuzzing/blob/master/tutorial/libFuzzerTutorial.md
|
| 81 |
+
[useful links]: {{ site.baseurl }}/reference/useful-links/#tutorials
|
| 82 |
+
|
| 83 |
+
## Trophies
|
| 84 |
+
As of August 2023, OSS-Fuzz has helped identify and fix over [10,000] vulnerabilities and [36,000] bugs across [1,000] projects.
|
| 85 |
+
|
| 86 |
+
[10,000]: https://bugs.chromium.org/p/oss-fuzz/issues/list?q=Type%3DBug-Security%20label%3Aclusterfuzz%20-status%3ADuplicate%2CWontFix&can=1
|
| 87 |
+
[36,000]: https://bugs.chromium.org/p/oss-fuzz/issues/list?q=Type%3DBug%20label%3Aclusterfuzz%20-status%3ADuplicate%2CWontFix&can=1
|
| 88 |
+
[1,000]: https://github.com/google/oss-fuzz/tree/master/projects
|
local-test-libxml2-delta-02/fuzz-tooling/docs/new_project_guide.md
ADDED
|
@@ -0,0 +1 @@
|
|
|
|
|
|
|
| 1 |
+
This page has moved [here](https://google.github.io/oss-fuzz/getting-started/new-project-guide/)
|
local-test-libxml2-delta-02/fuzz-tooling/infra/__pycache__/constants.cpython-312.pyc
ADDED
|
Binary file (802 Bytes). View file
|
|
|
local-test-libxml2-delta-02/fuzz-tooling/infra/base-images/aixcc_build_all.sh
ADDED
|
@@ -0,0 +1,59 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
#!/bin/bash -eux
|
| 2 |
+
|
| 3 |
+
if [ "$1" = "--cache-from" ]; then
|
| 4 |
+
PULL_CACHE=1
|
| 5 |
+
shift
|
| 6 |
+
CACHE_TAG="${1//\//-}" # s/\//-/g -> for branch names that contain slashes
|
| 7 |
+
shift
|
| 8 |
+
elif [ "$1" = "--cache-to" ]; then
|
| 9 |
+
PUSH_CACHE=1
|
| 10 |
+
shift
|
| 11 |
+
CACHE_TAG="${1//\//-}" # s/\//-/g -> for branch names that contain slashes
|
| 12 |
+
shift
|
| 13 |
+
fi
|
| 14 |
+
|
| 15 |
+
ARG_TAG="$1"
|
| 16 |
+
shift
|
| 17 |
+
|
| 18 |
+
BASE_IMAGES=(
|
| 19 |
+
"ghcr.io/aixcc-finals/base-image infra/base-images/base-image"
|
| 20 |
+
"ghcr.io/aixcc-finals/base-clang infra/base-images/base-clang"
|
| 21 |
+
"ghcr.io/aixcc-finals/base-builder infra/base-images/base-builder"
|
| 22 |
+
"ghcr.io/aixcc-finals/base-builder-go infra/base-images/base-builder-go"
|
| 23 |
+
"ghcr.io/aixcc-finals/base-builder-jvm infra/base-images/base-builder-jvm"
|
| 24 |
+
"ghcr.io/aixcc-finals/base-builder-python infra/base-images/base-builder-python"
|
| 25 |
+
"ghcr.io/aixcc-finals/base-builder-rust infra/base-images/base-builder-rust"
|
| 26 |
+
"ghcr.io/aixcc-finals/base-builder-ruby infra/base-images/base-builder-ruby"
|
| 27 |
+
"ghcr.io/aixcc-finals/base-builder-swift infra/base-images/base-builder-swift"
|
| 28 |
+
"ghcr.io/aixcc-finals/base-runner infra/base-images/base-runner"
|
| 29 |
+
"ghcr.io/aixcc-finals/base-runner-debug infra/base-images/base-runner-debug"
|
| 30 |
+
)
|
| 31 |
+
|
| 32 |
+
for tuple in "${BASE_IMAGES[@]}"; do
|
| 33 |
+
read -r image path <<< "$tuple"
|
| 34 |
+
|
| 35 |
+
if [ "${PULL_CACHE+x}" ]; then
|
| 36 |
+
|
| 37 |
+
docker buildx build \
|
| 38 |
+
--build-arg IMG_TAG="${ARG_TAG}" \
|
| 39 |
+
--cache-from=type=registry,ref="${image}:${CACHE_TAG}" \
|
| 40 |
+
--tag "${image}:${ARG_TAG}" --push "$@" "${path}"
|
| 41 |
+
|
| 42 |
+
elif [ "${PUSH_CACHE+x}" ]; then
|
| 43 |
+
|
| 44 |
+
docker buildx build \
|
| 45 |
+
--build-arg IMG_TAG="${ARG_TAG}" \
|
| 46 |
+
--cache-from=type=registry,ref="${image}:${CACHE_TAG}" \
|
| 47 |
+
--cache-to=type=registry,ref="${image}:${CACHE_TAG}",mode=max \
|
| 48 |
+
--tag "${image}:${ARG_TAG}" --push "$@" "${path}"
|
| 49 |
+
|
| 50 |
+
else
|
| 51 |
+
|
| 52 |
+
docker buildx build \
|
| 53 |
+
--build-arg IMG_TAG="${ARG_TAG}" \
|
| 54 |
+
--tag "${image}:${ARG_TAG}" --push "$@" "${path}"
|
| 55 |
+
|
| 56 |
+
fi
|
| 57 |
+
|
| 58 |
+
done
|
| 59 |
+
|
local-test-libxml2-delta-02/fuzz-tooling/infra/base-images/all.sh
ADDED
|
@@ -0,0 +1,28 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
#!/bin/bash -eux
|
| 2 |
+
# Copyright 2016 Google Inc.
|
| 3 |
+
#
|
| 4 |
+
# Licensed under the Apache License, Version 2.0 (the "License");
|
| 5 |
+
# you may not use this file except in compliance with the License.
|
| 6 |
+
# You may obtain a copy of the License at
|
| 7 |
+
#
|
| 8 |
+
# http://www.apache.org/licenses/LICENSE-2.0
|
| 9 |
+
#
|
| 10 |
+
# Unless required by applicable law or agreed to in writing, software
|
| 11 |
+
# distributed under the License is distributed on an "AS IS" BASIS,
|
| 12 |
+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
| 13 |
+
# See the License for the specific language governing permissions and
|
| 14 |
+
# limitations under the License.
|
| 15 |
+
#
|
| 16 |
+
################################################################################
|
| 17 |
+
|
| 18 |
+
docker build --pull -t ghcr.io/aixcc-finals/base-image "$@" infra/base-images/base-image
|
| 19 |
+
docker build -t ghcr.io/aixcc-finals/base-clang "$@" infra/base-images/base-clang
|
| 20 |
+
docker build -t ghcr.io/aixcc-finals/base-builder "$@" infra/base-images/base-builder
|
| 21 |
+
docker build -t ghcr.io/aixcc-finals/base-builder-go "$@" infra/base-images/base-builder-go
|
| 22 |
+
docker build -t ghcr.io/aixcc-finals/base-builder-jvm "$@" infra/base-images/base-builder-jvm
|
| 23 |
+
docker build -t ghcr.io/aixcc-finals/base-builder-python "$@" infra/base-images/base-builder-python
|
| 24 |
+
docker build -t ghcr.io/aixcc-finals/base-builder-rust "$@" infra/base-images/base-builder-rust
|
| 25 |
+
docker build -t ghcr.io/aixcc-finals/base-builder-ruby "$@" infra/base-images/base-builder-ruby
|
| 26 |
+
docker build -t ghcr.io/aixcc-finals/base-builder-swift "$@" infra/base-images/base-builder-swift
|
| 27 |
+
docker build -t ghcr.io/aixcc-finals/base-runner "$@" infra/base-images/base-runner
|
| 28 |
+
docker build -t ghcr.io/aixcc-finals/base-runner-debug "$@" infra/base-images/base-runner-debug
|
local-test-libxml2-delta-02/fuzz-tooling/infra/ci/build.py
ADDED
|
@@ -0,0 +1,292 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
#!/usr/bin/env python
|
| 2 |
+
# Copyright 2019 Google Inc.
|
| 3 |
+
#
|
| 4 |
+
# Licensed under the Apache License, Version 2.0 (the "License");
|
| 5 |
+
# you may not use this file except in compliance with the License.
|
| 6 |
+
# You may obtain a copy of the License at
|
| 7 |
+
#
|
| 8 |
+
# http://www.apache.org/licenses/LICENSE-2.0
|
| 9 |
+
#
|
| 10 |
+
# Unless required by applicable law or agreed to in writing, software
|
| 11 |
+
# distributed under the License is distributed on an "AS IS" BASIS,
|
| 12 |
+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
| 13 |
+
# See the License for the specific language governing permissions and
|
| 14 |
+
# limitations under the License.
|
| 15 |
+
#
|
| 16 |
+
################################################################################
|
| 17 |
+
"""Build modified projects."""
|
| 18 |
+
|
| 19 |
+
from __future__ import print_function
|
| 20 |
+
|
| 21 |
+
import enum
|
| 22 |
+
import os
|
| 23 |
+
import re
|
| 24 |
+
import sys
|
| 25 |
+
import subprocess
|
| 26 |
+
import yaml
|
| 27 |
+
|
| 28 |
+
# pylint: disable=wrong-import-position,import-error
|
| 29 |
+
sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
| 30 |
+
|
| 31 |
+
import constants
|
| 32 |
+
|
| 33 |
+
CANARY_PROJECT = 'skcms'
|
| 34 |
+
|
| 35 |
+
DEFAULT_ARCHITECTURES = ['x86_64']
|
| 36 |
+
DEFAULT_ENGINES = ['afl', 'honggfuzz', 'libfuzzer', 'centipede']
|
| 37 |
+
DEFAULT_SANITIZERS = ['address', 'undefined']
|
| 38 |
+
|
| 39 |
+
|
| 40 |
+
def get_changed_files_output():
|
| 41 |
+
"""Returns the output of a git command that discovers changed files."""
|
| 42 |
+
branch_commit_hash = subprocess.check_output(
|
| 43 |
+
['git', 'merge-base', 'HEAD', 'origin/HEAD']).strip().decode()
|
| 44 |
+
|
| 45 |
+
return subprocess.check_output(
|
| 46 |
+
['git', 'diff', '--name-only', branch_commit_hash + '..']).decode()
|
| 47 |
+
|
| 48 |
+
|
| 49 |
+
def get_modified_buildable_projects():
|
| 50 |
+
"""Returns a list of all the projects modified in this commit that have a
|
| 51 |
+
build.sh file."""
|
| 52 |
+
git_output = get_changed_files_output()
|
| 53 |
+
projects_regex = '.*projects/(?P<name>.*)/.*\n'
|
| 54 |
+
modified_projects = set(re.findall(projects_regex, git_output))
|
| 55 |
+
projects_dir = os.path.join(get_oss_fuzz_root(), 'projects')
|
| 56 |
+
# Filter out projects without Dockerfile files since new projects and reverted
|
| 57 |
+
# projects frequently don't have them. In these cases we don't want Travis's
|
| 58 |
+
# builds to fail.
|
| 59 |
+
modified_buildable_projects = []
|
| 60 |
+
for project in modified_projects:
|
| 61 |
+
if not os.path.exists(os.path.join(projects_dir, project, 'Dockerfile')):
|
| 62 |
+
print('Project {0} does not have Dockerfile. skipping build.'.format(
|
| 63 |
+
project))
|
| 64 |
+
continue
|
| 65 |
+
modified_buildable_projects.append(project)
|
| 66 |
+
return modified_buildable_projects
|
| 67 |
+
|
| 68 |
+
|
| 69 |
+
def get_oss_fuzz_root():
|
| 70 |
+
"""Get the absolute path of the root of the oss-fuzz checkout."""
|
| 71 |
+
script_path = os.path.realpath(__file__)
|
| 72 |
+
return os.path.abspath(
|
| 73 |
+
os.path.dirname(os.path.dirname(os.path.dirname(script_path))))
|
| 74 |
+
|
| 75 |
+
|
| 76 |
+
def execute_helper_command(helper_command):
|
| 77 |
+
"""Execute |helper_command| using helper.py."""
|
| 78 |
+
root = get_oss_fuzz_root()
|
| 79 |
+
script_path = os.path.join(root, 'infra', 'helper.py')
|
| 80 |
+
command = ['python', script_path] + helper_command
|
| 81 |
+
print('Running command: %s' % ' '.join(command))
|
| 82 |
+
subprocess.check_call(command)
|
| 83 |
+
|
| 84 |
+
|
| 85 |
+
def build_fuzzers(project, engine, sanitizer, architecture):
|
| 86 |
+
"""Execute helper.py's build_fuzzers command on |project|. Build the fuzzers
|
| 87 |
+
with |engine| and |sanitizer| for |architecture|."""
|
| 88 |
+
execute_helper_command([
|
| 89 |
+
'build_fuzzers', project, '--engine', engine, '--sanitizer', sanitizer,
|
| 90 |
+
'--architecture', architecture
|
| 91 |
+
])
|
| 92 |
+
|
| 93 |
+
|
| 94 |
+
def check_build(project, engine, sanitizer, architecture):
|
| 95 |
+
"""Execute helper.py's check_build command on |project|, assuming it was most
|
| 96 |
+
recently built with |engine| and |sanitizer| for |architecture|."""
|
| 97 |
+
execute_helper_command([
|
| 98 |
+
'check_build', project, '--engine', engine, '--sanitizer', sanitizer,
|
| 99 |
+
'--architecture', architecture
|
| 100 |
+
])
|
| 101 |
+
|
| 102 |
+
|
| 103 |
+
def should_build_coverage(project_yaml):
|
| 104 |
+
"""Returns True if a coverage build should be done based on project.yaml
|
| 105 |
+
contents."""
|
| 106 |
+
# Enable coverage builds on projects that use engines. Those that don't use
|
| 107 |
+
# engines shouldn't get coverage builds.
|
| 108 |
+
engines = project_yaml.get('fuzzing_engines', DEFAULT_ENGINES)
|
| 109 |
+
engineless = 'none' in engines
|
| 110 |
+
if engineless:
|
| 111 |
+
assert_message = ('Forbidden to specify multiple engines for '
|
| 112 |
+
'"fuzzing_engines" if "none" is specified.')
|
| 113 |
+
assert len(engines) == 1, assert_message
|
| 114 |
+
return False
|
| 115 |
+
if 'wycheproof' in engines:
|
| 116 |
+
return False
|
| 117 |
+
|
| 118 |
+
language = project_yaml.get('language')
|
| 119 |
+
if language not in constants.LANGUAGES_WITH_COVERAGE_SUPPORT:
|
| 120 |
+
print(('Project is written in "{language}", '
|
| 121 |
+
'coverage is not supported yet.').format(language=language))
|
| 122 |
+
return False
|
| 123 |
+
|
| 124 |
+
return True
|
| 125 |
+
|
| 126 |
+
|
| 127 |
+
def flatten_options(option_list):
|
| 128 |
+
"""Generator that flattens |option_list| (a list of sanitizers, architectures
|
| 129 |
+
or fuzzing engines) by returning each element in the list that isn't a
|
| 130 |
+
dictionary. For elements that are dictionaries, the sole key is returned."""
|
| 131 |
+
result = []
|
| 132 |
+
for option in option_list:
|
| 133 |
+
if isinstance(option, dict):
|
| 134 |
+
keys = list(option.keys())
|
| 135 |
+
assert len(keys) == 1
|
| 136 |
+
result.append(keys[0])
|
| 137 |
+
continue
|
| 138 |
+
result.append(option)
|
| 139 |
+
print(result)
|
| 140 |
+
return result
|
| 141 |
+
|
| 142 |
+
|
| 143 |
+
def should_build(project_yaml):
|
| 144 |
+
"""Returns True on if the build specified is enabled in the project.yaml."""
|
| 145 |
+
|
| 146 |
+
if os.getenv('SANITIZER') == 'coverage':
|
| 147 |
+
# This assumes we only do coverage builds with libFuzzer on x86_64.
|
| 148 |
+
return should_build_coverage(project_yaml)
|
| 149 |
+
|
| 150 |
+
def is_enabled(env_var, yaml_name, defaults):
|
| 151 |
+
"""Is the value of |env_var| enabled in |project_yaml| (in the |yaml_name|
|
| 152 |
+
section)? Uses |defaults| if |yaml_name| section is unspecified."""
|
| 153 |
+
return os.getenv(env_var) in flatten_options(
|
| 154 |
+
project_yaml.get(yaml_name, defaults))
|
| 155 |
+
|
| 156 |
+
return (is_enabled('ENGINE', 'fuzzing_engines', DEFAULT_ENGINES) and
|
| 157 |
+
is_enabled('SANITIZER', 'sanitizers', DEFAULT_SANITIZERS) and
|
| 158 |
+
is_enabled('ARCHITECTURE', 'architectures', DEFAULT_ARCHITECTURES))
|
| 159 |
+
|
| 160 |
+
|
| 161 |
+
def build_project(project):
|
| 162 |
+
"""Do the build of |project| that is specified by the environment variables -
|
| 163 |
+
SANITIZER, ENGINE, and ARCHITECTURE."""
|
| 164 |
+
root = get_oss_fuzz_root()
|
| 165 |
+
project_yaml_path = os.path.join(root, 'projects', project, 'project.yaml')
|
| 166 |
+
with open(project_yaml_path) as file_handle:
|
| 167 |
+
project_yaml = yaml.safe_load(file_handle)
|
| 168 |
+
|
| 169 |
+
if project_yaml.get('disabled', False):
|
| 170 |
+
print('Project {0} is disabled, skipping build.'.format(project))
|
| 171 |
+
return
|
| 172 |
+
|
| 173 |
+
engine = os.getenv('ENGINE')
|
| 174 |
+
sanitizer = os.getenv('SANITIZER')
|
| 175 |
+
architecture = os.getenv('ARCHITECTURE')
|
| 176 |
+
|
| 177 |
+
if not should_build(project_yaml):
|
| 178 |
+
print(('Specified build: engine: {0}, sanitizer: {1}, architecture: {2} '
|
| 179 |
+
'not enabled for this project: {3}. Skipping build.').format(
|
| 180 |
+
engine, sanitizer, architecture, project))
|
| 181 |
+
|
| 182 |
+
return
|
| 183 |
+
|
| 184 |
+
print('Building project', project)
|
| 185 |
+
build_fuzzers(project, engine, sanitizer, architecture)
|
| 186 |
+
|
| 187 |
+
run_tests = project_yaml.get('run_tests', True)
|
| 188 |
+
if engine != 'none' and sanitizer != 'coverage' and run_tests:
|
| 189 |
+
check_build(project, engine, sanitizer, architecture)
|
| 190 |
+
|
| 191 |
+
|
| 192 |
+
class BuildModifiedProjectsResult(enum.Enum):
|
| 193 |
+
"""Enum containing the return values of build_modified_projects()."""
|
| 194 |
+
NONE_BUILT = 0
|
| 195 |
+
BUILD_SUCCESS = 1
|
| 196 |
+
BUILD_FAIL = 2
|
| 197 |
+
|
| 198 |
+
|
| 199 |
+
def build_modified_projects():
|
| 200 |
+
"""Build modified projects. Returns BuildModifiedProjectsResult.NONE_BUILT if
|
| 201 |
+
no builds were attempted. Returns BuildModifiedProjectsResult.BUILD_SUCCESS if
|
| 202 |
+
all attempts succeed, otherwise returns
|
| 203 |
+
BuildModifiedProjectsResult.BUILD_FAIL."""
|
| 204 |
+
projects = get_modified_buildable_projects()
|
| 205 |
+
if not projects:
|
| 206 |
+
return BuildModifiedProjectsResult.NONE_BUILT
|
| 207 |
+
|
| 208 |
+
failed_projects = []
|
| 209 |
+
for project in projects:
|
| 210 |
+
try:
|
| 211 |
+
build_project(project)
|
| 212 |
+
except subprocess.CalledProcessError:
|
| 213 |
+
failed_projects.append(project)
|
| 214 |
+
|
| 215 |
+
if failed_projects:
|
| 216 |
+
print('Failed projects:', ' '.join(failed_projects))
|
| 217 |
+
return BuildModifiedProjectsResult.BUILD_FAIL
|
| 218 |
+
|
| 219 |
+
return BuildModifiedProjectsResult.BUILD_SUCCESS
|
| 220 |
+
|
| 221 |
+
|
| 222 |
+
def is_infra_changed():
|
| 223 |
+
"""Returns True if the infra directory was changed."""
|
| 224 |
+
git_output = get_changed_files_output()
|
| 225 |
+
infra_code_regex = '.*infra/.*\n'
|
| 226 |
+
return re.search(infra_code_regex, git_output) is not None
|
| 227 |
+
|
| 228 |
+
|
| 229 |
+
def build_base_images():
|
| 230 |
+
"""Builds base images."""
|
| 231 |
+
# TODO(jonathanmetzman): Investigate why caching fails so often and
|
| 232 |
+
# when we improve it, build base-clang as well. Also, move this function
|
| 233 |
+
# to a helper command when we can support base-clang.
|
| 234 |
+
execute_helper_command(['pull_images'])
|
| 235 |
+
images = [
|
| 236 |
+
'base-image',
|
| 237 |
+
'base-builder',
|
| 238 |
+
'base-builder-go',
|
| 239 |
+
'base-builder-javascript',
|
| 240 |
+
'base-builder-jvm',
|
| 241 |
+
'base-builder-python',
|
| 242 |
+
'base-builder-rust',
|
| 243 |
+
'base-builder-swift',
|
| 244 |
+
'base-builder-ruby',
|
| 245 |
+
'base-runner',
|
| 246 |
+
]
|
| 247 |
+
for image in images:
|
| 248 |
+
try:
|
| 249 |
+
execute_helper_command(['build_image', image, '--no-pull', '--cache'])
|
| 250 |
+
except subprocess.CalledProcessError:
|
| 251 |
+
return 1
|
| 252 |
+
|
| 253 |
+
return 0
|
| 254 |
+
|
| 255 |
+
|
| 256 |
+
def build_canary_project():
|
| 257 |
+
"""Builds a specific project when infra/ is changed to verify that infra/
|
| 258 |
+
changes don't break things. Returns False if build was attempted but
|
| 259 |
+
failed."""
|
| 260 |
+
|
| 261 |
+
try:
|
| 262 |
+
build_project('skcms')
|
| 263 |
+
except subprocess.CalledProcessError:
|
| 264 |
+
return False
|
| 265 |
+
|
| 266 |
+
return True
|
| 267 |
+
|
| 268 |
+
|
| 269 |
+
def main():
|
| 270 |
+
"""Build modified projects or canary project."""
|
| 271 |
+
os.environ['OSS_FUZZ_CI'] = '1'
|
| 272 |
+
infra_changed = is_infra_changed()
|
| 273 |
+
if infra_changed:
|
| 274 |
+
print('Pulling and building base images first.')
|
| 275 |
+
if build_base_images():
|
| 276 |
+
return 1
|
| 277 |
+
|
| 278 |
+
result = build_modified_projects()
|
| 279 |
+
if result == BuildModifiedProjectsResult.BUILD_FAIL:
|
| 280 |
+
return 1
|
| 281 |
+
|
| 282 |
+
# It's unnecessary to build the canary if we've built any projects already.
|
| 283 |
+
no_projects_built = result == BuildModifiedProjectsResult.NONE_BUILT
|
| 284 |
+
should_build_canary = no_projects_built and infra_changed
|
| 285 |
+
if should_build_canary and not build_canary_project():
|
| 286 |
+
return 1
|
| 287 |
+
|
| 288 |
+
return 0
|
| 289 |
+
|
| 290 |
+
|
| 291 |
+
if __name__ == '__main__':
|
| 292 |
+
sys.exit(main())
|
local-test-libxml2-delta-02/fuzz-tooling/infra/ci/build_test.py
ADDED
|
@@ -0,0 +1,124 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# Copyright 2020 Google LLC
|
| 2 |
+
#
|
| 3 |
+
# Licensed under the Apache License, Version 2.0 (the "License");
|
| 4 |
+
# you may not use this file except in compliance with the License.
|
| 5 |
+
# You may obtain a copy of the License at
|
| 6 |
+
#
|
| 7 |
+
# http://www.apache.org/licenses/LICENSE-2.0
|
| 8 |
+
#
|
| 9 |
+
# Unless required by applicable law or agreed to in writing, software
|
| 10 |
+
# distributed under the License is distributed on an "AS IS" BASIS,
|
| 11 |
+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
| 12 |
+
# See the License for the specific language governing permissions and
|
| 13 |
+
# limitations under the License.
|
| 14 |
+
#
|
| 15 |
+
################################################################################
|
| 16 |
+
"""Tests for build.py"""
|
| 17 |
+
|
| 18 |
+
import os
|
| 19 |
+
import sys
|
| 20 |
+
import unittest
|
| 21 |
+
from unittest import mock
|
| 22 |
+
|
| 23 |
+
# pylint: disable=wrong-import-position
|
| 24 |
+
INFRA_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
|
| 25 |
+
sys.path.append(INFRA_DIR)
|
| 26 |
+
|
| 27 |
+
from ci import build
|
| 28 |
+
|
| 29 |
+
|
| 30 |
+
def patch_environ(testcase_obj):
|
| 31 |
+
"""Patch environment."""
|
| 32 |
+
env = {}
|
| 33 |
+
patcher = mock.patch.dict(os.environ, env)
|
| 34 |
+
testcase_obj.addCleanup(patcher.stop)
|
| 35 |
+
patcher.start()
|
| 36 |
+
|
| 37 |
+
|
| 38 |
+
def _set_coverage_build():
|
| 39 |
+
"""Set the right environment variables for a coverage build."""
|
| 40 |
+
os.environ['SANITIZER'] = 'coverage'
|
| 41 |
+
os.environ['ENGINE'] = 'libfuzzer'
|
| 42 |
+
os.environ['ARCHITECTURE'] = 'x86_64'
|
| 43 |
+
|
| 44 |
+
|
| 45 |
+
class TestShouldBuild(unittest.TestCase):
|
| 46 |
+
"""Tests that should_build() works as intended."""
|
| 47 |
+
|
| 48 |
+
def setUp(self):
|
| 49 |
+
patch_environ(self)
|
| 50 |
+
|
| 51 |
+
def test_none_engine_coverage_build(self):
|
| 52 |
+
"""Tests that should_build returns False for a coverage build of a
|
| 53 |
+
project that specifies 'none' for fuzzing_engines."""
|
| 54 |
+
_set_coverage_build()
|
| 55 |
+
project_yaml = {
|
| 56 |
+
'language': 'c++',
|
| 57 |
+
'fuzzing_engines': ['none'],
|
| 58 |
+
'sanitizers': ['address']
|
| 59 |
+
}
|
| 60 |
+
self.assertFalse(build.should_build(project_yaml))
|
| 61 |
+
|
| 62 |
+
def test_unspecified_engines_coverage_build(self):
|
| 63 |
+
"""Tests that should_build returns True for a coverage build of a
|
| 64 |
+
project that doesn't specify fuzzing_engines."""
|
| 65 |
+
_set_coverage_build()
|
| 66 |
+
project_yaml = {'language': 'c++'}
|
| 67 |
+
self.assertTrue(build.should_build(project_yaml))
|
| 68 |
+
|
| 69 |
+
def test_libfuzzer_coverage_build(self):
|
| 70 |
+
"""Tests that should_build returns True for coverage build of a project
|
| 71 |
+
specifying 'libfuzzer' for fuzzing_engines."""
|
| 72 |
+
_set_coverage_build()
|
| 73 |
+
project_yaml = {
|
| 74 |
+
'language': 'c++',
|
| 75 |
+
'fuzzing_engines': ['libfuzzer'],
|
| 76 |
+
'sanitizers': ['address']
|
| 77 |
+
}
|
| 78 |
+
self.assertTrue(build.should_build(project_yaml))
|
| 79 |
+
|
| 80 |
+
def test_go_coverage_build(self):
|
| 81 |
+
"""Tests that should_build returns True for coverage build of a project
|
| 82 |
+
specifying 'libfuzzer' for fuzzing_engines."""
|
| 83 |
+
_set_coverage_build()
|
| 84 |
+
project_yaml = {'language': 'go'}
|
| 85 |
+
self.assertTrue(build.should_build(project_yaml))
|
| 86 |
+
|
| 87 |
+
def test_engine_project_none_build(self):
|
| 88 |
+
"""Tests that should_build returns False for an engine: 'none' build when
|
| 89 |
+
the project doesn't specify engines."""
|
| 90 |
+
os.environ['SANITIZER'] = 'address'
|
| 91 |
+
os.environ['ENGINE'] = 'none'
|
| 92 |
+
os.environ['ARCHITECTURE'] = 'x86_64'
|
| 93 |
+
project_yaml = {
|
| 94 |
+
'language': 'c++',
|
| 95 |
+
'fuzzing_engines': ['libfuzzer'],
|
| 96 |
+
'sanitizers': ['address']
|
| 97 |
+
}
|
| 98 |
+
self.assertFalse(build.should_build(project_yaml))
|
| 99 |
+
|
| 100 |
+
def test_centipede_none_build(self):
|
| 101 |
+
"""Tests that should_build returns True for none sanitizer build of a
|
| 102 |
+
project specifying 'centipede' for fuzzing_engines."""
|
| 103 |
+
os.environ['SANITIZER'] = 'none'
|
| 104 |
+
os.environ['ENGINE'] = 'centipede'
|
| 105 |
+
os.environ['ARCHITECTURE'] = 'x86_64'
|
| 106 |
+
project_yaml = {
|
| 107 |
+
'language': 'c++',
|
| 108 |
+
'fuzzing_engines': ['centipede'],
|
| 109 |
+
'sanitizers': ['none']
|
| 110 |
+
}
|
| 111 |
+
self.assertTrue(build.should_build(project_yaml))
|
| 112 |
+
|
| 113 |
+
def test_centipede_address_build(self):
|
| 114 |
+
"""Tests that should_build returns True for address sanitizer build of a
|
| 115 |
+
project specifying 'centipede' for fuzzing_engines."""
|
| 116 |
+
os.environ['SANITIZER'] = 'address'
|
| 117 |
+
os.environ['ENGINE'] = 'centipede'
|
| 118 |
+
os.environ['ARCHITECTURE'] = 'x86_64'
|
| 119 |
+
project_yaml = {
|
| 120 |
+
'language': 'c++',
|
| 121 |
+
'fuzzing_engines': ['centipede'],
|
| 122 |
+
'sanitizers': ['address']
|
| 123 |
+
}
|
| 124 |
+
self.assertTrue(build.should_build(project_yaml))
|
local-test-libxml2-delta-02/fuzz-tooling/infra/cifuzz/affected_fuzz_targets.py
ADDED
|
@@ -0,0 +1,113 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# Copyright 2021 Google LLC
|
| 2 |
+
#
|
| 3 |
+
# Licensed under the Apache License, Version 2.0 (the "License");
|
| 4 |
+
# you may not use this file except in compliance with the License.
|
| 5 |
+
# You may obtain a copy of the License at
|
| 6 |
+
#
|
| 7 |
+
# http://www.apache.org/licenses/LICENSE-2.0
|
| 8 |
+
#
|
| 9 |
+
# Unless required by applicable law or agreed to in writing, software
|
| 10 |
+
# distributed under the License is distributed on an "AS IS" BASIS,
|
| 11 |
+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
| 12 |
+
# See the License for the specific language governing permissions and
|
| 13 |
+
# limitations under the License.
|
| 14 |
+
"""Module for dealing with fuzz targets affected by the change-under-test
|
| 15 |
+
(CUT)."""
|
| 16 |
+
import logging
|
| 17 |
+
import os
|
| 18 |
+
import sys
|
| 19 |
+
|
| 20 |
+
# pylint: disable=wrong-import-position,import-error
|
| 21 |
+
sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
| 22 |
+
import utils
|
| 23 |
+
|
| 24 |
+
|
| 25 |
+
def remove_unaffected_fuzz_targets(clusterfuzz_deployment, out_dir,
|
| 26 |
+
files_changed, repo_path):
|
| 27 |
+
"""Removes all non affected fuzz targets in the out directory.
|
| 28 |
+
|
| 29 |
+
Args:
|
| 30 |
+
clusterfuzz_deployment: The ClusterFuzz deployment object.
|
| 31 |
+
out_dir: The location of the fuzz target binaries.
|
| 32 |
+
files_changed: A list of files changed compared to HEAD.
|
| 33 |
+
repo_path: The location of the OSS-Fuzz repo in the docker image.
|
| 34 |
+
|
| 35 |
+
This function will not delete fuzz targets unless it knows that the fuzz
|
| 36 |
+
targets are unaffected. For example, this means that fuzz targets which don't
|
| 37 |
+
have coverage data on will not be deleted.
|
| 38 |
+
"""
|
| 39 |
+
if not files_changed:
|
| 40 |
+
# Don't remove any fuzz targets if there is no difference from HEAD.
|
| 41 |
+
logging.info('No files changed compared to HEAD.')
|
| 42 |
+
return
|
| 43 |
+
|
| 44 |
+
logging.info('Files changed in PR: %s', files_changed)
|
| 45 |
+
|
| 46 |
+
fuzz_target_paths = utils.get_fuzz_targets(out_dir)
|
| 47 |
+
if not fuzz_target_paths:
|
| 48 |
+
# Nothing to remove.
|
| 49 |
+
logging.error('No fuzz targets found in out dir.')
|
| 50 |
+
return
|
| 51 |
+
|
| 52 |
+
coverage = clusterfuzz_deployment.get_coverage(repo_path)
|
| 53 |
+
if not coverage:
|
| 54 |
+
# Don't remove any fuzz targets unless we have data.
|
| 55 |
+
logging.error('Could not find latest coverage report.')
|
| 56 |
+
return
|
| 57 |
+
|
| 58 |
+
affected_fuzz_targets = get_affected_fuzz_targets(coverage, fuzz_target_paths,
|
| 59 |
+
files_changed)
|
| 60 |
+
|
| 61 |
+
if not affected_fuzz_targets:
|
| 62 |
+
logging.info('No affected fuzz targets detected, keeping all as fallback.')
|
| 63 |
+
return
|
| 64 |
+
|
| 65 |
+
logging.info('Using affected fuzz targets: %s.', affected_fuzz_targets)
|
| 66 |
+
unaffected_fuzz_targets = set(fuzz_target_paths) - affected_fuzz_targets
|
| 67 |
+
logging.info('Removing unaffected fuzz targets: %s.', unaffected_fuzz_targets)
|
| 68 |
+
|
| 69 |
+
# Remove all the targets that are not affected.
|
| 70 |
+
for fuzz_target_path in unaffected_fuzz_targets:
|
| 71 |
+
try:
|
| 72 |
+
os.remove(fuzz_target_path)
|
| 73 |
+
except OSError as error:
|
| 74 |
+
logging.error('%s occurred while removing file %s', error,
|
| 75 |
+
fuzz_target_path)
|
| 76 |
+
|
| 77 |
+
|
| 78 |
+
def is_fuzz_target_affected(coverage, fuzz_target_path, files_changed):
|
| 79 |
+
"""Returns True if a fuzz target (|fuzz_target_path|) is affected by
|
| 80 |
+
|files_changed|."""
|
| 81 |
+
fuzz_target = os.path.basename(fuzz_target_path)
|
| 82 |
+
covered_files = coverage.get_files_covered_by_target(fuzz_target)
|
| 83 |
+
if not covered_files:
|
| 84 |
+
# Assume a fuzz target is affected if we can't get its coverage from
|
| 85 |
+
# OSS-Fuzz.
|
| 86 |
+
# TODO(metzman): Figure out what we should do if covered_files is [].
|
| 87 |
+
# Should we act as if we couldn't get the coverage?
|
| 88 |
+
logging.info('Could not get coverage for %s. Treating as affected.',
|
| 89 |
+
fuzz_target)
|
| 90 |
+
return True
|
| 91 |
+
|
| 92 |
+
covered_files = [
|
| 93 |
+
os.path.normpath(covered_file) for covered_file in covered_files
|
| 94 |
+
]
|
| 95 |
+
logging.info('Fuzz target %s is affected by: %s', fuzz_target, covered_files)
|
| 96 |
+
for filename in files_changed:
|
| 97 |
+
if filename in covered_files:
|
| 98 |
+
logging.info('Fuzz target %s is affected by changed file: %s',
|
| 99 |
+
fuzz_target, filename)
|
| 100 |
+
return True
|
| 101 |
+
|
| 102 |
+
logging.info('Fuzz target %s is not affected.', fuzz_target)
|
| 103 |
+
return False
|
| 104 |
+
|
| 105 |
+
|
| 106 |
+
def get_affected_fuzz_targets(coverage, fuzz_target_paths, files_changed):
|
| 107 |
+
"""Returns a list of paths of affected targets."""
|
| 108 |
+
affected_fuzz_targets = set()
|
| 109 |
+
for fuzz_target_path in fuzz_target_paths:
|
| 110 |
+
if is_fuzz_target_affected(coverage, fuzz_target_path, files_changed):
|
| 111 |
+
affected_fuzz_targets.add(fuzz_target_path)
|
| 112 |
+
|
| 113 |
+
return affected_fuzz_targets
|
local-test-libxml2-delta-02/fuzz-tooling/infra/cifuzz/build-images.sh
ADDED
|
@@ -0,0 +1,34 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
#! /bin/bash -eux
|
| 2 |
+
# Copyright 2021 Google LLC
|
| 3 |
+
#
|
| 4 |
+
# Licensed under the Apache License, Version 2.0 (the "License");
|
| 5 |
+
# you may not use this file except in compliance with the License.
|
| 6 |
+
# You may obtain a copy of the License at
|
| 7 |
+
#
|
| 8 |
+
# http://www.apache.org/licenses/LICENSE-2.0
|
| 9 |
+
#
|
| 10 |
+
# Unless required by applicable law or agreed to in writing, software
|
| 11 |
+
# distributed under the License is distributed on an "AS IS" BASIS,
|
| 12 |
+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
| 13 |
+
# See the License for the specific language governing permissions and
|
| 14 |
+
# limitations under the License.
|
| 15 |
+
|
| 16 |
+
# Script for building the docker images for cifuzz.
|
| 17 |
+
|
| 18 |
+
CIFUZZ_DIR=$(dirname "$0")
|
| 19 |
+
CIFUZZ_DIR=$(realpath $CIFUZZ_DIR)
|
| 20 |
+
INFRA_DIR=$(realpath $CIFUZZ_DIR/..)
|
| 21 |
+
OSS_FUZZ_ROOT=$(realpath $INFRA_DIR/..)
|
| 22 |
+
|
| 23 |
+
# Build cifuzz-base.
|
| 24 |
+
docker build --tag ghcr.io/aixcc-finals/cifuzz-base --file $CIFUZZ_DIR/cifuzz-base/Dockerfile $OSS_FUZZ_ROOT
|
| 25 |
+
|
| 26 |
+
# Build run-fuzzers and build-fuzzers images.
|
| 27 |
+
docker build \
|
| 28 |
+
--tag ghcr.io/aixcc-finals/clusterfuzzlite-build-fuzzers-test:v1 \
|
| 29 |
+
--tag ghcr.io/aixcc-finals/clusterfuzzlite-build-fuzzers:v1 \
|
| 30 |
+
--file $INFRA_DIR/build_fuzzers.Dockerfile $INFRA_DIR
|
| 31 |
+
docker build \
|
| 32 |
+
--tag ghcr.io/aixcc-finals/clusterfuzzlite-run-fuzzers:v1 \
|
| 33 |
+
--tag ghcr.io/aixcc-finals/clusterfuzzlite-run-fuzzers-test:v1 \
|
| 34 |
+
--file $INFRA_DIR/run_fuzzers.Dockerfile $INFRA_DIR
|
local-test-libxml2-delta-02/fuzz-tooling/infra/cifuzz/clusterfuzz_deployment.py
ADDED
|
@@ -0,0 +1,385 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# Copyright 2021 Google LLC
|
| 2 |
+
#
|
| 3 |
+
# Licensed under the Apache License, Version 2.0 (the "License");
|
| 4 |
+
# you may not use this file except in compliance with the License.
|
| 5 |
+
# You may obtain a copy of the License at
|
| 6 |
+
#
|
| 7 |
+
# http://www.apache.org/licenses/LICENSE-2.0
|
| 8 |
+
#
|
| 9 |
+
# Unless required by applicable law or agreed to in writing, software
|
| 10 |
+
# distributed under the License is distributed on an "AS IS" BASIS,
|
| 11 |
+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
| 12 |
+
# See the License for the specific language governing permissions and
|
| 13 |
+
# limitations under the License.
|
| 14 |
+
"""Module for interacting with the ClusterFuzz deployment."""
|
| 15 |
+
import logging
|
| 16 |
+
import os
|
| 17 |
+
import sys
|
| 18 |
+
import urllib.error
|
| 19 |
+
import urllib.request
|
| 20 |
+
|
| 21 |
+
import config_utils
|
| 22 |
+
import continuous_integration
|
| 23 |
+
import filestore_utils
|
| 24 |
+
import http_utils
|
| 25 |
+
import get_coverage
|
| 26 |
+
import repo_manager
|
| 27 |
+
|
| 28 |
+
# pylint: disable=wrong-import-position,import-error
|
| 29 |
+
sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
| 30 |
+
import utils
|
| 31 |
+
|
| 32 |
+
|
| 33 |
+
class BaseClusterFuzzDeployment:
|
| 34 |
+
"""Base class for ClusterFuzz deployments."""
|
| 35 |
+
|
| 36 |
+
def __init__(self, config, workspace):
|
| 37 |
+
self.config = config
|
| 38 |
+
self.workspace = workspace
|
| 39 |
+
self.ci_system = continuous_integration.get_ci(config)
|
| 40 |
+
|
| 41 |
+
def download_latest_build(self):
|
| 42 |
+
"""Downloads the latest build from ClusterFuzz.
|
| 43 |
+
|
| 44 |
+
Returns:
|
| 45 |
+
A path to where the OSS-Fuzz build was stored, or None if it wasn't.
|
| 46 |
+
"""
|
| 47 |
+
raise NotImplementedError('Child class must implement method.')
|
| 48 |
+
|
| 49 |
+
def upload_build(self, commit):
|
| 50 |
+
"""Uploads the build with the given commit sha to the filestore."""
|
| 51 |
+
raise NotImplementedError('Child class must implement method.')
|
| 52 |
+
|
| 53 |
+
def download_corpus(self, target_name, corpus_dir):
|
| 54 |
+
"""Downloads the corpus for |target_name| from ClusterFuzz to |corpus_dir|.
|
| 55 |
+
|
| 56 |
+
Returns:
|
| 57 |
+
A path to where the OSS-Fuzz build was stored, or None if it wasn't.
|
| 58 |
+
"""
|
| 59 |
+
raise NotImplementedError('Child class must implement method.')
|
| 60 |
+
|
| 61 |
+
def upload_crashes(self):
|
| 62 |
+
"""Uploads crashes in |crashes_dir| to filestore."""
|
| 63 |
+
raise NotImplementedError('Child class must implement method.')
|
| 64 |
+
|
| 65 |
+
def upload_corpus(self, target_name, corpus_dir, replace=False): # pylint: disable=no-self-use,unused-argument
|
| 66 |
+
"""Uploads the corpus for |target_name| to filestore."""
|
| 67 |
+
raise NotImplementedError('Child class must implement method.')
|
| 68 |
+
|
| 69 |
+
def upload_coverage(self):
|
| 70 |
+
"""Uploads the coverage report to the filestore."""
|
| 71 |
+
raise NotImplementedError('Child class must implement method.')
|
| 72 |
+
|
| 73 |
+
def get_coverage(self, repo_path):
|
| 74 |
+
"""Returns the project coverage object for the project."""
|
| 75 |
+
raise NotImplementedError('Child class must implement method.')
|
| 76 |
+
|
| 77 |
+
|
| 78 |
+
def _make_empty_dir_if_nonexistent(path):
|
| 79 |
+
"""Makes an empty directory at |path| if it does not exist."""
|
| 80 |
+
os.makedirs(path, exist_ok=True)
|
| 81 |
+
|
| 82 |
+
|
| 83 |
+
class ClusterFuzzLite(BaseClusterFuzzDeployment):
|
| 84 |
+
"""Class representing a deployment of ClusterFuzzLite."""
|
| 85 |
+
|
| 86 |
+
COVERAGE_NAME = 'latest'
|
| 87 |
+
LATEST_BUILD_WINDOW = 3
|
| 88 |
+
|
| 89 |
+
def __init__(self, config, workspace):
|
| 90 |
+
super().__init__(config, workspace)
|
| 91 |
+
self.filestore = filestore_utils.get_filestore(self.config)
|
| 92 |
+
|
| 93 |
+
def download_latest_build(self):
|
| 94 |
+
if os.path.exists(self.workspace.clusterfuzz_build):
|
| 95 |
+
# This path is necessary because download_latest_build can be called
|
| 96 |
+
# multiple times.That is the case because it is called only when we need
|
| 97 |
+
# to see if a bug is novel, i.e. until we want to check a bug is novel we
|
| 98 |
+
# don't want to waste time calling this, but therefore this method can be
|
| 99 |
+
# called if multiple bugs are found.
|
| 100 |
+
return self.workspace.clusterfuzz_build
|
| 101 |
+
|
| 102 |
+
repo_dir = self.ci_system.repo_dir
|
| 103 |
+
if not repo_dir:
|
| 104 |
+
raise RuntimeError('Repo checkout does not exist.')
|
| 105 |
+
|
| 106 |
+
_make_empty_dir_if_nonexistent(self.workspace.clusterfuzz_build)
|
| 107 |
+
repo = repo_manager.RepoManager(repo_dir)
|
| 108 |
+
|
| 109 |
+
diff_base = self.ci_system.get_diff_base()
|
| 110 |
+
if not diff_base:
|
| 111 |
+
diff_base = 'HEAD^'
|
| 112 |
+
|
| 113 |
+
# Builds are stored by commit, so try the latest |LATEST_BUILD_WINDOW|
|
| 114 |
+
# commits before the current diff base.
|
| 115 |
+
# TODO(ochang): If API usage becomes an issue, this can be optimized by the
|
| 116 |
+
# filestore accepting a list of filenames to try.
|
| 117 |
+
try:
|
| 118 |
+
# TODO(metzman): Why do we default to 'origin', we should avoid going down
|
| 119 |
+
# this path entirely and not need to catch an exception.
|
| 120 |
+
commit_list = repo.get_commit_list(diff_base,
|
| 121 |
+
limit=self.LATEST_BUILD_WINDOW)
|
| 122 |
+
except ValueError as error:
|
| 123 |
+
logging.error('Can\'t get commit list: %s', error)
|
| 124 |
+
return None
|
| 125 |
+
|
| 126 |
+
for old_commit in commit_list:
|
| 127 |
+
logging.info('Trying to downloading previous build %s.', old_commit)
|
| 128 |
+
build_name = self._get_build_name(old_commit)
|
| 129 |
+
try:
|
| 130 |
+
if self.filestore.download_build(build_name,
|
| 131 |
+
self.workspace.clusterfuzz_build):
|
| 132 |
+
logging.info('Done downloading previous build.')
|
| 133 |
+
return self.workspace.clusterfuzz_build
|
| 134 |
+
|
| 135 |
+
logging.info('Build for %s does not exist.', old_commit)
|
| 136 |
+
except Exception as err: # pylint: disable=broad-except
|
| 137 |
+
logging.error('Could not download build for %s because of: %s',
|
| 138 |
+
old_commit, err)
|
| 139 |
+
|
| 140 |
+
return None
|
| 141 |
+
|
| 142 |
+
def download_corpus(self, target_name, corpus_dir):
|
| 143 |
+
_make_empty_dir_if_nonexistent(corpus_dir)
|
| 144 |
+
logging.info('Downloading corpus for %s to %s.', target_name, corpus_dir)
|
| 145 |
+
corpus_name = self._get_corpus_name(target_name)
|
| 146 |
+
try:
|
| 147 |
+
self.filestore.download_corpus(corpus_name, corpus_dir)
|
| 148 |
+
logging.info('Done downloading corpus. Contains %d elements.',
|
| 149 |
+
len(os.listdir(corpus_dir)))
|
| 150 |
+
except Exception as err: # pylint: disable=broad-except
|
| 151 |
+
logging.error('Failed to download corpus for target: %s. Error: %s',
|
| 152 |
+
target_name, str(err))
|
| 153 |
+
return corpus_dir
|
| 154 |
+
|
| 155 |
+
def _get_build_name(self, name):
|
| 156 |
+
return f'{self.config.sanitizer}-{name}'
|
| 157 |
+
|
| 158 |
+
def _get_corpus_name(self, target_name): # pylint: disable=no-self-use
|
| 159 |
+
"""Returns the name of the corpus artifact."""
|
| 160 |
+
return target_name
|
| 161 |
+
|
| 162 |
+
def upload_corpus(self, target_name, corpus_dir, replace=False):
|
| 163 |
+
"""Upload the corpus produced by |target_name|."""
|
| 164 |
+
logging.info('Uploading corpus in %s for %s.', corpus_dir, target_name)
|
| 165 |
+
name = self._get_corpus_name(target_name)
|
| 166 |
+
try:
|
| 167 |
+
self.filestore.upload_corpus(name, corpus_dir, replace=replace)
|
| 168 |
+
logging.info('Done uploading corpus.')
|
| 169 |
+
except Exception as err: # pylint: disable=broad-except
|
| 170 |
+
logging.error('Failed to upload corpus for target: %s. Error: %s.',
|
| 171 |
+
target_name, err)
|
| 172 |
+
|
| 173 |
+
def upload_build(self, commit):
|
| 174 |
+
"""Upload the build produced by CIFuzz as the latest build."""
|
| 175 |
+
logging.info('Uploading latest build in %s.', self.workspace.out)
|
| 176 |
+
build_name = self._get_build_name(commit)
|
| 177 |
+
try:
|
| 178 |
+
result = self.filestore.upload_build(build_name, self.workspace.out)
|
| 179 |
+
logging.info('Done uploading latest build.')
|
| 180 |
+
return result
|
| 181 |
+
except Exception as err: # pylint: disable=broad-except
|
| 182 |
+
logging.error('Failed to upload latest build: %s. Error: %s',
|
| 183 |
+
self.workspace.out, err)
|
| 184 |
+
|
| 185 |
+
def upload_crashes(self):
|
| 186 |
+
"""Uploads crashes."""
|
| 187 |
+
artifact_dirs = os.listdir(self.workspace.artifacts)
|
| 188 |
+
if not artifact_dirs:
|
| 189 |
+
logging.info('No crashes in %s. Not uploading.', self.workspace.artifacts)
|
| 190 |
+
return
|
| 191 |
+
|
| 192 |
+
for crash_target in artifact_dirs:
|
| 193 |
+
artifact_dir = os.path.join(self.workspace.artifacts, crash_target)
|
| 194 |
+
if not os.path.isdir(artifact_dir):
|
| 195 |
+
logging.warning('%s is not an expected artifact directory, skipping.',
|
| 196 |
+
crash_target)
|
| 197 |
+
continue
|
| 198 |
+
|
| 199 |
+
logging.info('Uploading crashes in %s.', artifact_dir)
|
| 200 |
+
try:
|
| 201 |
+
self.filestore.upload_crashes(crash_target, artifact_dir)
|
| 202 |
+
logging.info('Done uploading crashes.')
|
| 203 |
+
except Exception as err: # pylint: disable=broad-except
|
| 204 |
+
logging.error('Failed to upload crashes. Error: %s', err)
|
| 205 |
+
|
| 206 |
+
def upload_coverage(self):
|
| 207 |
+
"""Uploads the coverage report to the filestore."""
|
| 208 |
+
self.filestore.upload_coverage(self.COVERAGE_NAME,
|
| 209 |
+
self.workspace.coverage_report)
|
| 210 |
+
|
| 211 |
+
def get_coverage(self, repo_path):
|
| 212 |
+
"""Returns the project coverage object for the project."""
|
| 213 |
+
_make_empty_dir_if_nonexistent(self.workspace.clusterfuzz_coverage)
|
| 214 |
+
try:
|
| 215 |
+
if not self.filestore.download_coverage(
|
| 216 |
+
self.COVERAGE_NAME, self.workspace.clusterfuzz_coverage):
|
| 217 |
+
logging.error('Could not download coverage.')
|
| 218 |
+
return None
|
| 219 |
+
return get_coverage.FilesystemCoverage(
|
| 220 |
+
repo_path, self.workspace.clusterfuzz_coverage)
|
| 221 |
+
except Exception as err: # pylint: disable=broad-except
|
| 222 |
+
logging.error('Could not get coverage: %s.', err)
|
| 223 |
+
return None
|
| 224 |
+
|
| 225 |
+
|
| 226 |
+
class OSSFuzz(BaseClusterFuzzDeployment):
|
| 227 |
+
"""The OSS-Fuzz ClusterFuzz deployment."""
|
| 228 |
+
|
| 229 |
+
# Location of clusterfuzz builds on GCS.
|
| 230 |
+
CLUSTERFUZZ_BUILDS = 'clusterfuzz-builds'
|
| 231 |
+
|
| 232 |
+
# Zip file name containing the corpus.
|
| 233 |
+
CORPUS_ZIP_NAME = 'public.zip'
|
| 234 |
+
|
| 235 |
+
def get_latest_build_name(self):
|
| 236 |
+
"""Gets the name of the latest OSS-Fuzz build of a project.
|
| 237 |
+
|
| 238 |
+
Returns:
|
| 239 |
+
A string with the latest build version or None.
|
| 240 |
+
"""
|
| 241 |
+
version_file = (
|
| 242 |
+
f'{self.config.oss_fuzz_project_name}-{self.config.sanitizer}'
|
| 243 |
+
'-latest.version')
|
| 244 |
+
version_url = utils.url_join(utils.GCS_BASE_URL, self.CLUSTERFUZZ_BUILDS,
|
| 245 |
+
self.config.oss_fuzz_project_name,
|
| 246 |
+
version_file)
|
| 247 |
+
try:
|
| 248 |
+
response = urllib.request.urlopen(version_url)
|
| 249 |
+
except urllib.error.HTTPError:
|
| 250 |
+
logging.error('Error getting latest build version for %s from: %s.',
|
| 251 |
+
self.config.oss_fuzz_project_name, version_url)
|
| 252 |
+
return None
|
| 253 |
+
return response.read().decode()
|
| 254 |
+
|
| 255 |
+
def download_latest_build(self):
|
| 256 |
+
"""Downloads the latest OSS-Fuzz build from GCS.
|
| 257 |
+
|
| 258 |
+
Returns:
|
| 259 |
+
A path to where the OSS-Fuzz build was stored, or None if it wasn't.
|
| 260 |
+
"""
|
| 261 |
+
if os.path.exists(self.workspace.clusterfuzz_build):
|
| 262 |
+
# This function can be called multiple times, don't download the build
|
| 263 |
+
# again.
|
| 264 |
+
return self.workspace.clusterfuzz_build
|
| 265 |
+
|
| 266 |
+
_make_empty_dir_if_nonexistent(self.workspace.clusterfuzz_build)
|
| 267 |
+
|
| 268 |
+
latest_build_name = self.get_latest_build_name()
|
| 269 |
+
if not latest_build_name:
|
| 270 |
+
return None
|
| 271 |
+
|
| 272 |
+
logging.info('Downloading latest build.')
|
| 273 |
+
oss_fuzz_build_url = utils.url_join(utils.GCS_BASE_URL,
|
| 274 |
+
self.CLUSTERFUZZ_BUILDS,
|
| 275 |
+
self.config.oss_fuzz_project_name,
|
| 276 |
+
latest_build_name)
|
| 277 |
+
if http_utils.download_and_unpack_zip(oss_fuzz_build_url,
|
| 278 |
+
self.workspace.clusterfuzz_build):
|
| 279 |
+
logging.info('Done downloading latest build.')
|
| 280 |
+
return self.workspace.clusterfuzz_build
|
| 281 |
+
|
| 282 |
+
return None
|
| 283 |
+
|
| 284 |
+
def upload_build(self, commit): # pylint: disable=no-self-use
|
| 285 |
+
"""Noop Implementation of upload_build."""
|
| 286 |
+
logging.info('Not uploading latest build because on OSS-Fuzz.')
|
| 287 |
+
|
| 288 |
+
def upload_corpus(self, target_name, corpus_dir, replace=False): # pylint: disable=no-self-use,unused-argument
|
| 289 |
+
"""Noop Implementation of upload_corpus."""
|
| 290 |
+
logging.info('Not uploading corpus because on OSS-Fuzz.')
|
| 291 |
+
|
| 292 |
+
def upload_crashes(self): # pylint: disable=no-self-use
|
| 293 |
+
"""Noop Implementation of upload_crashes."""
|
| 294 |
+
logging.info('Not uploading crashes because on OSS-Fuzz.')
|
| 295 |
+
|
| 296 |
+
def download_corpus(self, target_name, corpus_dir):
|
| 297 |
+
"""Downloads the latest OSS-Fuzz corpus for the target.
|
| 298 |
+
|
| 299 |
+
Returns:
|
| 300 |
+
The local path to to corpus or None if download failed.
|
| 301 |
+
"""
|
| 302 |
+
_make_empty_dir_if_nonexistent(corpus_dir)
|
| 303 |
+
project_qualified_fuzz_target_name = target_name
|
| 304 |
+
qualified_name_prefix = self.config.oss_fuzz_project_name + '_'
|
| 305 |
+
if not target_name.startswith(qualified_name_prefix):
|
| 306 |
+
project_qualified_fuzz_target_name = qualified_name_prefix + target_name
|
| 307 |
+
|
| 308 |
+
corpus_url = (f'{utils.GCS_BASE_URL}{self.config.oss_fuzz_project_name}'
|
| 309 |
+
'-backup.clusterfuzz-external.appspot.com/corpus/'
|
| 310 |
+
f'libFuzzer/{project_qualified_fuzz_target_name}/'
|
| 311 |
+
f'{self.CORPUS_ZIP_NAME}')
|
| 312 |
+
logging.info('Downloading corpus from OSS-Fuzz: %s', corpus_url)
|
| 313 |
+
|
| 314 |
+
if not http_utils.download_and_unpack_zip(corpus_url, corpus_dir):
|
| 315 |
+
logging.warning('Failed to download corpus for %s.', target_name)
|
| 316 |
+
return corpus_dir
|
| 317 |
+
|
| 318 |
+
def upload_coverage(self):
|
| 319 |
+
"""Noop Implementation of upload_coverage_report."""
|
| 320 |
+
logging.info('Not uploading coverage report because on OSS-Fuzz.')
|
| 321 |
+
|
| 322 |
+
def get_coverage(self, repo_path):
|
| 323 |
+
"""Returns the project coverage object for the project."""
|
| 324 |
+
try:
|
| 325 |
+
return get_coverage.OSSFuzzCoverage(repo_path,
|
| 326 |
+
self.config.oss_fuzz_project_name)
|
| 327 |
+
except get_coverage.CoverageError:
|
| 328 |
+
return None
|
| 329 |
+
|
| 330 |
+
|
| 331 |
+
class NoClusterFuzzDeployment(BaseClusterFuzzDeployment):
|
| 332 |
+
"""ClusterFuzzDeployment implementation used when there is no deployment of
|
| 333 |
+
ClusterFuzz to use."""
|
| 334 |
+
|
| 335 |
+
def upload_build(self, commit): # pylint: disable=no-self-use
|
| 336 |
+
"""Noop Implementation of upload_build."""
|
| 337 |
+
logging.info('Not uploading latest build because no ClusterFuzz '
|
| 338 |
+
'deployment.')
|
| 339 |
+
|
| 340 |
+
def upload_corpus(self, target_name, corpus_dir, replace=False): # pylint: disable=no-self-use,unused-argument
|
| 341 |
+
"""Noop Implementation of upload_corpus."""
|
| 342 |
+
logging.info('Not uploading corpus because no ClusterFuzz deployment.')
|
| 343 |
+
|
| 344 |
+
def upload_crashes(self): # pylint: disable=no-self-use
|
| 345 |
+
"""Noop Implementation of upload_crashes."""
|
| 346 |
+
logging.info('Not uploading crashes because no ClusterFuzz deployment.')
|
| 347 |
+
|
| 348 |
+
def download_corpus(self, target_name, corpus_dir):
|
| 349 |
+
"""Noop Implementation of download_corpus."""
|
| 350 |
+
logging.info('Not downloading corpus because no ClusterFuzz deployment.')
|
| 351 |
+
return _make_empty_dir_if_nonexistent(corpus_dir)
|
| 352 |
+
|
| 353 |
+
def download_latest_build(self): # pylint: disable=no-self-use
|
| 354 |
+
"""Noop Implementation of download_latest_build."""
|
| 355 |
+
logging.info(
|
| 356 |
+
'Not downloading latest build because no ClusterFuzz deployment.')
|
| 357 |
+
|
| 358 |
+
def upload_coverage(self):
|
| 359 |
+
"""Noop Implementation of upload_coverage."""
|
| 360 |
+
logging.info(
|
| 361 |
+
'Not uploading coverage report because no ClusterFuzz deployment.')
|
| 362 |
+
|
| 363 |
+
def get_coverage(self, repo_path):
|
| 364 |
+
"""Noop Implementation of get_coverage."""
|
| 365 |
+
logging.info(
|
| 366 |
+
'Not getting project coverage because no ClusterFuzz deployment.')
|
| 367 |
+
|
| 368 |
+
|
| 369 |
+
_PLATFORM_CLUSTERFUZZ_DEPLOYMENT_MAPPING = {
|
| 370 |
+
config_utils.BaseConfig.Platform.INTERNAL_GENERIC_CI: OSSFuzz,
|
| 371 |
+
config_utils.BaseConfig.Platform.INTERNAL_GITHUB: OSSFuzz,
|
| 372 |
+
config_utils.BaseConfig.Platform.EXTERNAL_GENERIC_CI: ClusterFuzzLite,
|
| 373 |
+
config_utils.BaseConfig.Platform.EXTERNAL_GITHUB: ClusterFuzzLite,
|
| 374 |
+
}
|
| 375 |
+
|
| 376 |
+
|
| 377 |
+
def get_clusterfuzz_deployment(config, workspace):
|
| 378 |
+
"""Returns object reprsenting deployment of ClusterFuzz used by |config|."""
|
| 379 |
+
deployment_cls = _PLATFORM_CLUSTERFUZZ_DEPLOYMENT_MAPPING[config.platform]
|
| 380 |
+
if config.no_clusterfuzz_deployment:
|
| 381 |
+
logging.info('Overriding ClusterFuzzDeployment. Using None.')
|
| 382 |
+
deployment_cls = NoClusterFuzzDeployment
|
| 383 |
+
result = deployment_cls(config, workspace)
|
| 384 |
+
logging.info('ClusterFuzzDeployment: %s.', result)
|
| 385 |
+
return result
|
local-test-libxml2-delta-02/fuzz-tooling/infra/cifuzz/docker.py
ADDED
|
@@ -0,0 +1,127 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# Copyright 2021 Google LLC
|
| 2 |
+
#
|
| 3 |
+
# Licensed under the Apache License, Version 2.0 (the "License");
|
| 4 |
+
# you may not use this file except in compliance with the License.
|
| 5 |
+
# You may obtain a copy of the License at
|
| 6 |
+
#
|
| 7 |
+
# http://www.apache.org/licenses/LICENSE-2.0
|
| 8 |
+
#
|
| 9 |
+
# Unless required by applicable law or agreed to in writing, software
|
| 10 |
+
# distributed under the License is distributed on an "AS IS" BASIS,
|
| 11 |
+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
| 12 |
+
# See the License for the specific language governing permissions and
|
| 13 |
+
# limitations under the License.
|
| 14 |
+
"""Module for dealing with docker."""
|
| 15 |
+
import logging
|
| 16 |
+
import os
|
| 17 |
+
import sys
|
| 18 |
+
import uuid
|
| 19 |
+
|
| 20 |
+
# pylint: disable=wrong-import-position,import-error
|
| 21 |
+
sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
| 22 |
+
|
| 23 |
+
import constants
|
| 24 |
+
import utils
|
| 25 |
+
import environment
|
| 26 |
+
|
| 27 |
+
BASE_BUILDER_TAG = 'ghcr.io/aixcc-finals/base-builder'
|
| 28 |
+
PROJECT_TAG_PREFIX = 'gcr.io/oss-fuzz/'
|
| 29 |
+
|
| 30 |
+
# Default fuzz configuration.
|
| 31 |
+
_DEFAULT_DOCKER_RUN_ARGS = [
|
| 32 |
+
'-e', 'FUZZING_ENGINE=' + constants.DEFAULT_ENGINE, '-e', 'CIFUZZ=True'
|
| 33 |
+
]
|
| 34 |
+
|
| 35 |
+
UNIQUE_ID_SUFFIX = '-' + uuid.uuid4().hex
|
| 36 |
+
|
| 37 |
+
# TODO(metzman): Make run_fuzzers able to delete this image.
|
| 38 |
+
EXTERNAL_PROJECT_IMAGE = 'external-cfl-project' + UNIQUE_ID_SUFFIX
|
| 39 |
+
|
| 40 |
+
_DEFAULT_DOCKER_RUN_COMMAND = [
|
| 41 |
+
'docker',
|
| 42 |
+
'run',
|
| 43 |
+
'--rm',
|
| 44 |
+
'--privileged',
|
| 45 |
+
]
|
| 46 |
+
|
| 47 |
+
|
| 48 |
+
def get_docker_env_vars(env_mapping):
|
| 49 |
+
"""Returns a list of docker arguments that sets each key in |env_mapping| as
|
| 50 |
+
an env var and the value of that key in |env_mapping| as the value."""
|
| 51 |
+
env_var_args = []
|
| 52 |
+
for env_var, env_var_val in env_mapping.items():
|
| 53 |
+
env_var_args.extend(['-e', f'{env_var}={env_var_val}'])
|
| 54 |
+
return env_var_args
|
| 55 |
+
|
| 56 |
+
|
| 57 |
+
def get_project_image_name(project):
|
| 58 |
+
"""Returns the name of the project builder image for |project_name|."""
|
| 59 |
+
# TODO(jonathanmetzman): We may need unique names to support parallel fuzzing
|
| 60 |
+
# for CIFuzz (like CFL supports). Don't do this for now because no one has
|
| 61 |
+
# asked for it and build_specified_commit would need to be modified to support
|
| 62 |
+
# this.
|
| 63 |
+
if project:
|
| 64 |
+
return PROJECT_TAG_PREFIX + project
|
| 65 |
+
|
| 66 |
+
return EXTERNAL_PROJECT_IMAGE
|
| 67 |
+
|
| 68 |
+
|
| 69 |
+
def delete_images(images):
|
| 70 |
+
"""Deletes |images|."""
|
| 71 |
+
command = ['docker', 'rmi', '-f'] + images
|
| 72 |
+
utils.execute(command)
|
| 73 |
+
utils.execute(['docker', 'builder', 'prune', '-f'])
|
| 74 |
+
|
| 75 |
+
|
| 76 |
+
def get_base_docker_run_args(workspace,
|
| 77 |
+
sanitizer=constants.DEFAULT_SANITIZER,
|
| 78 |
+
language=constants.DEFAULT_LANGUAGE,
|
| 79 |
+
architecture=constants.DEFAULT_ARCHITECTURE,
|
| 80 |
+
docker_in_docker=False):
|
| 81 |
+
"""Returns arguments that should be passed to every invocation of 'docker
|
| 82 |
+
run'."""
|
| 83 |
+
docker_args = _DEFAULT_DOCKER_RUN_ARGS.copy()
|
| 84 |
+
env_mapping = {
|
| 85 |
+
'SANITIZER': sanitizer,
|
| 86 |
+
'ARCHITECTURE': architecture,
|
| 87 |
+
'FUZZING_LANGUAGE': language,
|
| 88 |
+
'OUT': workspace.out
|
| 89 |
+
}
|
| 90 |
+
docker_args += get_docker_env_vars(env_mapping)
|
| 91 |
+
docker_container = environment.get('CFL_CONTAINER_ID',
|
| 92 |
+
utils.get_container_name())
|
| 93 |
+
logging.info('Docker container: %s.', docker_container)
|
| 94 |
+
if docker_container and not docker_in_docker:
|
| 95 |
+
# Don't map specific volumes if in a docker container, it breaks when
|
| 96 |
+
# running a sibling container.
|
| 97 |
+
docker_args += ['--volumes-from', docker_container]
|
| 98 |
+
else:
|
| 99 |
+
docker_args += _get_args_mapping_host_path_to_container(workspace.workspace)
|
| 100 |
+
return docker_args, docker_container
|
| 101 |
+
|
| 102 |
+
|
| 103 |
+
def get_base_docker_run_command(workspace,
|
| 104 |
+
sanitizer=constants.DEFAULT_SANITIZER,
|
| 105 |
+
language=constants.DEFAULT_LANGUAGE,
|
| 106 |
+
architecture=constants.DEFAULT_ARCHITECTURE,
|
| 107 |
+
docker_in_docker=False):
|
| 108 |
+
"""Returns part of the command that should be used everytime 'docker run' is
|
| 109 |
+
invoked."""
|
| 110 |
+
docker_args, docker_container = get_base_docker_run_args(
|
| 111 |
+
workspace,
|
| 112 |
+
sanitizer,
|
| 113 |
+
language,
|
| 114 |
+
architecture,
|
| 115 |
+
docker_in_docker=docker_in_docker)
|
| 116 |
+
command = _DEFAULT_DOCKER_RUN_COMMAND.copy() + docker_args
|
| 117 |
+
return command, docker_container
|
| 118 |
+
|
| 119 |
+
|
| 120 |
+
def _get_args_mapping_host_path_to_container(host_path, container_path=None):
|
| 121 |
+
"""Get arguments to docker run that will map |host_path| a path on the host to
|
| 122 |
+
a path in the container. If |container_path| is specified, that path is mapped
|
| 123 |
+
to. If not, then |host_path| is mapped to itself in the container."""
|
| 124 |
+
# WARNING: Do not use this function when running in production (and
|
| 125 |
+
# --volumes-from) is used for mapping volumes. It will break production.
|
| 126 |
+
container_path = host_path if container_path is None else container_path
|
| 127 |
+
return ['-v', f'{host_path}:{container_path}']
|
local-test-libxml2-delta-02/fuzz-tooling/infra/cifuzz/requirements.txt
ADDED
|
@@ -0,0 +1,4 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
clusterfuzz==2.5.9
|
| 2 |
+
requests==2.28.0
|
| 3 |
+
protobuf==3.20.2
|
| 4 |
+
gsutil==5.20
|
local-test-libxml2-delta-02/fuzz-tooling/infra/cifuzz/run_fuzzers_entrypoint.py
ADDED
|
@@ -0,0 +1,97 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# Copyright 2020 Google LLC
|
| 2 |
+
#
|
| 3 |
+
# Licensed under the Apache License, Version 2.0 (the "License");
|
| 4 |
+
# you may not use this file except in compliance with the License.
|
| 5 |
+
# You may obtain a copy of the License at
|
| 6 |
+
#
|
| 7 |
+
# http://www.apache.org/licenses/LICENSE-2.0
|
| 8 |
+
#
|
| 9 |
+
# Unless required by applicable law or agreed to in writing, software
|
| 10 |
+
# distributed under the License is distributed on an "AS IS" BASIS,
|
| 11 |
+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
| 12 |
+
# See the License for the specific language governing permissions and
|
| 13 |
+
# limitations under the License.
|
| 14 |
+
"""Runs a specific OSS-Fuzz project's fuzzers for CI tools."""
|
| 15 |
+
import logging
|
| 16 |
+
import sys
|
| 17 |
+
|
| 18 |
+
import config_utils
|
| 19 |
+
import docker
|
| 20 |
+
import logs
|
| 21 |
+
import run_fuzzers
|
| 22 |
+
|
| 23 |
+
# pylint: disable=c-extension-no-member
|
| 24 |
+
# pylint gets confused because of the relative import of cifuzz.
|
| 25 |
+
|
| 26 |
+
logs.init()
|
| 27 |
+
|
| 28 |
+
|
| 29 |
+
def delete_unneeded_docker_images(config):
|
| 30 |
+
"""Deletes unneeded docker images if running in an environment with low
|
| 31 |
+
disk space."""
|
| 32 |
+
if not config.low_disk_space:
|
| 33 |
+
return
|
| 34 |
+
logging.info('Deleting builder docker images to save disk space.')
|
| 35 |
+
project_image = docker.get_project_image_name(config.oss_fuzz_project_name)
|
| 36 |
+
images = [
|
| 37 |
+
project_image,
|
| 38 |
+
docker.BASE_BUILDER_TAG,
|
| 39 |
+
docker.BASE_BUILDER_TAG + '-go',
|
| 40 |
+
docker.BASE_BUILDER_TAG + '-javascript',
|
| 41 |
+
docker.BASE_BUILDER_TAG + '-jvm',
|
| 42 |
+
docker.BASE_BUILDER_TAG + '-python',
|
| 43 |
+
docker.BASE_BUILDER_TAG + '-rust',
|
| 44 |
+
docker.BASE_BUILDER_TAG + '-ruby',
|
| 45 |
+
docker.BASE_BUILDER_TAG + '-swift',
|
| 46 |
+
]
|
| 47 |
+
docker.delete_images(images)
|
| 48 |
+
|
| 49 |
+
|
| 50 |
+
def run_fuzzers_entrypoint():
|
| 51 |
+
"""This is the entrypoint for the run_fuzzers github action.
|
| 52 |
+
This action can be added to any OSS-Fuzz project's workflow that uses
|
| 53 |
+
Github."""
|
| 54 |
+
config = config_utils.RunFuzzersConfig()
|
| 55 |
+
# The default return code when an error occurs.
|
| 56 |
+
returncode = 1
|
| 57 |
+
if config.dry_run:
|
| 58 |
+
# Sets the default return code on error to success.
|
| 59 |
+
returncode = 0
|
| 60 |
+
|
| 61 |
+
delete_unneeded_docker_images(config)
|
| 62 |
+
# Run the specified project's fuzzers from the build.
|
| 63 |
+
result = run_fuzzers.run_fuzzers(config)
|
| 64 |
+
if result == run_fuzzers.RunFuzzersResult.ERROR:
|
| 65 |
+
logging.error('Error occurred while running in workspace %s.',
|
| 66 |
+
config.workspace)
|
| 67 |
+
return returncode
|
| 68 |
+
if result == run_fuzzers.RunFuzzersResult.BUG_FOUND:
|
| 69 |
+
logging.info('Bug found.')
|
| 70 |
+
if not config.dry_run:
|
| 71 |
+
# Return 2 when a bug was found by a fuzzer causing the CI to fail.
|
| 72 |
+
return 2
|
| 73 |
+
return 0
|
| 74 |
+
|
| 75 |
+
|
| 76 |
+
def main():
|
| 77 |
+
"""Runs project's fuzzers for CI tools.
|
| 78 |
+
This is the entrypoint for the run_fuzzers github action.
|
| 79 |
+
|
| 80 |
+
NOTE: libFuzzer binaries must be located in the $WORKSPACE/build-out
|
| 81 |
+
directory in order for this action to be used. This action will only fuzz the
|
| 82 |
+
binaries that are located in that directory. It is recommended that you add
|
| 83 |
+
the build_fuzzers action preceding this one.
|
| 84 |
+
|
| 85 |
+
NOTE: Any crash report will be in the filepath:
|
| 86 |
+
${GITHUB_WORKSPACE}/out/testcase
|
| 87 |
+
This can be used in parallel with the upload-artifact action to surface the
|
| 88 |
+
logs.
|
| 89 |
+
|
| 90 |
+
Returns:
|
| 91 |
+
0 on success or nonzero on failure.
|
| 92 |
+
"""
|
| 93 |
+
return run_fuzzers_entrypoint()
|
| 94 |
+
|
| 95 |
+
|
| 96 |
+
if __name__ == '__main__':
|
| 97 |
+
sys.exit(main())
|
local-test-libxml2-delta-02/fuzz-tooling/infra/cifuzz/sarif_utils.py
ADDED
|
@@ -0,0 +1,251 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# Copyright 2023 Google LLC
|
| 2 |
+
#
|
| 3 |
+
# Licensed under the Apache License, Version 2.0 (the "License");
|
| 4 |
+
# you may not use this file except in compliance with the License.
|
| 5 |
+
# You may obtain a copy of the License at
|
| 6 |
+
#
|
| 7 |
+
# http://www.apache.org/licenses/LICENSE-2.0
|
| 8 |
+
#
|
| 9 |
+
# Unless required by applicable law or agreed to in writing, software
|
| 10 |
+
# distributed under the License is distributed on an "AS IS" BASIS,
|
| 11 |
+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
| 12 |
+
# See the License for the specific language governing permissions and
|
| 13 |
+
# limitations under the License.
|
| 14 |
+
"""Module for outputting SARIF data."""
|
| 15 |
+
import copy
|
| 16 |
+
import json
|
| 17 |
+
import logging
|
| 18 |
+
import os
|
| 19 |
+
|
| 20 |
+
from clusterfuzz import stacktraces
|
| 21 |
+
|
| 22 |
+
SARIF_RULES = [
|
| 23 |
+
{
|
| 24 |
+
'id': 'no-crashes',
|
| 25 |
+
'shortDescription': {
|
| 26 |
+
'text': 'Don\'t crash'
|
| 27 |
+
},
|
| 28 |
+
'helpUri': 'https://cwe.mitre.org/data/definitions/416.html',
|
| 29 |
+
'properties': {
|
| 30 |
+
'category': 'Crashes'
|
| 31 |
+
}
|
| 32 |
+
},
|
| 33 |
+
{
|
| 34 |
+
'id': 'heap-use-after-free',
|
| 35 |
+
'shortDescription': {
|
| 36 |
+
'text': 'Use of a heap-object after it has been freed.'
|
| 37 |
+
},
|
| 38 |
+
'helpUri': 'https://cwe.mitre.org/data/definitions/416.html',
|
| 39 |
+
'properties': {
|
| 40 |
+
'category': 'Crashes'
|
| 41 |
+
}
|
| 42 |
+
},
|
| 43 |
+
{
|
| 44 |
+
'id': 'heap-buffer-overflow',
|
| 45 |
+
'shortDescription': {
|
| 46 |
+
'text': 'A read or write past the end of a heap buffer.'
|
| 47 |
+
},
|
| 48 |
+
'helpUri': 'https://cwe.mitre.org/data/definitions/122.html',
|
| 49 |
+
'properties': {
|
| 50 |
+
'category': 'Crashes'
|
| 51 |
+
}
|
| 52 |
+
},
|
| 53 |
+
{
|
| 54 |
+
'id': 'stack-buffer-overflow',
|
| 55 |
+
'shortDescription': {
|
| 56 |
+
'text': 'A read or write past the end of a stack buffer.'
|
| 57 |
+
},
|
| 58 |
+
'helpUri': 'https://cwe.mitre.org/data/definitions/121.html',
|
| 59 |
+
'properties': {
|
| 60 |
+
'category': 'Crashes'
|
| 61 |
+
}
|
| 62 |
+
},
|
| 63 |
+
{
|
| 64 |
+
'id': 'global-buffer-overflow',
|
| 65 |
+
'shortDescription': {
|
| 66 |
+
'text': 'A read or write past the end of a global buffer.'
|
| 67 |
+
},
|
| 68 |
+
'helpUri': 'https://cwe.mitre.org/data/definitions/121.html',
|
| 69 |
+
'properties': {
|
| 70 |
+
'category': 'Crashes'
|
| 71 |
+
}
|
| 72 |
+
},
|
| 73 |
+
{
|
| 74 |
+
'id': 'stack-use-after-return',
|
| 75 |
+
'shortDescription': {
|
| 76 |
+
'text':
|
| 77 |
+
'A stack-based variable has been used after the function returned.'
|
| 78 |
+
},
|
| 79 |
+
'helpUri': 'https://cwe.mitre.org/data/definitions/562.html',
|
| 80 |
+
'properties': {
|
| 81 |
+
'category': 'Crashes'
|
| 82 |
+
}
|
| 83 |
+
},
|
| 84 |
+
{
|
| 85 |
+
'id': 'stack-use-after-scope',
|
| 86 |
+
'shortDescription': {
|
| 87 |
+
'text':
|
| 88 |
+
'A stack-based variable has been used outside of the scope in which it exists.'
|
| 89 |
+
},
|
| 90 |
+
'helpUri': 'https://cwe.mitre.org/data/definitions/562.html',
|
| 91 |
+
'properties': {
|
| 92 |
+
'category': 'Crashes'
|
| 93 |
+
}
|
| 94 |
+
},
|
| 95 |
+
{
|
| 96 |
+
'id': 'initialization-order-fiasco',
|
| 97 |
+
'shortDescription': {
|
| 98 |
+
'text': 'Problem with order of initialization of global objects.'
|
| 99 |
+
},
|
| 100 |
+
'helpUri': 'https://isocpp.org/wiki/faq/ctors#static-init-order',
|
| 101 |
+
'properties': {
|
| 102 |
+
'category': 'Crashes'
|
| 103 |
+
}
|
| 104 |
+
},
|
| 105 |
+
{
|
| 106 |
+
'id':
|
| 107 |
+
'direct-leak',
|
| 108 |
+
'shortDescription': {
|
| 109 |
+
'text': 'Memory is leaked.'
|
| 110 |
+
},
|
| 111 |
+
'helpUri':
|
| 112 |
+
'https://github.com/google/sanitizers/wiki/AddressSanitizerLeakSanitizer',
|
| 113 |
+
'properties': {
|
| 114 |
+
'category': 'Crashes'
|
| 115 |
+
}
|
| 116 |
+
},
|
| 117 |
+
{
|
| 118 |
+
'id':
|
| 119 |
+
'indirect-leak',
|
| 120 |
+
'shortDescription': {
|
| 121 |
+
'text': 'Memory is leaked.'
|
| 122 |
+
},
|
| 123 |
+
'helpUri':
|
| 124 |
+
'https://github.com/google/sanitizers/wiki/AddressSanitizerLeakSanitizer',
|
| 125 |
+
'properties': {
|
| 126 |
+
'category': 'Crashes'
|
| 127 |
+
}
|
| 128 |
+
},
|
| 129 |
+
]
|
| 130 |
+
SARIF_DATA = {
|
| 131 |
+
'version':
|
| 132 |
+
'2.1.0',
|
| 133 |
+
'$schema':
|
| 134 |
+
'http://json.schemastore.org/sarif-2.1.0-rtm.4',
|
| 135 |
+
'runs': [{
|
| 136 |
+
'tool': {
|
| 137 |
+
'driver': {
|
| 138 |
+
'name': 'ClusterFuzzLite/CIFuzz',
|
| 139 |
+
'informationUri': 'https://google.github.io/clusterfuzzlite/',
|
| 140 |
+
'rules': SARIF_RULES,
|
| 141 |
+
}
|
| 142 |
+
},
|
| 143 |
+
'results': []
|
| 144 |
+
}]
|
| 145 |
+
}
|
| 146 |
+
|
| 147 |
+
SRC_ROOT = '/src/'
|
| 148 |
+
|
| 149 |
+
|
| 150 |
+
def redact_src_path(src_path):
|
| 151 |
+
"""Redact the src path so that it can be reported to users."""
|
| 152 |
+
src_path = os.path.normpath(src_path)
|
| 153 |
+
if src_path.startswith(SRC_ROOT):
|
| 154 |
+
src_path = src_path[len(SRC_ROOT):]
|
| 155 |
+
|
| 156 |
+
src_path = os.sep.join(src_path.split(os.sep)[1:])
|
| 157 |
+
return src_path
|
| 158 |
+
|
| 159 |
+
|
| 160 |
+
def get_error_frame(crash_info):
|
| 161 |
+
"""Returns the stackframe where the error occurred."""
|
| 162 |
+
if not crash_info.crash_state:
|
| 163 |
+
return None
|
| 164 |
+
state = crash_info.crash_state.split('\n')[0]
|
| 165 |
+
logging.info('state: %s frames %s, %s', state, crash_info.frames,
|
| 166 |
+
[f.function_name for f in crash_info.frames[0]])
|
| 167 |
+
|
| 168 |
+
for crash_frames in crash_info.frames:
|
| 169 |
+
for frame in crash_frames:
|
| 170 |
+
# TODO(metzman): Do something less fragile here.
|
| 171 |
+
if frame.function_name is None:
|
| 172 |
+
continue
|
| 173 |
+
if state in frame.function_name:
|
| 174 |
+
return frame
|
| 175 |
+
return None
|
| 176 |
+
|
| 177 |
+
|
| 178 |
+
def get_error_source_info(crash_info):
|
| 179 |
+
"""Returns the filename and the line where the bug occurred."""
|
| 180 |
+
frame = get_error_frame(crash_info)
|
| 181 |
+
if not frame:
|
| 182 |
+
return (None, 1)
|
| 183 |
+
try:
|
| 184 |
+
return redact_src_path(frame.filename), int(frame.fileline or 1)
|
| 185 |
+
except TypeError:
|
| 186 |
+
return (None, 1)
|
| 187 |
+
|
| 188 |
+
|
| 189 |
+
def get_rule_index(crash_type):
|
| 190 |
+
"""Returns the rule index describe the rule that |crash_type| ran afoul of."""
|
| 191 |
+
# Don't include "READ" or "WRITE" or number of bytes.
|
| 192 |
+
crash_type = crash_type.replace('\n', ' ').split(' ')[0].lower()
|
| 193 |
+
logging.info('crash_type: %s.', crash_type)
|
| 194 |
+
for idx, rule in enumerate(SARIF_RULES):
|
| 195 |
+
if rule['id'] == crash_type:
|
| 196 |
+
logging.info('Rule index: %d.', idx)
|
| 197 |
+
return idx
|
| 198 |
+
|
| 199 |
+
return get_rule_index('no-crashes')
|
| 200 |
+
|
| 201 |
+
|
| 202 |
+
def get_sarif_data(stacktrace, target_path):
|
| 203 |
+
"""Returns a description of the crash in SARIF."""
|
| 204 |
+
data = copy.deepcopy(SARIF_DATA)
|
| 205 |
+
if stacktrace is None:
|
| 206 |
+
return data
|
| 207 |
+
|
| 208 |
+
fuzz_target = os.path.basename(target_path)
|
| 209 |
+
stack_parser = stacktraces.StackParser(fuzz_target=fuzz_target,
|
| 210 |
+
symbolized=True,
|
| 211 |
+
detect_ooms_and_hangs=True,
|
| 212 |
+
include_ubsan=True)
|
| 213 |
+
crash_info = stack_parser.parse(stacktrace)
|
| 214 |
+
error_source_info = get_error_source_info(crash_info)
|
| 215 |
+
rule_idx = get_rule_index(crash_info.crash_type)
|
| 216 |
+
rule_id = SARIF_RULES[rule_idx]['id']
|
| 217 |
+
uri = error_source_info[0]
|
| 218 |
+
|
| 219 |
+
result = {
|
| 220 |
+
'level': 'error',
|
| 221 |
+
'message': {
|
| 222 |
+
'text': crash_info.crash_type
|
| 223 |
+
},
|
| 224 |
+
'locations': [{
|
| 225 |
+
'physicalLocation': {
|
| 226 |
+
'artifactLocation': {
|
| 227 |
+
'uri': uri,
|
| 228 |
+
'index': 0
|
| 229 |
+
},
|
| 230 |
+
'region': {
|
| 231 |
+
'startLine': error_source_info[1],
|
| 232 |
+
# We don't have this granualarity fuzzing.
|
| 233 |
+
'startColumn': 1,
|
| 234 |
+
}
|
| 235 |
+
}
|
| 236 |
+
}],
|
| 237 |
+
'ruleId': rule_id,
|
| 238 |
+
'ruleIndex': rule_idx
|
| 239 |
+
}
|
| 240 |
+
if uri:
|
| 241 |
+
data['runs'][0]['results'].append(result)
|
| 242 |
+
return data
|
| 243 |
+
|
| 244 |
+
|
| 245 |
+
def write_stacktrace_to_sarif(stacktrace, target_path, workspace):
|
| 246 |
+
"""Writes a description of the crash in stacktrace to a SARIF file."""
|
| 247 |
+
data = get_sarif_data(stacktrace, target_path)
|
| 248 |
+
if not os.path.exists(workspace.sarif):
|
| 249 |
+
os.makedirs(workspace.sarif)
|
| 250 |
+
with open(os.path.join(workspace.sarif, 'results.sarif'), 'w') as file_handle:
|
| 251 |
+
file_handle.write(json.dumps(data))
|
local-test-libxml2-delta-02/fuzz-tooling/infra/cifuzz/test_helpers.py
ADDED
|
@@ -0,0 +1,117 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# Copyright 2020 Google LLC
|
| 2 |
+
#
|
| 3 |
+
# Licensed under the Apache License, Version 2.0 (the "License");
|
| 4 |
+
# you may not use this file except in compliance with the License.
|
| 5 |
+
# You may obtain a copy of the License at
|
| 6 |
+
#
|
| 7 |
+
# http://www.apache.org/licenses/LICENSE-2.0
|
| 8 |
+
#
|
| 9 |
+
# Unless required by applicable law or agreed to in writing, software
|
| 10 |
+
# distributed under the License is distributed on an "AS IS" BASIS,
|
| 11 |
+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
| 12 |
+
# See the License for the specific language governing permissions and
|
| 13 |
+
# limitations under the License.
|
| 14 |
+
"""Contains convenient helpers for writing tests."""
|
| 15 |
+
|
| 16 |
+
import contextlib
|
| 17 |
+
import os
|
| 18 |
+
import sys
|
| 19 |
+
import shutil
|
| 20 |
+
import tempfile
|
| 21 |
+
from unittest import mock
|
| 22 |
+
|
| 23 |
+
import config_utils
|
| 24 |
+
import docker
|
| 25 |
+
import workspace_utils
|
| 26 |
+
|
| 27 |
+
INFRA_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
|
| 28 |
+
# pylint: disable=wrong-import-position,import-error
|
| 29 |
+
sys.path.append(INFRA_DIR)
|
| 30 |
+
|
| 31 |
+
import helper
|
| 32 |
+
|
| 33 |
+
|
| 34 |
+
# TODO(metzman): Get rid of these decorators.
|
| 35 |
+
@mock.patch('config_utils._is_dry_run', return_value=True)
|
| 36 |
+
@mock.patch('platform_config.BasePlatformConfig.project_src_path',
|
| 37 |
+
return_value=None)
|
| 38 |
+
@mock.patch('os.path.basename', return_value=None)
|
| 39 |
+
def _create_config(config_cls, _, __, ___, **kwargs):
|
| 40 |
+
"""Creates a config object from |config_cls| and then sets every attribute
|
| 41 |
+
that is a key in |kwargs| to the corresponding value. Asserts that each key in
|
| 42 |
+
|kwargs| is an attribute of config."""
|
| 43 |
+
with mock.patch('config_utils.BaseConfig.validate', return_value=True):
|
| 44 |
+
config = config_cls()
|
| 45 |
+
for key, value in kwargs.items():
|
| 46 |
+
assert hasattr(config, key), 'Config doesn\'t have attribute: ' + key
|
| 47 |
+
setattr(config, key, value)
|
| 48 |
+
|
| 49 |
+
return config
|
| 50 |
+
|
| 51 |
+
|
| 52 |
+
def create_build_config(**kwargs):
|
| 53 |
+
"""Wrapper around _create_config for build configs."""
|
| 54 |
+
return _create_config(config_utils.BuildFuzzersConfig, **kwargs)
|
| 55 |
+
|
| 56 |
+
|
| 57 |
+
def create_run_config(**kwargs):
|
| 58 |
+
"""Wrapper around _create_config for run configs."""
|
| 59 |
+
return _create_config(config_utils.RunFuzzersConfig, **kwargs)
|
| 60 |
+
|
| 61 |
+
|
| 62 |
+
def create_workspace(workspace_path='/workspace'):
|
| 63 |
+
"""Returns a workspace located at |workspace_path| ('/workspace' by
|
| 64 |
+
default)."""
|
| 65 |
+
config = create_run_config(workspace=workspace_path)
|
| 66 |
+
return workspace_utils.Workspace(config)
|
| 67 |
+
|
| 68 |
+
|
| 69 |
+
def patch_environ(testcase_obj, env=None, empty=False, runner=False):
|
| 70 |
+
"""Patch environment. |testcase_obj| is the unittest.TestCase that contains
|
| 71 |
+
tests. |env|, if specified, is a dictionary of environment variables to start
|
| 72 |
+
from. If |empty| is True then the new patched environment will be empty. If
|
| 73 |
+
|runner| is True then the necessary environment variables will be set to run
|
| 74 |
+
the scripts from base-runner."""
|
| 75 |
+
if env is None:
|
| 76 |
+
env = {}
|
| 77 |
+
|
| 78 |
+
patcher = mock.patch.dict(os.environ, env)
|
| 79 |
+
testcase_obj.addCleanup(patcher.stop)
|
| 80 |
+
patcher.start()
|
| 81 |
+
if empty:
|
| 82 |
+
for key in os.environ.copy():
|
| 83 |
+
del os.environ[key]
|
| 84 |
+
|
| 85 |
+
if runner:
|
| 86 |
+
# Add the scripts for base-runner to the path since the wont be in
|
| 87 |
+
# /usr/local/bin on host machines during testing.
|
| 88 |
+
base_runner_dir = os.path.join(INFRA_DIR, 'base-images', 'base-runner')
|
| 89 |
+
os.environ['PATH'] = (os.environ.get('PATH', '') + os.pathsep +
|
| 90 |
+
base_runner_dir)
|
| 91 |
+
if 'GOPATH' not in os.environ:
|
| 92 |
+
# A GOPATH must be set or else the coverage script fails, even for getting
|
| 93 |
+
# the coverage of non-Go programs.
|
| 94 |
+
os.environ['GOPATH'] = '/root/go'
|
| 95 |
+
|
| 96 |
+
|
| 97 |
+
@contextlib.contextmanager
|
| 98 |
+
def temp_dir_copy(directory):
|
| 99 |
+
"""Context manager that yields a temporary copy of |directory|."""
|
| 100 |
+
with tempfile.TemporaryDirectory() as temp_dir:
|
| 101 |
+
temp_copy_path = os.path.join(temp_dir, os.path.basename(directory))
|
| 102 |
+
shutil.copytree(directory, temp_copy_path)
|
| 103 |
+
yield temp_copy_path
|
| 104 |
+
|
| 105 |
+
|
| 106 |
+
@contextlib.contextmanager
|
| 107 |
+
def docker_temp_dir():
|
| 108 |
+
"""Returns a temporary a directory that is useful for use with docker. On
|
| 109 |
+
cleanup this contextmanager uses docker to delete the directory's contents so
|
| 110 |
+
that if anything is owned by root it can be deleted (which
|
| 111 |
+
tempfile.TemporaryDirectory() cannot do) by non-root users."""
|
| 112 |
+
with tempfile.TemporaryDirectory() as temp_dir:
|
| 113 |
+
yield temp_dir
|
| 114 |
+
helper.docker_run([
|
| 115 |
+
'-v', f'{temp_dir}:/temp_dir', '-t', docker.BASE_BUILDER_TAG,
|
| 116 |
+
'/bin/bash', '-c', 'rm -rf /temp_dir/*'
|
| 117 |
+
])
|
local-test-libxml2-delta-02/fuzz-tooling/infra/cifuzz/workspace_utils.py
ADDED
|
@@ -0,0 +1,85 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# Copyright 2021 Google LLC
|
| 2 |
+
#
|
| 3 |
+
# Licensed under the Apache License, Version 2.0 (the "License");
|
| 4 |
+
# you may not use this file except in compliance with the License.
|
| 5 |
+
# You may obtain a copy of the License at
|
| 6 |
+
#
|
| 7 |
+
# http://www.apache.org/licenses/LICENSE-2.0
|
| 8 |
+
#
|
| 9 |
+
# Unless required by applicable law or agreed to in writing, software
|
| 10 |
+
# distributed under the License is distributed on an "AS IS" BASIS,
|
| 11 |
+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
| 12 |
+
# See the License for the specific language governing permissions and
|
| 13 |
+
# limitations under the License.
|
| 14 |
+
"""Module for representing the workspace directory which CIFuzz uses."""
|
| 15 |
+
|
| 16 |
+
import os
|
| 17 |
+
import shutil
|
| 18 |
+
|
| 19 |
+
|
| 20 |
+
class Workspace:
|
| 21 |
+
"""Class representing the workspace directory."""
|
| 22 |
+
|
| 23 |
+
def __init__(self, config):
|
| 24 |
+
self.workspace = config.workspace
|
| 25 |
+
|
| 26 |
+
def initialize_dir(self, directory): # pylint: disable=no-self-use
|
| 27 |
+
"""Creates directory if it doesn't already exist, otherwise does nothing."""
|
| 28 |
+
os.makedirs(directory, exist_ok=True)
|
| 29 |
+
|
| 30 |
+
@property
|
| 31 |
+
def repo_storage(self):
|
| 32 |
+
"""The parent directory for repo storage."""
|
| 33 |
+
return os.path.join(self.workspace, 'storage')
|
| 34 |
+
|
| 35 |
+
@property
|
| 36 |
+
def out(self):
|
| 37 |
+
"""The out directory used for storing the fuzzer build built by
|
| 38 |
+
build_fuzzers."""
|
| 39 |
+
# Don't use 'out' because it needs to be used by artifacts.
|
| 40 |
+
return os.path.join(self.workspace, 'build-out')
|
| 41 |
+
|
| 42 |
+
@property
|
| 43 |
+
def work(self):
|
| 44 |
+
"""The directory used as the work directory for the fuzzer build/run."""
|
| 45 |
+
return os.path.join(self.workspace, 'work')
|
| 46 |
+
|
| 47 |
+
@property
|
| 48 |
+
def artifacts(self):
|
| 49 |
+
"""The directory used to store artifacts for download by CI-system users."""
|
| 50 |
+
# This is hardcoded by a lot of clients, so we need to use this.
|
| 51 |
+
return os.path.join(self.workspace, 'out', 'artifacts')
|
| 52 |
+
|
| 53 |
+
@property
|
| 54 |
+
def clusterfuzz_build(self):
|
| 55 |
+
"""The directory where builds from ClusterFuzz are stored."""
|
| 56 |
+
return os.path.join(self.workspace, 'cifuzz-prev-build')
|
| 57 |
+
|
| 58 |
+
@property
|
| 59 |
+
def clusterfuzz_coverage(self):
|
| 60 |
+
"""The directory where builds from ClusterFuzz are stored."""
|
| 61 |
+
return os.path.join(self.workspace, 'cifuzz-prev-coverage')
|
| 62 |
+
|
| 63 |
+
@property
|
| 64 |
+
def coverage_report(self):
|
| 65 |
+
"""The directory where coverage reports generated by cifuzz are put."""
|
| 66 |
+
return os.path.join(self.workspace, 'cifuzz-coverage')
|
| 67 |
+
|
| 68 |
+
@property
|
| 69 |
+
def corpora(self):
|
| 70 |
+
"""The directory where corpora from ClusterFuzz are stored."""
|
| 71 |
+
return os.path.join(self.workspace, 'cifuzz-corpus')
|
| 72 |
+
|
| 73 |
+
@property
|
| 74 |
+
def pruned_corpora(self):
|
| 75 |
+
"""The directory where pruned corpora are stored."""
|
| 76 |
+
return os.path.join(self.workspace, 'cifuzz-pruned-corpus')
|
| 77 |
+
|
| 78 |
+
@property
|
| 79 |
+
def sarif(self):
|
| 80 |
+
"""The directory where sarif files are stored."""
|
| 81 |
+
return os.path.join(self.workspace, 'cifuzz-sarif')
|
| 82 |
+
|
| 83 |
+
def make_repo_for_sarif(self, repo_manager):
|
| 84 |
+
"""Copies the repo over for the sarif upload GitHub action."""
|
| 85 |
+
return shutil.copytree(repo_manager.repo_dir, self.sarif, symlinks=True)
|
local-test-libxml2-delta-02/fuzz-tooling/infra/repo_manager.py
ADDED
|
@@ -0,0 +1,272 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# Copyright 2019 Google LLC
|
| 2 |
+
#
|
| 3 |
+
# Licensed under the Apache License, Version 2.0 (the "License");
|
| 4 |
+
# you may not use this file except in compliance with the License.
|
| 5 |
+
# You may obtain a copy of the License at
|
| 6 |
+
#
|
| 7 |
+
# http://www.apache.org/licenses/LICENSE-2.0
|
| 8 |
+
#
|
| 9 |
+
# Unless required by applicable law or agreed to in writing, software
|
| 10 |
+
# distributed under the License is distributed on an "AS IS" BASIS,
|
| 11 |
+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
| 12 |
+
# See the License for the specific language governing permissions and
|
| 13 |
+
# limitations under the License.
|
| 14 |
+
"""Class to manage a git repository via python.
|
| 15 |
+
|
| 16 |
+
This class is to be used to implement git commands over
|
| 17 |
+
a python API and manage the current state of the git repo.
|
| 18 |
+
|
| 19 |
+
Typical usage example:
|
| 20 |
+
|
| 21 |
+
r_man = RepoManager('https://github.com/google/oss-fuzz.git')
|
| 22 |
+
r_man.checkout('5668cc422c2c92d38a370545d3591039fb5bb8d4')
|
| 23 |
+
"""
|
| 24 |
+
import datetime
|
| 25 |
+
import logging
|
| 26 |
+
import os
|
| 27 |
+
import shutil
|
| 28 |
+
|
| 29 |
+
import urllib.parse
|
| 30 |
+
|
| 31 |
+
import utils
|
| 32 |
+
|
| 33 |
+
|
| 34 |
+
class RepoManager:
|
| 35 |
+
"""Repo manager."""
|
| 36 |
+
|
| 37 |
+
def __init__(self, repo_dir):
|
| 38 |
+
self.repo_dir = repo_dir
|
| 39 |
+
|
| 40 |
+
def _is_git_repo(self):
|
| 41 |
+
"""Test if the current repo dir is a git repo or not.
|
| 42 |
+
|
| 43 |
+
Returns:
|
| 44 |
+
True if the current repo_dir is a valid git repo.
|
| 45 |
+
"""
|
| 46 |
+
git_path = os.path.join(self.repo_dir, '.git')
|
| 47 |
+
return os.path.isdir(git_path)
|
| 48 |
+
|
| 49 |
+
def git(self, cmd, check_result=False):
|
| 50 |
+
"""Run a git command.
|
| 51 |
+
|
| 52 |
+
Args:
|
| 53 |
+
command: The git command as a list to be run.
|
| 54 |
+
check_result: Should an exception be thrown on failed command.
|
| 55 |
+
|
| 56 |
+
Returns:
|
| 57 |
+
stdout, stderr, error code.
|
| 58 |
+
"""
|
| 59 |
+
return utils.execute(['git'] + cmd,
|
| 60 |
+
location=self.repo_dir,
|
| 61 |
+
check_result=check_result)
|
| 62 |
+
|
| 63 |
+
def commit_exists(self, commit):
|
| 64 |
+
"""Checks to see if a commit exists in the project repo.
|
| 65 |
+
|
| 66 |
+
Args:
|
| 67 |
+
commit: The commit SHA you are checking.
|
| 68 |
+
|
| 69 |
+
Returns:
|
| 70 |
+
True if the commit exits in the project.
|
| 71 |
+
"""
|
| 72 |
+
if not commit.rstrip():
|
| 73 |
+
return False
|
| 74 |
+
|
| 75 |
+
_, _, err_code = self.git(['cat-file', '-e', commit])
|
| 76 |
+
return not err_code
|
| 77 |
+
|
| 78 |
+
def commit_date(self, commit):
|
| 79 |
+
"""Get the date of a commit.
|
| 80 |
+
|
| 81 |
+
Args:
|
| 82 |
+
commit: The commit hash.
|
| 83 |
+
|
| 84 |
+
Returns:
|
| 85 |
+
A datetime representing the date of the commit.
|
| 86 |
+
"""
|
| 87 |
+
out, _, _ = self.git(['show', '-s', '--format=%ct', commit],
|
| 88 |
+
check_result=True)
|
| 89 |
+
return datetime.datetime.fromtimestamp(int(out), tz=datetime.timezone.utc)
|
| 90 |
+
|
| 91 |
+
def get_git_diff(self, base='origin...'):
|
| 92 |
+
"""Gets a list of files that have changed from the repo head.
|
| 93 |
+
|
| 94 |
+
Returns:
|
| 95 |
+
A list of changed file paths or None on Error.
|
| 96 |
+
"""
|
| 97 |
+
self.fetch_unshallow()
|
| 98 |
+
# Add '--' so that git knows we aren't talking about files.
|
| 99 |
+
command = ['diff', '--name-only', base, '--']
|
| 100 |
+
out, err_msg, err_code = self.git(command)
|
| 101 |
+
if err_code:
|
| 102 |
+
logging.error('Git diff failed with error message %s.', err_msg)
|
| 103 |
+
return None
|
| 104 |
+
if not out:
|
| 105 |
+
logging.error('No diff was found.')
|
| 106 |
+
return None
|
| 107 |
+
return [line for line in out.splitlines() if line]
|
| 108 |
+
|
| 109 |
+
def get_current_commit(self):
|
| 110 |
+
"""Gets the current commit SHA of the repo.
|
| 111 |
+
|
| 112 |
+
Returns:
|
| 113 |
+
The current active commit SHA.
|
| 114 |
+
"""
|
| 115 |
+
out, _, _ = self.git(['rev-parse', 'HEAD'], check_result=True)
|
| 116 |
+
return out.strip()
|
| 117 |
+
|
| 118 |
+
def get_parent(self, commit, count):
|
| 119 |
+
"""Gets the count'th parent of the given commit.
|
| 120 |
+
|
| 121 |
+
Returns:
|
| 122 |
+
The parent commit SHA.
|
| 123 |
+
"""
|
| 124 |
+
self.fetch_unshallow()
|
| 125 |
+
out, _, err_code = self.git(['rev-parse', commit + '~' + str(count)],
|
| 126 |
+
check_result=False)
|
| 127 |
+
if err_code:
|
| 128 |
+
return None
|
| 129 |
+
|
| 130 |
+
return out.strip()
|
| 131 |
+
|
| 132 |
+
def fetch_all_remotes(self):
|
| 133 |
+
"""Fetch all remotes for checkouts that track a single branch."""
|
| 134 |
+
self.git([
|
| 135 |
+
'config', 'remote.origin.fetch', '+refs/heads/*:refs/remotes/origin/*'
|
| 136 |
+
],
|
| 137 |
+
check_result=True)
|
| 138 |
+
self.git(['remote', 'update'], check_result=True)
|
| 139 |
+
|
| 140 |
+
def get_commit_list(self, newest_commit, oldest_commit=None, limit=None):
|
| 141 |
+
"""Gets the list of commits(inclusive) between the old and new commits.
|
| 142 |
+
|
| 143 |
+
Args:
|
| 144 |
+
newest_commit: The newest commit to be in the list.
|
| 145 |
+
oldest_commit: The (optional) oldest commit to be in the list.
|
| 146 |
+
|
| 147 |
+
Returns:
|
| 148 |
+
The list of commit SHAs from newest to oldest.
|
| 149 |
+
|
| 150 |
+
Raises:
|
| 151 |
+
ValueError: When either the oldest or newest commit does not exist.
|
| 152 |
+
RuntimeError: When there is an error getting the commit list.
|
| 153 |
+
"""
|
| 154 |
+
self.fetch_unshallow()
|
| 155 |
+
if oldest_commit and not self.commit_exists(oldest_commit):
|
| 156 |
+
raise ValueError('The oldest commit %s does not exist' % oldest_commit)
|
| 157 |
+
if not self.commit_exists(newest_commit):
|
| 158 |
+
raise ValueError('The newest commit %s does not exist' % newest_commit)
|
| 159 |
+
if oldest_commit == newest_commit:
|
| 160 |
+
return [oldest_commit]
|
| 161 |
+
|
| 162 |
+
if oldest_commit:
|
| 163 |
+
commit_range = oldest_commit + '..' + newest_commit
|
| 164 |
+
else:
|
| 165 |
+
commit_range = newest_commit
|
| 166 |
+
|
| 167 |
+
limit_args = []
|
| 168 |
+
if limit:
|
| 169 |
+
limit_args.append(f'--max-count={limit}')
|
| 170 |
+
|
| 171 |
+
out, _, err_code = self.git(['rev-list', commit_range] + limit_args)
|
| 172 |
+
commits = out.split('\n')
|
| 173 |
+
commits = [commit for commit in commits if commit]
|
| 174 |
+
if err_code or not commits:
|
| 175 |
+
raise RuntimeError('Error getting commit list between %s and %s ' %
|
| 176 |
+
(oldest_commit, newest_commit))
|
| 177 |
+
|
| 178 |
+
# Make sure result is inclusive
|
| 179 |
+
if oldest_commit:
|
| 180 |
+
commits.append(oldest_commit)
|
| 181 |
+
return commits
|
| 182 |
+
|
| 183 |
+
def fetch_branch(self, branch):
|
| 184 |
+
"""Fetches a remote branch from origin."""
|
| 185 |
+
return self.git(
|
| 186 |
+
['fetch', 'origin', '{branch}:{branch}'.format(branch=branch)])
|
| 187 |
+
|
| 188 |
+
def fetch_unshallow(self):
|
| 189 |
+
"""Gets the current git repository history."""
|
| 190 |
+
shallow_file = os.path.join(self.repo_dir, '.git', 'shallow')
|
| 191 |
+
if os.path.exists(shallow_file):
|
| 192 |
+
_, err, err_code = self.git(['fetch', '--unshallow'], check_result=False)
|
| 193 |
+
if err_code:
|
| 194 |
+
logging.error('Unshallow returned non-zero code: %s', err)
|
| 195 |
+
|
| 196 |
+
def checkout_pr(self, pr_ref):
|
| 197 |
+
"""Checks out a remote pull request.
|
| 198 |
+
|
| 199 |
+
Args:
|
| 200 |
+
pr_ref: The pull request reference to be checked out.
|
| 201 |
+
"""
|
| 202 |
+
self.fetch_unshallow()
|
| 203 |
+
self.git(['fetch', 'origin', pr_ref], check_result=True)
|
| 204 |
+
self.git(['checkout', '-f', 'FETCH_HEAD'], check_result=True)
|
| 205 |
+
self.git(['submodule', 'update', '-f', '--init', '--recursive'],
|
| 206 |
+
check_result=True)
|
| 207 |
+
|
| 208 |
+
def checkout_commit(self, commit, clean=True):
|
| 209 |
+
"""Checks out a specific commit from the repo.
|
| 210 |
+
|
| 211 |
+
Args:
|
| 212 |
+
commit: The commit SHA to be checked out.
|
| 213 |
+
|
| 214 |
+
Raises:
|
| 215 |
+
RuntimeError: when checkout is not successful.
|
| 216 |
+
ValueError: when commit does not exist.
|
| 217 |
+
"""
|
| 218 |
+
self.fetch_unshallow()
|
| 219 |
+
if not self.commit_exists(commit):
|
| 220 |
+
raise ValueError('Commit %s does not exist in current branch' % commit)
|
| 221 |
+
self.git(['checkout', '-f', commit], check_result=True)
|
| 222 |
+
self.git(['submodule', 'update', '-f', '--init', '--recursive'],
|
| 223 |
+
check_result=True)
|
| 224 |
+
if clean:
|
| 225 |
+
self.git(['clean', '-fxd'], check_result=True)
|
| 226 |
+
if self.get_current_commit() != commit:
|
| 227 |
+
raise RuntimeError('Error checking out commit %s' % commit)
|
| 228 |
+
|
| 229 |
+
def remove_repo(self):
|
| 230 |
+
"""Removes the git repo from disk."""
|
| 231 |
+
if os.path.isdir(self.repo_dir):
|
| 232 |
+
shutil.rmtree(self.repo_dir)
|
| 233 |
+
|
| 234 |
+
|
| 235 |
+
def clone_repo_and_get_manager(repo_url,
|
| 236 |
+
base_dir,
|
| 237 |
+
repo_name=None,
|
| 238 |
+
username=None,
|
| 239 |
+
password=None):
|
| 240 |
+
"""Clones a repo and constructs a repo manager class.
|
| 241 |
+
|
| 242 |
+
Args:
|
| 243 |
+
repo_url: The github url needed to clone.
|
| 244 |
+
base_dir: The full file-path where the git repo is located.
|
| 245 |
+
repo_name: The name of the directory the repo is cloned to.
|
| 246 |
+
"""
|
| 247 |
+
if repo_name is None:
|
| 248 |
+
repo_name = os.path.basename(repo_url).replace('.git', '')
|
| 249 |
+
repo_dir = os.path.join(base_dir, repo_name)
|
| 250 |
+
manager = RepoManager(repo_dir)
|
| 251 |
+
|
| 252 |
+
if not os.path.exists(repo_dir):
|
| 253 |
+
_clone(repo_url, base_dir, repo_name, username=username, password=password)
|
| 254 |
+
|
| 255 |
+
return manager
|
| 256 |
+
|
| 257 |
+
|
| 258 |
+
def _clone(repo_url, base_dir, repo_name, username=None, password=None):
|
| 259 |
+
"""Creates a clone of the repo in the specified directory.
|
| 260 |
+
|
| 261 |
+
Raises:
|
| 262 |
+
ValueError: when the repo is not able to be cloned.
|
| 263 |
+
"""
|
| 264 |
+
if username and password:
|
| 265 |
+
parsed_url = urllib.parse.urlparse(repo_url)
|
| 266 |
+
new_netloc = f'{username}:{password}@{parsed_url.netloc}'
|
| 267 |
+
repo_url = urllib.parse.urlunparse(parsed_url._replace(netloc=new_netloc))
|
| 268 |
+
|
| 269 |
+
utils.execute(['git', 'clone', repo_url, repo_name],
|
| 270 |
+
location=base_dir,
|
| 271 |
+
check_result=True,
|
| 272 |
+
log_command=not password)
|
local-test-libxml2-delta-02/fuzz-tooling/infra/tools/hold_back_images.py
ADDED
|
@@ -0,0 +1,128 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
#!/usr/bin/env python
|
| 2 |
+
# Copyright 2022 Google LLC
|
| 3 |
+
#
|
| 4 |
+
# Licensed under the Apache License, Version 2.0 (the "License");
|
| 5 |
+
# you may not use this file except in compliance with the License.
|
| 6 |
+
# You may obtain a copy of the License at
|
| 7 |
+
#
|
| 8 |
+
# http://www.apache.org/licenses/LICENSE-2.0
|
| 9 |
+
#
|
| 10 |
+
# Unless required by applicable law or agreed to in writing, software
|
| 11 |
+
# distributed under the License is distributed on an "AS IS" BASIS,
|
| 12 |
+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
| 13 |
+
# See the License for the specific language governing permissions and
|
| 14 |
+
# limitations under the License.
|
| 15 |
+
#
|
| 16 |
+
################################################################################
|
| 17 |
+
"""Script for pinning builder images for projects that break on upgrades. Works
|
| 18 |
+
with projects that use language builders."""
|
| 19 |
+
import argparse
|
| 20 |
+
import logging
|
| 21 |
+
import os
|
| 22 |
+
import re
|
| 23 |
+
import sys
|
| 24 |
+
import subprocess
|
| 25 |
+
|
| 26 |
+
ROOT_DIR = os.path.dirname(os.path.dirname(os.path.dirname(__file__)))
|
| 27 |
+
PROJECTS_DIR = os.path.join(ROOT_DIR, 'projects')
|
| 28 |
+
|
| 29 |
+
IMAGE_DIGEST_REGEX = re.compile(r'\[(.+)\]\n')
|
| 30 |
+
FROM_LINE_REGEX = re.compile(
|
| 31 |
+
r'FROM (ghcr.io\/aixcc-finals\/base-builder[\-a-z0-9]*)(\@?.*)')
|
| 32 |
+
|
| 33 |
+
|
| 34 |
+
def get_latest_docker_image_digest(image):
|
| 35 |
+
"""Returns a pinnable version of the latest |image|. This version will have a
|
| 36 |
+
SHA."""
|
| 37 |
+
subprocess.run(['docker', 'pull', image], check=True)
|
| 38 |
+
subprocess.run(['docker', 'pull', image], stdout=subprocess.PIPE, check=True)
|
| 39 |
+
|
| 40 |
+
command = [
|
| 41 |
+
'docker', 'image', 'inspect', '--format', '{{.RepoDigests}}', image
|
| 42 |
+
]
|
| 43 |
+
output = subprocess.run(command, check=True,
|
| 44 |
+
stdout=subprocess.PIPE).stdout.decode('utf-8')
|
| 45 |
+
return IMAGE_DIGEST_REGEX.match(output).groups(1)[0]
|
| 46 |
+
|
| 47 |
+
|
| 48 |
+
def get_args():
|
| 49 |
+
"""Returns parsed arguments."""
|
| 50 |
+
parser = argparse.ArgumentParser(sys.argv[0],
|
| 51 |
+
description='Hold back builder images.')
|
| 52 |
+
parser.add_argument('projects', help='Projects.', nargs='+')
|
| 53 |
+
|
| 54 |
+
parser.add_argument('--hold-image-digest',
|
| 55 |
+
required=False,
|
| 56 |
+
nargs='?',
|
| 57 |
+
default=None,
|
| 58 |
+
help='Image to hold on to.')
|
| 59 |
+
|
| 60 |
+
parser.add_argument('--update-held',
|
| 61 |
+
action='store_true',
|
| 62 |
+
default=False,
|
| 63 |
+
help='Update held images.')
|
| 64 |
+
|
| 65 |
+
parser.add_argument('--issue-number',
|
| 66 |
+
required=False,
|
| 67 |
+
nargs='?',
|
| 68 |
+
default=None,
|
| 69 |
+
help='Issue to reference.')
|
| 70 |
+
|
| 71 |
+
args = parser.parse_args()
|
| 72 |
+
return args
|
| 73 |
+
|
| 74 |
+
|
| 75 |
+
def get_hold_image_digest(line, hold_image_digest, update_held):
|
| 76 |
+
"""Returns the image digest for the |line| we want to pin. If the image is
|
| 77 |
+
already pinned then it is only updated if |update_held. If |hold_image_digest
|
| 78 |
+
is specified then it is returned, otherwise the latest pinnable version is
|
| 79 |
+
returned."""
|
| 80 |
+
matches = FROM_LINE_REGEX.match(line).groups()
|
| 81 |
+
if matches[1] and not update_held:
|
| 82 |
+
return None, False
|
| 83 |
+
initial_image = matches[0]
|
| 84 |
+
if hold_image_digest:
|
| 85 |
+
return hold_image_digest, True
|
| 86 |
+
return get_latest_docker_image_digest(initial_image), True
|
| 87 |
+
|
| 88 |
+
|
| 89 |
+
def hold_image(project, hold_image_digest, update_held, issue_number):
|
| 90 |
+
"""Rewrites the Dockerfile of |project| to pin the base-builder image on
|
| 91 |
+
upgrade."""
|
| 92 |
+
dockerfile_path = os.path.join(PROJECTS_DIR, project, 'Dockerfile')
|
| 93 |
+
with open(dockerfile_path, 'r') as dockerfile_handle:
|
| 94 |
+
dockerfile = dockerfile_handle.readlines()
|
| 95 |
+
for idx, line in enumerate(dockerfile[:]):
|
| 96 |
+
if not line.startswith('FROM ghcr.io/aixcc-finals/base-builder'):
|
| 97 |
+
continue
|
| 98 |
+
|
| 99 |
+
hold_image_digest, should_hold = get_hold_image_digest(
|
| 100 |
+
line.strip(), hold_image_digest, update_held)
|
| 101 |
+
if not should_hold:
|
| 102 |
+
logging.error('Not holding back %s.', project)
|
| 103 |
+
break
|
| 104 |
+
dockerfile[idx] = f'FROM {hold_image_digest}\n'
|
| 105 |
+
if issue_number:
|
| 106 |
+
comment = ('# Held back because of github.com/google/oss-fuzz/pull/'
|
| 107 |
+
f'{issue_number}\n# Please fix failure and upgrade.\n')
|
| 108 |
+
dockerfile.insert(idx, comment)
|
| 109 |
+
break
|
| 110 |
+
else:
|
| 111 |
+
# This path is taken when we don't break out of the loop.
|
| 112 |
+
assert None, f'Could not find FROM line in {project}'
|
| 113 |
+
dockerfile = ''.join(dockerfile)
|
| 114 |
+
with open(dockerfile_path, 'w') as dockerfile_handle:
|
| 115 |
+
dockerfile_handle.write(dockerfile)
|
| 116 |
+
|
| 117 |
+
|
| 118 |
+
def main():
|
| 119 |
+
"""Script for pinning builder images for projects that break on upgrades."""
|
| 120 |
+
args = get_args()
|
| 121 |
+
for project in args.projects:
|
| 122 |
+
hold_image(project, args.hold_image_digest, args.update_held,
|
| 123 |
+
args.issue_number)
|
| 124 |
+
return 0
|
| 125 |
+
|
| 126 |
+
|
| 127 |
+
if __name__ == '__main__':
|
| 128 |
+
sys.exit(main())
|