chore: fix deps

This commit is contained in:
Rim 2025-04-29 02:51:27 -04:00
parent 538d1f00aa
commit 8d07d458bd
1375 changed files with 195792 additions and 3 deletions

4
.gitignore vendored
View File

@ -1,3 +1 @@
.vs
.vscode
build*/*
build/

View File

@ -0,0 +1,29 @@
---
name: Bug report
about: Create a report to help us improve
title: ''
labels: 'Status: Open, Type: Bug'
assignees: ''
---
**Describe the bug**
A clear and concise description of what the bug is.
**To Reproduce**
```c++
#include <gsl>
// your repro here: ...
```
**Expected behavior**
A clear and concise description of what you expected to happen.
**Spec (please complete the following information):**
- OS: [e.g. Windows]
- Compiler: [e.g. MSVC]
- C++ Version: [e.g. C++20]
**Additional context**
Add any other context about the problem here.

59
deps/GSL/.github/workflows/android.yml vendored Normal file
View File

@ -0,0 +1,59 @@
name: CI_Android
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}
cancel-in-progress: true
on:
push:
branches: [ main ]
pull_request:
branches: [ main ]
jobs:
Android:
runs-on: macos-latest-large
defaults:
run:
working-directory: build
steps:
- uses: actions/checkout@v4
- name: Create build directory
run: mkdir -p build
working-directory: .
- uses: actions/setup-java@v4
with:
java-version: 8
distribution: zulu
- name: Start Emulator
run: |
echo "y" | $ANDROID_HOME/tools/bin/sdkmanager --install 'system-images;android-24;default;x86_64'
echo "no" | $ANDROID_HOME/tools/bin/avdmanager create avd -n xamarin_android_emulator -k 'system-images;android-24;default;x86_64' --force
$ANDROID_HOME/emulator/emulator -list-avds
echo "Starting emulator..."
nohup $ANDROID_HOME/emulator/emulator -no-audio -no-snapshot -avd xamarin_android_emulator &> /dev/null &
echo "Emulator starting in background"
- name: Configure
run: cmake -DCMAKE_TOOLCHAIN_FILE=$ANDROID_NDK_LATEST_HOME/build/cmake/android.toolchain.cmake -DANDROID_PLATFORM=16 -DANDROID_ABI=x86_64 -DCMAKE_BUILD_TYPE=Debug ..
- name: Build
run: cmake --build . --parallel
- name: Wait for emulator ready
timeout-minutes: 2
run: |
$ANDROID_HOME/platform-tools/adb wait-for-device shell 'while [[ -z $(getprop sys.boot_completed | tr -d '\r') ]]; do sleep 10; done; input keyevent 82'
$ANDROID_HOME/platform-tools/adb devices
$ANDROID_HOME/platform-tools/adb shell getprop ro.product.cpu.abi
- name: Deploy tests
run: |
adb push tests /data/local/tmp
adb shell find /data/local/tmp/tests -maxdepth 1 -exec chmod +x {} \\\;
- name: Test
run: adb shell find /data/local/tmp/tests -name "*_tests" -maxdepth 1 -exec {} \\\;

View File

@ -0,0 +1,56 @@
name: Composite CMake
inputs:
cmake_generator:
required: false
type: string
default: 'Unix Makefiles'
cmake_build_type:
required: true
type: string
default: ''
cmake_cxx_compiler:
required: false
type: string
gsl_cxx_standard:
required: true
type: number
extra_cmake_args:
required: false
type: string
default: ''
build_cmd:
required: true
type: string
default: 'make'
test_cmd:
required: false
type: string
default: 'make test'
shell:
required: false
type: string
default: 'bash'
runs:
using: composite
steps:
- name: Create build directory
run: mkdir build
shell: ${{ inputs.shell }}
- name: Configure CMake
working-directory: build
run: cmake -G "${{ inputs.cmake_generator }}" -DCMAKE_BUILD_TYPE=${{ inputs.cmake_build_type }} -DCMAKE_CXX_COMPILER=${{ inputs.cmake_cxx_compiler }} -DGSL_CXX_STANDARD=${{ inputs.gsl_cxx_standard }} -DCI_TESTING:BOOL=ON -DCMAKE_VERBOSE_MAKEFILE:BOOL=ON -Werror=dev ${{ inputs.extra_cmake_args }} ..
shell: ${{ inputs.shell }}
- name: Build
working-directory: build
run: ${{ inputs.build_cmd }}
shell: ${{ inputs.shell }}
- name: Test
working-directory: build
run: ${{ inputs.test_cmd }}
shell: ${{ inputs.shell }}

View File

@ -0,0 +1,25 @@
name: cmake_find_package
on:
push:
branches: [ main ]
pull_request:
branches: [ main ]
jobs:
cmake-find-package:
name: Build ${{ matrix.os }}
runs-on: ${{ matrix.os }}
strategy:
matrix:
os: [ ubuntu-latest, macos-latest ]
steps:
- uses: actions/checkout@v4
- uses: lukka/get-cmake@latest
with:
cmakeVersion: 3.14.0
- name: Configure GSL
run: cmake -S . -B build -G "Ninja" -D GSL_TEST=OFF -D CMAKE_INSTALL_PREFIX=${GITHUB_WORKSPACE}/build/install
- name: Install GSL
run: cmake --build build --target install
- name: Test GSL find_package support
run: cmake -S tests/ -B build/tests_find_package -G "Ninja" -D CMAKE_PREFIX_PATH=${GITHUB_WORKSPACE}/build/install -D CMAKE_BUILD_TYPE=Release

115
deps/GSL/.github/workflows/compilers.yml vendored Normal file
View File

@ -0,0 +1,115 @@
name: Compiler Integration Tests
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}
cancel-in-progress: true
on:
push:
branches: [ main ]
pull_request:
branches: [ main ]
# These jobs are correlated with the officially supported compilers
# and toolsets. If you change any versions, please update README.md.
jobs:
gcc:
strategy:
matrix:
gcc_version: [ 12, 13, 14 ]
build_type: [ Debug, Release ]
cxx_version: [ 14, 17, 20, 23 ]
exclude:
# https://github.com/google/googletest/issues/4232
# Looks like GoogleTest is not interested in making version 1.14
# work with gcc-12.
- gcc_version: 12
cxx_version: 20
- gcc_version: 12
cxx_version: 23
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Run CMake (configure, build, test)
uses: ./.github/workflows/cmake
with:
cmake_build_type: ${{ matrix.build_type }}
cmake_cxx_compiler: g++-${{ matrix.gcc_version }}
gsl_cxx_standard: ${{ matrix.cxx_version }}
clang:
strategy:
matrix:
clang_version: [ 16, 17, 18 ]
build_type: [ Debug, Release ]
cxx_version: [ 14, 17, 20, 23 ]
exclude:
# https://github.com/llvm/llvm-project/issues/93734
# Looks like clang fixed this issue in clang-18, but won't backport
# the fix.
- clang_version: 17
cxx_version: 23
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Run CMake (configure, build, test)
uses: ./.github/workflows/cmake
with:
cmake_build_type: ${{ matrix.build_type }}
cmake_cxx_compiler: clang++-${{ matrix.clang_version }}
gsl_cxx_standard: ${{ matrix.cxx_version }}
xcode:
strategy:
matrix:
xcode_version: [ '15.4' ]
build_type: [ Debug, Release ]
cxx_version: [ 14, 17, 20, 23 ]
runs-on: macos-latest
steps:
- uses: actions/checkout@v4
- name: select xcode version
run: sudo xcode-select -s /Applications/Xcode_${{ matrix.xcode_version }}.app
- name: Run CMake (configure, build, test)
uses: ./.github/workflows/cmake
with:
cmake_build_type: ${{ matrix.build_type }}
cmake_cxx_compiler: clang++
gsl_cxx_standard: ${{ matrix.cxx_version }}
VisualStudio:
strategy:
matrix:
generator: [ 'Visual Studio 16 2019', 'Visual Studio 17 2022' ]
image: [ windows-2019, windows-2022 ]
build_type: [ Debug, Release ]
extra_args: [ '', '-T ClangCL' ]
cxx_version: [ 14, 17, 20, 23 ]
exclude:
- generator: 'Visual Studio 17 2022'
image: windows-2019
- generator: 'Visual Studio 16 2019'
image: windows-2022
- generator: 'Visual Studio 16 2019'
cxx_version: 23
runs-on: ${{ matrix.image }}
steps:
- uses: actions/checkout@v4
- uses: microsoft/setup-msbuild@v2
- name: Run CMake (configure, build, test)
uses: ./.github/workflows/cmake
with:
cmake_generator: ${{ matrix.generator }}
cmake_build_type: ${{ matrix.build_type }}
gsl_cxx_standard: ${{ matrix.cxx_version }}
extra_cmake_args: ${{ matrix.extra_args }}
build_cmd: msbuild GSL.sln
test_cmd: ctest . --output-on-failure --no-compress-output
shell: pwsh

52
deps/GSL/.github/workflows/ios.yml vendored Normal file
View File

@ -0,0 +1,52 @@
name: CI_iOS
on:
push:
branches: [ main ]
pull_request:
branches: [ main ]
jobs:
iOS:
runs-on: macos-latest
defaults:
run:
working-directory: build
steps:
- uses: actions/checkout@v4
- name: Create build directory
run: mkdir -p build
working-directory: .
- name: Configure
run: |
cmake \
-Werror=dev \
-GXcode \
-DCMAKE_SYSTEM_NAME=iOS \
"-DCMAKE_OSX_ARCHITECTURES=arm64;x86_64" \
-DCMAKE_OSX_DEPLOYMENT_TARGET=9 \
-DCMAKE_TRY_COMPILE_TARGET_TYPE=STATIC_LIBRARY \
"-DMACOSX_BUNDLE_GUI_IDENTIFIER=GSL.\$(EXECUTABLE_NAME)" \
-DMACOSX_BUNDLE_BUNDLE_VERSION=3.1.0 \
-DMACOSX_BUNDLE_SHORT_VERSION_STRING=3.1.0 \
..
- name: Build
run: cmake --build . --parallel `sysctl -n hw.ncpu` --config Release -- -sdk iphonesimulator
- name: Start simulator
run: |
RUNTIME=`xcrun simctl list runtimes iOS -j|jq '.runtimes|last.identifier'`
UDID=`xcrun simctl list devices iPhone available -j|jq -r ".devices[$RUNTIME]|last.udid"`
xcrun simctl bootstatus $UDID -b
- name: Test
run: |
for TEST in `find tests/Release-iphonesimulator -depth 1 -name "*.app"`
do
xcrun simctl install booted $TEST
TEST_ID=`plutil -convert json -o - $TEST/Info.plist|jq -r ".CFBundleIdentifier"`
xcrun simctl launch --console booted $TEST_ID
xcrun simctl uninstall booted $TEST_ID
done

48
deps/GSL/CMakeLists.txt vendored Normal file
View File

@ -0,0 +1,48 @@
cmake_minimum_required(VERSION 3.14...3.16)
project(GSL VERSION 4.1.0 LANGUAGES CXX)
add_library(GSL INTERFACE)
add_library(Microsoft.GSL::GSL ALIAS GSL)
# https://cmake.org/cmake/help/latest/variable/PROJECT_IS_TOP_LEVEL.html
string(COMPARE EQUAL ${CMAKE_CURRENT_SOURCE_DIR} ${CMAKE_SOURCE_DIR} PROJECT_IS_TOP_LEVEL)
option(GSL_INSTALL "Generate and install GSL target" ${PROJECT_IS_TOP_LEVEL})
option(GSL_TEST "Build and perform GSL tests" ${PROJECT_IS_TOP_LEVEL})
# The implementation generally assumes a platform that implements C++14 support
target_compile_features(GSL INTERFACE "cxx_std_14")
# Setup include directory
add_subdirectory(include)
target_sources(GSL INTERFACE $<BUILD_INTERFACE:${GSL_SOURCE_DIR}/GSL.natvis>)
if (GSL_TEST)
enable_testing()
add_subdirectory(tests)
endif()
if (GSL_INSTALL)
include(GNUInstallDirs)
include(CMakePackageConfigHelpers)
install(DIRECTORY "${PROJECT_SOURCE_DIR}/include/gsl" DESTINATION ${CMAKE_INSTALL_INCLUDEDIR})
set(export_name "Microsoft.GSLConfig")
set(namespace "Microsoft.GSL::")
set(cmake_files_install_dir ${CMAKE_INSTALL_DATADIR}/cmake/Microsoft.GSL)
install(TARGETS GSL EXPORT ${export_name} INCLUDES DESTINATION ${CMAKE_INSTALL_INCLUDEDIR})
install(EXPORT ${export_name} NAMESPACE ${namespace} DESTINATION ${cmake_files_install_dir})
export(TARGETS GSL NAMESPACE ${namespace} FILE ${export_name}.cmake)
set(gls_config_version "${CMAKE_CURRENT_BINARY_DIR}/Microsoft.GSLConfigVersion.cmake")
write_basic_package_version_file(${gls_config_version} COMPATIBILITY SameMajorVersion ARCH_INDEPENDENT)
install(FILES ${gls_config_version} DESTINATION ${cmake_files_install_dir})
install(FILES GSL.natvis DESTINATION ${cmake_files_install_dir})
endif()

29
deps/GSL/CONTRIBUTING.md vendored Normal file
View File

@ -0,0 +1,29 @@
## Contributing to the Guidelines Support Library
The Guidelines Support Library (GSL) contains functions and types that are suggested for use by the
[C++ Core Guidelines](https://github.com/isocpp/CppCoreGuidelines). GSL design changes are made only as a result of modifications to the Guidelines.
GSL is accepting contributions that improve or refine any of the types in this library as well as ports to other platforms. Changes should have an issue
tracking the suggestion that has been approved by the maintainers. Your pull request should include a link to the bug that you are fixing. If you've submitted
a PR, please post a comment in the associated issue to avoid duplication of effort.
## Legal
You will need to complete a Contributor License Agreement (CLA). Briefly, this agreement testifies that you are granting us and the community permission to
use the submitted change according to the terms of the project's license, and that the work being submitted is under appropriate copyright.
Please submit a Contributor License Agreement (CLA) before submitting a pull request. You may visit https://cla.microsoft.com to sign digitally.
## Housekeeping
Your pull request should:
* Include a description of what your change intends to do
* Be a child commit of a reasonably recent commit in the **main** branch
* Requests need not be a single commit, but should be a linear sequence of commits (i.e. no merge commits in your PR)
* It is desirable, but not necessary, for the tests to pass at each commit. Please see [README.md](./README.md) for instructions to build the test suite.
* Have clear commit messages
* e.g. "Fix issue", "Add tests for type", etc.
* Include appropriate tests
* Tests should include reasonable permutations of the target fix/change
* Include baseline changes with your change
* All changed code must have 100% code coverage
* To avoid line ending issues, set `autocrlf = input` and `whitespace = cr-at-eol` in your git configuration

220
deps/GSL/README.md vendored Normal file
View File

@ -0,0 +1,220 @@
# GSL: Guidelines Support Library
[![CI](https://github.com/Microsoft/GSL/actions/workflows/compilers.yml/badge.svg)](https://github.com/microsoft/GSL/actions/workflows/compilers.yml?query=branch%3Amain)
[![vcpkg](https://img.shields.io/vcpkg/v/ms-gsl)](https://vcpkg.io/en/package/ms-gsl)
The Guidelines Support Library (GSL) contains functions and types that are suggested for use by the
[C++ Core Guidelines](https://github.com/isocpp/CppCoreGuidelines) maintained by the [Standard C++ Foundation](https://isocpp.org).
This repo contains Microsoft's implementation of GSL.
The entire implementation is provided inline in the headers under the [gsl](./include/gsl) directory. The implementation generally assumes a platform that implements C++14 support.
While some types have been broken out into their own headers (e.g. [gsl/span](./include/gsl/span)),
it is simplest to just include [gsl/gsl](./include/gsl/gsl) and gain access to the entire library.
> NOTE: We encourage contributions that improve or refine any of the types in this library as well as ports to
other platforms. Please see [CONTRIBUTING.md](./CONTRIBUTING.md) for more information about contributing.
# Project Code of Conduct
This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/). For more information see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) or contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with any additional questions or comments.
# Usage of Third Party Libraries
This project makes use of the [Google Test](https://github.com/google/googletest) testing library. Please see the [ThirdPartyNotices.txt](./ThirdPartyNotices.txt) file for details regarding the licensing of Google Test.
# Supported features
## Microsoft GSL implements the following from the C++ Core Guidelines:
Feature | Supported? | Description
-------------------------------------------------------------------------|:----------:|-------------
[**1. Views**][cg-views] | |
[owner](docs/headers.md#user-content-H-pointers-owner) | &#x2611; | An alias for a raw pointer
[not_null](docs/headers.md#user-content-H-pointers-not_null) | &#x2611; | Restricts a pointer/smart pointer to hold non-null values
[span](docs/headers.md#user-content-H-span-span) | &#x2611; | A view over a contiguous sequence of memory. Based on the standardized version of `std::span`, however `gsl::span` enforces bounds checking.
span_p | &#x2610; | Spans a range starting from a pointer to the first place for which the predicate is true
[basic_zstring](docs/headers.md#user-content-H-zstring) | &#x2611; | A pointer to a C-string (zero-terminated array) with a templated char type
[zstring](docs/headers.md#user-content-H-zstring) | &#x2611; | An alias to `basic_zstring` with dynamic extent and a char type of `char`
[czstring](docs/headers.md#user-content-H-zstring) | &#x2611; | An alias to `basic_zstring` with dynamic extent and a char type of `const char`
[wzstring](docs/headers.md#user-content-H-zstring) | &#x2611; | An alias to `basic_zstring` with dynamic extent and a char type of `wchar_t`
[cwzstring](docs/headers.md#user-content-H-zstring) | &#x2611; | An alias to `basic_zstring` with dynamic extent and a char type of `const wchar_t`
[u16zstring](docs/headers.md#user-content-H-zstring) | &#x2611; | An alias to `basic_zstring` with dynamic extent and a char type of `char16_t`
[cu16zstring](docs/headers.md#user-content-H-zstring) | &#x2611; | An alias to `basic_zstring` with dynamic extent and a char type of `const char16_t`
[u32zstring](docs/headers.md#user-content-H-zstring) | &#x2611; | An alias to `basic_zstring` with dynamic extent and a char type of `char32_t`
[cu32zstring](docs/headers.md#user-content-H-zstring) | &#x2611; | An alias to `basic_zstring` with dynamic extent and a char type of `const char32_t`
[**2. Owners**][cg-owners] | |
stack_array | &#x2610; | A stack-allocated array
dyn_array | &#x2610; | A heap-allocated array
[**3. Assertions**][cg-assertions] | |
[Expects](docs/headers.md#user-content-H-assert-expects) | &#x2611; | A precondition assertion; on failure it terminates
[Ensures](docs/headers.md#user-content-H-assert-ensures) | &#x2611; | A postcondition assertion; on failure it terminates
[**4. Utilities**][cg-utilities] | |
move_owner | &#x2610; | A helper function that moves one `owner` to the other
[final_action](docs/headers.md#user-content-H-util-final_action) | &#x2611; | A RAII style class that invokes a functor on its destruction
[finally](docs/headers.md#user-content-H-util-finally) | &#x2611; | A helper function instantiating [final_action](docs/headers.md#user-content-H-util-final_action)
[GSL_SUPPRESS](docs/headers.md#user-content-H-assert-gsl_suppress) | &#x2611; | A macro that takes an argument and turns it into `[[gsl::suppress(x)]]` or `[[gsl::suppress("x")]]`
[[implicit]] | &#x2610; | A "marker" to put on single-argument constructors to explicitly make them non-explicit
[index](docs/headers.md#user-content-H-util-index) | &#x2611; | A type to use for all container and array indexing (currently an alias for `std::ptrdiff_t`)
[narrow](docs/headers.md#user-content-H-narrow-narrow) | &#x2611; | A checked version of `narrow_cast`; it can throw [narrowing_error](docs/headers.md#user-content-H-narrow-narrowing_error)
[narrow_cast](docs/headers.md#user-content-H-util-narrow_cast) | &#x2611; | A narrowing cast for values and a synonym for `static_cast`
[narrowing_error](docs/headers.md#user-content-H-narrow-narrowing_error) | &#x2611; | A custom exception type thrown by [narrow](docs/headers.md#user-content-H-narrow-narrow)
[**5. Concepts**][cg-concepts] | &#x2610; |
## The following features do not exist in or have been removed from the C++ Core Guidelines:
Feature | Supported? | Description
-----------------------------------|:----------:|-------------
[strict_not_null](docs/headers.md#user-content-H-pointers-strict_not_null) | &#x2611; | A stricter version of [not_null](docs/headers.md#user-content-H-pointers-not_null) with explicit constructors
multi_span | &#x2610; | Deprecated. Multi-dimensional span.
strided_span | &#x2610; | Deprecated. Support for this type has been discontinued.
basic_string_span | &#x2610; | Deprecated. Like `span` but for strings with a templated char type
string_span | &#x2610; | Deprecated. An alias to `basic_string_span` with a char type of `char`
cstring_span | &#x2610; | Deprecated. An alias to `basic_string_span` with a char type of `const char`
wstring_span | &#x2610; | Deprecated. An alias to `basic_string_span` with a char type of `wchar_t`
cwstring_span | &#x2610; | Deprecated. An alias to `basic_string_span` with a char type of `const wchar_t`
u16string_span | &#x2610; | Deprecated. An alias to `basic_string_span` with a char type of `char16_t`
cu16string_span | &#x2610; | Deprecated. An alias to `basic_string_span` with a char type of `const char16_t`
u32string_span | &#x2610; | Deprecated. An alias to `basic_string_span` with a char type of `char32_t`
cu32string_span | &#x2610; | Deprecated. An alias to `basic_string_span` with a char type of `const char32_t`
## The following features have been adopted by WG21. They are deprecated in GSL.
Feature | Deprecated Since | Notes
------------------------------------------------------------------|------------------|------
[unique_ptr](docs/headers.md#user-content-H-pointers-unique_ptr) | C++11 | Use std::unique_ptr instead.
[shared_ptr](docs/headers.md#user-content-H-pointers-shared_ptr) | C++11 | Use std::shared_ptr instead.
[byte](docs/headers.md#user-content-H-byte-byte) | C++17 | Use std::byte instead.
joining_thread | C++20 (Note: Not yet implemented in GSL) | Use std::jthread instead.
This is based on [CppCoreGuidelines semi-specification](https://github.com/isocpp/CppCoreGuidelines/blob/master/CppCoreGuidelines.md#gsl-guidelines-support-library).
[cg-views]: https://github.com/isocpp/CppCoreGuidelines/blob/master/CppCoreGuidelines.md#gslview-views
[cg-owners]: https://github.com/isocpp/CppCoreGuidelines/blob/master/CppCoreGuidelines.md#gslowner-ownership-pointers
[cg-assertions]: https://github.com/isocpp/CppCoreGuidelines/blob/master/CppCoreGuidelines.md#gslassert-assertions
[cg-utilities]: https://github.com/isocpp/CppCoreGuidelines/blob/master/CppCoreGuidelines.md#gslutil-utilities
[cg-concepts]: https://github.com/isocpp/CppCoreGuidelines/blob/master/CppCoreGuidelines.md#gslconcept-concepts
# Quick Start
## Supported Compilers / Toolsets
The GSL officially supports recent major versions of Visual Studio with both MSVC and LLVM, GCC, Clang, and XCode with Apple-Clang.
For each of these major versions, the GSL officially supports C++14, C++17, C++20, and C++23 (when supported by the compiler).
Below is a table showing the versions currently being tested (also see [.github/workflows/compilers.yml](the workflow).)
Compiler |Toolset Versions Currently Tested
:------- |--:
GCC | 12, 13, 14
XCode | 14.3.1, 15.4
Clang | 16, 17, 18
Visual Studio with MSVC | VS2019, VS2022
Visual Studio with LLVM | VS2019, VS2022
---
If you successfully port GSL to another platform, we would love to hear from you!
- Submit an issue specifying the platform and target.
- Consider contributing your changes by filing a pull request with any necessary changes.
- If at all possible, add a CI/CD step and add the button to the table below!
Target | CI/CD Status
:------- | -----------:
iOS | [![CI_iOS](https://github.com/microsoft/GSL/workflows/CI_iOS/badge.svg?branch=main)](https://github.com/microsoft/GSL/actions/workflows/ios.yml?query=branch%3Amain)
Android | [![CI_Android](https://github.com/microsoft/GSL/workflows/CI_Android/badge.svg?branch=main)](https://github.com/microsoft/GSL/actions/workflows/android.yml?query=branch%3Amain)
Note: These CI/CD steps are run with each pull request, however failures in them are non-blocking.
## Building the tests
To build the tests, you will require the following:
* [CMake](http://cmake.org), version 3.14 or later to be installed and in your PATH.
These steps assume the source code of this repository has been cloned into a directory named `c:\GSL`.
1. Create a directory to contain the build outputs for a particular architecture (we name it `c:\GSL\build-x86` in this example).
cd GSL
md build-x86
cd build-x86
2. Configure CMake to use the compiler of your choice (you can see a list by running `cmake --help`).
cmake -G "Visual Studio 15 2017" c:\GSL
3. Build the test suite (in this case, in the Debug configuration, Release is another good choice).
cmake --build . --config Debug
4. Run the test suite.
ctest -C Debug
All tests should pass - indicating your platform is fully supported and you are ready to use the GSL types!
## Building GSL - Using vcpkg
You can download and install GSL using the [vcpkg](https://github.com/Microsoft/vcpkg) dependency manager:
git clone https://github.com/Microsoft/vcpkg.git
cd vcpkg
./bootstrap-vcpkg.sh
./vcpkg integrate install
vcpkg install ms-gsl
The GSL port in vcpkg is kept up to date by Microsoft team members and community contributors. If the version is out of date, please [create an issue or pull request](https://github.com/Microsoft/vcpkg) on the vcpkg repository.
## Using the libraries
As the types are entirely implemented inline in headers, there are no linking requirements.
You can copy the [gsl](./include/gsl) directory into your source tree so it is available
to your compiler, then include the appropriate headers in your program.
Alternatively set your compiler's *include path* flag to point to the GSL development folder (`c:\GSL\include` in the example above) or installation folder (after running the install). Eg.
MSVC++
/I c:\GSL\include
GCC/clang
-I$HOME/dev/GSL/include
Include the library using:
#include <gsl/gsl>
## Usage in CMake
The library provides a Config file for CMake, once installed it can be found via `find_package`.
Which, when successful, will add library target called `Microsoft.GSL::GSL` which you can use via the usual
`target_link_libraries` mechanism.
```cmake
find_package(Microsoft.GSL CONFIG REQUIRED)
target_link_libraries(foobar PRIVATE Microsoft.GSL::GSL)
```
### FetchContent
If you are using CMake version 3.11+ you can use the official [FetchContent module](https://cmake.org/cmake/help/latest/module/FetchContent.html).
This allows you to easily incorporate GSL into your project.
```cmake
# NOTE: This example uses CMake version 3.14 (FetchContent_MakeAvailable).
# Since it streamlines the FetchContent process
cmake_minimum_required(VERSION 3.14)
include(FetchContent)
FetchContent_Declare(GSL
GIT_REPOSITORY "https://github.com/microsoft/GSL"
GIT_TAG "v4.1.0"
GIT_SHALLOW ON
)
FetchContent_MakeAvailable(GSL)
target_link_libraries(foobar PRIVATE Microsoft.GSL::GSL)
```
## Debugging visualization support
For Visual Studio users, the file [GSL.natvis](./GSL.natvis) in the root directory of the repository can be added to your project if you would like more helpful visualization of GSL types in the Visual Studio debugger than would be offered by default.
## See Also
For information on [Microsoft Gray Systems Lab (GSL)](https://aka.ms/gsl) of applied data management and system research see <https://aka.ms/gsl>.

41
deps/GSL/SECURITY.md vendored Normal file
View File

@ -0,0 +1,41 @@
<!-- BEGIN MICROSOFT SECURITY.MD V0.0.7 BLOCK -->
## Security
Microsoft takes the security of our software products and services seriously, which includes all source code repositories managed through our GitHub organizations, which include [Microsoft](https://github.com/Microsoft), [Azure](https://github.com/Azure), [DotNet](https://github.com/dotnet), [AspNet](https://github.com/aspnet), [Xamarin](https://github.com/xamarin), and [our GitHub organizations](https://opensource.microsoft.com/).
If you believe you have found a security vulnerability in any Microsoft-owned repository that meets [Microsoft's definition of a security vulnerability](https://aka.ms/opensource/security/definition), please report it to us as described below.
## Reporting Security Issues
**Please do not report security vulnerabilities through public GitHub issues.**
Instead, please report them to the Microsoft Security Response Center (MSRC) at [https://msrc.microsoft.com/create-report](https://aka.ms/opensource/security/create-report).
If you prefer to submit without logging in, send email to [secure@microsoft.com](mailto:secure@microsoft.com). If possible, encrypt your message with our PGP key; please download it from the [Microsoft Security Response Center PGP Key page](https://aka.ms/opensource/security/pgpkey).
You should receive a response within 24 hours. If for some reason you do not, please follow up via email to ensure we received your original message. Additional information can be found at [microsoft.com/msrc](https://aka.ms/opensource/security/msrc).
Please include the requested information listed below (as much as you can provide) to help us better understand the nature and scope of the possible issue:
* Type of issue (e.g. buffer overflow, SQL injection, cross-site scripting, etc.)
* Full paths of source file(s) related to the manifestation of the issue
* The location of the affected source code (tag/branch/commit or direct URL)
* Any special configuration required to reproduce the issue
* Step-by-step instructions to reproduce the issue
* Proof-of-concept or exploit code (if possible)
* Impact of the issue, including how an attacker might exploit the issue
This information will help us triage your report more quickly.
If you are reporting for a bug bounty, more complete reports can contribute to a higher bounty award. Please visit our [Microsoft Bug Bounty Program](https://aka.ms/opensource/security/bounty) page for more details about our active programs.
## Preferred Languages
We prefer all communications to be in English.
## Policy
Microsoft follows the principle of [Coordinated Vulnerability Disclosure](https://aka.ms/opensource/security/cvd).
<!-- END MICROSOFT SECURITY.MD BLOCK -->

41
deps/GSL/ThirdPartyNotices.txt vendored Normal file
View File

@ -0,0 +1,41 @@
THIRD-PARTY SOFTWARE NOTICES AND INFORMATION
Do Not Translate or Localize
GSL: Guidelines Support Library incorporates third party material from the projects listed below.
-------------------------------------------------------------------------------
Software: Google Test
Owner: Google Inc.
Source URL: github.com/google/googletest
License: BSD 3 - Clause
Text:
Copyright 2008, Google Inc.
All rights reserved.
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are
met:
* Redistributions of source code must retain the above copyright
notice, this list of conditions and the following disclaimer.
* Redistributions in binary form must reproduce the above
copyright notice, this list of conditions and the following disclaimer
in the documentation and/or other materials provided with the
distribution.
* Neither the name of Google Inc. nor the names of its
contributors may be used to endorse or promote products derived from
this software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
-------------------------------------------------------------------------------

884
deps/GSL/docs/headers.md vendored Normal file
View File

@ -0,0 +1,884 @@
The Guidelines Support Library (GSL) interface is very lightweight and exposed via a header-only library. This document attempts to document all of the headers and their exposed classes and functions.
Types and functions are exported in the namespace `gsl`.
See [GSL: Guidelines support library](https://isocpp.github.io/CppCoreGuidelines/CppCoreGuidelines#S-gsl)
# <a name="H" />Headers
- [`<algorithms>`](#user-content-H-algorithms)
- [`<assert>`](#user-content-H-assert)
- [`<byte>`](#user-content-H-byte)
- [`<gsl>`](#user-content-H-gsl)
- [`<narrow>`](#user-content-H-narrow)
- [`<pointers>`](#user-content-H-pointers)
- [`<span>`](#user-content-H-span)
- [`<span_ext>`](#user-content-H-span_ext)
- [`<zstring>`](#user-content-H-zstring)
- [`<util>`](#user-content-H-util)
## <a name="H-algorithms" />`<algorithms>`
This header contains some common algorithms that have been wrapped in GSL safety features.
- [`gsl::copy`](#user-content-H-algorithms-copy)
### <a name="H-algorithms-copy" />`gsl::copy`
```cpp
template <class SrcElementType, std::size_t SrcExtent, class DestElementType,
std::size_t DestExtent>
void copy(span<SrcElementType, SrcExtent> src, span<DestElementType, DestExtent> dest);
```
This function copies the content from the `src` [`span`](#user-content-H-span-span) to the `dest` [`span`](#user-content-H-span-span). It [`Expects`](#user-content-H-assert-expects)
that the destination `span` is at least as large as the source `span`.
## <a name="H-assert" />`<assert>`
This header contains some macros used for contract checking and suppressing code analysis warnings.
See [GSL.assert: Assertions](https://isocpp.github.io/CppCoreGuidelines/CppCoreGuidelines#SS-assertions)
- [`GSL_SUPPRESS`](#user-content-H-assert-gsl_suppress)
- [`Expects`](#user-content-H-assert-expects)
- [`Ensures`](#user-content-H-assert-ensures)
### <a name="H-assert-gsl_suppress" />`GSL_SUPPRESS`
This macro can be used to suppress a code analysis warning.
The core guidelines request tools that check for the rules to respect suppressing a rule by writing
`[[gsl::suppress(tag)]]` or `[[gsl::suppress(tag, justification: "message")]]`.
Clang does not use exactly that syntax, but requires `tag` to be put in double quotes `[[gsl::suppress("tag")]]`.
For portable code you can use `GSL_SUPPRESS(tag)`.
See [In.force: Enforcement](https://isocpp.github.io/CppCoreGuidelines/CppCoreGuidelines#inforce-enforcement).
### <a name="H-assert-expects" />`Expects`
This macro can be used for expressing a precondition. If the precondition is not held, then `std::terminate` will be called.
See [I.6: Prefer `Expects()` for expressing preconditions](https://isocpp.github.io/CppCoreGuidelines/CppCoreGuidelines#i6-prefer-expects-for-expressing-preconditions)
### <a name="H-assert-ensures" />`Ensures`
This macro can be used for expressing a postcondition. If the postcondition is not held, then `std::terminate` will be called.
See [I.8: Prefer `Ensures()` for expressing postconditions](https://isocpp.github.io/CppCoreGuidelines/CppCoreGuidelines#i8-prefer-ensures-for-expressing-postconditions)
## <a name="H-byte" />`<byte>`
This header contains the definition of a byte type, implementing `std::byte` before it was standardized into C++17.
- [`gsl::byte`](#user-content-H-byte-byte)
### <a name="H-byte-byte" />`gsl::byte`
If `GSL_USE_STD_BYTE` is defined to be `1`, then `gsl::byte` will be an alias to `std::byte`.
If `GSL_USE_STD_BYTE` is defined to be `0`, then `gsl::byte` will be a distinct type that implements the concept of byte.
If `GSL_USE_STD_BYTE` is not defined, then the header file will check if `std::byte` is available (C\+\+17 or higher). If yes,
`gsl::byte` will be an alias to `std::byte`, otherwise `gsl::byte` will be a distinct type that implements the concept of byte.
&#x26a0; Take care when linking projects that were compiled with different language standards (before C\+\+17 and C\+\+17 or higher).
If you do so, you might want to `#define GSL_USE_STD_BYTE 0` to a fixed value to be sure that both projects use exactly
the same type. Otherwise you might get linker errors.
See [SL.str.5: Use `std::byte` to refer to byte values that do not necessarily represent characters](https://isocpp.github.io/CppCoreGuidelines/CppCoreGuidelines#Rstr-byte)
### Non-member functions
```cpp
template <class IntegerType, std::enable_if_t<std::is_integral<IntegerType>::value, bool> = true>
constexpr byte& operator<<=(byte& b, IntegerType shift) noexcept;
template <class IntegerType, std::enable_if_t<std::is_integral<IntegerType>::value, bool> = true>
constexpr byte operator<<(byte b, IntegerType shift) noexcept;
template <class IntegerType, std::enable_if_t<std::is_integral<IntegerType>::value, bool> = true>
constexpr byte& operator>>=(byte& b, IntegerType shift) noexcept;
template <class IntegerType, std::enable_if_t<std::is_integral<IntegerType>::value, bool> = true>
constexpr byte operator>>(byte b, IntegerType shift) noexcept;
```
Left or right shift a `byte` by a given number of bits.
```cpp
constexpr byte& operator|=(byte& l, byte r) noexcept;
constexpr byte operator|(byte l, byte r) noexcept;
```
Bitwise "or" of two `byte`s.
```cpp
constexpr byte& operator&=(byte& l, byte r) noexcept;
constexpr byte operator&(byte l, byte r) noexcept;
```
Bitwise "and" of two `byte`s.
```cpp
constexpr byte& operator^=(byte& l, byte r) noexcept;
constexpr byte operator^(byte l, byte r) noexcept;
```
Bitwise xor of two `byte`s.
```cpp
constexpr byte operator~(byte b) noexcept;
```
Bitwise negation of a `byte`. Flips all bits. Zeroes become ones, ones become zeroes.
```cpp
template <class IntegerType, std::enable_if_t<std::is_integral<IntegerType>::value, bool> = true>
constexpr IntegerType to_integer(byte b) noexcept;
```
Convert the given `byte` value to an integral type.
```cpp
template <typename T>
constexpr byte to_byte(T t) noexcept;
```
Convert the given value to a `byte`. The template requires `T` to be an `unsigned char` so that no data loss can occur.
If you want to convert an integer constant to a `byte` you probably want to call `to_byte<integer constant>()`.
```cpp
template <int I>
constexpr byte to_byte() noexcept;
```
Convert the given value `I` to a `byte`. The template requires `I` to be in the valid range 0..255 for a `gsl::byte`.
## <a name="H-gsl" />`<gsl>`
This header is a convenience header that includes all other [GSL headers](#user-content-H).
Since `<narrow>` requires exceptions, it will only be included if exceptions are enabled.
## <a name="H-narrow" />`<narrow>`
This header contains utility functions and classes, for narrowing casts, which require exceptions. The narrowing-related utilities that don't require exceptions are found inside [util](#user-content-H-util).
See [GSL.util: Utilities](https://isocpp.github.io/CppCoreGuidelines/CppCoreGuidelines#SS-utilities)
- [`gsl::narrowing_error`](#user-content-H-narrow-narrowing_error)
- [`gsl::narrow`](#user-content-H-narrow-narrow)
### <a name="H-narrow-narrowing_error" />`gsl::narrowing_error`
`gsl::narrowing_error` is the exception thrown by [`gsl::narrow`](#user-content-H-narrow-narrow) when a narrowing conversion fails. It is derived from `std::exception`.
### <a name="H-narrow-narrow" />`gsl::narrow`
`gsl::narrow<T>(x)` is a named cast that does a `static_cast<T>(x)` for narrowing conversions with no signedness promotions.
If the argument `x` cannot be represented in the target type `T`, then the function throws a [`gsl::narrowing_error`](#user-content-H-narrow-narrowing_error) (e.g., `narrow<unsigned>(-42)` and `narrow<char>(300)` throw).
Note: compare [`gsl::narrow_cast`](#user-content-H-util-narrow_cast) in header [util](#user-content-H-util).
See [ES.46: Avoid lossy (narrowing, truncating) arithmetic conversions](https://isocpp.github.io/CppCoreGuidelines/CppCoreGuidelines#Res-narrowing) and [ES.49: If you must use a cast, use a named cast](https://isocpp.github.io/CppCoreGuidelines/CppCoreGuidelines#Res-casts-named)
## <a name="H-pointers" />`<pointers>`
This header contains some pointer types.
See [GSL.view](https://isocpp.github.io/CppCoreGuidelines/CppCoreGuidelines#SS-views)
- [`gsl::unique_ptr`](#user-content-H-pointers-unique_ptr)
- [`gsl::shared_ptr`](#user-content-H-pointers-shared_ptr)
- [`gsl::owner`](#user-content-H-pointers-owner)
- [`gsl::not_null`](#user-content-H-pointers-not_null)
- [`gsl::strict_not_null`](#user-content-H-pointers-strict_not_null)
### <a name="H-pointers-unique_ptr" />`gsl::unique_ptr`
`gsl::unique_ptr` is an alias to `std::unique_ptr`.
See [GSL.owner: Ownership pointers](https://isocpp.github.io/CppCoreGuidelines/CppCoreGuidelines#SS-ownership)
### <a name="H-pointers-shared_ptr" />`gsl::shared_ptr`
`gsl::shared_ptr` is an alias to `std::shared_ptr`.
See [GSL.owner: Ownership pointers](https://isocpp.github.io/CppCoreGuidelines/CppCoreGuidelines#SS-ownership)
### <a name="H-pointers-owner" />`gsl::owner`
`gsl::owner<T>` is designed as a safety mechanism for code that must deal directly with raw pointers that own memory. Ideally such code should be restricted to the implementation of low-level abstractions. `gsl::owner` can also be used as a stepping point in converting legacy code to use more modern RAII constructs such as smart pointers.
`T` must be a pointer type (`std::is_pointer<T>`).
A `gsl::owner<T>` is a typedef to `T`. It adds no runtime overhead whatsoever, as it is purely syntactic and does not add any runtime checks. Instead, it serves as an annotation for static analysis tools which check for memory safety, and as a code comprehension guide for human readers.
See Enforcement section of [C.31: All resources acquired by a class must be released by the classs destructor](https://isocpp.github.io/CppCoreGuidelines/CppCoreGuidelines#Rc-dtor-release).
### <a name="H-pointers-not_null" />`gsl::not_null`
`gsl::not_null<T>` restricts a pointer or smart pointer to only hold non-null values. It has no size overhead over `T`.
The checks for ensuring that the pointer is not null are done in the constructor. There is no overhead when retrieving or dereferencing the checked pointer.
When a nullptr check fails, `std::terminate` is called.
See [F.23: Use a `not_null<T>` to indicate that “null” is not a valid value](https://isocpp.github.io/CppCoreGuidelines/CppCoreGuidelines#Rf-nullptr)
#### Member Types
```cpp
using element_type = T;
```
The type of the pointer or smart pointer that is managed by this object.
#### Member functions
##### Construct/Copy
```cpp
template <typename U, typename = std::enable_if_t<std::is_convertible<U, T>::value>>
constexpr not_null(U&& u);
template <typename = std::enable_if_t<!std::is_same<std::nullptr_t, T>::value>>
constexpr not_null(T u);
```
Constructs a `gsl_owner<T>` from a pointer that is convertible to `T` or that is a `T`. It [`Expects`](#user-content-H-assert-expects) that the provided pointer is not `== nullptr`.
```cpp
template <typename U, typename = std::enable_if_t<std::is_convertible<U, T>::value>>
constexpr not_null(const not_null<U>& other);
```
Constructs a `gsl_owner<T>` from another `gsl_owner` where the other pointer is convertible to `T`. It [`Expects`](#user-content-H-assert-expects) that the provided pointer is not `== nullptr`.
```cpp
not_null(const not_null& other) = default;
not_null& operator=(const not_null& other) = default;
```
Copy construction and assignment.
```cpp
not_null(std::nullptr_t) = delete;
not_null& operator=(std::nullptr_t) = delete;
```
Construction from `std::nullptr_t` and assignment of `std::nullptr_t` are explicitly deleted.
##### Modifiers
```cpp
not_null& operator++() = delete;
not_null& operator--() = delete;
not_null operator++(int) = delete;
not_null operator--(int) = delete;
not_null& operator+=(std::ptrdiff_t) = delete;
not_null& operator-=(std::ptrdiff_t) = delete;
```
Explicitly deleted operators. Pointers point to single objects ([I.13: Do not pass an array as a single pointer](http://isocpp.github.io/CppCoreGuidelines/CppCoreGuidelines#Ri-array)), so don't allow these operators.
##### Observers
```cpp
constexpr details::value_or_reference_return_t<T> get() const;
constexpr operator T() const { return get(); }
```
Get the underlying pointer.
```cpp
constexpr decltype(auto) operator->() const { return get(); }
constexpr decltype(auto) operator*() const { return *get(); }
```
Dereference the underlying pointer.
```cpp
void operator[](std::ptrdiff_t) const = delete;
```
Array index operator is explicitly deleted. Pointers point to single objects ([I.13: Do not pass an array as a single pointer](http://isocpp.github.io/CppCoreGuidelines/CppCoreGuidelines#Ri-array)), so don't allow treating them as an array.
```cpp
void swap(not_null<T>& other) { std::swap(ptr_, other.ptr_); }
```
Swaps contents with another `gsl::not_null` object.
#### Non-member functions
```cpp
template <class T>
auto make_not_null(T&& t) noexcept;
```
Creates a `gsl::not_null` object, deducing the target type from the type of the argument.
```cpp
template <typename T, std::enable_if_t<std::is_move_assignable<T>::value && std::is_move_constructible<T>::value, bool> = true>
void swap(not_null<T>& a, not_null<T>& b);
```
Swaps the contents of two `gsl::not_null` objects.
```cpp
template <class T, class U>
auto operator==(const not_null<T>& lhs,
const not_null<U>& rhs) noexcept(noexcept(lhs.get() == rhs.get()))
-> decltype(lhs.get() == rhs.get());
template <class T, class U>
auto operator!=(const not_null<T>& lhs,
const not_null<U>& rhs) noexcept(noexcept(lhs.get() != rhs.get()))
-> decltype(lhs.get() != rhs.get());
template <class T, class U>
auto operator<(const not_null<T>& lhs,
const not_null<U>& rhs) noexcept(noexcept(lhs.get() < rhs.get()))
-> decltype(lhs.get() < rhs.get());
template <class T, class U>
auto operator<=(const not_null<T>& lhs,
const not_null<U>& rhs) noexcept(noexcept(lhs.get() <= rhs.get()))
-> decltype(lhs.get() <= rhs.get());
template <class T, class U>
auto operator>(const not_null<T>& lhs,
const not_null<U>& rhs) noexcept(noexcept(lhs.get() > rhs.get()))
-> decltype(lhs.get() > rhs.get());
template <class T, class U>
auto operator>=(const not_null<T>& lhs,
const not_null<U>& rhs) noexcept(noexcept(lhs.get() >= rhs.get()))
-> decltype(lhs.get() >= rhs.get());
```
Comparison of pointers that are convertible to each other.
##### Input/Output
```cpp
template <class T>
std::ostream& operator<<(std::ostream& os, const not_null<T>& val);
```
Performs stream output on a `not_null` pointer, invoking `os << val.get()`. This function is only available when `GSL_NO_IOSTREAMS` is not defined.
##### Modifiers
```cpp
template <class T, class U>
std::ptrdiff_t operator-(const not_null<T>&, const not_null<U>&) = delete;
template <class T>
not_null<T> operator-(const not_null<T>&, std::ptrdiff_t) = delete;
template <class T>
not_null<T> operator+(const not_null<T>&, std::ptrdiff_t) = delete;
template <class T>
not_null<T> operator+(std::ptrdiff_t, const not_null<T>&) = delete;
```
Addition and subtraction are explicitly deleted. Pointers point to single objects ([I.13: Do not pass an array as a single pointer](http://isocpp.github.io/CppCoreGuidelines/CppCoreGuidelines#Ri-array)), so don't allow these operators.
##### STL integration
```cpp
template <class T>
struct std::hash<gsl::not_null<T>> { ... };
```
Specialization of `std::hash` for `gsl::not_null`.
### <a name="H-pointers-strict_not_null" />`gsl::strict_not_null`
`strict_not_null` is the same as [`not_null`](#user-content-H-pointers-not_null) except that the constructors are `explicit`.
The free function that deduces the target type from the type of the argument and creates a `gsl::strict_not_null` object is `gsl::make_strict_not_null`.
## <a name="H-span" />`<span>`
This header file exports the class `gsl::span`, a bounds-checked implementation of `std::span`.
- [`gsl::span`](#user-content-H-span-span)
### <a name="H-span-span" />`gsl::span`
```cpp
template <class ElementType, std::size_t Extent>
class span;
```
`gsl::span` is a view over memory. It does not own the memory and is only a way to access contiguous sequences of objects.
The extent can be either a fixed size or [`gsl::dynamic_extent`](#user-content-H-span_ext-dynamic_extent).
The `gsl::span` is based on the standardized version of `std::span` which was added to C++20. Originally, the plan was to
deprecate `gsl::span` when `std::span` finished standardization, however that plan changed when the runtime bounds checking
was removed from `std::span`'s design.
The only difference between `gsl::span` and `std::span` is that `gsl::span` strictly enforces runtime bounds checking.
Any violations of the bounds check results in termination of the program.
Like `gsl::span`, `gsl::span`'s iterators also differ from `std::span`'s iterator in that all access operations are bounds checked.
#### Which version of span should I use?
##### Use `gsl::span` if
- you want to guarantee bounds safety in your project.
- All data accessing operations use bounds checking to ensure you are only accessing valid memory.
- your project uses C++14 or C++17.
- `std::span` is not available as it was not introduced into the STL until C++20.
##### Use `std::span` if
- your project is C++20 and you need the performance offered by `std::span`.
#### Types
```cpp
using element_type = ElementType;
using value_type = std::remove_cv_t<ElementType>;
using size_type = std::size_t;
using pointer = element_type*;
using const_pointer = const element_type*;
using reference = element_type&;
using const_reference = const element_type&;
using difference_type = std::ptrdiff_t;
using iterator = details::span_iterator<ElementType>;
using reverse_iterator = std::reverse_iterator<iterator>;
```
#### Member functions
```cpp
constexpr span() noexcept;
```
Constructs an empty `span`. This constructor is only available if `Extent` is 0 or [`gsl::dynamic_extent`](#user-content-H-span_ext-dynamic_extent).
`span::data()` will return `nullptr`.
```cpp
constexpr explicit(Extent != gsl::dynamic_extent) span(pointer ptr, size_type count) noexcept;
```
Constructs a `span` from a pointer and a size. If `Extent` is not [`gsl::dynamic_extent`](#user-content-H-span_ext-dynamic_extent),
then the constructor [`Expects`](#user-content-H-assert-expects) that `count == Extent`.
```cpp
constexpr explicit(Extent != gsl::dynamic_extent) span(pointer firstElem, pointer lastElem) noexcept;
```
Constructs a `span` from a pointer to the begin and the end of the data. If `Extent` is not [`gsl::dynamic_extent`](#user-content-H-span_ext-dynamic_extent),
then the constructor [`Expects`](#user-content-H-assert-expects) that `lastElem - firstElem == Extent`.
```cpp
template <std::size_t N>
constexpr span(element_type (&arr)[N]) noexcept;
```
Constructs a `span` from a C style array. This overload is available if `Extent ==`[`gsl::dynamic_extent`](#user-content-H-span_ext-dynamic_extent)
or `N == Extent`.
```cpp
template <class T, std::size_t N>
constexpr span(std::array<T, N>& arr) noexcept;
template <class T, std::size_t N>
constexpr span(const std::array<T, N>& arr) noexcept;
```
Constructs a `span` from a `std::array`. These overloads are available if `Extent ==`[`gsl::dynamic_extent`](#user-content-H-span_ext-dynamic_extent)
or `N == Extent`, and if the array can be interpreted as a `ElementType` array.
```cpp
template <class Container>
constexpr explicit(Extent != gsl::dynamic_extent) span(Container& cont) noexcept;
template <class Container>
constexpr explicit(Extent != gsl::dynamic_extent) span(const Container& cont) noexcept;
```
Constructs a `span` from a container. These overloads are available if `Extent ==`[`gsl::dynamic_extent`](#user-content-H-span_ext-dynamic_extent)
or `N == Extent`, and if the container can be interpreted as a contiguous `ElementType` array.
```cpp
constexpr span(const span& other) noexcept = default;
```
Copy constructor.
```cpp
template <class OtherElementType, std::size_t OtherExtent>
explicit(Extent != gsl::dynamic_extent && OtherExtent == dynamic_extent)
constexpr span(const span<OtherElementType, OtherExtent>& other) noexcept;
```
Constructs a `span` from another `span`. This constructor is available if `OtherExtent == Extent || Extent ==`[`gsl::dynamic_extent`](#user-content-H-span_ext-dynamic_extent)` || OtherExtent ==`[`gsl::dynamic_extent`](#user-content-H-span_ext-dynamic_extent)
and if `ElementType` and `OtherElementType` are compatible.
If `Extent !=`[`gsl::dynamic_extent`](#user-content-H-span_ext-dynamic_extent) and `OtherExtent ==`[`gsl::dynamic_extent`](#user-content-H-span_ext-dynamic_extent),
then the constructor [`Expects`](#user-content-H-assert-expects) that `other.size() == Extent`.
```cpp
constexpr span& operator=(const span& other) noexcept = default;
```
Copy assignment
```cpp
template <std::size_t Count>
constexpr span<element_type, Count> first() const noexcept;
constexpr span<element_type, dynamic_extent> first(size_type count) const noexcept;
template <std::size_t Count>
constexpr span<element_type, Count> last() const noexcept;
constexpr span<element_type, dynamic_extent> last(size_type count) const noexcept;
```
Return a subspan of the first/last `Count` elements. [`Expects`](#user-content-H-assert-expects) that `Count` does not exceed the `span`'s size.
```cpp
template <std::size_t offset, std::size_t count = dynamic_extent>
constexpr auto subspan() const noexcept;
constexpr span<element_type, dynamic_extent>
subspan(size_type offset, size_type count = dynamic_extent) const noexcept;
```
Return a subspan starting at `offset` and having size `count`. [`Expects`](#user-content-H-assert-expects) that `offset` does not exceed the `span`'s size,
and that `offset == `[`gsl::dynamic_extent`](#user-content-H-span_ext-dynamic_extent) or `offset + count` does not exceed the `span`'s size.
If `count` is `gsl::dynamic_extent`, the number of elements in the subspan is `size() - offset`.
```cpp
constexpr size_type size() const noexcept;
constexpr size_type size_bytes() const noexcept;
```
Returns the size respective the size in bytes of the `span`.
```cpp
constexpr bool empty() const noexcept;
```
Is the `span` empty?
```cpp
constexpr reference operator[](size_type idx) const noexcept;
```
Returns a reference to the element at the given index. [`Expects`](#user-content-H-assert-expects) that `idx` is less than the `span`'s size.
```cpp
constexpr reference front() const noexcept;
constexpr reference back() const noexcept;
```
Returns a reference to the first/last element in the `span`. [`Expects`](#user-content-H-assert-expects) that the `span` is not empty.
```cpp
constexpr pointer data() const noexcept;
```
Returns a pointer to the beginning of the contained data.
```cpp
constexpr iterator begin() const noexcept;
constexpr iterator end() const noexcept;
constexpr reverse_iterator rbegin() const noexcept;
constexpr reverse_iterator rend() const noexcept;
```
Returns an iterator to the first/last normal/reverse iterator.
```cpp
template <class Type, std::size_t Extent>
span(Type (&)[Extent]) -> span<Type, Extent>;
template <class Type, std::size_t Size>
span(std::array<Type, Size>&) -> span<Type, Size>;
template <class Type, std::size_t Size>
span(const std::array<Type, Size>&) -> span<const Type, Size>;
template <class Container,
class Element = std::remove_pointer_t<decltype(std::declval<Container&>().data())>>
span(Container&) -> span<Element>;
template <class Container,
class Element = std::remove_pointer_t<decltype(std::declval<const Container&>().data())>>
span(const Container&) -> span<Element>;
```
Deduction guides.
```cpp
template <class ElementType, std::size_t Extent>
span<const byte, details::calculate_byte_size<ElementType, Extent>::value>
as_bytes(span<ElementType, Extent> s) noexcept;
template <class ElementType, std::size_t Extent>
span<byte, details::calculate_byte_size<ElementType, Extent>::value>
as_writable_bytes(span<ElementType, Extent> s) noexcept;
```
Converts a `span` into a `span` of `byte`s.
`as_writable_bytes` will only be available for non-const `ElementType`s.
## <a name="H-span_ext" />`<span_ext>`
This file is a companion for and included by [`<gsl/span>`](#user-content-H-span), and should not be used on its own. It contains useful features that aren't part of the `std::span` API as found inside the STL `<span>` header (with the exception of [`gsl::dynamic_extent`](#user-content-H-span_ext-dynamic_extent), which is included here due to implementation constraints).
- [`gsl::dynamic_extent`](#user-content-H-span_ext-dynamic_extent)
- [`gsl::span`](#user-content-H-span_ext-span)
- [`gsl::span` comparison operators](#user-content-H-span_ext-span_comparison_operators)
- [`gsl::make_span`](#user-content-H-span_ext-make_span)
- [`gsl::at`](#user-content-H-span_ext-at)
- [`gsl::ssize`](#user-content-H-span_ext-ssize)
- [`gsl::span` iterator functions](#user-content-H-span_ext-span_iterator_functions)
### <a name="H-span_ext-dynamic_extent" />`gsl::dynamic_extent`
Defines the extent value to be used by all `gsl::span` with dynamic extent.
Note: `std::dynamic_extent` is exposed by the STL `<span>` header and so ideally `gsl::dynamic_extent` would be under [`<gsl/span>`](#user-content-H-span), but to avoid cyclic dependency issues it is under `<span_ext>` instead.
### <a name="H-span_ext-span" />`gsl::span`
```cpp
template <class ElementType, std::size_t Extent = dynamic_extent>
class span;
```
Forward declaration of `gsl::span`.
### <a name="H-span_ext-span_comparison_operators" />`gsl::span` comparison operators
```cpp
template <class ElementType, std::size_t FirstExtent, std::size_t SecondExtent>
constexpr bool operator==(span<ElementType, FirstExtent> l, span<ElementType, SecondExtent> r);
template <class ElementType, std::size_t FirstExtent, std::size_t SecondExtent>
constexpr bool operator!=(span<ElementType, FirstExtent> l, span<ElementType, SecondExtent> r);
template <class ElementType, std::size_t Extent>
constexpr bool operator<(span<ElementType, Extent> l, span<ElementType, Extent> r);
template <class ElementType, std::size_t Extent>
constexpr bool operator<=(span<ElementType, Extent> l, span<ElementType, Extent> r);
template <class ElementType, std::size_t Extent>
constexpr bool operator>(span<ElementType, Extent> l, span<ElementType, Extent> r);
template <class ElementType, std::size_t Extent>
constexpr bool operator>=(span<ElementType, Extent> l, span<ElementType, Extent> r);
```
The comparison operators for two `span`s lexicographically compare the elements in the `span`s.
### <a name="H-span_ext-make_span" />`gsl::make_span`
```cpp
template <class ElementType>
constexpr span<ElementType> make_span(ElementType* ptr, typename span<ElementType>::size_type count);
template <class ElementType>
constexpr span<ElementType> make_span(ElementType* firstElem, ElementType* lastElem);
template <class ElementType, std::size_t N>
constexpr span<ElementType, N> make_span(ElementType (&arr)[N]) noexcept;
template <class Container>
constexpr span<typename Container::value_type> make_span(Container& cont);
template <class Container>
constexpr span<const typename Container::value_type> make_span(const Container& cont);
```
Utility function for creating a `span` with [`gsl::dynamic_extent`](#user-content-H-span_ext-dynamic_extent) from
- pointer and length,
- pointer to start and pointer to end,
- a C style array, or
- a container.
### <a name="H-span_ext-at" />`gsl::at`
```cpp
template <class ElementType, std::size_t Extent>
constexpr ElementType& at(span<ElementType, Extent> s, index i);
```
The function `gsl::at` offers a safe way to access data with index bounds checking.
This is the specialization of [`gsl::at`](#user-content-H-util-at) for [`span`](#user-content-H-span-span). It returns a reference to the `i`th element and
[`Expects`](#user-content-H-assert-expects) that the provided index is within the bounds of the `span`.
Note: `gsl::at` supports indexes up to `PTRDIFF_MAX`.
### <a name="H-span_ext-ssize" />`gsl::ssize`
```cpp
template <class ElementType, std::size_t Extent>
constexpr std::ptrdiff_t ssize(const span<ElementType, Extent>& s) noexcept;
```
Return the size of a [`span`](#user-content-H-span-span) as a `ptrdiff_t`.
### <a name="H-span_ext-span_iterator_functions" />`gsl::span` iterator functions
```cpp
template <class ElementType, std::size_t Extent>
constexpr typename span<ElementType, Extent>::iterator
begin(const span<ElementType, Extent>& s) noexcept;
template <class ElementType, std::size_t Extent = dynamic_extent>
constexpr typename span<ElementType, Extent>::iterator
end(const span<ElementType, Extent>& s) noexcept;
template <class ElementType, std::size_t Extent>
constexpr typename span<ElementType, Extent>::reverse_iterator
rbegin(const span<ElementType, Extent>& s) noexcept;
template <class ElementType, std::size_t Extent>
constexpr typename span<ElementType, Extent>::reverse_iterator
rend(const span<ElementType, Extent>& s) noexcept;
template <class ElementType, std::size_t Extent>
constexpr typename span<ElementType, Extent>::iterator
cbegin(const span<ElementType, Extent>& s) noexcept;
template <class ElementType, std::size_t Extent = dynamic_extent>
constexpr typename span<ElementType, Extent>::iterator
cend(const span<ElementType, Extent>& s) noexcept;
template <class ElementType, std::size_t Extent>
constexpr typename span<ElementType, Extent>::reverse_iterator
crbegin(const span<ElementType, Extent>& s) noexcept;
template <class ElementType, std::size_t Extent>
constexpr typename span<ElementType, Extent>::reverse_iterator
crend(const span<ElementType, Extent>& s) noexcept;
```
Free functions for getting a non-const/const begin/end normal/reverse iterator for a [`span`](#user-content-H-span-span).
## <a name="H-zstring" />`<zstring>`
This header exports a family of `*zstring` types.
A `gsl::XXzstring<T>` is a typedef to `T`. It adds no checks whatsoever, it is just for having a syntax to describe
that a pointer points to a zero terminated C style string. This helps static code analysis, and it helps human readers.
`basic_zstring` is a pointer to a C-string (zero-terminated array) with a templated char type. Used to implement the rest of the `*zstring` family.
`zstring` is a zero terminated `char` string.
`czstring` is a const zero terminated `char` string.
`wzstring` is a zero terminated `wchar_t` string.
`cwzstring` is a const zero terminated `wchar_t` string.
`u16zstring` is a zero terminated `char16_t` string.
`cu16zstring` is a const zero terminated `char16_t` string.
`u32zstring` is a zero terminated `char32_t` string.
`cu32zstring` is a const zero terminated `char32_t` string.
See [GSL.view](https://isocpp.github.io/CppCoreGuidelines/CppCoreGuidelines#SS-views) and [SL.str.3: Use zstring or czstring to refer to a C-style, zero-terminated, sequence of characters](https://isocpp.github.io/CppCoreGuidelines/CppCoreGuidelines#Rstr-zstring).
## <a name="H-util" />`<util>`
This header contains utility functions and classes. This header works without exceptions being available. The parts that require
exceptions being available are in their own header file [narrow](#user-content-H-narrow).
See [GSL.util: Utilities](https://isocpp.github.io/CppCoreGuidelines/CppCoreGuidelines#SS-utilities)
- [`gsl::narrow_cast`](#user-content-H-util-narrow_cast)
- [`gsl::final_action`](#user-content-H-util-final_action)
- [`gsl::at`](#user-content-H-util-at)
### <a name="H-util-index" />`gsl::index`
An alias to `std::ptrdiff_t`. It serves as the index type for all container indexes/subscripts/sizes.
### <a name="H-util-narrow_cast" />`gsl::narrow_cast`
`gsl::narrow_cast<T>(x)` is a named cast that is identical to a `static_cast<T>(x)`. It exists to make clear to static code analysis tools and to human readers that a lossy conversion is acceptable.
Note: compare the throwing version [`gsl::narrow`](#user-content-H-narrow-narrow) in header [narrow](#user-content-H-narrow).
See [ES.46: Avoid lossy (narrowing, truncating) arithmetic conversions](https://isocpp.github.io/CppCoreGuidelines/CppCoreGuidelines#Res-narrowing) and [ES.49: If you must use a cast, use a named cast](https://isocpp.github.io/CppCoreGuidelines/CppCoreGuidelines#Res-casts-named)
### <a name="H-util-final_action" />`gsl::final_action`
```cpp
template <class F>
class final_action { ... };
```
`final_action` allows you to ensure something gets run at the end of a scope.
See [E.19: Use a final_action object to express cleanup if no suitable resource handle is available](https://isocpp.github.io/CppCoreGuidelines/CppCoreGuidelines#Re-finally)
#### Member functions
```cpp
explicit final_action(const F& ff) noexcept;
explicit final_action(F&& ff) noexcept;
```
Construct an object with the action to invoke in the destructor.
```cpp
~final_action() noexcept;
```
The destructor will call the action that was passed in the constructor.
```cpp
final_action(final_action&& other) noexcept;
final_action(const final_action&) = delete;
void operator=(const final_action&) = delete;
void operator=(final_action&&) = delete;
```
Move construction is allowed. Copy construction is deleted. Copy and move assignment are also explicitly deleted.
#### <a name="H-util-finally" />Non-member functions
```cpp
template <class F>
auto finally(F&& f) noexcept;
```
Creates a `gsl::final_action` object, deducing the template argument type from the type of the argument.
### <a name="H-util-at" />`gsl::at`
The function `gsl::at` offers a safe way to access data with index bounds checking.
Note: `gsl::at` supports indexes up to `PTRDIFF_MAX`.
See [ES.42: Keep use of pointers simple and straightforward](https://isocpp.github.io/CppCoreGuidelines/CppCoreGuidelines#Res-ptr)
```cpp
template <class T, std::size_t N>
constexpr T& at(T (&arr)[N], const index i);
```
This overload returns a reference to the `i`s element of a C style array `arr`. It [`Expects`](#user-content-H-assert-expects) that the provided index is within the bounds of the array.
```cpp
template <class Cont>
constexpr auto at(Cont& cont, const index i) -> decltype(cont[cont.size()]);
```
This overload returns a reference to the `i`s element of the container `cont`. It [`Expects`](#user-content-H-assert-expects) that the provided index is within the bounds of the array.
```cpp
template <class T>
constexpr T at(const std::initializer_list<T> cont, const index i);
```
This overload returns a reference to the `i`s element of the initializer list `cont`. It [`Expects`](#user-content-H-assert-expects) that the provided index is within the bounds of the array.
```cpp
template <class T, std::size_t extent = std::dynamic_extent>
constexpr auto at(std::span<T, extent> sp, const index i) -> decltype(sp[sp.size()]);
```
This overload returns a reference to the `i`s element of the `std::span` `sp`. It [`Expects`](#user-content-H-assert-expects) that the provided index is within the bounds of the array.
For [`gsl::at`](#user-content-H-span_ext-at) for [`gsl::span`](#user-content-H-span-span) see header [`span_ext`](#user-content-H-span_ext).
```cpp
template <class T, std::enable_if_t<std::is_move_assignable<T>::value && std::is_move_constructible<T>::value>>
void swap(T& a, T& b);
```
Swaps the contents of two objects. Exists only to specialize `gsl::swap<T>(gsl::not_null<T>&, gsl::not_null<T>&)`.

13
deps/GSL/include/CMakeLists.txt vendored Normal file
View File

@ -0,0 +1,13 @@
# Add include folders to the library and targets that consume it
# the SYSTEM keyword suppresses warnings for users of the library
#
# By adding this directory as an include directory the user gets a
# namespace effect.
#
# IE:
# #include <gsl/gsl>
if(PROJECT_IS_TOP_LEVEL)
target_include_directories(GSL INTERFACE $<BUILD_INTERFACE:${CMAKE_CURRENT_SOURCE_DIR}>)
else()
target_include_directories(GSL SYSTEM INTERFACE $<BUILD_INTERFACE:${CMAKE_CURRENT_SOURCE_DIR}>)
endif()

319
deps/GSL/tests/CMakeLists.txt vendored Normal file
View File

@ -0,0 +1,319 @@
cmake_minimum_required(VERSION 3.14...3.16)
project(GSLTests LANGUAGES CXX)
set(GSL_CXX_STANDARD "14" CACHE STRING "Use c++ standard")
set(CMAKE_CXX_STANDARD ${GSL_CXX_STANDARD})
set(CMAKE_CXX_EXTENSIONS OFF)
set(CMAKE_CXX_STANDARD_REQUIRED ON)
include(FindPkgConfig)
include(ExternalProject)
# will make visual studio generated project group files
set_property(GLOBAL PROPERTY USE_FOLDERS ON)
if(CI_TESTING AND GSL_CXX_STANDARD EQUAL 20)
add_compile_definitions(FORCE_STD_SPAN_TESTS=1)
endif()
if(IOS)
add_compile_definitions(GTEST_HAS_DEATH_TEST=1 IOS_PROCESS_DELAY_WORKAROUND=1)
endif()
pkg_search_module(GTestMain gtest_main)
if (NOT GTestMain_FOUND)
# No pre-installed GTest is available, try to download it using Git.
find_package(Git REQUIRED QUIET)
configure_file(CMakeLists.txt.in googletest-download/CMakeLists.txt)
execute_process(
COMMAND ${CMAKE_COMMAND} -G "${CMAKE_GENERATOR}" .
RESULT_VARIABLE result
WORKING_DIRECTORY ${CMAKE_CURRENT_BINARY_DIR}/googletest-download
)
if(result)
message(FATAL_ERROR "CMake step for googletest failed: ${result}")
endif()
execute_process(
COMMAND ${CMAKE_COMMAND} --build .
RESULT_VARIABLE result
WORKING_DIRECTORY ${CMAKE_CURRENT_BINARY_DIR}/googletest-download
)
if(result)
message(FATAL_ERROR "CMake step for googletest failed: ${result}")
endif()
set(gtest_force_shared_crt ON CACHE BOOL "" FORCE)
set(GTestMain_LIBRARIES gtest_main)
add_subdirectory(
${CMAKE_CURRENT_BINARY_DIR}/googletest-src
${CMAKE_CURRENT_BINARY_DIR}/googletest-build
EXCLUDE_FROM_ALL
)
endif()
if (CMAKE_CURRENT_SOURCE_DIR STREQUAL CMAKE_SOURCE_DIR)
find_package(Microsoft.GSL CONFIG REQUIRED)
enable_testing()
if (NOT DEFINED Microsoft.GSL_VERSION)
message(FATAL_ERROR "Microsoft.GSL_VERSION not defined!")
endif()
message(STATUS "Microsoft.GSL_VERSION = ${Microsoft.GSL_VERSION}")
endif()
if (MSVC AND (GSL_CXX_STANDARD GREATER_EQUAL 17))
set(GSL_CPLUSPLUS_OPT -Zc:__cplusplus -permissive-)
endif()
include(CheckCXXCompilerFlag)
# this interface adds compile options to how the tests are run
# please try to keep entries ordered =)
add_library(gsl_tests_config INTERFACE)
if(MSVC) # MSVC or simulating MSVC
target_compile_options(gsl_tests_config INTERFACE
${GSL_CPLUSPLUS_OPT}
/EHsc
/W4
/WX
$<$<CXX_COMPILER_ID:MSVC>:
/wd4996 # Use of function or classes marked [[deprecated]]
/wd26409 # CppCoreCheck - GTest
/wd26426 # CppCoreCheck - GTest
/wd26440 # CppCoreCheck - GTest
/wd26446 # CppCoreCheck - prefer gsl::at()
/wd26472 # CppCoreCheck - use gsl::narrow(_cast)
/wd26481 # CppCoreCheck - use span instead of pointer arithmetic
$<$<VERSION_LESS:$<CXX_COMPILER_VERSION>,1920>: # VS2015
/wd4189 # variable is initialized but not referenced
$<$<NOT:$<CONFIG:Debug>>: # Release, RelWithDebInfo
/wd4702 # Unreachable code
>
>
>
$<$<CXX_COMPILER_ID:Clang>:
-Weverything
-Wfloat-equal
-Wno-c++98-compat
-Wno-c++98-compat-pedantic
-Wno-covered-switch-default # GTest
-Wno-deprecated-declarations # Allow tests for [[deprecated]] elements
-Wno-global-constructors # GTest
-Wno-language-extension-token # GTest gtest-port.h
-Wno-missing-braces
-Wno-missing-prototypes
-Wno-shift-sign-overflow # GTest gtest-port.h
-Wno-undef # GTest
-Wno-used-but-marked-unused # GTest EXPECT_DEATH
-Wno-switch-default # GTest EXPECT_DEATH
$<$<EQUAL:${GSL_CXX_STANDARD},14>: # no support for [[maybe_unused]]
-Wno-unused-member-function
-Wno-unused-variable
$<$<VERSION_EQUAL:$<CXX_COMPILER_VERSION>,15.0.1>:
-Wno-deprecated # False positive in MSVC Clang 15.0.1 raises a C++17 warning
>
>
>
)
check_cxx_compiler_flag("-Wno-reserved-identifier" WARN_RESERVED_ID)
if (WARN_RESERVED_ID)
target_compile_options(gsl_tests_config INTERFACE "-Wno-reserved-identifier")
endif()
else()
target_compile_options(gsl_tests_config INTERFACE
-fno-strict-aliasing
-Wall
-Wcast-align
-Wconversion
-Wctor-dtor-privacy
-Werror
-Wextra
-Wpedantic
-Wshadow
-Wsign-conversion
-Wfloat-equal
-Wno-deprecated-declarations # Allow tests for [[deprecated]] elements
$<$<OR:$<CXX_COMPILER_ID:Clang>,$<CXX_COMPILER_ID:AppleClang>>:
-Weverything
-Wno-c++98-compat
-Wno-c++98-compat-pedantic
-Wno-missing-braces
-Wno-covered-switch-default # GTest
-Wno-global-constructors # GTest
-Wno-missing-prototypes
-Wno-padded
-Wno-switch-default
-Wno-unknown-attributes
-Wno-used-but-marked-unused # GTest EXPECT_DEATH
-Wno-weak-vtables
$<$<EQUAL:${GSL_CXX_STANDARD},14>: # no support for [[maybe_unused]]
-Wno-unused-member-function
-Wno-unused-variable
>
>
$<$<CXX_COMPILER_ID:Clang>:
$<$<AND:$<VERSION_GREATER:$<CXX_COMPILER_VERSION>,4.99>,$<VERSION_LESS:$<CXX_COMPILER_VERSION>,6>>:
$<$<EQUAL:${GSL_CXX_STANDARD},17>:-Wno-undefined-func-template>
>
$<$<AND:$<EQUAL:${GSL_CXX_STANDARD},20>,$<OR:$<CXX_COMPILER_VERSION:11.0.0>,$<CXX_COMPILER_VERSION:10.0.0>>>:
-Wno-zero-as-null-pointer-constant # failing Clang Ubuntu 20.04 tests, seems to be a bug with clang 10.0.0
# and clang 11.0.0. (operator< is being re-written by the compiler
# as operator<=> and raising the warning)
>
>
$<$<CXX_COMPILER_ID:AppleClang>:
$<$<AND:$<VERSION_GREATER:$<CXX_COMPILER_VERSION>,9.1>,$<VERSION_LESS:$<CXX_COMPILER_VERSION>,10>>:
$<$<EQUAL:${GSL_CXX_STANDARD},17>:-Wno-undefined-func-template>
>
>
$<$<CXX_COMPILER_ID:GNU>:
-Wdouble-promotion # float implicit to double
-Wlogical-op # suspicious uses of logical operators
$<$<NOT:$<VERSION_LESS:$<CXX_COMPILER_VERSION>,6>>:
-Wduplicated-cond # duplicated if-else conditions
-Wmisleading-indentation
-Wnull-dereference
$<$<EQUAL:${GSL_CXX_STANDARD},14>: # no support for [[maybe_unused]]
-Wno-unused-variable
>
>
$<$<NOT:$<VERSION_LESS:$<CXX_COMPILER_VERSION>,7>>:
-Wduplicated-branches # identical if-else branches
>
>
)
endif(MSVC)
check_cxx_compiler_flag("-Wno-unsafe-buffer-usage" WARN_UNSAFE_BUFFER)
if (WARN_UNSAFE_BUFFER)
# This test uses very greedy heuristics such as "no pointer arithmetic on raw buffer"
target_compile_options(gsl_tests_config INTERFACE "-Wno-unsafe-buffer-usage")
endif()
# for tests to find the gtest header
target_include_directories(gsl_tests_config SYSTEM INTERFACE
googletest/googletest/include
)
add_executable(gsl_tests
algorithm_tests.cpp
assertion_tests.cpp
at_tests.cpp
byte_tests.cpp
notnull_tests.cpp
owner_tests.cpp
pointers_tests.cpp
span_compatibility_tests.cpp
span_ext_tests.cpp
span_tests.cpp
strict_notnull_tests.cpp
utils_tests.cpp
)
target_link_libraries(gsl_tests
Microsoft.GSL::GSL
gsl_tests_config
${GTestMain_LIBRARIES}
)
add_test(gsl_tests gsl_tests)
# No exception tests
foreach(flag_var
CMAKE_CXX_FLAGS CMAKE_CXX_FLAGS_DEBUG CMAKE_CXX_FLAGS_RELEASE
CMAKE_CXX_FLAGS_MINSIZEREL CMAKE_CXX_FLAGS_RELWITHDEBINFO)
STRING (REGEX REPLACE "/EHsc" "" ${flag_var} "${${flag_var}}")
endforeach(flag_var)
# this interface adds compile options to how the tests are run
# please try to keep entries ordered =)
add_library(gsl_tests_config_noexcept INTERFACE)
if(MSVC) # MSVC or simulating MSVC
target_compile_definitions(gsl_tests_config_noexcept INTERFACE
_HAS_EXCEPTIONS=0 # disable exceptions in the Microsoft STL
)
target_compile_options(gsl_tests_config_noexcept INTERFACE
${GSL_CPLUSPLUS_OPT}
/W4
/WX
$<$<CXX_COMPILER_ID:MSVC>:
/wd4577
/wd4702
/wd26440 # CppCoreCheck - GTest
/wd26446 # CppCoreCheck - prefer gsl::at()
>
$<$<CXX_COMPILER_ID:Clang>:
-Weverything
-Wfloat-equal
-Wno-c++98-compat
-Wno-c++98-compat-pedantic
-Wno-missing-prototypes
-Wno-unknown-attributes
$<$<EQUAL:${GSL_CXX_STANDARD},14>:
$<$<VERSION_EQUAL:$<CXX_COMPILER_VERSION>,15.0.1>:
-Wno-deprecated # False positive in MSVC Clang 15.0.1 raises a C++17 warning
>
>
>
)
check_cxx_compiler_flag("-Wno-reserved-identifier" WARN_RESERVED_ID)
if (WARN_RESERVED_ID)
target_compile_options(gsl_tests_config_noexcept INTERFACE "-Wno-reserved-identifier")
endif()
else()
target_compile_options(gsl_tests_config_noexcept INTERFACE
-fno-exceptions
-fno-strict-aliasing
-Wall
-Wcast-align
-Wconversion
-Wctor-dtor-privacy
-Werror
-Wextra
-Wpedantic
-Wshadow
-Wsign-conversion
-Wfloat-equal
$<$<OR:$<CXX_COMPILER_ID:Clang>,$<CXX_COMPILER_ID:AppleClang>>:
-Weverything
-Wno-c++98-compat
-Wno-c++98-compat-pedantic
-Wno-missing-prototypes
-Wno-unknown-attributes
-Wno-weak-vtables
>
$<$<CXX_COMPILER_ID:GNU>:
-Wdouble-promotion # float implicit to double
-Wlogical-op # suspicious uses of logical operators
-Wuseless-cast # casting to its own type
$<$<NOT:$<VERSION_LESS:$<CXX_COMPILER_VERSION>,6>>:
-Wduplicated-cond # duplicated if-else conditions
-Wmisleading-indentation
-Wnull-dereference
>
$<$<NOT:$<VERSION_LESS:$<CXX_COMPILER_VERSION>,7>>:
-Wduplicated-branches # identical if-else branches
>
$<$<NOT:$<VERSION_LESS:$<CXX_COMPILER_VERSION>,8>>:
-Wcast-align=strict # increase alignment (i.e. char* to int*)
>
>
)
endif(MSVC)
check_cxx_compiler_flag("-Wno-unsafe-buffer-usage" WARN_UNSAFE_BUFFER)
if (WARN_UNSAFE_BUFFER)
# This test uses very greedy heuristics such as "no pointer arithmetic on raw buffer"
target_compile_options(gsl_tests_config_noexcept INTERFACE "-Wno-unsafe-buffer-usage")
endif()
add_executable(gsl_noexcept_tests no_exception_ensure_tests.cpp)
target_link_libraries(gsl_noexcept_tests
Microsoft.GSL::GSL
gsl_tests_config_noexcept
)
add_test(gsl_noexcept_tests gsl_noexcept_tests)

1
deps/asmjit/.github/FUNDING.yml vendored Normal file
View File

@ -0,0 +1 @@
github: kobalicek

View File

@ -0,0 +1,42 @@
name: Bug Report
description: File a bug report
body:
- type: markdown
attributes:
value: |
Before you hit the submit button:
* Please see our [Contribution Guidelines](https://github.com/asmjit/asmjit/blob/master/CONTRIBUTING.md).
* Make sure that you use a recent AsmJit (master branch) before filing a bug report.
* Make sure that you use logging and error handling features to collect as much information as possible, if applicable.
- type: textarea
id: issue-description
attributes:
label: Issue Description
description: Please share a clear and concise description of the issue and optionally provide reproducibility information and output from AsmJit's logger.
placeholder: Description
validations:
required: true
- type: dropdown
id: operating-system
attributes:
label: Operating System
multiple: true
options:
- Not specified / possibly all
- Windows
- Linux
- Mac
- Android
- Other
- type: dropdown
id: target-architecture
attributes:
label: Architecture
multiple: true
options:
- Not specified
- X86 / X86_64
- AArch32
- AArch64
- RISC-V
- Other

View File

@ -0,0 +1,18 @@
name: Feature Request
description: Request a new feature or enhancement
labels: [enhancement]
body:
- type: markdown
attributes:
value: |
Before you hit the submit button:
* Please see our [Contribution Guidelines](https://github.com/asmjit/asmjit/blob/master/CONTRIBUTING.md).
* Make sure that you use a recent AsmJit (master branch) before filing a feature request.
- type: textarea
id: issue-description
attributes:
label: Issue Description
description: Please share a clear and concise description of a new feature or enhancement.
placeholder: Description
validations:
required: true

View File

@ -0,0 +1,18 @@
name: Help & Questions
description: Ask a question or get help
labels: [question]
body:
- type: markdown
attributes:
value: |
Before you hit the submit button:
* Please see our [Contribution Guidelines](https://github.com/asmjit/asmjit/blob/master/CONTRIBUTING.md).
* [Gitter Chat](https://app.gitter.im/#/room/#asmjit:gitter.im) is usually faster to get answers.
* If you need a help, please include as much information as possible to make it clear what the intend is.
- type: textarea
id: issue-description
attributes:
label: Details
description: The description of the problem or question.
validations:
required: true

View File

@ -0,0 +1,11 @@
name: Other Issues
description: Something that doesn't fit the other categories
body:
- type: textarea
id: issue-description
attributes:
label: Issue Description
description: Please share a clear and concise description of the issue.
placeholder: Description
validations:
required: true

View File

@ -0,0 +1,25 @@
{
"diagnostics": {
"asan": { "definitions": ["ASMJIT_SANITIZE=address"] },
"ubsan": { "definitions": ["ASMJIT_SANITIZE=undefined"] },
"msan": { "definitions": ["ASMJIT_SANITIZE=memory"] }
},
"valgrind_arguments": [
"--leak-check=full",
"--show-reachable=yes",
"--track-origins=yes"
],
"tests": [
{ "optional": true, "cmd": ["asmjit_test_unit", "--quick"] },
{ "optional": true, "cmd": ["asmjit_test_assembler"] },
{ "optional": true, "cmd": ["asmjit_test_assembler", "--validate"] },
{ "optional": true, "cmd": ["asmjit_test_emitters"] },
{ "optional": true, "cmd": ["asmjit_test_execute"] },
{ "optional": true, "cmd": ["asmjit_test_compiler"] },
{ "optional": true, "cmd": ["asmjit_test_instinfo"] },
{ "optional": true, "cmd": ["asmjit_test_x86_sections"] },
{ "optional": true, "cmd": ["asmjit_test_perf", "--quick"] }
]
}

232
deps/asmjit/.github/workflows/build.yml vendored Normal file
View File

@ -0,0 +1,232 @@
name: "Build"
on:
push:
pull_request:
concurrency:
group: ${{github.ref}}
cancel-in-progress: ${{github.ref != 'refs/heads/master'}}
defaults:
run:
shell: bash
jobs:
source-check:
name: "source check"
runs-on: ubuntu-latest
steps:
- name: "Checkout"
uses: actions/checkout@v4
- name: "Setup node.js"
uses: actions/setup-node@v4
with:
node-version: "*"
- name: "Check Enumerations"
run: |
cd tools
node enumgen.js --verify
build:
strategy:
fail-fast: false
matrix:
include:
- { title: "diag-analyze" , host: "ubuntu-latest" , arch: "x64" , cc: "clang-18", conf: "Debug" , diagnostics: "analyze-build" }
- { title: "diag-asan" , host: "ubuntu-latest" , arch: "x64" , cc: "clang-18", conf: "Release", diagnostics: "asan", defs: "ASMJIT_TEST=1" }
- { title: "diag-msan" , host: "ubuntu-latest" , arch: "x64" , cc: "clang-18", conf: "Release", diagnostics: "msan", defs: "ASMJIT_TEST=1" }
- { title: "diag-ubsan" , host: "ubuntu-latest" , arch: "x64" , cc: "clang-18", conf: "Release", diagnostics: "ubsan", defs: "ASMJIT_TEST=1" }
- { title: "diag-hardened" , host: "ubuntu-latest" , arch: "x64" , cc: "clang-18", conf: "Release", diagnostics: "hardened", defs: "ASMJIT_TEST=1" }
- { title: "diag-valgrind" , host: "ubuntu-24.04" , arch: "x64" , cc: "clang-18", conf: "Release", diagnostics: "valgrind", defs: "ASMJIT_TEST=1" }
- { title: "no-deprecated" , host: "ubuntu-latest" , arch: "x64" , cc: "clang-18", conf: "Release", defs: "ASMJIT_TEST=1,ASMJIT_NO_DEPRECATED=1" }
- { title: "no-intrinsics" , host: "ubuntu-latest" , arch: "x64" , cc: "clang-18", conf: "Release", defs: "ASMJIT_TEST=1,ASMJIT_NO_INTRINSICS=1" }
- { title: "no-logging" , host: "ubuntu-latest" , arch: "x64" , cc: "clang-18", conf: "Release", defs: "ASMJIT_TEST=1,ASMJIT_NO_LOGGING=1" }
- { title: "no-logging-text" , host: "ubuntu-latest" , arch: "x64" , cc: "clang-18", conf: "Release", defs: "ASMJIT_TEST=1,ASMJIT_NO_LOGGING=1,ASMJIT_NO_TEXT=1" }
- { title: "no-builder" , host: "ubuntu-latest" , arch: "x64" , cc: "clang-18", conf: "Release", defs: "ASMJIT_TEST=1,ASMJIT_NO_BUILDER=1" }
- { title: "no-compiler" , host: "ubuntu-latest" , arch: "x64" , cc: "clang-18", conf: "Release", defs: "ASMJIT_TEST=1,ASMJIT_NO_COMPILER=1" }
- { title: "no-introspection", host: "ubuntu-latest" , arch: "x64" , cc: "clang-18", conf: "Release", defs: "ASMJIT_TEST=1,ASMJIT_NO_COMPILER=1,ASMJIT_NO_INTROSPECTION=1" }
- { title: "no-jit" , host: "ubuntu-latest" , arch: "x64" , cc: "clang-18", conf: "Release", defs: "ASMJIT_TEST=1,ASMJIT_NO_JIT=1" }
- { title: "no-validation" , host: "ubuntu-latest" , arch: "x64" , cc: "clang-18", conf: "Release", defs: "ASMJIT_TEST=1,ASMJIT_NO_VALIDATION=1" }
- { title: "no-x86" , host: "ubuntu-latest" , arch: "x64" , cc: "clang-18", conf: "Release", defs: "ASMJIT_TEST=1,ASMJIT_NO_X86=1" }
- { title: "no-aarch64" , host: "ubuntu-latest" , arch: "x64" , cc: "clang-18", conf: "Release", defs: "ASMJIT_TEST=1,ASMJIT_NO_AARCH64=1" }
- { title: "lang-c++17" , host: "ubuntu-latest" , arch: "x64" , cc: "clang-18", conf: "Debug" , defs: "ASMJIT_TEST=1,CMAKE_CXX_FLAGS=-std=c++17" }
- { title: "lang-c++20" , host: "ubuntu-latest" , arch: "x64" , cc: "clang-18", conf: "Debug" , defs: "ASMJIT_TEST=1,CMAKE_CXX_FLAGS=-std=c++20" }
- { title: "lang-c++23" , host: "ubuntu-latest" , arch: "x64" , cc: "clang-18", conf: "Debug" , defs: "ASMJIT_TEST=1,CMAKE_CXX_FLAGS=-std=c++23" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x86" , cc: "gcc-9" , conf: "Debug" , defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x86" , cc: "gcc-9" , conf: "Release", defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x64" , cc: "gcc-9" , conf: "Debug" , defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x64" , cc: "gcc-9" , conf: "Release", defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x86" , cc: "gcc-10" , conf: "Debug" , defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x86" , cc: "gcc-10" , conf: "Release", defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x64" , cc: "gcc-10" , conf: "Debug" , defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x64" , cc: "gcc-10" , conf: "Release", defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x86" , cc: "gcc-11" , conf: "Debug" , defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x86" , cc: "gcc-11" , conf: "Release", defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x64" , cc: "gcc-11" , conf: "Debug" , defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x64" , cc: "gcc-11" , conf: "Release", defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x86" , cc: "gcc-12" , conf: "Debug" , defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x86" , cc: "gcc-12" , conf: "Release", defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x64" , cc: "gcc-12" , conf: "Debug" , defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x64" , cc: "gcc-12" , conf: "Release", defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x86" , cc: "gcc-13" , conf: "Debug" , defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x86" , cc: "gcc-13" , conf: "Release", defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x64" , cc: "gcc-13" , conf: "Debug" , defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x64" , cc: "gcc-13" , conf: "Release", defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x86" , cc: "clang-11", conf: "Debug" , defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x86" , cc: "clang-11", conf: "Release", defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x64" , cc: "clang-11", conf: "Debug" , defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x64" , cc: "clang-11", conf: "Release", defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x86" , cc: "clang-12", conf: "Debug" , defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x86" , cc: "clang-12", conf: "Release", defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x64" , cc: "clang-12", conf: "Debug" , defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x64" , cc: "clang-12", conf: "Release", defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x86" , cc: "clang-13", conf: "Debug" , defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x86" , cc: "clang-13", conf: "Release", defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x64" , cc: "clang-13", conf: "Debug" , defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x64" , cc: "clang-13", conf: "Release", defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x86" , cc: "clang-14", conf: "Debug" , defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x86" , cc: "clang-14", conf: "Release", defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x64" , cc: "clang-14", conf: "Debug" , defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x64" , cc: "clang-14", conf: "Release", defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x86" , cc: "clang-15", conf: "Debug" , defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x86" , cc: "clang-15", conf: "Release", defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x64" , cc: "clang-15", conf: "Debug" , defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x64" , cc: "clang-15", conf: "Release", defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x86" , cc: "clang-16", conf: "Debug" , defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x86" , cc: "clang-16", conf: "Release", defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x64" , cc: "clang-16", conf: "Debug" , defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x64" , cc: "clang-16", conf: "Release", defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x86" , cc: "clang-17", conf: "Debug" , defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x86" , cc: "clang-17", conf: "Release", defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x64" , cc: "clang-17", conf: "Debug" , defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x64" , cc: "clang-17", conf: "Release", defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x86" , cc: "clang-18", conf: "Debug" , defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x86" , cc: "clang-18", conf: "Release", defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x64" , cc: "clang-18", conf: "Debug" , defs: "ASMJIT_TEST=1" }
- { title: "linux" , host: "ubuntu-22.04" , arch: "x64" , cc: "clang-18", conf: "Release", defs: "ASMJIT_TEST=1" }
- { title: "macos" , host: "macos-13" , arch: "x64" , cc: "gcc-14" , conf: "Debug" , defs: "ASMJIT_TEST=1" }
- { title: "macos" , host: "macos-13" , arch: "x64" , cc: "gcc-14" , conf: "Release", defs: "ASMJIT_TEST=1" }
- { title: "macos" , host: "macos-13" , arch: "x64" , cc: "clang" , conf: "Debug" , defs: "ASMJIT_TEST=1" }
- { title: "macos" , host: "macos-13" , arch: "x64" , cc: "clang" , conf: "Release", defs: "ASMJIT_TEST=1" }
- { title: "macos" , host: "macos-14" , arch: "arm64" , cc: "clang" , conf: "Debug" , defs: "ASMJIT_TEST=1" }
- { title: "macos" , host: "macos-14" , arch: "arm64" , cc: "clang" , conf: "Release", defs: "ASMJIT_TEST=1" }
- { title: "windows" , host: "windows-2019" , arch: "x86" , cc: "vs2019" , conf: "Debug" , defs: "ASMJIT_TEST=1" }
- { title: "windows" , host: "windows-2019" , arch: "x86" , cc: "vs2019" , conf: "Release", defs: "ASMJIT_TEST=1" }
- { title: "windows" , host: "windows-2019" , arch: "x64" , cc: "vs2019" , conf: "Debug" , defs: "ASMJIT_TEST=1" }
- { title: "windows" , host: "windows-2019" , arch: "x64" , cc: "vs2019" , conf: "Release", defs: "ASMJIT_TEST=1" }
- { title: "windows" , host: "windows-2022" , arch: "x86" , cc: "vs2022" , conf: "Debug" , defs: "ASMJIT_TEST=1" }
- { title: "windows" , host: "windows-2022" , arch: "x86" , cc: "vs2022" , conf: "Release", defs: "ASMJIT_TEST=1" }
- { title: "windows" , host: "windows-2022" , arch: "x64" , cc: "vs2022" , conf: "Debug" , defs: "ASMJIT_TEST=1" }
- { title: "windows" , host: "windows-2022" , arch: "x64" , cc: "vs2022" , conf: "Release", defs: "ASMJIT_TEST=1" }
# Cross compiled, cannot run tests (Windows/ARM64).
- { title: "windows" , host: "windows-2022" , arch: "arm64" , cc: "vs2022" , conf: "Debug" , defs: "ASMJIT_TEST=0" }
- { title: "windows" , host: "windows-2022" , arch: "arm64" , cc: "vs2022" , conf: "Release", defs: "ASMJIT_TEST=0" }
# Cross compiled, cannot run tests (Windows/UWP).
- { title: "windows/uwp" , host: "windows-2022" , arch: "x64" , cc: "vs2022" , conf: "Release", defs: "ASMJIT_TEST=0,CMAKE_SYSTEM_NAME=WindowsStore,CMAKE_SYSTEM_VERSION=10.0,CMAKE_CXX_FLAGS=-D_WIN32_WINNT=0x0A00" }
- { title: "freebsd" , host: "ubuntu-latest" , arch: "x64" , cc: "clang" , conf: "Release", vm: "freebsd", vm_ver: "14.1", defs: "ASMJIT_TEST=1" }
- { title: "freebsd" , host: "ubuntu-latest" , arch: "arm64" , cc: "clang" , conf: "Release", vm: "freebsd", vm_ver: "14.1", defs: "ASMJIT_TEST=1" }
- { title: "netbsd" , host: "ubuntu-latest" , arch: "x64" , cc: "clang" , conf: "Release", vm: "netbsd" , vm_ver: "10.0", defs: "ASMJIT_TEST=1" }
- { title: "netbsd" , host: "ubuntu-latest" , arch: "arm64" , cc: "clang" , conf: "Release", vm: "netbsd" , vm_ver: "10.0", defs: "ASMJIT_TEST=1" }
- { title: "openbsd" , host: "ubuntu-latest" , arch: "x64" , cc: "clang" , conf: "Release", vm: "openbsd", vm_ver: "7.4" , defs: "ASMJIT_TEST=1" }
- { title: "openbsd" , host: "ubuntu-latest" , arch: "arm64" , cc: "clang" , conf: "Release", vm: "openbsd", vm_ver: "7.4" , defs: "ASMJIT_TEST=1" }
- { title: "debian" , host: "ubuntu-latest" , arch: "arm/v7" , cc: "clang" , conf: "Release", vm: "debian:unstable", defs: "ASMJIT_TEST=1" }
- { title: "debian" , host: "ubuntu-latest" , arch: "arm64" , cc: "clang" , conf: "Release", vm: "debian:unstable", defs: "ASMJIT_TEST=1" }
- { title: "debian" , host: "ubuntu-latest" , arch: "riscv64", cc: "clang" , conf: "Release", vm: "debian:unstable", defs: "ASMJIT_TEST=1" }
- { title: "debian" , host: "ubuntu-latest" , arch: "ppc64le", cc: "clang" , conf: "Release", vm: "debian:unstable", defs: "ASMJIT_TEST=1" }
name: "${{matrix.title}}/${{matrix.arch}}, ${{matrix.cc}} ${{matrix.conf}}"
runs-on: "${{matrix.host}}"
steps:
- name: "Checkout"
uses: actions/checkout@v4
with:
path: "source"
- name: "Checkout Build Actions"
uses: actions/checkout@v4
with:
repository: build-actions/build-actions
path: "build-actions"
- name: "Python"
uses: actions/setup-python@v5
with:
python-version: "3.x"
- name: QEMU
if: ${{matrix.vm && !matrix.vm_ver}}
uses: docker/setup-qemu-action@v3
with:
platforms: linux/${{matrix.arch}}
- name: "Build & Test - Native"
if: ${{!matrix.vm}}
run: python build-actions/action.py
--source-dir=source
--config=source/.github/workflows/build-config.json
--compiler=${{matrix.cc}}
--diagnostics=${{matrix.diagnostics}}
--architecture=${{matrix.arch}}
--problem-matcher=auto
--build-type=${{matrix.conf}}
--build-defs=${{matrix.defs}}
- name: "Build & Test - Cross Platform Actions"
if: ${{matrix.vm && matrix.vm_ver}}
uses: cross-platform-actions/action@master
with:
operating_system: ${{matrix.vm}}
architecture: ${{matrix.arch}}
version: ${{matrix.vm_ver}}
sync_files: "runner-to-vm"
shutdown_vm: false
shell: bash
run: |
set -e
PATH="/usr/sbin:/usr/pkg/sbin:/usr/pkg/bin:$PATH:$(pwd)/build-actions"
CI_NETBSD_USE_PKGIN=1
export PATH
export CI_NETBSD_USE_PKGIN
sh ./build-actions/prepare-environment.sh
python3 build-actions/action.py \
--source-dir=source \
--config=source/.github/workflows/build-config.json \
--compiler=${{matrix.cc}} \
--diagnostics=${{matrix.diagnostics}} \
--architecture=${{matrix.arch}} \
--problem-matcher=auto \
--build-type=${{matrix.conf}} \
--build-defs=${{matrix.defs}}
- name: "Build & Test - Docker + QEMU"
if: ${{matrix.vm && !matrix.vm_ver}}
run: |
docker run \
--rm \
-v $(pwd):/${{github.workspace}} \
-w ${{github.workspace}}/build-actions \
--platform linux/${{matrix.arch}} \
${{matrix.vm}} \
bash action.sh \
--source-dir=../source \
--config=../source/.github/workflows/build-config.json \
--compiler=${{matrix.cc}} \
--diagnostics=${{matrix.diagnostics}} \
--architecture=${{matrix.arch}} \
--problem-matcher=auto \
--build-type=${{matrix.conf}} \
--build-defs=${{matrix.defs}}

687
deps/asmjit/CMakeLists.txt vendored Normal file
View File

@ -0,0 +1,687 @@
cmake_minimum_required(VERSION 3.19 FATAL_ERROR)
# Don't create a project if it was already created by another CMakeLists.txt. This makes
# it possible to support both add_subdirectory() and include() ways of using AsmJit as a
# dependency.
if (NOT CMAKE_PROJECT_NAME OR "${CMAKE_PROJECT_NAME}" STREQUAL "asmjit")
project(asmjit CXX)
endif()
include(CheckCXXCompilerFlag)
include(CheckCXXSourceCompiles)
include(GNUInstallDirs)
# AsmJit - Configuration - Build
# ==============================
if (NOT DEFINED ASMJIT_TEST)
set(ASMJIT_TEST FALSE)
endif()
if (NOT DEFINED ASMJIT_EMBED)
set(ASMJIT_EMBED FALSE)
endif()
if (NOT DEFINED ASMJIT_STATIC)
set(ASMJIT_STATIC ${ASMJIT_EMBED})
endif()
if (NOT DEFINED ASMJIT_SANITIZE)
set(ASMJIT_SANITIZE FALSE)
endif()
if (NOT DEFINED ASMJIT_NO_CUSTOM_FLAGS)
set(ASMJIT_NO_CUSTOM_FLAGS FALSE)
endif()
if (NOT DEFINED ASMJIT_NO_NATVIS)
set(ASMJIT_NO_NATVIS FALSE)
endif()
# EMBED implies STATIC.
if (ASMJIT_EMBED AND NOT ASMJIT_STATIC)
set(ASMJIT_STATIC TRUE)
endif()
# AsmJit - Configuration - Backend
# ================================
if (NOT DEFINED ASMJIT_NO_X86)
set(ASMJIT_NO_X86 FALSE)
endif()
if (NOT DEFINED ASMJIT_NO_AARCH64)
set(ASMJIT_NO_AARCH64 FALSE)
endif()
if (NOT DEFINED ASMJIT_NO_FOREIGN)
set(ASMJIT_NO_FOREIGN FALSE)
endif()
# AsmJit - Configuration - Features
# =================================
if (NOT DEFINED ASMJIT_NO_DEPRECATED)
set(ASMJIT_NO_DEPRECATED FALSE)
endif()
if (NOT DEFINED ASMJIT_NO_SHM_OPEN)
set(ASMJIT_NO_SHM_OPEN FALSE)
endif()
if (NOT DEFINED ASMJIT_NO_JIT)
set(ASMJIT_NO_JIT FALSE)
endif()
if (NOT DEFINED ASMJIT_NO_TEXT)
set(ASMJIT_NO_TEXT FALSE)
endif()
if (NOT DEFINED ASMJIT_NO_LOGGING)
set(ASMJIT_NO_LOGGING ${ASMJIT_NO_TEXT})
endif()
if (NOT DEFINED ASMJIT_NO_VALIDATION)
set(ASMJIT_NO_VALIDATION FALSE)
endif()
if (NOT DEFINED ASMJIT_NO_INTROSPECTION)
set(ASMJIT_NO_INTROSPECTION FALSE)
endif()
if (NOT DEFINED ASMJIT_NO_BUILDER)
set(ASMJIT_NO_BUILDER FALSE)
endif()
if (NOT DEFINED ASMJIT_NO_COMPILER)
if (ASMJIT_NO_BUILDER OR ASMJIT_NO_INTROSPECTION)
set(ASMJIT_NO_COMPILER TRUE)
else()
set(ASMJIT_NO_COMPILER FALSE)
endif()
endif()
# AsmJit - Configuration - CMake Introspection
# ============================================
set(ASMJIT_DIR "${CMAKE_CURRENT_LIST_DIR}" CACHE PATH "Location of 'asmjit'")
set(ASMJIT_TEST "${ASMJIT_TEST}" CACHE BOOL "Build 'asmjit' test applications")
set(ASMJIT_EMBED "${ASMJIT_EMBED}" CACHE BOOL "Embed 'asmjit' library (no targets)")
set(ASMJIT_STATIC "${ASMJIT_STATIC}" CACHE BOOL "Build 'asmjit' library as static")
set(ASMJIT_SANITIZE "${ASMJIT_SANITIZE}" CACHE STRING "Build with sanitizers: 'address', 'undefined', etc...")
set(ASMJIT_NO_NATVIS "${ASMJIT_NO_NATVIS}" CACHE BOOL "Disable natvis support (embedding asmjit.natvis in PDB)")
set(ASMJIT_NO_CUSTOM_FLAGS "${ASMJIT_NO_CUSTOM_FLAGS}" CACHE BOOL "Disable extra compilation flags added by AsmJit to its targets")
set(ASMJIT_NO_X86 "${ASMJIT_NO_X86}" CACHE BOOL "Disable X86/X64 backend")
set(ASMJIT_NO_AARCH64 "${ASMJIT_NO_AARCH64}" CACHE BOOL "Disable AArch64 backend")
set(ASMJIT_NO_FOREIGN "${ASMJIT_NO_FOREIGN}" CACHE BOOL "Disable all foreign architectures (enables only a target architecture)")
set(ASMJIT_NO_DEPRECATED "${ASMJIT_NO_DEPRECATED}" CACHE BOOL "Disable deprecated API at build time")
set(ASMJIT_NO_SHM_OPEN "${ASMJIT_NO_SHM_OPEN}" CACHE BOOL "Disable the use of shm_open() even on platforms where it's supported")
set(ASMJIT_NO_JIT "${ASMJIT_NO_JIT}" CACHE BOOL "Disable VirtMem, JitAllocator, and JitRuntime at build time")
set(ASMJIT_NO_TEXT "${ASMJIT_NO_TEXT}" CACHE BOOL "Disable textual representation of instructions, enums, cpu features, ...")
set(ASMJIT_NO_LOGGING "${ASMJIT_NO_LOGGING}" CACHE BOOL "Disable logging features at build time")
set(ASMJIT_NO_VALIDATION "${ASMJIT_NO_VALIDATION}" CACHE BOOL "Disable instruction validation API at build time")
set(ASMJIT_NO_INTROSPECTION "${ASMJIT_NO_INTROSPECTION}" CACHE BOOL "Disable instruction introspection API at build time")
set(ASMJIT_NO_BUILDER "${ASMJIT_NO_BUILDER}" CACHE BOOL "Disable Builder emitter at build time")
set(ASMJIT_NO_COMPILER "${ASMJIT_NO_COMPILER}" CACHE BOOL "Disable Compiler emitter at build time")
# AsmJit - Project
# ================
set(ASMJIT_INCLUDE_DIRS "${ASMJIT_DIR}/src") # Include directory is the same as source dir.
set(ASMJIT_DEPS "") # AsmJit dependencies (libraries) for the linker.
set(ASMJIT_LIBS "") # Dependencies of libs/apps that want to use AsmJit.
set(ASMJIT_CFLAGS "") # Public compiler flags.
set(ASMJIT_PRIVATE_CFLAGS "") # Private compiler flags independent of build type.
set(ASMJIT_PRIVATE_CFLAGS_DBG "") # Private compiler flags used by debug builds.
set(ASMJIT_PRIVATE_CFLAGS_REL "") # Private compiler flags used by release builds.
set(ASMJIT_SANITIZE_CFLAGS "") # Compiler flags required by currently enabled sanitizers.
set(ASMJIT_SANITIZE_LFLAGS "") # Linker flags required by currently enabled sanitizers.
# AsmJit - Utilities
# ==================
function(asmjit_detect_cflags out)
set(out_array ${${out}})
foreach(flag ${ARGN})
string(REGEX REPLACE "[+]" "x" flag_signature "${flag}")
string(REGEX REPLACE "[-=:;/.\]" "_" flag_signature "${flag_signature}")
check_cxx_compiler_flag(${flag} "__CxxFlag_${flag_signature}")
if (${__CxxFlag_${flag_signature}})
list(APPEND out_array "${flag}")
endif()
endforeach()
set(${out} "${out_array}" PARENT_SCOPE)
endfunction()
# Support for various sanitizers provided by C/C++ compilers.
function(asmjit_detect_sanitizers out)
set(_out_array ${${out}})
set(_flags "")
foreach(_arg ${ARGN})
string(REPLACE "," ";" _arg "${_arg}")
list(APPEND _flags ${_arg})
endforeach()
foreach(_flag ${_flags})
if (NOT "${_flag}" MATCHES "^-fsanitize=")
SET(_flag "-fsanitize=${_flag}")
endif()
# Sanitizers also require link flags, see CMAKE_REQUIRED_FLAGS.
set(CMAKE_REQUIRED_FLAGS "${_flag}")
asmjit_detect_cflags(_out_array ${_flag})
unset(CMAKE_REQUIRED_FLAGS)
endforeach()
set(${out} "${_out_array}" PARENT_SCOPE)
endfunction()
function(asmjit_add_target target target_type)
set(single_val "")
set(multi_val SOURCES LIBRARIES CFLAGS CFLAGS_DBG CFLAGS_REL)
cmake_parse_arguments("X" "" "${single_val}" "${multi_val}" ${ARGN})
if ("${target_type}" MATCHES "^(EXECUTABLE|TEST)$")
add_executable(${target} ${X_SOURCES})
else()
add_library(${target} ${target_type} ${X_SOURCES})
endif()
set_target_properties(${target}
PROPERTIES
DEFINE_SYMBOL ""
CXX_VISIBILITY_PRESET hidden)
target_compile_options(${target} PRIVATE ${X_CFLAGS} ${ASMJIT_SANITIZE_CFLAGS} $<$<CONFIG:Debug>:${X_CFLAGS_DBG}> $<$<NOT:$<CONFIG:Debug>>:${X_CFLAGS_REL}>)
target_compile_features(${target} PUBLIC cxx_std_11)
target_link_options(${target} PRIVATE ${ASMJIT_PRIVATE_LFLAGS})
target_link_libraries(${target} PRIVATE ${X_LIBRARIES})
if ("${target_type}" STREQUAL "TEST")
add_test(NAME ${target} COMMAND ${target})
endif()
endfunction()
# AsmJit - Compiler Support
# =========================
# We will have to keep this most likely forever as some users may still be using it.
set(ASMJIT_INCLUDE_DIR "${ASMJIT_INCLUDE_DIRS}")
if (NOT ASMJIT_NO_CUSTOM_FLAGS)
if ("${CMAKE_CXX_COMPILER_ID}" STREQUAL "MSVC" OR "x${CMAKE_CXX_COMPILER_FRONTEND_VARIANT}" STREQUAL "xMSVC")
list(APPEND ASMJIT_PRIVATE_CFLAGS
-MP # [+] Multi-Process Compilation.
-GF # [+] Eliminate duplicate strings.
-Zc:__cplusplus # [+] Conforming __cplusplus definition.
-Zc:inline # [+] Remove unreferenced COMDAT.
-Zc:strictStrings # [+] Strict const qualification of string literals.
-Zc:threadSafeInit- # [-] Thread-safe statics.
-W4) # [+] Warning level 4.
list(APPEND ASMJIT_PRIVATE_CFLAGS_DBG
-GS) # [+] Buffer security-check.
list(APPEND ASMJIT_PRIVATE_CFLAGS_REL
-GS- # [-] Buffer security-check.
-O2 # [+] Favor speed over size.
-Oi) # [+] Generate intrinsic functions.
elseif ("${CMAKE_CXX_COMPILER_ID}" MATCHES "^(GNU|Clang|AppleClang)$")
list(APPEND ASMJIT_PRIVATE_CFLAGS -Wall -Wextra -Wconversion)
list(APPEND ASMJIT_PRIVATE_CFLAGS -fno-math-errno)
list(APPEND ASMJIT_PRIVATE_CFLAGS_REL -O2)
# -fno-semantic-interposition is not available on apple - the compiler issues a warning, which is not detected.
if ("${CMAKE_CXX_COMPILER_ID}" STREQUAL "AppleClang")
asmjit_detect_cflags(ASMJIT_PRIVATE_CFLAGS -fno-threadsafe-statics)
else()
asmjit_detect_cflags(ASMJIT_PRIVATE_CFLAGS -fno-threadsafe-statics -fno-semantic-interposition)
endif()
# The following flags can save few bytes in the resulting binary.
asmjit_detect_cflags(ASMJIT_PRIVATE_CFLAGS_REL
-fmerge-all-constants # Merge all constants even if it violates ISO C++.
-fno-enforce-eh-specs) # Don't enforce termination if noexcept function throws.
endif()
endif()
# Support for sanitizers.
if (ASMJIT_SANITIZE)
asmjit_detect_sanitizers(ASMJIT_SANITIZE_CFLAGS ${ASMJIT_SANITIZE})
if (ASMJIT_SANITIZE_CFLAGS)
message("-- Enabling sanitizers: '${ASMJIT_SANITIZE_CFLAGS}'")
# Linker must receive the same flags as the compiler when it comes to sanitizers.
set(ASMJIT_SANITIZE_LFLAGS ${ASMJIT_SANITIZE_CFLAGS})
# Don't omit frame pointer if sanitizers are enabled.
if ("${CMAKE_CXX_COMPILER_ID}" STREQUAL "MSVC" OR "x${CMAKE_CXX_COMPILER_FRONTEND_VARIANT}" STREQUAL "xMSVC")
list(APPEND ASMJIT_SANITIZE_CFLAGS -Oy-)
else()
list(APPEND ASMJIT_SANITIZE_CFLAGS -fno-omit-frame-pointer -g)
endif()
list(APPEND ASMJIT_PRIVATE_CFLAGS ${ASMJIT_SANITIZE_CFLAGS})
list(APPEND ASMJIT_PRIVATE_LFLAGS ${ASMJIT_SANITIZE_LFLAGS})
endif()
endif()
if (WIN32)
# Dependency: nothing extra at the moment.
elseif ("${CMAKE_SYSTEM_NAME}" STREQUAL "Android")
# Dependency: libc is the only required library on Android as it also provides libthread.
message("-- Dependency: adding libc (Android target detected)")
list(APPEND ASMJIT_DEPS c)
elseif ("${CMAKE_SYSTEM_NAME}" STREQUAL "Haiku")
# Dependency: libroot is used by Haiku instead of libc, so link to libroot and libpthread.
message("-- Dependency: adding libroot and libpthread (Haiku target detected)")
list(APPEND ASMJIT_DEPS root pthread)
else()
# Dependency: libc is always required.
message("-- Dependency: adding libc (Linux, BSD, or other UNIX/POSIX environment)")
list(APPEND ASMJIT_DEPS c)
# Dependency: pthread (required so AsmJit can use pthread_lock).
check_cxx_source_compiles("
#include <pthread.h>
int main() {
pthread_mutex_t m;
pthread_mutex_init(&m, nullptr);
return pthread_mutex_destroy(&m);
}
" ASMJIT_LIBC_HAS_LIBPTHREAD)
if (ASMJIT_LIBC_HAS_LIBPTHREAD)
message("-- Dependency: libpthread provided by libc (not linking to libpthread)")
else()
message("-- Dependency: libpthread not provided by libc, linking to libpthread")
list(APPEND ASMJIT_DEPS pthread)
endif()
# Dependency: shm_open (required so AsmJit can use shm_open on supported platforms).
if ("${CMAKE_SYSTEM_NAME}" MATCHES "^(Linux|NetBSD)$" AND NOT ASMJIT_NO_SHM_OPEN)
check_cxx_source_compiles("
#include <sys/mman.h>
int main() {
const char file_name[1] {};
return shm_open(file_name, 0, 0);
}
" ASMJIT_LIBC_HAS_LIBRT)
if (ASMJIT_LIBC_HAS_LIBRT)
message("-- Dependency: shm_open provided by libc (not linking to librt)")
else()
message("-- Dependency: shm_open not provided by libc, linking to librt")
list(APPEND ASMJIT_DEPS rt)
endif()
endif()
endif()
set(ASMJIT_LIBS ${ASMJIT_DEPS})
if (NOT ASMJIT_EMBED)
list(INSERT ASMJIT_LIBS 0 asmjit)
endif()
if (ASMJIT_EMBED)
set(ASMJIT_TARGET_TYPE "EMBED")
elseif (ASMJIT_STATIC)
set(ASMJIT_TARGET_TYPE "STATIC")
else()
set(ASMJIT_TARGET_TYPE "SHARED")
endif()
foreach(build_option # AsmJit build options.
ASMJIT_STATIC
ASMJIT_NO_DEPRECATED
# AsmJit backends selection.
ASMJIT_NO_X86
ASMJIT_NO_AARCH64
ASMJIT_NO_FOREIGN
# AsmJit features selection.
ASMJIT_NO_JIT
ASMJIT_NO_TEXT
ASMJIT_NO_LOGGING
ASMJIT_NO_INTROSPECTION
ASMJIT_NO_VALIDATION
ASMJIT_NO_BUILDER
ASMJIT_NO_COMPILER)
if (${build_option})
List(APPEND ASMJIT_CFLAGS "-D${build_option}")
List(APPEND ASMJIT_PRIVATE_CFLAGS "-D${build_option}")
endif()
endforeach()
# AsmJit - Linker Support
# =======================
if (WIN32)
if (CMAKE_LINKER MATCHES "link\\.exe" OR CMAKE_LINKER MATCHES "lld-link\\.exe")
set(ASMJIT_LINKER_SUPPORTS_NATVIS TRUE)
endif()
endif()
# AsmJit - Source
# ===============
set(ASMJIT_SRC_LIST
asmjit/asmjit.h
asmjit/asmjit-scope-begin.h
asmjit/asmjit-scope-end.h
asmjit/core.h
asmjit/core/api-build_p.h
asmjit/core/api-config.h
asmjit/core/archtraits.cpp
asmjit/core/archtraits.h
asmjit/core/archcommons.h
asmjit/core/assembler.cpp
asmjit/core/assembler.h
asmjit/core/builder.cpp
asmjit/core/builder.h
asmjit/core/codebuffer.h
asmjit/core/codeholder.cpp
asmjit/core/codeholder.h
asmjit/core/codewriter.cpp
asmjit/core/codewriter_p.h
asmjit/core/compiler.cpp
asmjit/core/compiler.h
asmjit/core/compilerdefs.h
asmjit/core/constpool.cpp
asmjit/core/constpool.h
asmjit/core/cpuinfo.cpp
asmjit/core/cpuinfo.h
asmjit/core/emithelper.cpp
asmjit/core/emithelper_p.h
asmjit/core/emitter.cpp
asmjit/core/emitter.h
asmjit/core/emitterutils.cpp
asmjit/core/emitterutils_p.h
asmjit/core/environment.cpp
asmjit/core/environment.h
asmjit/core/errorhandler.cpp
asmjit/core/errorhandler.h
asmjit/core/formatter.cpp
asmjit/core/formatter.h
asmjit/core/func.cpp
asmjit/core/func.h
asmjit/core/funcargscontext.cpp
asmjit/core/funcargscontext_p.h
asmjit/core/globals.cpp
asmjit/core/globals.h
asmjit/core/inst.cpp
asmjit/core/inst.h
asmjit/core/instdb.cpp
asmjit/core/instdb_p.h
asmjit/core/jitallocator.cpp
asmjit/core/jitallocator.h
asmjit/core/jitruntime.cpp
asmjit/core/jitruntime.h
asmjit/core/logger.cpp
asmjit/core/logger.h
asmjit/core/misc_p.h
asmjit/core/operand.cpp
asmjit/core/operand.h
asmjit/core/osutils.cpp
asmjit/core/osutils.h
asmjit/core/osutils_p.h
asmjit/core/raassignment_p.h
asmjit/core/rabuilders_p.h
asmjit/core/radefs_p.h
asmjit/core/ralocal.cpp
asmjit/core/ralocal_p.h
asmjit/core/rapass.cpp
asmjit/core/rapass_p.h
asmjit/core/rastack.cpp
asmjit/core/rastack_p.h
asmjit/core/string.cpp
asmjit/core/string.h
asmjit/core/support.cpp
asmjit/core/support.h
asmjit/core/target.cpp
asmjit/core/target.h
asmjit/core/type.cpp
asmjit/core/type.h
asmjit/core/virtmem.cpp
asmjit/core/virtmem.h
asmjit/core/zone.cpp
asmjit/core/zone.h
asmjit/core/zonehash.cpp
asmjit/core/zonehash.h
asmjit/core/zonelist.cpp
asmjit/core/zonelist.h
asmjit/core/zonestack.cpp
asmjit/core/zonestack.h
asmjit/core/zonestring.h
asmjit/core/zonetree.cpp
asmjit/core/zonetree.h
asmjit/core/zonevector.cpp
asmjit/core/zonevector.h
asmjit/a64.h
asmjit/arm.h
asmjit/arm/armformatter.cpp
asmjit/arm/armformatter_p.h
asmjit/arm/armglobals.h
asmjit/arm/armoperand.h
asmjit/arm/armutils.h
asmjit/arm/a64archtraits_p.h
asmjit/arm/a64assembler.cpp
asmjit/arm/a64assembler.h
asmjit/arm/a64builder.cpp
asmjit/arm/a64builder.h
asmjit/arm/a64compiler.cpp
asmjit/arm/a64compiler.h
asmjit/arm/a64emithelper.cpp
asmjit/arm/a64emithelper_p.h
asmjit/arm/a64emitter.h
asmjit/arm/a64formatter.cpp
asmjit/arm/a64formatter_p.h
asmjit/arm/a64func.cpp
asmjit/arm/a64func_p.h
asmjit/arm/a64globals.h
asmjit/arm/a64instapi.cpp
asmjit/arm/a64instapi_p.h
asmjit/arm/a64instdb.cpp
asmjit/arm/a64instdb.h
asmjit/arm/a64operand.cpp
asmjit/arm/a64operand.h
asmjit/arm/a64rapass.cpp
asmjit/arm/a64rapass_p.h
asmjit/x86.h
asmjit/x86/x86archtraits_p.h
asmjit/x86/x86assembler.cpp
asmjit/x86/x86assembler.h
asmjit/x86/x86builder.cpp
asmjit/x86/x86builder.h
asmjit/x86/x86compiler.cpp
asmjit/x86/x86compiler.h
asmjit/x86/x86emithelper.cpp
asmjit/x86/x86emithelper_p.h
asmjit/x86/x86emitter.h
asmjit/x86/x86formatter.cpp
asmjit/x86/x86formatter_p.h
asmjit/x86/x86func.cpp
asmjit/x86/x86func_p.h
asmjit/x86/x86globals.h
asmjit/x86/x86instdb.cpp
asmjit/x86/x86instdb.h
asmjit/x86/x86instdb_p.h
asmjit/x86/x86instapi.cpp
asmjit/x86/x86instapi_p.h
asmjit/x86/x86operand.cpp
asmjit/x86/x86operand.h
asmjit/x86/x86rapass.cpp
asmjit/x86/x86rapass_p.h
)
if (MSVC AND NOT ASMJIT_NO_NATVIS)
list(APPEND ASMJIT_SRC_LIST asmjit.natvis)
endif()
set(ASMJIT_SRC "")
foreach(src_file ${ASMJIT_SRC_LIST})
set(src_file "${ASMJIT_DIR}/src/${src_file}")
list(APPEND ASMJIT_SRC ${src_file})
if ("${src_file}" MATCHES "\\.natvis")
if (ASMJIT_LINKER_SUPPORTS_NATVIS)
list(APPEND ASMJIT_PRIVATE_LFLAGS "-natvis:${src_file}")
endif()
endif()
endforeach()
source_group(TREE "${ASMJIT_DIR}" FILES ${ASMJIT_SRC})
# AsmJit - Summary
# ================
message("** AsmJit Summary **")
message(" ASMJIT_DIR=${ASMJIT_DIR}")
message(" ASMJIT_TEST=${ASMJIT_TEST}")
message(" ASMJIT_TARGET_TYPE=${ASMJIT_TARGET_TYPE}")
message(" ASMJIT_DEPS=${ASMJIT_DEPS}")
message(" ASMJIT_LIBS=${ASMJIT_LIBS}")
message(" ASMJIT_CFLAGS=${ASMJIT_CFLAGS}")
message(" ASMJIT_PRIVATE_CFLAGS=${ASMJIT_PRIVATE_CFLAGS}")
message(" ASMJIT_PRIVATE_CFLAGS_DBG=${ASMJIT_PRIVATE_CFLAGS_DBG}")
message(" ASMJIT_PRIVATE_CFLAGS_REL=${ASMJIT_PRIVATE_CFLAGS_REL}")
# AsmJit - Targets
# ================
if (NOT ASMJIT_EMBED)
# Add AsmJit target.
asmjit_add_target(asmjit "${ASMJIT_TARGET_TYPE}"
SOURCES ${ASMJIT_SRC}
LIBRARIES ${ASMJIT_DEPS}
CFLAGS ${ASMJIT_PRIVATE_CFLAGS}
CFLAGS_DBG ${ASMJIT_PRIVATE_CFLAGS_DBG}
CFLAGS_REL ${ASMJIT_PRIVATE_CFLAGS_REL})
target_compile_options(asmjit INTERFACE ${ASMJIT_CFLAGS})
target_include_directories(asmjit BEFORE INTERFACE
$<BUILD_INTERFACE:${ASMJIT_INCLUDE_DIRS}>
$<INSTALL_INTERFACE:${CMAKE_INSTALL_INCLUDEDIR}>)
# Create an asmjit::asmjit alias.
add_library(asmjit::asmjit ALIAS asmjit)
# Add AsmJit install instructions (library and public headers).
if (NOT ASMJIT_NO_INSTALL)
install(TARGETS asmjit
EXPORT asmjit-config
RUNTIME DESTINATION "${CMAKE_INSTALL_BINDIR}"
ARCHIVE DESTINATION "${CMAKE_INSTALL_LIBDIR}"
LIBRARY DESTINATION "${CMAKE_INSTALL_LIBDIR}"
INCLUDES DESTINATION "${CMAKE_INSTALL_INCLUDEDIR}")
install(EXPORT asmjit-config
NAMESPACE asmjit::
DESTINATION "${CMAKE_INSTALL_LIBDIR}/cmake/asmjit")
foreach(_src_file ${ASMJIT_SRC_LIST})
if ("${_src_file}" MATCHES "\\.h$" AND NOT "${_src_file}" MATCHES "_p\\.h$")
get_filename_component(_src_dir ${_src_file} PATH)
install(FILES "${ASMJIT_DIR}/src/${_src_file}" DESTINATION "${CMAKE_INSTALL_INCLUDEDIR}/${_src_dir}")
endif()
endforeach()
endif()
# Add AsmJit tests.
if (ASMJIT_TEST)
enable_testing()
# Special target that always uses embedded AsmJit.
asmjit_add_target(asmjit_test_unit TEST
SOURCES ${ASMJIT_SRC}
test/asmjit_test_unit.cpp
test/broken.cpp
test/broken.h
LIBRARIES ${ASMJIT_DEPS}
CFLAGS ${ASMJIT_PRIVATE_CFLAGS}
-DASMJIT_TEST
-DASMJIT_STATIC
CFLAGS_DBG ${ASMJIT_PRIVATE_CFLAGS_DBG}
CFLAGS_REL ${ASMJIT_PRIVATE_CFLAGS_REL})
target_include_directories(asmjit_test_unit BEFORE PRIVATE ${ASMJIT_INCLUDE_DIRS})
asmjit_add_target(asmjit_test_assembler TEST
SOURCES test/asmjit_test_assembler.cpp
test/asmjit_test_assembler.h
test/asmjit_test_assembler_a64.cpp
test/asmjit_test_assembler_x64.cpp
test/asmjit_test_assembler_x86.cpp
LIBRARIES asmjit::asmjit
CFLAGS ${ASMJIT_PRIVATE_CFLAGS}
CFLAGS_DBG ${ASMJIT_PRIVATE_CFLAGS_DBG}
CFLAGS_REL ${ASMJIT_PRIVATE_CFLAGS_REL})
asmjit_add_target(asmjit_test_perf EXECUTABLE
SOURCES test/asmjit_test_perf.cpp
test/asmjit_test_perf_a64.cpp
test/asmjit_test_perf_x86.cpp
SOURCES test/asmjit_test_perf.h
LIBRARIES asmjit::asmjit
CFLAGS ${ASMJIT_PRIVATE_CFLAGS}
CFLAGS_DBG ${ASMJIT_PRIVATE_CFLAGS_DBG}
CFLAGS_REL ${ASMJIT_PRIVATE_CFLAGS_REL})
foreach(_target asmjit_test_emitters
asmjit_test_execute
asmjit_test_x86_sections)
asmjit_add_target(${_target} TEST
SOURCES test/${_target}.cpp
LIBRARIES asmjit::asmjit
CFLAGS ${ASMJIT_PRIVATE_CFLAGS}
CFLAGS_DBG ${ASMJIT_PRIVATE_CFLAGS_DBG}
CFLAGS_REL ${ASMJIT_PRIVATE_CFLAGS_REL})
endforeach()
if (NOT ASMJIT_NO_INTROSPECTION)
asmjit_add_target(asmjit_test_instinfo TEST
SOURCES test/asmjit_test_instinfo.cpp
LIBRARIES asmjit::asmjit
CFLAGS ${ASMJIT_PRIVATE_CFLAGS}
CFLAGS_DBG ${ASMJIT_PRIVATE_CFLAGS_DBG}
CFLAGS_REL ${ASMJIT_PRIVATE_CFLAGS_REL})
endif()
if (NOT (ASMJIT_NO_BUILDER OR ASMJIT_NO_COMPILER))
# Vectorcall tests and XMM tests require at least SSE2 in 32-bit mode (in 64-bit mode it's implicit).
# Some compilers don't like passing -msse2 for 64-bit targets, and some compilers targeting non-x86
# would pass "-msse2" compile flag check, but with a warning not detected by CMake. Thus, verify that
# our target is really 32-bit X86 and only use -msse2 or -arch:SSE2 flags when necessary.
set(ASMJIT_SSE2_CFLAGS "")
check_cxx_source_compiles("
#if defined(_M_IX86) || defined(__X86__) || defined(__i386__)
int target_is_32_bit_x86() { return 1; }
#else
// Compile error...
#endif
int main() {
return target_is_32_bit_x86();
}
" ASMJIT_TARGET_IS_32_BIT_X86)
if (ASMJIT_TARGET_IS_32_BIT_X86)
if ("${CMAKE_CXX_COMPILER_ID}" STREQUAL "MSVC" OR "x${CMAKE_CXX_COMPILER_FRONTEND_VARIANT}" STREQUAL "xMSVC")
asmjit_detect_cflags(ASMJIT_SSE2_CFLAGS "-arch:SSE2")
else()
asmjit_detect_cflags(ASMJIT_SSE2_CFLAGS "-msse2")
endif()
endif()
asmjit_add_target(asmjit_test_compiler TEST
SOURCES test/asmjit_test_compiler.cpp
test/asmjit_test_compiler.h
test/asmjit_test_compiler_a64.cpp
test/asmjit_test_compiler_x86.cpp
LIBRARIES asmjit::asmjit
CFLAGS ${ASMJIT_PRIVATE_CFLAGS} ${ASMJIT_SSE2_CFLAGS}
CFLAGS_DBG ${ASMJIT_PRIVATE_CFLAGS_DBG}
CFLAGS_REL ${ASMJIT_PRIVATE_CFLAGS_REL})
endif()
endif()
endif()

102
deps/asmjit/CONTRIBUTING.md vendored Normal file
View File

@ -0,0 +1,102 @@
## How to Contribute to AsmJit
### Did you find a bug or something isn't working as expected?
* Please use [Issues](https://github.com/asmjit/asmjit/issues) page to report bugs or create a [pull request](https://github.com/asmjit/asmjit/pulls) if you have already fixed it.
* Make sure that when a bug is reported it provides as much information as possible to make it easy to either reproduce it locally or to at least guess where the problem could be. AsmJit is a low-level tool, which makes it very easy to emit code that would crash or not work as intended when executed. Always use AsmJit's [Logging](https://asmjit.com/doc/group__asmjit__logging.html) and [Error Handling](https://asmjit.com/doc/group__asmjit__error__handling.html) features first to analyze whether there is not a simple to catch bug in your own code.
* Don't be afraid to ask for help if you don't know how to solve a particular problem or in case it's unclear how to do it. The community would help if the problem is well described and has a solution. In general we always try to at least improve the documentation in case it doesn't provide enough information and users must ask for help.
### Asking questions
* We prefer GitHub issues to be used for reporting bugs or feature requests, but it's still okay to ask questions there as well. However, please consider joining our [Gitter Chat](https://app.gitter.im/#/room/#asmjit:gitter.im) to ask questions; it has an active community that can quickly respond.
### Suggesting feature requests
* It's very likely that when using AsmJit you have found something that AsmJit doesn't provide, which would be handy to have as a built-in. The [Issues](https://github.com/asmjit/asmjit/issues) page can be used to submit feature requests, but please keep in mind that AsmJit is a relatively small project and not all requested features will be accepted, especially if they are non-trivial, time consuming to implement, or the scope of the feature doesn't match AsmJit goals.
* If you have already implemented the feature you are suggesting, please open a [pull request](https://github.com/asmjit/asmjit/pulls).
* Ports (requesting new AsmJit backends) can be reported as feature requests, but only by people that are willing to work on them as creating new ports takes a lot of time.
### Suggesting a documentation enhancement
* [AsmJit's documentation](https://asmjit.com/doc/index.html) is auto-generated from source code, so if you would like to improve it just open a [pull request](https://github.com/asmjit/asmjit/pulls) with your changes. The documentation uses [Doxygen](https://www.doxygen.nl/) as a front-end, so you can use `\ref` keyword to create links and other Doxygen keywords to enhance the documentation.
### Suggesting a website content enhancement
* [AsmJit's website](https://asmjit.com) is also generated, but not from public sources at the moment. If you did find an issue on the website you can either use contact information on the [support page](https://asmjit.com/support.html) or to discuss the change on our [Gitter Chat](https://app.gitter.im/#/room/#asmjit:gitter.im). Alternatively, opening a regular issue is also okay.
## Coding Style & Consistency
* If you decide to open a pull request, make sure that the code you submit uses the same convention as the rest of the code. We prefer keeping the code consistent.
* [.editorconfig](./.editorconfig) should help with basic settings.
* Initially, AsmJit coding style was based on Google C++ Style Guide, but it has diverged from it.
* Include guards use `<PATH_TO_SRC>_H_INCLUDED` format.
* `asmjit` namespace must be open by `ASMJIT_BEGIN_NAMESPACE` and closed by `ASMJIT_END_NAMESPACE`
* `asmjit::xxx` (backend specific) nested namespace must be open by `ASMJIT_BEGIN_SUB_NAMESPACE(xxx)` and closed by `ASMJIT_END_SUB_NAMESPACE`.
* Opening bracket is on the same line, like `struct Something {`, `if (condition) {`, etc...
* The code uses a soft limit of 120 characters per line (including documentation), but it's not enforced and it's okay to use more when it makes sense (for example defining tables, etc...).
* Since AsmJit doesn't use Exceptions nor RTTI the code cannot use containers provided by the C++ standard library. In general, we try to only use a bare minimum from the C++ standard library to make it viable to use AsmJit even in C code bases where JIT complier is implemented in C++ ([Erlang](https://www.erlang.org/) can be seen as a great example).
## Testing
* AsmJit uses a minimalist unit testing framework to write unit tests to avoid third-party dependencies.
* At the moment tests are in the same file as the implementation and are only compiled when `ASMJIT_TEST` macro is defined.
* Use `-DASMJIT_TEST=1` when invoking [CMake](https://cmake.org/) to compile AsmJit tests.
* Unit tests are compiled to a single `asmjit_test_unit[.exe]` executable.
* Other tests have their own executables based on what is tested.
* Always add assembler tests when adding new instructions, see [asmjit_test_assembler_x64.cpp](./test/asmjit_test_assembler_x64.cpp) and [asmjit_test_assembler_a64.cpp](./test/asmjit_test_assembler_a64.cpp) for more details.
## Pull Request Messages
* If a change fixes a bug the message should should start with `[bug]`.
* If a change fixes or enhances documentation it should start with `[doc]`.
* If a change fixes or enhances our CI it should start with `[ci]`.
* If a change breaks ABI it must start with `[abi]`.
* Otherwise there is no suggested prefix.
## ABI Changes
* ABI changes happen, but they are usually accumulated and committed within a short time window to not break it often. In general we prefer to break ABI once a year, or once 6 months if there is something that has a high priority. There are no hard rules though.
* AsmJit uses an `inline namespace`, which should make it impossible to link to AsmJit library that is ABI incompatible. When ABI break happens both AsmJit version and ABI namespace are changed, see [asmjit/core/api-config.h](./src/asmjit/core/api-config.h) for more details.
* What is an ABI break?
* Modifying a public struct/class in a way that its functionality is altered and/or its size is changed
* Adding/removing virtual functions to/from classes, respectively
* Changing a signature of a public function or a class member function (for example adding a parameter).
* Changing the value of an enum or global constant (for example instructions are now sorted by name, so adding a new instruction breaks ABI)
* Possibly more, but these were the most common...
* What is not ABI break?
* Extending the functionality by using reserved members of a struct/class
* Adding new API including new structs and classes
* Changing anything that is internal and that doesn't leak to public headers

17
deps/asmjit/LICENSE.md vendored Normal file
View File

@ -0,0 +1,17 @@
Copyright (c) 2008-2024 The AsmJit Authors
This software is provided 'as-is', without any express or implied
warranty. In no event will the authors be held liable for any damages
arising from the use of this software.
Permission is granted to anyone to use this software for any purpose,
including commercial applications, and to alter it and redistribute it
freely, subject to the following restrictions:
1. The origin of this software must not be misrepresented; you must not
claim that you wrote the original software. If you use this software
in a product, an acknowledgment in the product documentation would be
appreciated but is not required.
2. Altered source versions must be plainly marked as such, and must not be
misrepresented as being the original software.
3. This notice may not be removed or altered from any source distribution.

70
deps/asmjit/README.md vendored Normal file
View File

@ -0,0 +1,70 @@
AsmJit
------
AsmJit is a lightweight library for machine code generation written in C++ language.
* [Official Home Page (asmjit.com)](https://asmjit.com)
* [Official Repository (asmjit/asmjit)](https://github.com/asmjit/asmjit)
* [Public Chat Channel](https://app.gitter.im/#/room/#asmjit:gitter.im)
* [Zlib License](./LICENSE.md)
See [asmjit.com](https://asmjit.com) page for more details, examples, and documentation.
Documentation
-------------
* [Documentation Index](https://asmjit.com/doc/index.html)
* [Build Instructions](https://asmjit.com/doc/group__asmjit__build.html)
Contributing
------------
* See [CONTRIBUTING](./CONTRIBUTING.md) page for more details
Breaking Changes
----------------
Breaking the API is sometimes inevitable, what to do?
* See [Breaking Changes Guide](https://asmjit.com/doc/group__asmjit__breaking__changes.html), which is now part of AsmJit documentation
* See asmjit tests, they always compile and provide implementation of many use-cases:
* [asmjit_test_emitters.cpp](./test/asmjit_test_emitters.cpp) - Tests that demonstrate the purpose of emitters
* [asmjit_test_assembler_x86.cpp](./test/asmjit_test_assembler_x86.cpp) - Tests targeting AsmJit's Assembler (x86/x64)
* [asmjit_test_compiler_x86.cpp](./test/asmjit_test_compiler_x86.cpp) - Tests targeting AsmJit's Compiler (x86/x64)
* [asmjit_test_instinfo.cpp](./test/asmjit_test_instinfo.cpp) - Tests that query instruction information
* [asmjit_test_x86_sections.cpp](./test/asmjit_test_x86_sections.cpp) - Multiple sections test.
* Visit our [Gitter Chat](https://app.gitter.im/#/room/#asmjit:gitter.im) if you need a quick help
Project Organization
--------------------
* **`/`** - Project root
* **src** - Source code
* **asmjit** - Source code and headers (always point include path in here)
* **core** - Core API, backend independent except relocations
* **arm** - ARM specific API, used only by ARM and AArch64 backends
* **x86** - X86 specific API, used only by X86 and X64 backends
* **test** - Unit and integration tests (don't embed in your project)
* **tools** - Tools used for configuring, documenting, and generating files
Ports
-----
* [ ] 32-bit ARM/Thumb port (work in progress)
* [ ] RISC-V port (not in progress, help welcome)
Support
-------
* AsmJit project has both community and commercial support, see [AsmJit's Support Page](https://asmjit.com/support.html)
* You can help the development and maintenance through Petr Kobalicek's [GitHub sponsors Profile](https://github.com/sponsors/kobalicek)
Notable Donors List:
* [ZehMatt](https://github.com/ZehMatt)
Authors & Maintainers
---------------------
* Petr Kobalicek <kobalicek.petr@gmail.com>

26
deps/asmjit/db/LICENSE.md vendored Normal file
View File

@ -0,0 +1,26 @@
AsmJit database is dual licensed under Zlib and Unlicense (public domain)
This is free and unencumbered software released into the public domain.
Anyone is free to copy, modify, publish, use, compile, sell, or
distribute this software, either in source code form or as a compiled
binary, for any purpose, commercial or non-commercial, and by any
means.
In jurisdictions that recognize copyright laws, the author or authors
of this software dedicate any and all copyright interest in the
software to the public domain. We make this dedication for the benefit
of the public at large and to the detriment of our heirs and
successors. We intend this dedication to be an overt act of
relinquishment in perpetuity of all present and future rights to this
software under copyright law.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
IN NO EVENT SHALL THE AUTHORS BE LIABLE FOR ANY CLAIM, DAMAGES OR
OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE,
ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
OTHER DEALINGS IN THE SOFTWARE.
For more information, please refer to <http://unlicense.org>

21
deps/asmjit/db/README.md vendored Normal file
View File

@ -0,0 +1,21 @@
AsmJit Instruction Database
---------------------------
This is a database of instructions that is used by AsmJit to generate its internal database and also assembler implementations. This project started initially as AsmDB, but was merged to AsmJit later to make the maintenance easier. The database was created in a way so that each instruction definition would only need a single line in JSON data file. The data is then processed by architecture specific data readers that make the data canonical and ready for processing.
AsmJit database provides the following ISAs:
* `isa_x86.json` - provides X86 instruction data (both 32-bit and 64-bit)
* `isa_aarch32.json` - provides AArch32 instruction data (A32/T16/T32 encoding)
* `isa_aarch64.json` - provides AArch64 instruction data (A64 encoding)
* `isa_aarch64_sme.json` - provides AArch64 SME instruction data (work-in-progress)
To Be Documented
----------------
This project will be refactored and documented in the future.
License
-------
AsmJit database is dual licensed under Zlib (AsmJit license) or public domain. The database can be used for any purpose, not just by AsmJit.

3
deps/curl/.github/CODEOWNERS vendored Normal file
View File

@ -0,0 +1,3 @@
# Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
#
# SPDX-License-Identifier: curl

29
deps/curl/.github/CONTRIBUTING.md vendored Normal file
View File

@ -0,0 +1,29 @@
<!--
Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
SPDX-License-Identifier: curl
-->
How to contribute to curl
=========================
Join the community
------------------
1. Click 'watch' on the GitHub repo
2. Subscribe to the suitable [mailing lists](https://curl.se/mail/)
Read [CONTRIBUTE](../docs/CONTRIBUTE.md)
---------------------------------------
Send your suggestions using one of these methods:
-------------------------------------------------
1. in a mail to the mailing list
2. as a [pull request](https://github.com/curl/curl/pulls)
3. as an [issue](https://github.com/curl/curl/issues)
/ The curl team

6
deps/curl/.github/FUNDING.yml vendored Normal file
View File

@ -0,0 +1,6 @@
# Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
#
# SPDX-License-Identifier: curl
github: curl
open_collective: curl

View File

@ -0,0 +1,55 @@
# Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
#
# SPDX-License-Identifier: curl
name: Bug Report on code
description: Tell us about your problem with curl or libcurl
body:
- type: markdown
attributes:
value: |
Thanks for taking the time to fill out this bug report!
Only file bugs here! Ask questions on the mailing lists https://curl.se/mail/
**SECURITY RELATED?** Post it here: https://hackerone.com/curl
There are collections of known issues to be aware of:
- https://curl.se/docs/knownbugs.html
- https://curl.se/docs/todo.html
- type: textarea
id: reproducer
attributes:
label: I did this
validations:
required: false
- type: textarea
id: expected-behaviour
attributes:
label: I expected the following
validations:
required: false
- type: textarea
id: version
attributes:
label: curl/libcurl version
description: |
Please paste the output of `curl -V` here.
placeholder: 'curl 8.2.0'
validations:
required: true
- type: textarea
id: os
attributes:
label: operating system
description: |
On Unix please post the output of `uname -a` here.
placeholder: 'Fedora Linux 38'
validations:
required: true

View File

@ -0,0 +1,18 @@
# Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
#
# SPDX-License-Identifier: curl
blank_issues_enabled: false
contact_links:
- name: Ask a question (without email)
url: https://github.com/curl/curl/discussions
about: Use the Discussion forum here on GitHub
- name: Ask a question (using email)
url: https://curl.se/mail/
about: Send question to the suitable mailing list
- name: Commercial support
url: https://curl.se/support.html
about: Pay for fast quality support for and help with curl/libcurl
- name: Feature request
url: https://curl.se/mail/
about: To propose new features or enhancements, please bring that discussion to a suitable curl mailing list.

View File

@ -0,0 +1,32 @@
# Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
#
# SPDX-License-Identifier: curl
name: Bug Report on documentation
description: Problems, errors, mistakes or typos in documentation.
labels: documentation
body:
- type: markdown
attributes:
value: |
Thanks for taking the time to fill out this bug report!
Only file documentation bugs here! Ask questions on the mailing lists https://curl.se/mail/
- type: textarea
id: source
attributes:
label: Specify which documentation you found a problem with
description: |
Include function name, URL, tarball version and all other relevant
details that identify the documentation source.
validations:
required: true
- type: textarea
id: problem
attributes:
label: The problem
validations:
required: true

10
deps/curl/.github/dependabot.yml vendored Normal file
View File

@ -0,0 +1,10 @@
# Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
#
# SPDX-License-Identifier: curl
version: 2
updates:
- package-ecosystem: 'github-actions'
directory: '/'
schedule:
interval: 'weekly'

531
deps/curl/.github/labeler.yml vendored Normal file
View File

@ -0,0 +1,531 @@
# Copyright (C) Daniel Fandrich, <dan@coneharvesters.com>, et al.
#
# SPDX-License-Identifier: curl
# The workflow configures the .github/workflows/label.yml action
# to add labels to pull requests. This is not (yet?) a replacement for human
# triaging, but is intended to add labels to the easy cases. If the matching
# language becomes more powerful, more cases should be able to be handled.
#
# Labels are added in two ways: the any-glob-to-all-files ones are added if all
# the files fit into the category, and the any-glob-to-any-file ones are added
# as long as any file matches. The first ones are for "major" categories (the
# PR is all about that one topic, like HTTP/3), while the second ones are
# "addendums" that give useful information about a PR that's really mostly
# something else (e.g. CI if the PR also touches CI jobs).
#
# N.B. any-glob-to-all-files is misnamed; it acts like one-glob-to-all-files.
# Therefore, to get any-glob-to-all-files semantics with multiple matching
# patterns, they must be joined with commas to a single string surrounded by
# braces. For example: '{lib/**,src/**}'.
#
# See https://github.com/actions/labeler/ for documentation on this file.
---
appleOS:
- all:
- changed-files:
- any-glob-to-all-files: "{\
.github/workflows/macos.yml,\
lib/config-mac.h,\
lib/macos*,\
lib/vtls/sectransp*,\
m4/curl-sectransp.m4\
}"
authentication:
- all:
- changed-files:
- any-glob-to-all-files: "{\
docs/mk-ca-bundle.1,\
docs/libcurl/opts/CURLINFO_HTTPAUTH*,\
docs/libcurl/opts/CURLINFO_PROXYAUTH*,\
docs/libcurl/opts/CURLOPT_KRB*,\
docs/libcurl/opts/CURLOPT_SASL*,\
docs/libcurl/opts/CURLOPT_SERVICE_NAME*,\
docs/libcurl/opts/CURLOPT_USERNAME*,\
docs/libcurl/opts/CURLOPT_USERPWD*,\
docs/libcurl/opts/CURLOPT_XOAUTH*,\
lib/*gssapi*,\
lib/*krb5*,\
lib/*ntlm*,\
lib/curl_sasl.*,\
lib/http_aws*,\
lib/http_digest.*,\
lib/http_negotiate.*,\
lib/vauth/**\
}"
build:
- all:
- changed-files:
- any-glob-to-all-files: "{\
**/CMakeLists.txt,\
**/Makefile.am,\
**/Makefile.inc,\
**/*.m4,\
**/*.mk,\
*.m4,\
docs/INSTALL-CMAKE.md,\
lib/curl_config.h.cmake,\
lib/libcurl*.in,\
CMake/**,\
CMakeLists.txt,\
configure.ac,\
m4/**,\
Makefile.*,\
packages/**,\
plan9/**,\
projects/**,\
winbuild/**,\
lib/libcurl.def,\
tests/cmake/**\
}"
CI:
- all:
- changed-files:
- any-glob-to-any-file:
- '.circleci/**'
- '.github/**'
- 'appveyor.*'
- 'scripts/ci*'
- 'tests/azure.pm'
- 'tests/appveyor.pm'
- 'tests/CI.md'
cmake:
- all:
- changed-files:
- any-glob-to-all-files: "{\
**/CMakeLists.txt,\
CMake/**,\
docs/INSTALL-CMAKE.md,\
lib/curl_config.h.cmake,\
tests/cmake/**\
}"
cmdline tool:
- all:
- changed-files:
- any-glob-to-any-file:
- 'docs/cmdline-opts/**'
- 'src/**'
connecting & proxies:
- all:
- changed-files:
- any-glob-to-all-files: "{\
docs/internals/CONNECTION-FILTERS.md,\
docs/examples/ipv6.c,\
docs/libcurl/opts/CURLINFO_CONNECT*,\
docs/libcurl/opts/CURLINFO_PROXY*,\
docs/libcurl/opts/CURLOPT_ADDRESS*,\
docs/libcurl/opts/CURLOPT_CONNECT*,\
docs/libcurl/opts/CURLOPT_HAPROXY*,\
docs/libcurl/opts/CURLOPT_OPENSOCKET*,\
docs/libcurl/opts/CURLOPT_PRE_PROXY*,\
docs/libcurl/opts/CURLOPT_PROXY*,\
docs/libcurl/opts/CURLOPT_SOCKOPT*,\
docs/libcurl/opts/CURLOPT_SOCKS*,\
docs/libcurl/opts/CURLOPT_TCP*,\
docs/libcurl/opts/CURLOPT_TIMEOUT*,\
lib/cf-*proxy.*,\
lib/cf-socket.*,\
lib/cfilters.*,\
lib/conncache.*,\
lib/connect.*,\
lib/http_proxy.*,\
lib/if2ip.*,\
lib/noproxy.*,\
lib/socks.*,\
tests/server/socksd.c\
}"
cookies:
- all:
- changed-files:
- any-glob-to-all-files: "{\
docs/HTTP-COOKIES.md,\
docs/cmdline-opts/cookie*,\
docs/cmdline-opts/junk-session-cookies.md,\
docs/libcurl/opts/CURLINFO_COOKIE*,\
docs/libcurl/opts/CURLOPT_COOKIE*,\
docs/examples/cookie_interface.c,\
lib/cookie.*,\
lib/psl.*\
}"
cryptography:
- all:
- changed-files:
- any-glob-to-all-files: "{\
docs/CIPHERS.md,\
docs/RUSTLS.md,\
docs/libcurl/opts/CURLOPT_EGDSOCKET*,\
lib/*sha256*,\
lib/*sha512*,\
lib/curl_des.*,\
lib/curl_hmac.*,\
lib/curl_md?.*,\
lib/md?.*,\
lib/rand.*\
}"
DICT:
- all:
- changed-files:
- any-glob-to-all-files: "{\
lib/dict.*,\
tests/dictserver.py\
}"
documentation:
- all:
- changed-files:
- any-glob-to-all-files: "{\
.github/workflows/checkdocs.yml,\
.github/scripts/badwords.*,\
.github/scripts/cd2cd,\
.github/scripts/cd2nroff,\
.github/scripts/cdall.pl,\
.github/scripts/nroff2cd,\
.github/scripts/verify-examples.pl,\
.github/scripts/verify-synopsis.pl,\
**/*.md,\
**/*.txt,\
**/*.1,\
CHANGES.md,\
docs/**,\
LICENSES/**,\
README,\
RELEASE-NOTES,\
scripts/cd*\
}"
- all-globs-to-all-files:
# negative matches
- '!**/CMakeLists.txt'
- '!**/Makefile.am'
FTP:
- all:
- changed-files:
- any-glob-to-all-files: "{\
docs/libcurl/opts/CURLINFO_FTP*,\
docs/libcurl/opts/CURLOPT_FTP*,\
docs/libcurl/opts/CURLOPT_WILDCARDMATCH*,\
docs/examples/ftp*,\
lib/curl_fnmatch.*,\
lib/curl_range.*,\
lib/ftp*,\
tests/ftp*\
}"
GOPHER:
- all:
- changed-files:
- any-glob-to-all-files: "{\
lib/gopher*\
}"
HTTP:
- all:
- changed-files:
- any-glob-to-all-files: "{\
docs/examples/hsts*,\
docs/examples/http-*,\
docs/examples/httpput*,\
docs/examples/https*,\
docs/examples/*post*,\
docs/HTTP-COOKIES.md,\
docs/libcurl/opts/CURLINFO_COOKIE*,\
docs/libcurl/opts/CURLOPT_COOKIE*,\
docs/libcurl/opts/CURLINFO_HTTP_**,\
docs/libcurl/opts/CURLINFO_REDIRECT*,\
docs/libcurl/opts/CURLINFO_REFER*,\
docs/libcurl/opts/CURLOPT_FOLLOWLOCATION*,\
docs/libcurl/opts/CURLOPT_HSTS*,\
docs/libcurl/opts/CURLOPT_HTTP*,\
docs/libcurl/opts/CURLOPT_POST.*,\
docs/libcurl/opts/CURLOPT_POSTFIELD*,\
docs/libcurl/opts/CURLOPT_POSTREDIR*,\
docs/libcurl/opts/CURLOPT_REDIR*,\
docs/libcurl/opts/CURLOPT_REFER*,\
docs/libcurl/opts/CURLOPT_TRAILER*,\
docs/libcurl/opts/CURLOPT_TRANSFER_ENCODING*,\
lib/cf-https*,\
lib/cf-h1*,\
lib/cf-h2*,\
lib/cookie.*,\
lib/hsts.*,\
lib/http*,\
tests/http*,\
tests/http-server.pl,\
tests/http/*,\
tests/nghttp*\
}"
HTTP/2:
- all:
- changed-files:
- any-glob-to-all-files: "{\
CMake/FindNGHTTP2.cmake,\
CMake/FindQuiche.cmake,\
docs/libcurl/opts/CURLOPT_STREAM*,\
docs/examples/http2*,\
lib/http2*,\
tests/http2-server.pl\
}"
HTTP/3:
- all:
- changed-files:
- any-glob-to-all-files: "{\
.github/workflows/ngtcp2*,\
.github/workflows/quiche*,\
.github/workflows/osslq*,\
CMake/FindMSH3.cmake,\
CMake/FindNGHTTP3.cmake,\
CMake/FindNGTCP2.cmake,\
docs/HTTP3.md,\
docs/examples/http3*,\
lib/vquic/**,\
tests/http3-server.pl,\
tests/nghttpx.conf\
}"
IMAP:
- all:
- changed-files:
- any-glob-to-all-files: "{\
lib/imap*,\
docs/examples/imap*\
}"
LDAP:
- all:
- changed-files:
- any-glob-to-all-files: "{\
lib/*ldap*\
}"
libcurl API:
- all:
- changed-files:
- any-glob-to-any-file:
- 'docs/libcurl/ABI.md'
- 'docs/libcurl/curl_*.md'
- 'include/curl/**'
logging:
- all:
- changed-files:
- any-glob-to-all-files: "{\
docs/cmdline-opts/trace*,\
docs/libcurl/curl_global_trace*,\
lib/curl_trc*,\
tests/http/test_15_tracing.py\
}"
MIME:
- all:
- changed-files:
- any-glob-to-all-files: "{\
docs/libcurl/curl_form*,\
docs/libcurl/curl_mime_*,\
docs/libcurl/opts/CURLOPT_MIME*,\
docs/libcurl/opts/CURLOPT_HTTPPOST*,\
lib/formdata*,\
lib/mime*,\
src/tool_formparse.*\
}"
MQTT:
- all:
- changed-files:
- any-glob-to-all-files: "{\
lib/mqtt*,\
tests/server/mqttd.c\
}"
name lookup:
- all:
- changed-files:
- any-glob-to-all-files: "{\
docs/examples/resolve.c,\
docs/libcurl/opts/CURLINFO_NAMELOOKUP*,\
docs/libcurl/opts/CURLOPT_DNS*,\
docs/libcurl/opts/CURLOPT_DOH*,\
docs/libcurl/opts/CURLOPT_RESOLVE*,\
lib/asyn*,\
lib/curl_gethostname.*,\
lib/doh*,\
lib/host*,\
lib/idn*,\
lib/inet_pton.*,\
lib/socketpair*,\
tests/server/resolve.c\
}"
POP3:
- all:
- changed-files:
- any-glob-to-all-files: "{\
docs/examples/pop3*,\
lib/pop3.*\
}"
RTMP:
- all:
- changed-files:
- any-glob-to-all-files: "{\
CMake/FindLibrtmp.cmake,\
lib/curl_rtmp.*\
}"
RTSP:
- all:
- changed-files:
- any-glob-to-all-files: "{\
docs/libcurl/opts/CURLINFO_RTSP*,\
docs/libcurl/opts/CURLOPT_RTSP*,\
lib/rtsp.*,\
tests/rtspserver.pl,\
tests/server/rtspd.c\
}"
SCP/SFTP:
- all:
- changed-files:
- any-glob-to-all-files: "{\
CMake/FindLibssh2.cmake,\
docs/libcurl/opts/CURLOPT_SSH*,\
docs/examples/sftp*,\
lib/vssh/**,\
tests/sshhelp.pm,\
tests/sshserver.pl\
}"
script:
- all:
- changed-files:
- any-glob-to-all-files: "{\
**/*.pl,\
**/*.sh,\
curl-config.in,\
docs/curl-config.1,\
docs/mk-ca-bundle.1,\
docs/THANKS-filter,\
scripts/**\
}"
SMB:
- all:
- changed-files:
- any-glob-to-all-files: "{\
lib/smb.*,\
tests/smbserver.py\
}"
SMTP:
- all:
- changed-files:
- any-glob-to-all-files: "{\
docs/examples/smtp-*,\
docs/libcurl/opts/CURLOPT_MAIL*,\
lib/smtp.*\
}"
tests:
- all:
- changed-files:
- any-glob-to-any-file:
- 'tests/**'
TFTP:
- all:
- changed-files:
- any-glob-to-all-files: "{\
lib/tftp.*,\
tests/tftpserver.pl,\
tests/server/tftp*\
}"
TLS:
- all:
- changed-files:
- any-glob-to-all-files: "{\
CMake/FindBearSSL.cmake,\
CMake/FindMbedTLS.cmake,\
CMake/FindWolfSSL.cmake,\
CMake/FindRustls.cmake,\
docs/examples/ssl*,\
docs/examples/*ssl.*,\
docs/examples/*tls.*,\
docs/SSL*,\
docs/libcurl/curl_global_sslset*,\
docs/libcurl/opts/CURLINFO_CA*,\
docs/libcurl/opts/CURLINFO_CERT*,\
docs/libcurl/opts/CURLINFO_SSL*,\
docs/libcurl/opts/CURLINFO_TLS*,\
docs/libcurl/opts/CURLOPT_CA*,\
docs/libcurl/opts/CURLOPT_CERT*,\
docs/libcurl/opts/CURLOPT_PINNEDPUBLICKEY*,\
docs/libcurl/opts/CURLOPT_SSL*,\
docs/libcurl/opts/CURLOPT_TLS*,\
docs/libcurl/opts/CURLOPT_USE_SSL*,\
lib/vtls/**,\
m4/curl-bearssl.m4,\
m4/curl-gnutls.m4,\
m4/curl-mbedtls.m4,\
m4/curl-openssl.m4,\
m4/curl-rustls.m4,\
m4/curl-schannel.m4,\
m4/curl-sectransp.m4,\
m4/curl-wolfssl.m4\
}"
URL:
- all:
- changed-files:
- any-glob-to-all-files: "{\
docs/libcurl/curl_url*,\
docs/URL-SYNTAX.md,\
docs/examples/parseurl*,\
include/curl/urlapi.h,\
lib/urlapi*\
}"
WebSocket:
- all:
- changed-files:
- any-glob-to-all-files: "{\
docs/internals/WEBSOCKET.md*,\
docs/examples/websocket*,\
docs/libcurl/curl_ws_*,\
docs/libcurl/libcurl-ws*,\
docs/libcurl/opts/CURLOPT_WS_*,\
include/curl/websockets.h,\
lib/ws.*,\
tests/http/clients/ws*,\
tests/http/test_20_websockets.py,\
tests/http/testenv/ws*\
}"
Windows:
- all:
- changed-files:
- any-glob-to-all-files: "{\
appveyor.*,\
.github/workflows/windows.yml,\
CMake/win32-cache.cmake,\
lib/*win32*,\
lib/curl_multibyte.*,\
lib/rename.*,\
lib/vtls/schannel*,\
m4/curl-schannel.m4,\
projects/**,\
src/tool_doswin.c,\
winbuild/**,\
lib/libcurl.def\
}"

12
deps/curl/.github/lock.yml vendored Normal file
View File

@ -0,0 +1,12 @@
# Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
#
# SPDX-License-Identifier: curl
# Configuration for lock-threads - https://github.com/dessant/lock-threads
# Number of days of inactivity before a closed issue or pull request is locked
daysUntilLock: 90
# Comment to post before locking. Set to `false` to disable
lockComment: false
# Limit to only `issues` or `pulls`
# only: issues

79
deps/curl/.github/scripts/badwords.pl vendored Normal file
View File

@ -0,0 +1,79 @@
#!/usr/bin/env perl
# Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
#
# SPDX-License-Identifier: curl
#
# bad[:=]correct
#
# If separator is '=', the string will be compared case sensitively.
# If separator is ':', the check is done case insensitively.
#
# To add white listed uses of bad words that are removed before checking for
# the bad ones:
#
# ---(accepted word)
#
my $w;
while(<STDIN>) {
chomp;
if($_ =~ /^#/) {
next;
}
if($_ =~ /^---(.*)/) {
push @whitelist, $1;
}
elsif($_ =~ /^([^:=]*)([:=])(.*)/) {
my ($bad, $sep, $better)=($1, $2, $3);
push @w, $bad;
$alt{$bad} = $better;
if($sep eq "=") {
$exactcase{$bad} = 1;
}
}
}
my $errors;
sub file {
my ($f) = @_;
my $l = 0;
open(F, "<$f");
while(<F>) {
my $in = $_;
$l++;
chomp $in;
if($in =~ /^ /) {
next;
}
# remove the link part
$in =~ s/(\[.*\])\(.*\)/$1/g;
# remove backticked texts
$in =~ s/\`.*\`//g;
# remove whitelisted patterns
for my $p (@whitelist) {
$in =~ s/$p//g;
}
foreach my $w (@w) {
my $case = $exactcase{$w};
if(($in =~ /^(.*)$w/i && !$case) ||
($in =~ /^(.*)$w/ && $case) ) {
my $p = $1;
my $c = length($p)+1;
print STDERR "$f:$l:$c: error: found bad word \"$w\"\n";
printf STDERR " %4d | $in\n", $l;
printf STDERR " | %*s^%s\n", length($p), " ",
"~" x (length($w)-1);
printf STDERR " maybe use \"%s\" instead?\n", $alt{$w};
$errors++;
}
}
}
close(F);
}
my @files = @ARGV;
foreach my $each (@files) {
file($each);
}
exit $errors;

73
deps/curl/.github/scripts/badwords.txt vendored Normal file
View File

@ -0,0 +1,73 @@
# Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
#
# SPDX-License-Identifier: curl
#
back-end:backend
e-mail:email
run-time:runtime
set-up:setup
tool chain:toolchain
tool-chain:toolchain
wild-card:wildcard
wild card:wildcard
i'm:I am
you've:You have
we've:we have
we're:we are
we'll:we will
we'd:we would
they've:They have
they're:They are
they'll:They will
they'd:They would
you've:you have
you'd:you would
you'll:you will
you're:you are
should've:should have
don't=do not
could've:could have
doesn't:does not
isn't:is not
aren't:are not
a html: an html
a http: an http
a ftp: an ftp
url =URL
internet\b=Internet
isation:ization
\bit's:it is
it'd:it would
there's:there is
[^.]\. And: Rewrite it somehow?
^(And|So|But) = Rewrite it somehow?
\. But: Rewrite it somehow?
\. So : Rewrite without "so" ?
dir :directory
can't:cannot
that's:that is
web page:webpage
host name\b:hostname
host names\b:hostnames
file name\b:filename
file names\b:filenames
\buser name\b:username
\buser names\b:usernames
\bpass phrase:passphrase
didn't:did not
doesn't:does not
won't:will not
couldn't:could not
\bwill\b:rewrite to present tense
\b32bit=32-bit
\b64bit=64-bit
32 bit\b=32-bit
64 bit\b=64-bit
64-bits:64 bits or 64-bit
32-bits:32 bits or 32-bit
\bvery\b:rephrase using an alternative word
\bCurl\b=curl
\bLibcurl\b=libcurl
---WWW::Curl
---NET::Curl
---Curl Corporation

115
deps/curl/.github/scripts/binarycheck.pl vendored Normal file
View File

@ -0,0 +1,115 @@
#!/usr/bin/env perl
#***************************************************************************
# _ _ ____ _
# Project ___| | | | _ \| |
# / __| | | | |_) | |
# | (__| |_| | _ <| |___
# \___|\___/|_| \_\_____|
#
# Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
#
# This software is licensed as described in the file COPYING, which
# you should have received as part of this distribution. The terms
# are also available at https://curl.se/docs/copyright.html.
#
# You may opt to use, copy, modify, merge, publish, distribute and/or sell
# copies of the Software, and permit persons to whom the Software is
# furnished to do so, under the terms of the COPYING file.
#
# This software is distributed on an "AS IS" basis, WITHOUT WARRANTY OF ANY
# KIND, either express or implied.
#
# SPDX-License-Identifier: curl
#
###########################################################################
# This scripts scans the entire git repository for binary files.
#
# All files in the git repo that contain signs of being binary are then
# collected and a sha256sum is generated for all of them. That summary is then
# compared to the list of pre-vetted files so that only the exact copies of
# already scrutinized files are deemed okay to "appear binary".
#
use strict;
use warnings;
my $root = ".";
my $sumsfile = ".github/scripts/binarycheck.sums";
if($ARGV[0]) {
$root = $ARGV[0];
}
my @bin;
my %known;
my $error = 0;
sub knownbins {
open(my $mh, "<", "$sumsfile") ||
die "can't read known binaries";
while(<$mh>) {
my $l = $_;
chomp $l;
if($l =~ /^([a-f0-9]+) (.*)/) {
my ($sum, $file) = ($1, $2);
$known{$file} = 1;
}
elsif($l =~ /^#/) {
# skip comments
}
else {
print STDERR "suspicious line in $sumsfile\n";
$error++;
}
}
close($mh);
}
sub checkfile {
my ($file) = @_;
open(my $mh, "<", "$file") || die "can't read $file";
my $line = 0;
while(<$mh>) {
my $l = $_;
$line++;
if($l =~ /([\x00-\x08\x0b\x0c\x0e-\x1f\x7f])/) {
push @bin, $file;
if(!$known{$file}) {
printf STDERR "$file:$line has unknown binary contents\n";
$error++;
}
last;
}
}
close($mh);
}
my @files = `git ls-files -- $root`;
if(scalar(@files) < 3000) {
# this means this is not the git source code repository or that git does
# not work, error out!
print STDERR "too few files in the git repository!\n";
exit 1;
}
knownbins();
if(scalar(keys %known) < 10) {
print STDERR "too few known binaries in $sumsfile\n";
exit 2;
}
for my $f (@files) {
chomp $f;
checkfile("$root/$f");
}
my $check=system("sha256sum -c $sumsfile");
if($check) {
print STDERR "sha256sum detected a problem\n";
$error++;
}
exit $error;

View File

@ -0,0 +1,16 @@
# Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
# SPDX-License-Identifier: curl
c68161dba1c0166e4ab8d8ce00f57326db25d29fdd52c33d9974d0972ec60990 ./tests/certs/test-localhost-san-first.pub.der
82430be03ec1783e2f9fad6e07a6f42cce62f8a23d87ea81a95977b47110c200 ./tests/certs/test-localhost-san-last.pub.der
47233a0092db614f53e96a4df83ddeaa7e5242899ede1c1a90c53423a0b13bba ./tests/certs/test-localhost.nn.pub.der
63898448aa199675a30fb6722046a665a7c1a5c24453e63d8c37397482a7dc52 ./tests/certs/test-localhost.pub.der
f78c61bb06a71d1bf9b034ecfcb7fe35ae85b6a3b87bf3a73c085dc062747dc1 ./tests/certs/test-localhost0h.pub.der
9e38c1fb0a151c4e23c8abddc44711c12afb3161c6b2a1c68e1bb2b0a4484e3b ./tests/data/test1425
26ee981dcb84b6a2adce601084b78e6b787b54a2a997549582a8bd42087ab51b ./tests/data/test1426
d640923e45809a3fe277e0af90459d82d32603aacc7b8db88754fcb335bf98df ./tests/data/test1531
6f51bc318104fb5fe4b6013fc4e8e1c3c8dec1819202e8ea025bdbc4bbc8c02d ./tests/data/test1938
33809cab2442488e5985b4939727bc4ead9fc65150f53008e3e4c93140675a94 ./tests/data/test262
2d073a52984bab1f196d80464ea8ab6dafd887bd5fee9ed58603f8510df0c6a5 ./tests/data/test35
4cc9fd6f31d0bb4dcb38e1565796e7ec5e48ea5ac9d3c1101de576be618786ba ./tests/data/test463
d655a29dcf2423b420b508c9e381b0fad0b88feb74caa8978725e22c9f7c374d ./tests/data/test467
8644ccf85e552755bf65faf2991d84f19523919379ec2cf195841a4cabe1507b ./tests/data/test545

119
deps/curl/.github/scripts/cleancmd.pl vendored Normal file
View File

@ -0,0 +1,119 @@
#!/usr/bin/env perl
# Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
#
# SPDX-License-Identifier: curl
#
# Input: cmdline docs markdown files, they get modified *in place*
#
# Strip off the leading meta-data/header part, remove all known curl symbols
# and long command line options. Also clean up whatever else the spell checker
# might have a problem with that we still deem is fine.
#
open(S, "<./docs/libcurl/symbols-in-versions")
|| die "can't find symbols-in-versions";
while(<S>) {
if(/^([^ ]*) /) {
push @asyms, $1;
}
}
close(S);
# init the opts table with "special" options not easy to figure out
my @aopts = (
'--ftp-ssl-reqd', # old alias
);
open(O, "<./docs/options-in-versions")
|| die "can't find options-in-versions";
while(<O>) {
chomp;
if(/^([^ ]+)/) {
my $o = $1;
push @aopts, $o;
if($o =~ /^--no-(.*)/) {
# for the --no options, also make one without it
push @aopts, "--$1";
}
elsif($o =~ /^--disable-(.*)/) {
# for the --disable options, also make the special ones
push @aopts, "--$1";
push @aopts, "--no-$1";
}
}
}
close(O);
open(C, "<./.github/scripts/spellcheck.curl")
|| die "can't find spellcheck.curl";
while(<C>) {
if(/^\#/) {
next;
}
chomp;
if(/^([^ ]+)/) {
push @asyms, $1;
}
}
close(C);
# longest symbols first
my @syms = sort { length($b) <=> length($a) } @asyms;
# longest cmdline options first
my @opts = sort { length($b) <=> length($a) } @aopts;
sub process {
my ($f) = @_;
my $ignore = 0;
my $sepcount = 0;
my $out;
my $line = 0;
open(F, "<$f") or die;
while(<F>) {
$line++;
if(/^---/ && ($line == 1)) {
$ignore = 1;
next;
}
elsif(/^---/ && $ignore) {
$ignore = 0;
next;
}
next if($ignore);
my $l = $_;
# strip out backticked words
$l =~ s/`[^`]+`//g;
# **bold**
$l =~ s/\*\*(\S.*?)\*\*//g;
# *italics*
$l =~ s/\*(\S.*?)\*//g;
# strip out https URLs, we don't want them spellchecked
$l =~ s!https://[a-z0-9\#_/.-]+!!gi;
$out .= $l;
}
close(F);
# cut out all known curl cmdline options
map { $out =~ s/$_//g; } (@opts);
# cut out all known curl symbols
map { $out =~ s/\b$_\b//g; } (@syms);
if(!$ignore) {
open(O, ">$f") or die;
print O $out;
close(O);
}
}
for my $f (@ARGV) {
process($f);
}

136
deps/curl/.github/scripts/cmp-config.pl vendored Normal file
View File

@ -0,0 +1,136 @@
#!/usr/bin/env perl
#***************************************************************************
# _ _ ____ _
# Project ___| | | | _ \| |
# / __| | | | |_) | |
# | (__| |_| | _ <| |___
# \___|\___/|_| \_\_____|
#
# Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
#
# This software is licensed as described in the file COPYING, which
# you should have received as part of this distribution. The terms
# are also available at https://curl.se/docs/copyright.html.
#
# You may opt to use, copy, modify, merge, publish, distribute and/or sell
# copies of the Software, and permit persons to whom the Software is
# furnished to do so, under the terms of the COPYING file.
#
# This software is distributed on an "AS IS" basis, WITHOUT WARRANTY OF ANY
# KIND, either express or implied.
#
# SPDX-License-Identifier: curl
#
###########################################################################
my $autotools = $ARGV[0];
my $cmake = $ARGV[1];
if(!$cmake) {
print "Usage: cmp-config <config1> <config2.h>\n";
exit;
}
# this lists complete lines that will be removed from the output if
# matching
my %remove = (
'#define CURL_EXTERN_SYMBOL' => 1,
'#define CURL_OS "Linux"' => 1,
'#define CURL_OS "x86_64-pc-linux-gnu"' => 1,
'#define GETHOSTNAME_TYPE_ARG2 int' => 1,
'#define GETHOSTNAME_TYPE_ARG2 size_t' => 1,
'#define HAVE_BROTLI 1' => 1,
'#define HAVE_BROTLI_DECODE_H 1' => 1,
'#define HAVE_DLFCN_H 1' => 1,
'#define HAVE_GSSAPI_GSSAPI_KRB5_H 1' => 1,
'#define HAVE_INTTYPES_H 1' => 1,
'#define HAVE_LDAP_H 1' => 1,
'#define HAVE_LDAP_SSL 1' => 1,
'#define HAVE_LIBBROTLIDEC 1' => 1,
'#define HAVE_LIBPSL_H 1' => 1,
'#define HAVE_LIBRTMP_RTMP_H 1' => 1,
'#define HAVE_LIBSOCKET 1' => 1,
'#define HAVE_LIBSSH' => 1,
'#define HAVE_LIBSSH2 1' => 1,
'#define HAVE_LIBSSL 1' => 1,
'#define HAVE_LIBWOLFSSH' => 1,
'#define HAVE_LIBZSTD 1' => 1,
'#define HAVE_MSH3_H 1' => 1,
'#define HAVE_NGHTTP2_NGHTTP2_H 1' => 1,
'#define HAVE_NGHTTP3_NGHTTP3_H 1' => 1,
'#define HAVE_NGTCP2_NGTCP2_CRYPTO_H 1' => 1,
'#define HAVE_NGTCP2_NGTCP2_H 1' => 1,
'#define HAVE_OPENSSL_CRYPTO_H 1' => 1,
'#define HAVE_OPENSSL_ERR_H 1' => 1,
'#define HAVE_OPENSSL_PEM_H 1' => 1,
'#define HAVE_OPENSSL_RSA_H 1' => 1,
'#define HAVE_OPENSSL_SSL_H 1' => 1,
'#define HAVE_OPENSSL_X509_H 1' => 1,
'#define HAVE_QUICHE_H 1' => 1,
'#define HAVE_SSL_SET_QUIC_USE_LEGACY_CODEPOINT 1' => 1,
'#define HAVE_STDINT_H 1' => 1,
'#define HAVE_STDIO_H 1' => 1,
'#define HAVE_STDLIB_H 1' => 1,
'#define HAVE_STRING_H 1' => 1,
'#define HAVE_SYS_XATTR_H 1' => 1,
'#define HAVE_UNICODE_UIDNA_H 1' => 1,
'#define HAVE_WOLFSSH_SSH_H 1' => 1,
'#define HAVE_ZSTD 1' => 1,
'#define HAVE_ZSTD_H 1' => 1,
'#define LT_OBJDIR ".libs/"' => 1,
'#define NEED_LBER_H 1' => 1,
'#define PACKAGE "curl"' => 1,
'#define PACKAGE_BUGREPORT "a suitable curl mailing list: https://curl.se/mail/"' => 1,
'#define PACKAGE_NAME "curl"' => 1,
'#define PACKAGE_STRING "curl -"' => 1,
'#define PACKAGE_TARNAME "curl"' => 1,
'#define PACKAGE_URL ""' => 1,
'#define PACKAGE_VERSION "-"' => 1,
'#define SIZEOF_LONG_LONG 8' => 1,
'#define VERSION "-"' => 1,
'#define _FILE_OFFSET_BITS 64' => 1,
);
sub filter {
my ($line) = @_;
if(!$remove{$line}) {
return "$line\n";
}
$remove{$line}++;
return "";
}
sub grepit {
my ($input, $output) = @_;
my @defines;
# first get all the #define lines
open(F, "<$input");
while(<F>) {
if($_ =~ /^#def/) {
chomp;
push @defines, $_;
}
}
close(F);
open(O, ">$output");
# output the sorted list through the filter
foreach my $d(sort @defines) {
print O filter($d);
}
close(O);
}
grepit($autotools, "/tmp/autotools");
grepit($cmake, "/tmp/cmake");
foreach my $v (keys %remove) {
if($remove{$v} == 1) {
print "Ignored, never matched line: $v\n";
}
}
# return the exit code from diff
exit system("diff -u /tmp/autotools /tmp/cmake") >> 8;

View File

@ -0,0 +1,49 @@
#!/usr/bin/env bash
# Copyright (C) Viktor Szakats
#
# SPDX-License-Identifier: curl
# Sort list of libs, libpaths, cflags found in libcurl.pc and curl-config files,
# then diff the autotools and cmake generated ones.
sort_lists() {
prevline=''
section=''
while IFS= read -r l; do
if [[ "${prevline}" =~ (--cc|--configure) ]]; then # curl-config
echo "<IGNORED>"
else
# libcurl.pc
if [[ "${l}" =~ ^(Requires|Libs|Cflags)(\.private)?:\ (.+)$ ]]; then
if [ "${BASH_REMATCH[1]}" = 'Requires' ]; then
# Spec does not allow duplicates here:
# https://manpages.debian.org/unstable/pkg-config/pkg-config.1.en.html#Requires:
# "You may only mention the same package one time on the Requires: line"
val="$(printf '%s' "${BASH_REMATCH[3]}" | tr ',' '\n' | sort | tr '\n' ' ')"
else
val="$(printf '%s' "${BASH_REMATCH[3]}" | tr ' ' '\n' | sort -u | tr '\n' ' ')"
fi
l="${BASH_REMATCH[1]}${BASH_REMATCH[2]}: ${val}"
# curl-config
elif [[ "${section}" =~ (--libs|--static-libs) && "${l}" =~ ^( *echo\ \")(.+)(\")$ ]]; then
val="$(printf '%s' "${BASH_REMATCH[2]}" | tr ' ' '\n' | sort -u | tr '\n' ' ')"
l="${BASH_REMATCH[1]}${val}${BASH_REMATCH[3]}"
section=''
fi
echo "${l}"
fi
# curl-config
prevline="${l}"
if [[ "${l}" =~ --[a-z-]+\) ]]; then
section="${BASH_REMATCH[0]}"
fi
done < "$1"
}
am=$(mktemp -t autotools.XXX); sort_lists "$1" > "${am}"
cm=$(mktemp -t cmake.XXX) ; sort_lists "$2" > "${cm}"
diff -u "${am}" "${cm}"
res="$?"
rm -r -f "${am}" "${cm}"
exit "${res}"

View File

@ -0,0 +1,16 @@
# Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
#
# SPDX-License-Identifier: curl
clen
te
wont
statics
nome
wast
numer
anull
inout
msdos
ba
fo
ede

57
deps/curl/.github/scripts/distfiles.sh vendored Normal file
View File

@ -0,0 +1,57 @@
#!/usr/bin/env bash
# Copyright (C) Viktor Szakats
#
# SPDX-License-Identifier: curl
# Compare git repo files with tarball files and report a mismatch
# after excluding exceptions.
set -eu
gitonly=".git*
^.*
^appveyor.*
^buildconf
^GIT-INFO.md
^README.md
^renovate.json
^REUSE.toml
^SECURITY.md
^LICENSES/*
^docs/examples/adddocsref.pl
^docs/THANKS-filter
^projects/Windows/*
^scripts/ciconfig.pl
^scripts/cijobs.pl
^scripts/contributors.sh
^scripts/contrithanks.sh
^scripts/delta
^scripts/installcheck.sh
^scripts/release-notes.pl
^scripts/singleuse.pl
^tests/CI.md"
tarfiles="$(mktemp)"
gitfiles="$(mktemp)"
tar -tf "$1" \
| sed -E 's|^[^/]+/||g' \
| grep -v -E '(/|^)$' \
| sort > "${tarfiles}"
git -C "${2:-.}" ls-files \
| grep -v -E "($(printf '%s' "${gitonly}" | tr $'\n' '|' | sed -e 's|\.|\\.|g' -e 's|\*|.+|g'))$" \
| sort > "${gitfiles}"
dif="$(diff -u "${tarfiles}" "${gitfiles}" | tail -n +3 || true)"
rm -rf "${tarfiles:?}" "${gitfiles:?}"
echo 'Only in tarball:'
echo "${dif}" | grep '^-' || true
echo
echo 'Missing from tarball:'
if echo "${dif}" | grep '^+'; then
exit 1
fi

10
deps/curl/.github/scripts/shellcheck.sh vendored Normal file
View File

@ -0,0 +1,10 @@
#!/bin/sh
# Copyright (C) Viktor Szakats
#
# SPDX-License-Identifier: curl
shellcheck --version
# shellcheck disable=SC2046
shellcheck --exclude=1091 \
--enable=avoid-nullary-conditions,deprecate-which \
$(grep -l -E '^#!(/usr/bin/env bash|/bin/sh|/bin/bash)' $(git ls-files))

153
deps/curl/.github/scripts/spacecheck.pl vendored Normal file
View File

@ -0,0 +1,153 @@
#!/usr/bin/env perl
#***************************************************************************
# _ _ ____ _
# Project ___| | | | _ \| |
# / __| | | | |_) | |
# | (__| |_| | _ <| |___
# \___|\___/|_| \_\_____|
#
# Copyright (C) Viktor Szakats
#
# This software is licensed as described in the file COPYING, which
# you should have received as part of this distribution. The terms
# are also available at https://curl.se/docs/copyright.html.
#
# You may opt to use, copy, modify, merge, publish, distribute and/or sell
# copies of the Software, and permit persons to whom the Software is
# furnished to do so, under the terms of the COPYING file.
#
# This software is distributed on an "AS IS" basis, WITHOUT WARRANTY OF ANY
# KIND, either express or implied.
#
# SPDX-License-Identifier: curl
#
###########################################################################
use strict;
use warnings;
my @tabs = (
"^m4/zz40-xc-ovr.m4",
"Makefile\\.[a-z]+\$",
"/mkfile",
"\\.(bat|sln|vc)\$",
"^tests/certs/.+\\.der\$",
"^tests/data/test",
);
my @mixed_eol = (
"^tests/certs/.+\\.(crt|der)\$",
"^tests/certs/Server-localhost0h-sv.pem",
"^tests/data/test",
);
my @need_crlf = (
"\\.(bat|sln)\$",
"^winbuild/.+\\.md\$",
);
my @space_at_eol = (
"^tests/.+\\.(cacert|crt|pem)\$",
"^tests/data/test",
);
my @eol_at_eof = (
"^tests/certs/.+\\.der\$",
);
sub fn_match {
my ($filename, @masklist) = @_;
foreach my $mask (@masklist) {
if ($filename =~ $mask) {
return 1;
}
}
return 0;
}
sub eol_detect {
my ($content) = @_;
my $cr = () = $content =~ /\r/g;
my $lf = () = $content =~ /\n/g;
if ($cr > 0 && $lf == 0) {
return "cr"
}
elsif ($cr == 0 && $lf > 0) {
return "lf"
}
elsif ($cr == 0 && $lf == 0) {
return "bin"
}
elsif ($cr == $lf) {
return "crlf"
}
return ""
}
my $issues = 0;
open my $git_ls_files, '-|', 'git ls-files' or die "Failed running git ls-files: $!";
while (my $filename = <$git_ls_files>) {
chomp $filename;
open my $fh, '<', $filename or die "Cannot open '$filename': $!";
my $content = do { local $/; <$fh> };
close $fh;
my @err = ();
if (!fn_match($filename, @tabs) &&
$content =~ /\t/) {
push @err, "content: has tab";
}
my $eol = eol_detect($content);
if ($eol eq "" &&
!fn_match($filename, @mixed_eol)) {
push @err, "content: has mixed EOL types";
}
if ($eol ne "crlf" &&
fn_match($filename, @need_crlf)) {
push @err, "content: must use CRLF EOL for this file type";
}
if ($eol ne "lf" && $content ne "" &&
!fn_match($filename, @need_crlf) &&
!fn_match($filename, @mixed_eol)) {
push @err, "content: must use LF EOL for this file type";
}
if (!fn_match($filename, @space_at_eol) &&
$content =~ /[ \t]\n/) {
push @err, "content: has line-ending whitespace";
}
if ($content ne "" &&
!fn_match($filename, @eol_at_eof) &&
$content !~ /\n\z/) {
push @err, "content: has no EOL at EOF";
}
if ($content =~ /\n\n\z/ ||
$content =~ /\r\n\r\n\z/) {
push @err, "content: has multiple EOL at EOF";
}
if (@err) {
$issues++;
foreach my $err (@err) {
print "$filename: $err\n";
}
}
}
close $git_ls_files;
if ($issues) {
exit 1;
}

View File

@ -0,0 +1,151 @@
# Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
#
# SPDX-License-Identifier: curl
#
# common variable types + structs
# callback typedefs
# public functions names
# some man page names
curl_fileinfo
curl_forms
curl_hstsentry
curl_httppost
curl_index
curl_khkey
curl_pushheaders
curl_waitfd
CURLcode
CURLformoption
CURLHcode
CURLMcode
CURLMsg
CURLSHcode
CURLUcode
curl_calloc_callback
curl_chunk_bgn_callback
curl_chunk_end_callback
curl_conv_callback
curl_debug_callback
curl_fnmatch_callback
curl_formget_callback
curl_free_callback
curl_hstsread_callback
curl_hstswrite_callback
curl_ioctl_callback
curl_malloc_callback
curl_multi_timer_callback
curl_opensocket_callback
curl_prereq_callback
curl_progress_callback
curl_push_callback
curl_read_callback
curl_realloc_callback
curl_resolver_start_callback
curl_seek_callback
curl_socket_callback
curl_sockopt_callback
curl_ssl_ctx_callback
curl_strdup_callback
curl_trailer_callback
curl_write_callback
curl_xferinfo_callback
curl_strequal
curl_strnequal
curl_mime_init
curl_mime_free
curl_mime_addpart
curl_mime_name
curl_mime_filename
curl_mime_type
curl_mime_encoder
curl_mime_data
curl_mime_filedata
curl_mime_data_cb
curl_mime_subparts
curl_mime_headers
curl_formadd
curl_formget
curl_formfree
curl_getdate
curl_getenv
curl_version
curl_easy_escape
curl_escape
curl_easy_unescape
curl_unescape
curl_free
curl_global_init
curl_global_init_mem
curl_global_cleanup
curl_global_trace
curl_global_sslset
curl_slist_append
curl_slist_free_all
curl_getdate
curl_share_init
curl_share_setopt
curl_share_cleanup
curl_version_info
curl_easy_strerror
curl_share_strerror
curl_easy_pause
curl_easy_ssls_import
curl_easy_ssls_export
curl_easy_init
curl_easy_setopt
curl_easy_perform
curl_easy_cleanup
curl_easy_getinfo
curl_easy_duphandle
curl_easy_reset
curl_easy_recv
curl_easy_send
curl_easy_upkeep
curl_easy_header
curl_easy_nextheader
curl_mprintf
curl_mfprintf
curl_msprintf
curl_msnprintf
curl_mvprintf
curl_mvfprintf
curl_mvsprintf
curl_mvsnprintf
curl_maprintf
curl_mvaprintf
curl_multi_init
curl_multi_add_handle
curl_multi_remove_handle
curl_multi_fdset
curl_multi_waitfds
curl_multi_wait
curl_multi_poll
curl_multi_wakeup
curl_multi_perform
curl_multi_cleanup
curl_multi_info_read
curl_multi_strerror
curl_multi_socket
curl_multi_socket_action
curl_multi_socket_all
curl_multi_timeout
curl_multi_setopt
curl_multi_assign
curl_multi_get_handles
curl_pushheader_bynum
curl_pushheader_byname
curl_multi_waitfds
curl_easy_option_by_name
curl_easy_option_by_id
curl_easy_option_next
curl_url
curl_url_cleanup
curl_url_dup
curl_url_get
curl_url_set
curl_url_strerror
curl_ws_recv
curl_ws_send
curl_ws_meta
libcurl-env
libcurl-ws

View File

@ -0,0 +1,979 @@
# Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
#
# SPDX-License-Identifier: curl
#
AAAA
ABI
accessor
ACK
AES
AIA
AIX
al
Alessandro
aliasMode
allocator
alnum
ALPN
Altera
AltSvc
ALTSVC
amiga
AmigaOS
AmiSSL
anyauth
anycast
apache
Apache
API
APIs
APOP
AppVeyor
archivers
Archos
Arntsen
Aros
asynch
AsynchDNS
atime
auth
autobuild
autobuilds
Autoconf
autoconf
Automake
automake
autoreconf
Autotools
autotools
AVR
AWS
AWS-LC
axTLS
backend
backends
backoff
backticks
balancers
Baratov
basename
bashrc
BDFL
BearSSL
Benoit
BeOS
bitmask
bitwise
Björn
Bjørn
bool
boolean
BoringSSL
Boukris
Broadcom
brotli
bufq
bufref
bugfix
bugfixes
buildable
buildbot
Caddy
calloc
CAPA
capath
CCC
CDN
CentOS
CFLAGS
cflags
CGI's
CHACHA
chacha
Chaffraix
changelog
changeset
CharConv
charset
charsets
checksrc
checksums
chgrp
chmod
chown
ChromeOS
CI's
CIDR
CIFS
CLA
CLAs
cleartext
CLI
ClientHello
clientp
cliget
closesocket
CMake
cmake
CMake's
cmake's
CMakeLists
CNA
CNAME
CNAMEs
CODESET
codeset
CodeSonar
Comcast
commit's
Config
config
conncache
connectdata
CookieInfo
Coverity
CPUs
CR
CRL
CRLF
crontab
crt
crypto
cryptographic
cryptographically
CSEQ
CSeq
csh
cshrc
CTRL
cURL
CURLcode
curldown
CURLE
CURLECH
CURLH
curlimages
CURLINFO
curlrc
curltest
customizable
CVE
CVSS
CWD
CWE
cyassl
Cygwin
daniel
datatracker
dbg
Debian
DEBUGBUILD
decrypt
decrypting
deepcode
DELE
DER
dereference
dereferences
deselectable
deserialization
Deserialized
destructor
detections
dev
devcpp
DevOps
devtools
DHCP
DHE
dir
distro
distro's
distros
DJGPP
dlist
DLL
dll
DLLs
DNS
dns
dnsop
DoH
DoT
doxygen
drftpd
dsa
Dudka
Dymond
dynbuf
EAGAIN
EBCDIC
ECC
ECDHE
ECH
ECHConfig
ECHConfigList
ecl
ECONNREFUSED
eCOS
ECT
EF
EFnet
EGD
EHLO
EINTR
else's
encodings
enctype
endianness
Engler
enum
epoll
EPRT
EPSV
ERRNO
errno
ESNI
et
etag
ETag
ETags
exe
executables
EXPN
extensibility
failsafe
Falkeborn
Fandrich
Fastly
fcpp
Fedora
Feltzing
ffi
filesize
filesystem
FindCURL
FLOSS
fnmatch
footguns
formpost
formposts
Fortnite
FOSS
FPL
fread
FreeBSD
FreeDOS
FreeRTOS
freshmeat
Frexx
FS
fseek
FTPing
fuzzer
fwrite
Garmin
gcc
GCM
gdb
Genode
Gentoo
Gergely
getaddrinfo
getenv
gethostbyname
gethostname
Getinfo
getinfo
GETing
getpwuid
ggcov
Ghedini
Gisle
Glesys
globbed
globbing
gmail
GnuTLS
Golemon
GOST
GPG
GPL
GPLed
GREASE
GREASEing
Greear
groff
gsasl
GSKit
gskit
GSS
GSSAPI
GTFO
Guenter
GUIs
Gunderson
Gustafsson
gzip
Gzipped
gzipped
HackerOne
HackerOne's
HAProxy
HardenedBSD
Hards
Haxx
haxx
Heimdal
HelloRetryRequest
HELO
HH
HMAC
Hoersken
Holme
homebrew
hostname
hostnames
Housley
HRR
Hruska
HSTS
hsts
HTC
html
http
HTTPAUTH
httpd
HTTPD
httpget
HttpGet
HTTPS
https
HTTPSRR
hyper's
Högskolan
IANA
Icecast
ICONV
iconv
IDN
IDNA
IETF
ietf
ifdef
ifdefed
Ifdefs
ifdefs
ifhost
IIS
ILE
illumos
IMAP
imap
IMAPS
imaps
impacket
init
initializer
inlined
interop
interoperable
interoperates
IoT
ipadOS
IPCXN
IPFS
ipld
IPNS
IPv
IPv4
IPv4/6
IPv6
IRIs
IRIX
Itanium
iX
Jakub
Jiri
jo
jpeg
jq
JSON
json
Julien
Kamil
Kaufmann
kB
KDE
keepalive
Keil
kerberos
Keychain
keychain
KiB
kickstart
Kirei
Knauf
kqueue
Krb
krb
Kubernetes
Kuhrt
Kungliga
Largefile
LDAP
ldap
LDAPS
ldaps
LF
LGTM
libbrotlidec
libc
libcurl
libcurl's
libcurls
libera
libev
libevent
libgsasl
libidn
libnssckbi
libnsspem
libpsl
Libre
libre
LibreSSL
librtmp
libs
libssh
libssh2
Libtool
libtool
libuv
libWebSocket
libz
libzstd
LineageOS
linux
lldb
ln
localhost
LOGDIR
logfile
lookups
loopback
LOWCOST
LOWDELAY
LPRT
LSB
lseek
Lua
lwIP
macdef
macOS
macos
Makefile
makefiles
malloc
mallocs
manpage
manpages
maprintf
Marek
Mavrogiannopoulos
Mbed
mbedTLS
md
Meglio
memdebug
MesaLink
mesalink
Metalink
mfprintf
Michal
Micrium
MicroBlaze
MicroOS
middlebox
MINCOST
mingw
MinGW
MINIX
misconfigured
Mishyn
mitigations
MITM
mk
mkdir
mktime
Monnerat
monospace
MorphOS
MPE
MPL
mprintf
MPTCP
MQTT
mqtt
mqtts
MSB
MSGSENT
msh
MSIE
msnprintf
msprintf
msquic
mstate
MSVC
MSYS
msys
mtime
mTLS
MUA
multicwd
multiparts
multipath
MultiSSL
mumbo
musedev
mutex
mvaprintf
mvfprintf
mvprintf
mvsnprintf
mvsprintf
MX
Nagel
Nagle
NAMELOOKUP
Natively
NATs
nc
NCR
NDK
NEC
Necko
NetBSD
netrc
netstat
Netware
NFS
nghttp
nghttpx
ngtcp
Nikos
Nios
nitems
NixOS
NLST
nmake
nmemb
nocwd
NODELAY
NonStop
NOOP
Novell
NPN
nroff
nslookup
NSS
nss
NTLM
NTLMUSER
NTLMv
NUM
NuttX
OAuth
objcopy
OCSP
Ok
OpenBSD
OpenLDAP
OpenRISC
OpenSSF
OpenSSF's
OpenSSH
OpenSSL
OpenStep
openSUSE
openwall
Orbis
ORing
Osipov
OSS
PaaS
pac
pacman
parser's
parsers
PASE
PASV
PEM
pem
perl
permafailing
PINGs
pipelining
PKCS
pkcs
PKGBUILD
PKI
pluggable
pn
PolarSSL
Polhem
pollset
POSIX
Postfix
POSTing
POSTs
PowerShell
pre
prebuilt
precompiled
prepend
prepended
prepending
prepends
preprocess
preprocessed
Preprocessing
preprocessor
Prereq
PRET
pretransfer
printf
printf's
PSL
pthreads
PTR
ptr
punycode
PWD
pwd
py
pycurl
pytest
Pytest
qname
QNX
QoS
Qubes
QUIC
quictls
quicwg
Raad
radix
RAS
RBS
ReactOS
README
realloc
Realtime
rebalances
rebase
RECV
recv
Redhat
redirections
redirs
redistributable
Redox
reentrant
Referer
referer
reinitializes
Relatedly
repo
reprioritized
resending
resends
RETR
retransmit
retrigger
RHEL
Rikard
rmdir
ROADMAP
Roadmap
Rockbox
roffit
RPG
RR
RRs
RRtype
RSA
RTMP
rtmp
rtmpdump
RTMPE
RTMPS
RTMPT
RTMPTE
RTMPTS
RTOS
RTP
RTSP
rtsp
RTT
runtests
runtime
Ruslan
rustc
Rustls
rustls
Sagula
SanDisk
SAS
SASL
Satiro
Schannel
Schindelin
SCO
SCP
scp
SDK
se
SEB
SEK
selectable
Serv
setopt
setsockopt
setuid
SFTP
sftp
sha
SHOUTcast
SIGALRM
SIGCHLD
SIGPIPE
singlecwd
SINIX
Sintonen
sizeof
SLE
slist
sln
SMB
smb
SMBS
smbs
SMBv
SMTP
smtp
smtps
SMTPS
SNI
socketopen
socketpair
sockopt
SOCKOPT
SOCKSv
Solaris
SONAME
Soref
SOVERSION
SPARC
SPDX
SPNEGO
Spotify
sprintf
src
SRP
SRWLOCK
SSL
ssl
SSLeay
SSLKEYLOGFILE
SSLS
sslv
SSLv
SSLVERSION
SSPI
stackoverflow
STARTTLS
STARTTRANSFER
stateful
statvfs
stderr
stdin
stdout
Steinar
Stenberg
STLS
STOR
strcat
strcpy
strdup
strerror
strlen
strncat
struct
structs
Structs
stunnel
subdirectories
subdirectory
submitters
substring
substrings
SunOS
SunSSH
superset
svc
svcb
SVCB
Svyatoslav
Swisscom
sws
Symbian
symlink
symlinks
syntaxes
Szakats
TABs
Tatsuhiro
TBD
TCP
tcpdump
Tekniska
testability
testcurl
TFTP
tftp
threadsafe
Tizen
TLS
tlsv
TLSv
TODO
Tomtom
toolchain
toolchains
toolset
toplevel
TOS
TPF
TrackMemory
transcode
Tru
trurl
trustless
Tse
Tsujikawa
TTL
tvOS
txt
typedef
typedefed
Ubuntu
ucLinux
UCRT
UDP
UI
UID
UIDL
Ultrix
Unary
unassign
UNC
uncompress
unencoded
unencrypted
unescape
Unglobbed
Unicode
UNICOS
UnixSockets
UnixWare
unlink
unpause
unpaused
unpauses
unpausing
unsanitized
Unshare
unsharing
untrusted
unwrite
UPN
upstreaming
URI
URIs
url
URL's
urlencoded
urlget
USD
userdata
Userinfo
userinfo
USERPROFILE
UTF
UX
valgrind
Vanem
vararg
VC
vcpkg
vexxhost
Viktor
Virtuozzo
VLAN
VM
VMS
VMware
vnd
VRF
VRFY
VSE
vsftpd
vsprintf
vt
vtls
vxWorks
wakeup
Warta
watchOS
WAV
WB
wcurl
web page
WebDAV
WebOS
webpage
WebSocket
WEBSOCKET
WHATWG
whitespace
Whitespaces
winbind
winbuild
WinIDN
WinLDAP
winsock
Wireshark
wolfSSH
wolfSSL
ws
WS
WSS
www
Xbox
XDG
xdigit
Xilinx
XP
Xtensa
XYZ
Youtube
YYYY
YYYYMMDD
Zakrzewski
Zitzmann
zlib
zsh
zstd
Zuul
zuul

View File

@ -0,0 +1,32 @@
# Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
#
# SPDX-License-Identifier: curl
#
# Docs: https://github.com/UnicornGlobal/spellcheck-github-actions
matrix:
- name: Markdown
expect_match: false
apsell:
mode: en
dictionary:
wordlists:
- wordlist.txt
output: wordlist.dic
encoding: utf-8
pipeline:
- pyspelling.filters.markdown:
markdown_extensions:
- markdown.extensions.extra:
- pyspelling.filters.html:
comments: true
attributes:
- title
- alt
ignores:
- ':matches(code, pre)'
- 'code'
- 'pre'
- 'strong'
- 'em'
sources:
- '**/*.md|!docs/BINDINGS.md|!docs/DISTROS.md|!docs/CIPHERS-TLS12.md'

View File

@ -0,0 +1,41 @@
#!/usr/bin/env perl
# Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
#
# SPDX-License-Identifier: curl
#
# Given: a libcurl curldown man page
# Outputs: the same file, minus the header
#
my $f = $ARGV[0];
open(F, "<$f") or die;
my @out;
my $line = 0;
my $hideheader = 0;
while(<F>) {
if($hideheader) {
if(/^---/) {
# end if hiding
$hideheader = 0;
}
push @out, "\n"; # replace with blank
next;
}
elsif(!$line++ && /^---/) {
# starts with a header, strip off the header
$hideheader = 1;
push @out, "\n"; # replace with blank
next;
}
push @out, $_;
}
close(F);
open(O, ">$f") or die;
for my $l (@out) {
print O $l;
}
close(O);

View File

@ -0,0 +1,110 @@
#!/usr/bin/env perl
#***************************************************************************
# _ _ ____ _
# Project ___| | | | _ \| |
# / __| | | | |_) | |
# | (__| |_| | _ <| |___
# \___|\___/|_| \_\_____|
#
# Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
#
# This software is licensed as described in the file COPYING, which
# you should have received as part of this distribution. The terms
# are also available at https://curl.se/docs/copyright.html.
#
# You may opt to use, copy, modify, merge, publish, distribute and/or sell
# copies of the Software, and permit persons to whom the Software is
# furnished to do so, under the terms of the COPYING file.
#
# This software is distributed on an "AS IS" basis, WITHOUT WARRANTY OF ANY
# KIND, either express or implied.
#
# SPDX-License-Identifier: curl
#
###########################################################################
my @files = @ARGV;
my $cfile = "test.c";
my $check = "./scripts/checksrc.pl";
my $error;
if($files[0] eq "-h") {
print "Usage: verify-synopsis [man pages]\n";
exit;
}
sub testcompile {
my $rc = system("gcc -c test.c -DCURL_DISABLE_TYPECHECK -DCURL_ALLOW_OLD_MULTI_SOCKET -DCURL_DISABLE_DEPRECATION -Wunused -Werror -Wno-unused-but-set-variable -I include") >> 8;
return $rc;
}
sub checksrc {
my $rc = system("$check test.c") >> 8;
return $rc;
}
sub extract {
my($f) = @_;
my $syn = 0;
my $l = 0;
my $iline = 0;
my $fail = 0;
open(F, "<$f") or die "failed opening input file $f : $!";
open(O, ">$cfile") or die "failed opening output file $cfile : $!";
print O "#include <curl/curl.h>\n";
while(<F>) {
$iline++;
if(/^.SH EXAMPLE/) {
$syn = 1
}
elsif($syn == 1) {
if(/^.nf/) {
$syn++;
print O "/* !checksrc! disable UNUSEDIGNORE all */\n";
print O "/* !checksrc! disable COPYRIGHT all */\n";
print O "/* !checksrc! disable FOPENMODE all */\n";
printf O "#line %d \"$f\"\n", $iline+1;
}
}
elsif($syn == 2) {
if(/^.fi/) {
last;
}
if(/(?<!\\)(?:\\{2})*\\(?!\\)/) {
print STDERR
"Error while processing file $f line $iline:\n$_" .
"Error: Single backslashes \\ are not properly shown in " .
"manpage EXAMPLE output unless they are escaped \\\\.\n";
$fail = 1;
$error = 1;
last;
}
# two backslashes become one
$_ =~ s/\\\\/\\/g;
print O $_;
$l++;
}
}
close(F);
close(O);
return ($fail ? 0 : $l);
}
my $count;
for my $m (@files) {
#print "Verify $m\n";
my $out = extract($m);
if($out) {
$error |= testcompile($m);
$error |= checksrc($m);
}
$count++;
}
if(!$error) {
print "Verified $count man pages ok\n";
}
else {
print "Detected problems\n";
}
exit $error;

View File

@ -0,0 +1,84 @@
#!/usr/bin/env perl
#***************************************************************************
# _ _ ____ _
# Project ___| | | | _ \| |
# / __| | | | |_) | |
# | (__| |_| | _ <| |___
# \___|\___/|_| \_\_____|
#
# Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
#
# This software is licensed as described in the file COPYING, which
# you should have received as part of this distribution. The terms
# are also available at https://curl.se/docs/copyright.html.
#
# You may opt to use, copy, modify, merge, publish, distribute and/or sell
# copies of the Software, and permit persons to whom the Software is
# furnished to do so, under the terms of the COPYING file.
#
# This software is distributed on an "AS IS" basis, WITHOUT WARRANTY OF ANY
# KIND, either express or implied.
#
# SPDX-License-Identifier: curl
#
###########################################################################
my @files = @ARGV;
my $cfile = "test.c";
if($files[0] eq "-h") {
print "Usage: verify-synopsis [man pages]\n";
exit;
}
sub testcompile {
my $rc = system("gcc -c test.c -DCURL_DISABLE_TYPECHECK -DCURL_ALLOW_OLD_MULTI_SOCKET -I include") >> 8;
return $rc;
}
sub extract {
my($f) = @_;
my $syn = 0;
my $l = 0;
my $iline = 0;
open(F, "<$f");
open(O, ">$cfile");
while(<F>) {
$iline++;
if(/^# SYNOPSIS/) {
$syn = 1
}
elsif($syn == 1) {
if(/^\~\~\~/) {
$syn++;
print O "#line $iline \"$f\"\n";
}
}
elsif($syn == 2) {
if(/^\~\~\~/) {
last;
}
# turn the vararg argument into vararg
$_ =~ s/, parameter\)\;/, ...);/;
print O $_;
$l++;
}
}
close(F);
close(O);
if($syn < 2) {
print STDERR "Found no synopsis in $f\n";
return 1;
}
return 0;
}
my $error;
for my $m (@files) {
$error |= extract($m);
$error |= testcompile($m);
}
exit $error;

13
deps/curl/.github/scripts/yamlcheck.sh vendored Normal file
View File

@ -0,0 +1,13 @@
#!/bin/sh
# Copyright (C) Viktor Szakats
#
# SPDX-License-Identifier: curl
set -eu
# shellcheck disable=SC2046
yamllint \
--format standard \
--strict \
--config-data "$(dirname "$0")/yamlcheck.yaml" \
$(git ls-files '*.yaml' '*.yml')

View File

@ -0,0 +1,17 @@
# Copyright (C) Viktor Szakats
#
# SPDX-License-Identifier: curl
#
# Docs: https://yamllint.readthedocs.io/en/stable/configuration.html
extends: default
rules:
line-length:
max: 500
level: warning
braces: disable
commas: disable
comments: disable
document-start: disable

21
deps/curl/.github/stale.yml vendored Normal file
View File

@ -0,0 +1,21 @@
# Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
#
# SPDX-License-Identifier: curl
# Number of days of inactivity before an issue becomes stale
daysUntilStale: 180
# Number of days of inactivity before a stale issue is closed
daysUntilClose: 14
# Issues with these labels will never be considered stale
exemptLabels:
- pinned
- security
# Label to use when marking an issue as stale
staleLabel: stale
# Comment to post when marking an issue as stale. Set to `false` to disable
markComment: >
This issue has been automatically marked as stale because it has not had
recent activity. It will be closed if no further activity occurs. Thank you
for your contributions.
# Comment to post when closing a stale issue. Set to `false` to disable
closeComment: false

View File

@ -0,0 +1,41 @@
# Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
#
# SPDX-License-Identifier: curl
name: AppVeyor Status Report
'on':
status
concurrency:
group: ${{ github.workflow }}-${{ github.event.sha }}-${{ github.event.target_url }}
cancel-in-progress: true
permissions: {}
jobs:
split:
runs-on: ubuntu-latest
if: ${{ github.event.sender.login == 'appveyor[bot]' }}
permissions:
statuses: write
steps:
- name: Create individual AppVeyor build statuses
if: ${{ github.event.sha && github.event.target_url }}
env:
APPVEYOR_COMMIT_SHA: ${{ github.event.sha }}
APPVEYOR_TARGET_URL: ${{ github.event.target_url }}
APPVEYOR_REPOSITORY: ${{ github.event.repository.full_name }}
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
echo ${APPVEYOR_TARGET_URL} | sed 's/\/project\//\/api\/projects\//' | xargs -t -n1 curl -s | \
jq -c '.build.jobs[] | {target_url: ($target_url + "/job/" + .jobId),
context: (.name | sub("^(Environment: )?"; "AppVeyor / ")),
state: (.status | sub("queued"; "pending")
| sub("starting"; "pending")
| sub("running"; "pending")
| sub("failed"; "failure")
| sub("cancelled"; "error")),
description: .status}' \
--arg target_url ${APPVEYOR_TARGET_URL} | tee /dev/stderr | parallel --pipe -j 1 -N 1 \
gh api --silent --input - repos/${APPVEYOR_REPOSITORY}/statuses/${APPVEYOR_COMMIT_SHA}

View File

@ -0,0 +1,159 @@
# Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
#
# SPDX-License-Identifier: curl
# This workflow contains tests that operate on documentation files only. Some
# checks modify the source so they cannot be combined into a single job.
name: Docs
'on':
push:
branches:
- master
- '*/ci'
paths:
- '.github/workflows/checkdocs.yml'
- '.github/scripts/mdlinkcheck'
- '/scripts/**'
- '**.md'
- 'docs/*'
pull_request:
branches:
- master
paths:
- '.github/workflows/checkdocs.yml'
- '.github/scripts/**'
- '.github/scripts/mdlinkcheck'
- '**.md'
- 'docs/*'
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.sha }}
cancel-in-progress: true
permissions: {}
jobs:
# proselint:
# runs-on: ubuntu-latest
# steps:
# - uses: actions/checkout@eef61447b9ff4aafe5dcd4e0bbf5d482be7e7871 # v4
# with:
# persist-credentials: false
# name: checkout
#
# - name: install prereqs
# run: |
# sudo rm -f /etc/apt/sources.list.d/microsoft-prod.list
# sudo apt-get -o Dpkg::Use-Pty=0 update
# sudo apt-get -o Dpkg::Use-Pty=0 install python3-proselint
#
# # config file help: https://github.com/amperser/proselint/
# - name: create proselint config
# run: |
# cat <<JSON > $HOME/.proselintrc.json
# {
# "checks": {
# "typography.diacritical_marks": false,
# "typography.symbols": false,
# "annotations.misc": false,
# "security.password": false,
# "misc.annotations": false
# }
# }
# JSON
#
# - name: trim headers off all *.md files
# run: git ls-files -z '*.md' | xargs -0 -n1 .github/scripts/trimmarkdownheader.pl
#
# - name: check prose
# run: git ls-files -z '*.md' | grep -Evz 'CHECKSRC.md|DISTROS.md|curl_mprintf.md|CURLOPT_INTERFACE.md|interface.md' | xargs -0 proselint README
#
# # This is for CHECKSRC and files with aggressive exclamation mark needs
# - name: create second proselint config
# run: |
# cat <<JSON > $HOME/.proselintrc.json
# {
# "checks": {
# "typography.diacritical_marks": false,
# "typography.symbols": false,
# "typography.exclamation": false,
# "lexical_illusions.misc": false,
# "annotations.misc": false
# }
# }
# JSON
#
# - name: check special prose
# run: proselint docs/internals/CHECKSRC.md docs/libcurl/curl_mprintf.md docs/libcurl/opts/CURLOPT_INTERFACE.md docs/cmdline-opts/interface.md
linkcheck:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
persist-credentials: false
name: checkout
- name: Run mdlinkcheck
run: ./scripts/mdlinkcheck
spellcheck:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
persist-credentials: false
name: checkout
- name: trim all *.md files in docs/
run: .github/scripts/cleancmd.pl $(find docs -name "*.md")
- name: setup the custom wordlist
run: grep -v '^#' .github/scripts/spellcheck.words > wordlist.txt
- name: Check Spelling
uses: rojopolis/spellcheck-github-actions@ed0756273a1658136c36d26e3d0353de35b98c8b # v0
with:
config_path: .github/scripts/spellcheck.yaml
badwords-synopsis:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
persist-credentials: false
name: checkout
- name: badwords
run: .github/scripts/badwords.pl < .github/scripts/badwords.txt `git ls-files '**.md'` docs/TODO docs/KNOWN_BUGS packages/OS400/README.OS400
- name: verify-synopsis
run: .github/scripts/verify-synopsis.pl docs/libcurl/curl*.md
man-examples:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
persist-credentials: false
name: checkout
- name: render nroff versions
run: autoreconf -fi && ./configure --without-ssl --without-libpsl && make -C docs
- name: verify examples
run: .github/scripts/verify-examples.pl docs/libcurl/curl*.3 docs/libcurl/opts/*.3
miscchecks:
runs-on: ubuntu-24.04-arm
timeout-minutes: 5
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
persist-credentials: false
name: checkout
- name: spacecheck
run: .github/scripts/spacecheck.pl

120
deps/curl/.github/workflows/checksrc.yml vendored Normal file
View File

@ -0,0 +1,120 @@
# Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
#
# SPDX-License-Identifier: curl
# This workflow contains checks at the source code level only.
name: Source
'on':
push:
branches:
- master
- '*/ci'
paths-ignore:
- '**/*.md'
- '.circleci/**'
- 'appveyor.*'
- 'plan9/**'
- 'tests/data/**'
- 'winbuild/**'
pull_request:
branches:
- master
paths-ignore:
- '**/*.md'
- '.circleci/**'
- 'appveyor.*'
- 'plan9/**'
- 'tests/data/**'
- 'winbuild/**'
permissions: {}
jobs:
checksrc:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
persist-credentials: false
name: checkout
- name: check
run: git ls-files -z "*.[ch]" | xargs -0 -n1 ./scripts/checksrc.pl
codespell-cmakelint-pytype-ruff:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
persist-credentials: false
name: checkout
- name: install
env:
DEBIAN_FRONTEND: noninteractive
run: |
sudo rm -f /etc/apt/sources.list.d/microsoft-prod.list
sudo apt-get -o Dpkg::Use-Pty=0 update
sudo apt-get -o Dpkg::Use-Pty=0 install \
codespell python3-pip python3-networkx python3-pydot python3-yaml \
python3-toml python3-markupsafe python3-jinja2 python3-tabulate \
python3-typing-extensions python3-libcst python3-impacket \
python3-websockets python3-pytest
python3 -m pip install --break-system-packages cmakelint==1.4.3 pytype==2024.9.13 ruff==0.6.8
- name: spellcheck
run: |
codespell \
--skip scripts/mk-ca-bundle.pl \
--skip src/tool_hugehelp.c \
-I .github/scripts/codespell-ignore.txt \
CMake include m4 scripts src lib
- name: cmakelint
run: scripts/cmakelint.sh
- name: pytype
run: find . -name '*.py' -exec pytype -j auto -k {} +
- name: ruff
run: ruff check --extend-select=B007,B016,C405,C416,COM818,D200,D213,D204,D401,D415,FURB129,N818,PERF401,PERF403,PIE790,PIE808,PLW0127,Q004,RUF010,SIM101,SIM117,SIM118,TRY400,TRY401
reuse:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
persist-credentials: false
name: checkout
- name: REUSE Compliance Check
uses: fsfe/reuse-action@bb774aa972c2a89ff34781233d275075cbddf542 # v5
miscchecks:
runs-on: ubuntu-latest
timeout-minutes: 5
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
persist-credentials: false
name: checkout
- name: shellcheck
run: .github/scripts/shellcheck.sh
- name: spacecheck
run: .github/scripts/spacecheck.pl
- name: yamlcheck
run: .github/scripts/yamlcheck.sh
- name: binarycheck
run: .github/scripts/binarycheck.pl
# we allow some extra in source code
- name: badwords
run: |
grep -Ev '(\\bwill| url | dir )' .github/scripts/badwords.txt | \
.github/scripts/badwords.pl $(git ls-files -- src lib include)

View File

@ -0,0 +1,149 @@
# Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
#
# SPDX-License-Identifier: curl
name: configure-vs-cmake
'on':
push:
branches:
- master
paths:
- '*.ac'
- '**/*.m4'
- '**/CMakeLists.txt'
- 'CMake/**'
- 'lib/curl_config.h.cmake'
- 'tests/cmake/**'
- '.github/scripts/cmp-config.pl'
- '.github/workflows/configure-vs-cmake.yml'
pull_request:
branches:
- master
paths:
- '*.ac'
- '**/*.m4'
- '**/CMakeLists.txt'
- 'CMake/**'
- 'lib/curl_config.h.cmake'
- 'tests/cmake/**'
- '.github/scripts/cmp-config.pl'
- '.github/workflows/configure-vs-cmake.yml'
permissions: {}
jobs:
check-linux:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
persist-credentials: false
- name: 'run configure --with-openssl'
run: |
autoreconf -fi
export PKG_CONFIG_DEBUG_SPEW=1
mkdir bld-am && cd bld-am && ../configure --enable-static=no --with-openssl --without-libpsl --without-brotli
- name: 'run cmake'
run: |
cmake -B bld-cm -DCURL_WERROR=ON -DCURL_USE_LIBPSL=OFF -DCURL_BROTLI=OFF
- name: 'configure log'
run: cat bld-am/config.log 2>/dev/null || true
- name: 'cmake log'
run: cat bld-cm/CMakeFiles/CMakeConfigureLog.yaml 2>/dev/null || true
- name: 'compare generated curl_config.h files'
run: ./.github/scripts/cmp-config.pl bld-am/lib/curl_config.h bld-cm/lib/curl_config.h
- name: 'compare generated libcurl.pc files'
run: ./.github/scripts/cmp-pkg-config.sh bld-am/libcurl.pc bld-cm/libcurl.pc
- name: 'compare generated curl-config files'
run: ./.github/scripts/cmp-pkg-config.sh bld-am/curl-config bld-cm/curl-config
check-macos:
runs-on: macos-latest
steps:
- name: 'install packages'
run: |
while [[ $? == 0 ]]; do for i in 1 2 3; do brew update && brew install libtool autoconf automake && break 2 || { echo Error: wait to try again; sleep 10; } done; false Too many retries; done
- name: 'toolchain versions'
run: |
echo '::group::brew packages installed'; ls -l "$(brew --prefix)/opt"; echo '::endgroup::'
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
persist-credentials: false
- name: 'run configure --with-openssl'
run: |
autoreconf -fi
export PKG_CONFIG_DEBUG_SPEW=1
mkdir bld-am && cd bld-am && ../configure --enable-static=no --with-openssl --without-libpsl --disable-ldap --with-zstd
- name: 'run cmake'
run: |
cmake -B bld-cm -DCURL_WERROR=ON -DCURL_USE_LIBPSL=OFF -DCURL_DISABLE_LDAP=ON \
-DCMAKE_C_COMPILER_TARGET="$(uname -m | sed 's/arm64/aarch64/')-apple-darwin$(uname -r)" \
-DCURL_BROTLI=OFF \
-DCURL_USE_LIBSSH2=OFF
- name: 'configure log'
run: cat bld-am/config.log 2>/dev/null || true
- name: 'cmake log'
run: cat bld-cm/CMakeFiles/CMakeConfigureLog.yaml 2>/dev/null || true
- name: 'compare generated curl_config.h files'
run: ./.github/scripts/cmp-config.pl bld-am/lib/curl_config.h bld-cm/lib/curl_config.h
- name: 'compare generated libcurl.pc files'
run: ./.github/scripts/cmp-pkg-config.sh bld-am/libcurl.pc bld-cm/libcurl.pc
- name: 'compare generated curl-config files'
run: ./.github/scripts/cmp-pkg-config.sh bld-am/curl-config bld-cm/curl-config
check-windows:
runs-on: ubuntu-latest
env:
TRIPLET: 'x86_64-w64-mingw32'
steps:
- name: 'install packages'
run: sudo apt-get -o Dpkg::Use-Pty=0 install mingw-w64
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
persist-credentials: false
- name: 'run configure --with-schannel'
run: |
autoreconf -fi
export PKG_CONFIG_DEBUG_SPEW=1
mkdir bld-am && cd bld-am && ../configure --enable-static=no --with-schannel --without-libpsl --host="${TRIPLET}"
- name: 'run cmake'
run: |
cmake -B bld-cm -DCURL_WERROR=ON -DCURL_USE_SCHANNEL=ON -DCURL_USE_LIBPSL=OFF \
-DCMAKE_SYSTEM_NAME=Windows \
-DCMAKE_C_COMPILER_TARGET="${TRIPLET}" \
-DCMAKE_C_COMPILER="${TRIPLET}-gcc"
- name: 'configure log'
run: cat bld-am/config.log 2>/dev/null || true
- name: 'cmake log'
run: cat bld-cm/CMakeFiles/CMakeConfigureLog.yaml 2>/dev/null || true
- name: 'compare generated curl_config.h files'
run: ./.github/scripts/cmp-config.pl bld-am/lib/curl_config.h bld-cm/lib/curl_config.h
- name: 'compare generated libcurl.pc files'
run: ./.github/scripts/cmp-pkg-config.sh bld-am/libcurl.pc bld-cm/libcurl.pc
- name: 'compare generated curl-config files'
run: ./.github/scripts/cmp-pkg-config.sh bld-am/curl-config bld-cm/curl-config

View File

@ -0,0 +1,163 @@
# Copyright (C) Viktor Szakats
#
# SPDX-License-Identifier: curl
---
name: curl-for-win
'on':
push:
branches:
- master
- '*/ci'
paths-ignore:
- '**/*.md'
- '.circleci/**'
- 'appveyor.*'
- 'packages/**'
- 'plan9/**'
- 'projects/**'
- 'winbuild/**'
pull_request:
branches:
- master
paths-ignore:
- '**/*.md'
- '.circleci/**'
- 'appveyor.*'
- 'packages/**'
- 'plan9/**'
- 'projects/**'
- 'winbuild/**'
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.sha }}
cancel-in-progress: true
permissions: {}
env:
CW_NOGET: 'curl trurl'
CW_MAP: '0'
CW_JOBS: '5'
CW_NOPKG: '1'
jobs:
linux-glibc-gcc:
runs-on: ubuntu-latest
timeout-minutes: 30
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
persist-credentials: false
path: 'curl'
fetch-depth: 8
- name: 'build'
run: |
git clone --depth 1 https://github.com/curl/curl-for-win
mv curl-for-win/* .
export CW_CONFIG='-main-werror-linux-a64-x64-gcc'
export CW_REVISION='${{ github.sha }}'
DOCKER_IMAGE='debian:bookworm-slim'
export DOCKER_CONTENT_TRUST=1
export CW_CCSUFFIX='-15'
export CW_GCCSUFFIX='-12'
docker trust inspect --pretty "${DOCKER_IMAGE}"
time docker pull "${DOCKER_IMAGE}"
docker images --digests
time docker run --volume "$(pwd):$(pwd)" --workdir "$(pwd)" \
--env-file <(env | grep -a -E \
'^(CW_|GITHUB_)') \
"${DOCKER_IMAGE}" \
sh -c ./_ci-linux-debian.sh
linux-musl-llvm:
runs-on: ubuntu-latest
timeout-minutes: 30
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
persist-credentials: false
path: 'curl'
fetch-depth: 8
- name: 'build'
run: |
git clone --depth 1 https://github.com/curl/curl-for-win
mv curl-for-win/* .
export CW_CONFIG='-main-werror-linux-musl-r64-x64'
export CW_REVISION='${{ github.sha }}'
. ./_versions.sh
docker trust inspect --pretty "${DOCKER_IMAGE}"
time docker pull "${DOCKER_IMAGE}"
docker images --digests
time docker run --volume "$(pwd):$(pwd)" --workdir "$(pwd)" \
--env-file <(env | grep -a -E \
'^(CW_|GITHUB_)') \
"${DOCKER_IMAGE}" \
sh -c ./_ci-linux-debian.sh
mac-clang:
runs-on: macos-latest
timeout-minutes: 30
env:
CW_JOBS: '4'
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
persist-credentials: false
path: 'curl'
fetch-depth: 8
- name: 'build'
run: |
git clone --depth 1 https://github.com/curl/curl-for-win
mv curl-for-win/* .
export CW_CONFIG='-main-werror-mac-x64'
export CW_REVISION='${{ github.sha }}'
sh -c ./_ci-mac-homebrew.sh
win-llvm:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
persist-credentials: false
path: 'curl'
fetch-depth: 8
- name: 'build'
run: |
git clone --depth 1 https://github.com/curl/curl-for-win
mv curl-for-win/* .
export CW_CONFIG='-main-werror-win-x64'
export CW_REVISION='${{ github.sha }}'
. ./_versions.sh
docker trust inspect --pretty "${DOCKER_IMAGE}"
time docker pull "${DOCKER_IMAGE}"
docker images --digests
time docker run --volume "$(pwd):$(pwd)" --workdir "$(pwd)" \
--env-file <(env | grep -a -E \
'^(CW_|GITHUB_)') \
"${DOCKER_IMAGE}" \
sh -c ./_ci-linux-debian.sh
win-gcc-libssh-zlibng-x86:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
persist-credentials: false
path: 'curl'
fetch-depth: 8
- name: 'build'
run: |
git clone --depth 1 https://github.com/curl/curl-for-win
mv curl-for-win/* .
export CW_CONFIG='-main-werror-win-x86-gcc-libssh1-zlibng'
export CW_REVISION='${{ github.sha }}'
. ./_versions.sh
docker trust inspect --pretty "${DOCKER_IMAGE}"
time docker pull "${DOCKER_IMAGE}"
docker images --digests
time docker run --volume "$(pwd):$(pwd)" --workdir "$(pwd)" \
--env-file <(env | grep -a -E \
'^(CW_|GITHUB_)') \
"${DOCKER_IMAGE}" \
sh -c ./_ci-linux-debian.sh

View File

@ -0,0 +1,194 @@
# Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
#
# SPDX-License-Identifier: curl
name: dist
'on':
push:
branches:
- master
- '*/ci'
pull_request:
branches:
- master
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.sha }}
cancel-in-progress: true
permissions: {}
env:
MAKEFLAGS: -j 5
jobs:
maketgz-and-verify-in-tree:
runs-on: ubuntu-latest
timeout-minutes: 15
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
persist-credentials: false
- name: 'remove preinstalled curl libcurl4{-doc}'
run: sudo apt-get -o Dpkg::Use-Pty=0 purge curl libcurl4 libcurl4-doc
- name: 'autoreconf'
run: autoreconf -fi
- name: 'configure'
run: ./configure --without-ssl --without-libpsl
- name: 'make'
run: make V=1
- name: 'maketgz'
run: SOURCE_DATE_EPOCH=1711526400 ./scripts/maketgz 99.98.97
- uses: actions/upload-artifact@4cec3d8aa04e39d1a68397de0c4cd6fb9dce8ec1 # v4
with:
name: 'release-tgz'
path: 'curl-99.98.97.tar.gz'
retention-days: 1
- name: 'verify in-tree configure build including install'
run: |
echo "::stop-commands::$(uuidgen)"
tar xvf curl-99.98.97.tar.gz
pushd curl-99.98.97
./configure --prefix=$HOME/temp --without-ssl --without-libpsl
make
make test-ci
make install
popd
# basic check of the installed files
bash scripts/installcheck.sh $HOME/temp
rm -rf curl-99.98.97
verify-out-of-tree-docs:
runs-on: ubuntu-latest
timeout-minutes: 15
needs: maketgz-and-verify-in-tree
steps:
- uses: actions/download-artifact@cc203385981b70ca67e1cc392babf9cc229d5806 # v4
with:
name: 'release-tgz'
- name: 'verify out-of-tree configure build including docs'
run: |
echo "::stop-commands::$(uuidgen)"
tar xvf curl-99.98.97.tar.gz
touch curl-99.98.97/docs/{cmdline-opts,libcurl}/Makefile.inc
mkdir build
pushd build
../curl-99.98.97/configure --without-ssl --without-libpsl
make
make test-ci
popd
rm -rf build
rm -rf curl-99.98.97
verify-out-of-tree-autotools-debug:
runs-on: ubuntu-latest
timeout-minutes: 15
needs: maketgz-and-verify-in-tree
steps:
- uses: actions/download-artifact@cc203385981b70ca67e1cc392babf9cc229d5806 # v4
with:
name: 'release-tgz'
- name: 'verify out-of-tree autotools debug build'
run: |
echo "::stop-commands::$(uuidgen)"
tar xvf curl-99.98.97.tar.gz
pushd curl-99.98.97
mkdir build
pushd build
../configure --without-ssl --enable-debug "--prefix=${PWD}/pkg" --without-libpsl
make
make test-ci
make install
verify-out-of-tree-cmake:
runs-on: ubuntu-latest
timeout-minutes: 15
needs: maketgz-and-verify-in-tree
steps:
- uses: actions/download-artifact@cc203385981b70ca67e1cc392babf9cc229d5806 # v4
with:
name: 'release-tgz'
- name: 'verify out-of-tree cmake build'
run: |
echo "::stop-commands::$(uuidgen)"
tar xvf curl-99.98.97.tar.gz
pushd curl-99.98.97
cmake -B build -DCURL_WERROR=ON -DCURL_USE_LIBPSL=OFF
make -C build
missing-files:
runs-on: ubuntu-latest
timeout-minutes: 5
needs: maketgz-and-verify-in-tree
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
persist-credentials: false
- uses: actions/download-artifact@cc203385981b70ca67e1cc392babf9cc229d5806 # v4
with:
name: 'release-tgz'
- name: 'detect files missing from release tarball'
run: .github/scripts/distfiles.sh curl-99.98.97.tar.gz
reproducible-releases:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
persist-credentials: false
- name: 'remove preinstalled curl libcurl4{-doc}'
run: sudo apt-get -o Dpkg::Use-Pty=0 purge curl libcurl4 libcurl4-doc
- name: 'generate release tarballs'
run: ./scripts/dmaketgz 9.10.11
- name: 'verify release tarballs'
run: |
mkdir _verify
mv curl-9.10.11.tar.gz _verify
cd _verify
../scripts/verify-release curl-9.10.11.tar.gz
cmake-integration:
name: 'cmake-integration-on-${{ matrix.image }}'
runs-on: ${{ matrix.image }}
timeout-minutes: 10
env:
CC: clang
CMAKE_GENERATOR: Ninja
strategy:
fail-fast: false
matrix:
image: [ubuntu-latest, macos-latest]
steps:
- name: 'install prereqs'
run: |
if [[ '${{ matrix.image }}' = *'ubuntu'* ]]; then
sudo apt-get -o Dpkg::Use-Pty=0 install ninja-build libpsl-dev libssl-dev
else
brew install ninja libpsl openssl
fi
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
persist-credentials: false
- name: 'via FetchContent'
run: ./tests/cmake/test.sh FetchContent
- name: 'via add_subdirectory'
run: ./tests/cmake/test.sh add_subdirectory
- name: 'via find_package'
run: ./tests/cmake/test.sh find_package

46
deps/curl/.github/workflows/fuzz.yml vendored Normal file
View File

@ -0,0 +1,46 @@
# Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
#
# SPDX-License-Identifier: curl
name: Fuzzer
'on':
push:
branches:
- master
- '*/ci'
paths-ignore:
- '**/*.md'
- '**/CMakeLists.txt'
- '.circleci/**'
- 'appveyor.*'
- 'CMake/**'
- 'packages/**'
- 'plan9/**'
- 'projects/**'
- 'tests/data/**'
- 'winbuild/**'
pull_request:
branches:
- master
paths-ignore:
- '**/*.md'
- '**/CMakeLists.txt'
- '.circleci/**'
- 'appveyor.*'
- 'CMake/**'
- 'packages/**'
- 'plan9/**'
- 'projects/**'
- 'tests/data/**'
- 'winbuild/**'
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.sha }}
cancel-in-progress: true
permissions: {}
jobs:
Fuzzing:
uses: curl/curl-fuzzer/.github/workflows/ci.yml@master

View File

@ -0,0 +1,68 @@
# Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
#
# SPDX-License-Identifier: curl
name: Hacktoberfest
'on':
# this must not ever run on any other branch than master
push:
branches:
- master
concurrency:
# this should not run in parallel, so just run one at a time
group: ${{ github.workflow }}
permissions: {}
jobs:
# add hacktoberfest-accepted label to PRs opened starting from September 30th
# till November 1st which are closed via commit reference from master branch.
merged:
runs-on: ubuntu-latest
permissions:
# requires issues AND pull-requests write permissions to edit labels on PRs!
issues: write
pull-requests: write
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
persist-credentials: false
fetch-depth: 100
- name: Check whether repo participates in Hacktoberfest
run: |
gh config set prompt disabled && echo "label=$(
gh repo view --json repositoryTopics --jq '.repositoryTopics[].name' | grep '^hacktoberfest$')" >> $GITHUB_OUTPUT
id: check
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- name: Search relevant commit message lines starting with Closes/Merges
run: |
git log --format=email '${{ github.event.before }}..${{ github.event.after }}' | \
grep -Ei '^Close[sd]? ' | sort | uniq | tee log
if: steps.check.outputs.label == 'hacktoberfest'
- name: Search for Number-based PR references
run: |
grep -Eo '#([0-9]+)' log | cut -d# -f2 | sort | uniq | xargs -t -n1 -I{} \
gh pr view {} --json number,createdAt \
--jq '{number, opened: .createdAt} | [.number, .opened] | join(":")' | tee /dev/stderr | \
grep -Eo '^([0-9]+):[0-9]{4}-(09-30T|10-|11-01T)' | cut -d: -f1 | sort | uniq | xargs -t -n1 -I {} \
gh pr edit {} --add-label 'hacktoberfest-accepted'
if: steps.check.outputs.label == 'hacktoberfest'
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- name: Search for URL-based PR references
run: |
grep -Eo 'github.com/(.+)/(.+)/pull/([0-9]+)' log | sort | uniq | xargs -t -n1 -I{} \
gh pr view 'https://{}' --json number,createdAt \
--jq '{number, opened: .createdAt} | [.number, .opened] | join(":")' | tee /dev/stderr | \
grep -Eo '^([0-9]+):[0-9]{4}-(09-30T|10-|11-01T)' | cut -d: -f1 | sort | uniq | xargs -t -n1 -I {} \
gh pr edit {} --add-label 'hacktoberfest-accepted'
if: steps.check.outputs.label == 'hacktoberfest'
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

View File

@ -0,0 +1,519 @@
# Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
#
# SPDX-License-Identifier: curl
name: Linux HTTP/3
'on':
push:
branches:
- master
- '*/ci'
paths-ignore:
- '**/*.md'
- '.circleci/**'
- 'appveyor.*'
- 'packages/**'
- 'plan9/**'
- 'projects/**'
- 'winbuild/**'
pull_request:
branches:
- master
paths-ignore:
- '**/*.md'
- '.circleci/**'
- 'appveyor.*'
- 'packages/**'
- 'plan9/**'
- 'projects/**'
- 'winbuild/**'
concurrency:
# Hardcoded workflow filename as workflow name above is just Linux again
group: http3-${{ github.event.pull_request.number || github.sha }}
cancel-in-progress: true
permissions: {}
env:
MAKEFLAGS: -j 5
# handled in renovate.json
openssl-version: 3.4.1
# handled in renovate.json
quictls-version: 3.3.0
# renovate: datasource=github-tags depName=gnutls/gnutls versioning=semver registryUrl=https://github.com
gnutls-version: 3.8.9
# renovate: datasource=github-tags depName=wolfSSL/wolfssl versioning=semver extractVersion=^v?(?<version>.+)-stable$ registryUrl=https://github.com
wolfssl-version: 5.7.6
# renovate: datasource=github-tags depName=ngtcp2/nghttp3 versioning=semver registryUrl=https://github.com
nghttp3-version: 1.8.0
# renovate: datasource=github-tags depName=ngtcp2/ngtcp2 versioning=semver registryUrl=https://github.com
ngtcp2-version: 1.11.0
# renovate: datasource=github-tags depName=nghttp2/nghttp2 versioning=semver registryUrl=https://github.com
nghttp2-version: 1.65.0
# renovate: datasource=github-tags depName=cloudflare/quiche versioning=semver registryUrl=https://github.com
quiche-version: 0.23.4
jobs:
build-cache:
runs-on: ubuntu-latest
steps:
- name: 'cache quictls'
uses: actions/cache@d4323d4df104b026a6aa633fdb11d772146be0bf # v4
id: cache-quictls-no-deprecated
env:
cache-name: cache-quictls-no-deprecated
with:
path: ~/quictls/build
key: ${{ runner.os }}-http3-build-${{ env.cache-name }}-${{ env.quictls-version }}-quic1
- name: 'cache gnutls'
uses: actions/cache@d4323d4df104b026a6aa633fdb11d772146be0bf # v4
id: cache-gnutls
env:
cache-name: cache-gnutls
with:
path: ~/gnutls/build
key: ${{ runner.os }}-http3-build-${{ env.cache-name }}-${{ env.gnutls-version }}
- name: 'cache wolfssl'
uses: actions/cache@d4323d4df104b026a6aa633fdb11d772146be0bf # v4
id: cache-wolfssl
env:
cache-name: cache-wolfssl
with:
path: ~/wolfssl/build
key: ${{ runner.os }}-http3-build-${{ env.cache-name }}-${{ env.wolfssl-version }}
- name: 'cache nghttp3'
uses: actions/cache@d4323d4df104b026a6aa633fdb11d772146be0bf # v4
id: cache-nghttp3
env:
cache-name: cache-nghttp3
with:
path: ~/nghttp3/build
key: ${{ runner.os }}-http3-build-${{ env.cache-name }}-${{ env.nghttp3-version }}
- name: 'cache ngtcp2'
uses: actions/cache@d4323d4df104b026a6aa633fdb11d772146be0bf # v4
id: cache-ngtcp2
env:
cache-name: cache-ngtcp2
with:
path: ~/ngtcp2/build
key: ${{ runner.os }}-http3-build-${{ env.cache-name }}-${{ env.ngtcp2-version }}-${{ env.quictls-version }}-${{ env.gnutls-version }}-${{ env.wolfssl-version }}
- name: 'cache nghttp2'
uses: actions/cache@d4323d4df104b026a6aa633fdb11d772146be0bf # v4
id: cache-nghttp2
env:
cache-name: cache-nghttp2
with:
path: ~/nghttp2/build
key: ${{ runner.os }}-http3-build-${{ env.cache-name }}-${{ env.nghttp2-version }}-${{ env.quictls-version }}-${{ env.ngtcp2-version }}-${{ env.nghttp3-version }}
- id: settings
if: |
steps.cache-quictls-no-deprecated.outputs.cache-hit != 'true' ||
steps.cache-gnutls.outputs.cache-hit != 'true' ||
steps.cache-wolfssl.outputs.cache-hit != 'true' ||
steps.cache-nghttp3.outputs.cache-hit != 'true' ||
steps.cache-ngtcp2.outputs.cache-hit != 'true' ||
steps.cache-nghttp2.outputs.cache-hit != 'true'
run: |
echo 'needs-build=true' >> $GITHUB_OUTPUT
- name: 'install build prereqs'
if: steps.settings.outputs.needs-build == 'true'
run: |
sudo rm -f /etc/apt/sources.list.d/microsoft-prod.list
sudo apt-get -o Dpkg::Use-Pty=0 update
sudo apt-get -o Dpkg::Use-Pty=0 install \
libtool autoconf automake pkgconf stunnel4 \
libpsl-dev libbrotli-dev libzstd-dev zlib1g-dev libev-dev libc-ares-dev \
nettle-dev libp11-kit-dev libtspi-dev libunistring-dev guile-2.2-dev libtasn1-bin \
libtasn1-6-dev libidn2-0-dev gawk gperf libtss2-dev dns-root-data bison gtk-doc-tools \
texinfo texlive texlive-extra-utils autopoint libev-dev \
apache2 apache2-dev libnghttp2-dev
echo 'CC=gcc-12' >> $GITHUB_ENV
echo 'CXX=g++-12' >> $GITHUB_ENV
- name: 'build quictls'
if: steps.cache-quictls-no-deprecated.outputs.cache-hit != 'true'
run: |
cd $HOME
git clone --quiet --depth=1 -b openssl-${{ env.quictls-version }}-quic1 https://github.com/quictls/openssl quictls
cd quictls
./config no-deprecated --prefix=$PWD/build --libdir=lib no-makedepend no-apps no-docs no-tests
make
make -j1 install_sw
- name: 'build gnutls'
if: steps.cache-gnutls.outputs.cache-hit != 'true'
run: |
cd $HOME
git clone --quiet --depth=1 -b ${{ env.gnutls-version }} https://github.com/gnutls/gnutls.git
cd gnutls
./bootstrap
./configure --disable-dependency-tracking --prefix=$PWD/build \
LDFLAGS="-Wl,-rpath,$PWD/build/lib -L$PWD/build/lib" \
--with-included-libtasn1 --with-included-unistring \
--disable-guile --disable-doc --disable-tests --disable-tools
make
make install
- name: 'build wolfssl'
if: steps.cache-wolfssl.outputs.cache-hit != 'true'
run: |
cd $HOME
git clone --quiet --depth=1 -b v${{ env.wolfssl-version }}-stable https://github.com/wolfSSL/wolfssl.git
cd wolfssl
./autogen.sh
./configure --disable-dependency-tracking --enable-all --enable-quic \
--disable-benchmark --disable-crypttests --disable-examples --prefix=$PWD/build
make
make install
- name: 'build nghttp3'
if: steps.cache-nghttp3.outputs.cache-hit != 'true'
run: |
cd $HOME
git clone --quiet --depth=1 -b v${{ env.nghttp3-version }} https://github.com/ngtcp2/nghttp3
cd nghttp3
git submodule update --init --depth=1
autoreconf -fi
./configure --disable-dependency-tracking --prefix=$PWD/build \
PKG_CONFIG_PATH="$PWD/build/lib/pkgconfig" \
--enable-lib-only
make
make install
- name: 'build ngtcp2'
if: steps.cache-ngtcp2.outputs.cache-hit != 'true'
run: |
cd $HOME
git clone --quiet --depth=1 -b v${{ env.ngtcp2-version }} https://github.com/ngtcp2/ngtcp2
cd ngtcp2
autoreconf -fi
./configure --disable-dependency-tracking --prefix=$PWD/build \
PKG_CONFIG_PATH="$HOME/quictls/build/lib/pkgconfig:$HOME/gnutls/build/lib/pkgconfig:$HOME/wolfssl/build/lib/pkgconfig" \
--enable-lib-only --with-openssl --with-gnutls --with-wolfssl
make install
- name: 'build nghttp2'
if: steps.cache-nghttp2.outputs.cache-hit != 'true'
run: |
cd $HOME
git clone --quiet --depth=1 -b v${{ env.nghttp2-version }} https://github.com/nghttp2/nghttp2
cd nghttp2
git submodule update --init --depth=1
autoreconf -fi
./configure --disable-dependency-tracking --prefix=$PWD/build \
PKG_CONFIG_PATH="$HOME/quictls/build/lib/pkgconfig:$HOME/nghttp3/build/lib/pkgconfig:$HOME/ngtcp2/build/lib/pkgconfig" \
LDFLAGS="-Wl,-rpath,$HOME/quictls/build/lib" \
--enable-http3
make install
linux:
name: ${{ matrix.build.generate && 'CM' || 'AM' }} ${{ matrix.build.name }}
needs:
- build-cache
runs-on: 'ubuntu-latest'
timeout-minutes: 45
strategy:
fail-fast: false
matrix:
build:
- name: quictls
PKG_CONFIG_PATH: '$HOME/quictls/build/lib/pkgconfig:$HOME/nghttp3/build/lib/pkgconfig:$HOME/ngtcp2/build/lib/pkgconfig:$HOME/nghttp2/build/lib/pkgconfig'
configure: >-
LDFLAGS="-Wl,-rpath,$HOME/quictls/build/lib"
--with-ngtcp2=$HOME/ngtcp2/build --enable-warnings --enable-werror --enable-debug --disable-ntlm
--with-test-nghttpx="$HOME/nghttp2/build/bin/nghttpx"
--with-openssl=$HOME/quictls/build --enable-ssls-export
--with-libuv
- name: gnutls
PKG_CONFIG_PATH: '$HOME/gnutls/build/lib/pkgconfig:$HOME/nghttp3/build/lib/pkgconfig:$HOME/ngtcp2/build/lib/pkgconfig:$HOME/nghttp2/build/lib/pkgconfig'
configure: >-
LDFLAGS="-Wl,-rpath,$HOME/gnutls/build/lib"
--with-ngtcp2=$HOME/ngtcp2/build --enable-warnings --enable-werror --enable-debug
--with-test-nghttpx="$HOME/nghttp2/build/bin/nghttpx"
--with-gnutls=$HOME/gnutls/build --enable-ssls-export
--with-libuv
- name: wolfssl
PKG_CONFIG_PATH: '$HOME/wolfssl/build/lib/pkgconfig:$HOME/nghttp3/build/lib/pkgconfig:$HOME/ngtcp2/build/lib/pkgconfig:$HOME/nghttp2/build/lib/pkgconfig'
configure: >-
LDFLAGS="-Wl,-rpath,$HOME/wolfssl/build/lib"
--with-ngtcp2=$HOME/ngtcp2/build --enable-warnings --enable-werror --enable-debug
--with-test-nghttpx="$HOME/nghttp2/build/bin/nghttpx"
--with-wolfssl=$HOME/wolfssl/build
--enable-ech --enable-ssls-export
--with-libuv
- name: wolfssl
PKG_CONFIG_PATH: '$HOME/wolfssl/build/lib/pkgconfig:$HOME/nghttp3/build/lib/pkgconfig:$HOME/ngtcp2/build/lib/pkgconfig:$HOME/nghttp2/build/lib/pkgconfig'
generate: >-
-DCURL_USE_WOLFSSL=ON -DUSE_NGTCP2=ON -DENABLE_DEBUG=ON
-DTEST_NGHTTPX="$HOME/nghttp2/build/bin/nghttpx"
-DHTTPD_NGHTTPX="$HOME/nghttp2/build/bin/nghttpx"
-DUSE_ECH=ON
-DCURL_USE_LIBUV=ON
- name: openssl-quic
PKG_CONFIG_PATH: '$HOME/openssl/build/lib64/pkgconfig'
configure: >-
LDFLAGS="-Wl,-rpath,$HOME/openssl/build/lib64"
--enable-warnings --enable-werror --enable-debug --disable-ntlm
--with-test-nghttpx="$HOME/nghttp2/build/bin/nghttpx"
--with-openssl=$HOME/openssl/build --with-openssl-quic
--with-nghttp3=$HOME/nghttp3/build
--with-libuv
- name: quiche
configure: >-
LDFLAGS="-Wl,-rpath,$HOME/quiche/target/release"
--with-openssl=$HOME/quiche/quiche/deps/boringssl/src
--enable-warnings --enable-werror --enable-debug
--with-quiche=$HOME/quiche/target/release
--with-test-nghttpx="$HOME/nghttp2/build/bin/nghttpx"
--with-ca-fallback
--with-libuv
- name: quiche
PKG_CONFIG_PATH: '$HOME/quiche/target/release'
generate: >-
-DOPENSSL_ROOT_DIR=$HOME/quiche/quiche/deps/boringssl/src -DENABLE_DEBUG=ON
-DUSE_QUICHE=ON
-DTEST_NGHTTPX="$HOME/nghttp2/build/bin/nghttpx"
-DHTTPD_NGHTTPX="$HOME/nghttp2/build/bin/nghttpx"
-DCURL_CA_FALLBACK=ON
-DCURL_USE_LIBUV=ON
steps:
- name: 'install prereqs'
run: |
sudo rm -f /etc/apt/sources.list.d/microsoft-prod.list
sudo apt-get -o Dpkg::Use-Pty=0 update
sudo apt-get -o Dpkg::Use-Pty=0 install \
libtool autoconf automake ninja-build pkgconf stunnel4 \
libpsl-dev libbrotli-dev libzstd-dev zlib1g-dev libev-dev libc-ares-dev \
nettle-dev libp11-kit-dev libtspi-dev libunistring-dev guile-2.2-dev libtasn1-bin \
libtasn1-6-dev libidn2-0-dev gawk gperf libtss2-dev dns-root-data bison gtk-doc-tools \
texinfo texlive texlive-extra-utils autopoint libev-dev libuv1-dev \
apache2 apache2-dev libnghttp2-dev vsftpd
python3 -m venv $HOME/venv
echo 'CC=gcc-12' >> $GITHUB_ENV
echo 'CXX=g++-12' >> $GITHUB_ENV
- name: 'cache quictls'
uses: actions/cache@d4323d4df104b026a6aa633fdb11d772146be0bf # v4
id: cache-quictls-no-deprecated
env:
cache-name: cache-quictls-no-deprecated
with:
path: ~/quictls/build
key: ${{ runner.os }}-http3-build-${{ env.cache-name }}-${{ env.quictls-version }}-quic1
fail-on-cache-miss: true
- name: 'cache gnutls'
if: matrix.build.name == 'gnutls'
uses: actions/cache@d4323d4df104b026a6aa633fdb11d772146be0bf # v4
id: cache-gnutls
env:
cache-name: cache-gnutls
with:
path: ~/gnutls/build
key: ${{ runner.os }}-http3-build-${{ env.cache-name }}-${{ env.gnutls-version }}
fail-on-cache-miss: true
- name: 'cache wolfssl'
if: matrix.build.name == 'wolfssl'
uses: actions/cache@d4323d4df104b026a6aa633fdb11d772146be0bf # v4
id: cache-wolfssl
env:
cache-name: cache-wolfssl
with:
path: ~/wolfssl/build
key: ${{ runner.os }}-http3-build-${{ env.cache-name }}-${{ env.wolfssl-version }}
fail-on-cache-miss: true
- name: 'cache nghttp3'
uses: actions/cache@d4323d4df104b026a6aa633fdb11d772146be0bf # v4
id: cache-nghttp3
env:
cache-name: cache-nghttp3
with:
path: ~/nghttp3/build
key: ${{ runner.os }}-http3-build-${{ env.cache-name }}-${{ env.nghttp3-version }}
fail-on-cache-miss: true
- name: 'cache ngtcp2'
uses: actions/cache@d4323d4df104b026a6aa633fdb11d772146be0bf # v4
id: cache-ngtcp2
env:
cache-name: cache-ngtcp2
with:
path: ~/ngtcp2/build
key: ${{ runner.os }}-http3-build-${{ env.cache-name }}-${{ env.ngtcp2-version }}-${{ env.quictls-version }}-${{ env.gnutls-version }}-${{ env.wolfssl-version }}
fail-on-cache-miss: true
- name: 'cache nghttp2'
uses: actions/cache@d4323d4df104b026a6aa633fdb11d772146be0bf # v4
id: cache-nghttp2
env:
cache-name: cache-nghttp2
with:
path: ~/nghttp2/build
key: ${{ runner.os }}-http3-build-${{ env.cache-name }}-${{ env.nghttp2-version }}-${{ env.quictls-version }}-${{ env.ngtcp2-version }}-${{ env.nghttp3-version }}
fail-on-cache-miss: true
- name: 'cache openssl'
if: matrix.build.name == 'openssl-quic'
uses: actions/cache@d4323d4df104b026a6aa633fdb11d772146be0bf # v4
id: cache-openssl
env:
cache-name: cache-openssl
with:
path: ~/openssl/build
key: ${{ runner.os }}-http3-build-${{ env.cache-name }}-${{ env.openssl-version }}
- name: 'install openssl'
if: matrix.build.name == 'openssl-quic' && steps.cache-openssl.outputs.cache-hit != 'true'
run: |
git clone --quiet --depth=1 -b openssl-${{ env.openssl-version }} https://github.com/openssl/openssl
cd openssl
./config --prefix=$HOME/openssl/build no-makedepend no-apps no-docs no-tests
make
make -j1 install_sw
cat exporters/openssl.pc
- name: 'cache quiche'
if: matrix.build.name == 'quiche'
uses: actions/cache@d4323d4df104b026a6aa633fdb11d772146be0bf # v4
id: cache-quiche
env:
cache-name: cache-quiche
with:
path: ~/quiche
key: ${{ runner.os }}-http3-build-${{ env.cache-name }}-${{ env.quiche-version }}
- name: 'build quiche and boringssl'
if: matrix.build.name == 'quiche' && steps.cache-quiche.outputs.cache-hit != 'true'
run: |
cd $HOME
git clone --quiet --depth=1 -b ${{ env.quiche-version }} --recursive https://github.com/cloudflare/quiche.git
cd quiche
#### Work-around https://github.com/curl/curl/issues/7927 #######
#### See https://github.com/alexcrichton/cmake-rs/issues/131 ####
sed -i -e 's/cmake = "0.1"/cmake = "=0.1.45"/' quiche/Cargo.toml
cargo build -v --package quiche --release --features ffi,pkg-config-meta,qlog --verbose
ln -s libquiche.so target/release/libquiche.so.0
mkdir -v quiche/deps/boringssl/src/lib
ln -vnf $(find target/release -name libcrypto.a -o -name libssl.a) quiche/deps/boringssl/src/lib/
# include dir
# $HOME/quiche/quiche/deps/boringssl/src/include
# lib dir
# $HOME/quiche/quiche/deps/boringssl/src/lib
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
persist-credentials: false
- name: 'autoreconf'
if: ${{ matrix.build.configure }}
run: autoreconf -fi
- name: 'configure'
run: |
if [ -n '${{ matrix.build.PKG_CONFIG_PATH }}' ]; then
export PKG_CONFIG_PATH="${{ matrix.build.PKG_CONFIG_PATH }}"
fi
if [ -n '${{ matrix.build.generate }}' ]; then
cmake -B bld -G Ninja \
-DCMAKE_C_COMPILER_TARGET=$(uname -m)-pc-linux-gnu -DBUILD_STATIC_LIBS=ON \
-DCMAKE_UNITY_BUILD=ON -DCURL_TEST_BUNDLES=ON -DCURL_WERROR=ON \
${{ matrix.build.generate }}
else
mkdir bld && cd bld && ../configure --enable-unity --enable-test-bundles --enable-warnings --enable-werror \
--disable-dependency-tracking \
${{ matrix.build.configure }}
fi
- name: 'configure log'
if: ${{ !cancelled() }}
run: cat bld/config.log bld/CMakeFiles/CMakeConfigureLog.yaml 2>/dev/null || true
- name: 'curl_config.h'
run: |
echo '::group::raw'; cat bld/lib/curl_config.h || true; echo '::endgroup::'
grep -F '#define' bld/lib/curl_config.h | sort || true
- name: 'test configs'
run: grep -H -v '^#' bld/tests/config bld/tests/http/config.ini || true
- name: 'build'
run: |
if [ -n '${{ matrix.build.generate }}' ]; then
cmake --build bld --verbose
else
make -C bld V=1
fi
- name: 'check curl -V output'
run: bld/src/curl -V
- name: 'build tests'
run: |
if [ -n '${{ matrix.build.generate }}' ]; then
cmake --build bld --verbose --target testdeps
else
make -C bld V=1 -C tests
fi
- name: 'install test prereqs'
run: |
source $HOME/venv/bin/activate
python3 -m pip install -r tests/requirements.txt
- name: 'run tests'
env:
TFLAGS: '${{ matrix.build.tflags }}'
run: |
source $HOME/venv/bin/activate
if [ -n '${{ matrix.build.generate }}' ]; then
cmake --build bld --verbose --target test-ci
else
make -C bld V=1 test-ci
fi
- name: 'install pytest prereqs'
run: |
source $HOME/venv/bin/activate
python3 -m pip install -r tests/http/requirements.txt
- name: 'run pytest event based'
env:
CURL_TEST_EVENT: 1
CURL_CI: github
PYTEST_ADDOPTS: '--color=yes'
run: |
source $HOME/venv/bin/activate
if [ -n '${{ matrix.build.generate }}' ]; then
cmake --build bld --verbose --target curl-pytest-ci
else
make -C bld V=1 pytest-ci
fi
- name: 'build examples'
run: |
if [ -n '${{ matrix.build.generate }}' ]; then
cmake --build bld --verbose --target curl-examples
else
make -C bld V=1 examples
fi

26
deps/curl/.github/workflows/label.yml vendored Normal file
View File

@ -0,0 +1,26 @@
# Copyright (C) Daniel Fandrich, <dan@coneharvesters.com>, et al.
#
# SPDX-License-Identifier: curl
# This workflow will triage pull requests and apply a label based on the
# paths that are modified in the pull request.
#
# To use this workflow, you will need to set up a .github/labeler.yml
# file with configuration. For more information, see:
# https://github.com/actions/labeler
name: Labeler
'on': [pull_request_target]
jobs:
label:
runs-on: ubuntu-latest
permissions:
contents: read
pull-requests: write
steps:
- uses: actions/labeler@8558fd74291d67161a8a78ce36a881fa63b766a9 # v5
with:
repo-token: '${{ secrets.GITHUB_TOKEN }}'

View File

@ -0,0 +1,148 @@
# Copyright (C) Daniel Fandrich, <dan@coneharvesters.com>, et al.
#
# SPDX-License-Identifier: curl
#
# Compile on an old version of Linux that has barely the minimal build
# requirements for CMake. This tests that curl is still usable on really
# outdated systems.
#
# Debian stretch is chosen as it closely matches some of the oldest major
# versions we support (especially cmake); see docs/INTERNALS.md and it
# is still supported (as of this writing).
# stretch has ELTS support from Freexian until 2027-06-30
# For ELTS info see https://www.freexian.com/lts/extended/docs/how-to-use-extended-lts/
# The Debian key will expire 2025-05-20, after which package signature
# verification may need to be disabled.
# httrack is one of the smallest downloaders, needed to bootstrap ELTS,
# and won't conflict with the curl we're building.
name: Old Linux
'on':
push:
branches:
- master
- '*/ci'
paths-ignore:
- '**/*.md'
- '.circleci/**'
- 'appveyor.*'
- 'packages/**'
- 'plan9/**'
- 'projects/**'
- 'winbuild/**'
pull_request:
branches:
- master
paths-ignore:
- '**/*.md'
- '.circleci/**'
- 'appveyor.*'
- 'packages/**'
- 'plan9/**'
- 'projects/**'
- 'winbuild/**'
permissions: {}
env:
MAKEFLAGS: -j 5
DEBIAN_FRONTEND: noninteractive
jobs:
cmake:
name: linux (cmake & autoconf)
runs-on: 'ubuntu-latest'
container: 'debian:stretch'
steps:
- name: 'install prereqs'
# Remember, this shell is dash, not bash
run: |
sed -E -i -e s@[a-z]+\.debian\.org/@archive.debian.org/debian-archive/@ -e '/ stretch-updates /d' /etc/apt/sources.list
apt-get -o Dpkg::Use-Pty=0 update
# See comment above if this fails after 2025-05-20
apt-get -o Dpkg::Use-Pty=0 install -y --no-install-suggests --no-install-recommends httrack
httrack --get https://deb.freexian.com/extended-lts/pool/main/f/freexian-archive-keyring/freexian-archive-keyring_2022.06.08_all.deb
dpkg -i freexian-archive-keyring_2022.06.08_all.deb
echo 'deb http://deb.freexian.com/extended-lts stretch-lts main contrib non-free' | tee /etc/apt/sources.list.d/extended-lts.list
apt-get -o Dpkg::Use-Pty=0 update
apt-get -o Dpkg::Use-Pty=0 install -y --no-install-suggests --no-install-recommends cmake make automake autoconf libtool gcc pkg-config libpsl-dev libzstd-dev zlib1g-dev libssl1.0-dev libssh-dev libssh2-1-dev libc-ares-dev heimdal-dev libldap2-dev librtmp-dev stunnel4 groff
# GitHub's actions/checkout needs a newer glibc. This one is the
# latest available for buster, the next stable release after stretch.
httrack --get https://security.debian.org/debian-security/pool/updates/main/g/glibc/libc6_2.28-10+deb10u4_amd64.deb
dpkg -i libc6_*_amd64.deb
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
persist-credentials: false
- name: 'cmake build-only (out-of-tree, libssh2)'
run: |
mkdir bld-1
cd bld-1
cmake .. -DCMAKE_UNITY_BUILD=ON -DCURL_TEST_BUNDLES=ON -DCURL_WERROR=ON -DBUILD_SHARED_LIBS=ON \
-DENABLE_ARES=OFF -DCURL_ZSTD=OFF -DCURL_USE_GSSAPI=OFF -DCURL_USE_LIBSSH2=ON -DCURL_USE_LIBSSH=OFF -DUSE_LIBRTMP=ON
make install
src/curl --disable --version
- name: 'cmake build-only curl_config.h'
run: |
echo '::group::raw'; cat bld-1/lib/curl_config.h || true; echo '::endgroup::'
grep -F '#define' bld-1/lib/curl_config.h | sort || true
# when this job can get a libssh version 0.9.0 or later, this should get
# that enabled again
- name: 'cmake generate (out-of-tree, c-ares, zstd, gssapi)'
run: |
mkdir bld-cares
cd bld-cares
cmake .. -DCMAKE_UNITY_BUILD=ON -DCURL_TEST_BUNDLES=ON -DCURL_WERROR=ON -DBUILD_SHARED_LIBS=ON \
-DENABLE_ARES=ON -DCURL_USE_GSSAPI=ON -DCURL_USE_LIBSSH2=OFF -DCURL_USE_LIBSSH=OFF -DUSE_LIBRTMP=ON \
-DCURL_LIBCURL_VERSIONED_SYMBOLS=ON
- name: 'cmake curl_config.h'
run: |
echo '::group::raw'; cat bld-cares/lib/curl_config.h || true; echo '::endgroup::'
grep -F '#define' bld-cares/lib/curl_config.h | sort || true
- name: 'cmake build'
run: |
make -C bld-cares
bld-cares/src/curl --disable --version
- name: 'cmake install'
run: make -C bld-cares install
- name: 'cmake build tests'
run: make -C bld-cares testdeps
- name: 'cmake run tests'
run: make -C bld-cares test-ci
- name: 'autoreconf'
run: autoreconf -if
- name: 'configure (out-of-tree, c-ares, libssh2, zstd, gssapi)'
run: |
mkdir bld-am
cd bld-am
../configure --disable-dependency-tracking --enable-unity --enable-test-bundles --enable-warnings --enable-werror \
--with-openssl --enable-ares --with-libssh2 --with-zstd --with-gssapi --with-librtmp \
--prefix="$PWD"/../install-am
- name: 'autoconf curl_config.h'
run: |
echo '::group::raw'; cat bld-am/lib/curl_config.h || true; echo '::endgroup::'
grep -F '#define' bld-am/lib/curl_config.h | sort || true
- name: 'autoconf build'
run: |
make -C bld-am
bld-am/src/curl --disable --version
- name: 'autoconf install'
run: make -C bld-am install
- name: 'autoconf build tests'
run: make -C bld-am/tests all

709
deps/curl/.github/workflows/linux.yml vendored Normal file
View File

@ -0,0 +1,709 @@
# Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
#
# SPDX-License-Identifier: curl
name: Linux
'on':
push:
branches:
- master
- '*/ci'
paths-ignore:
- '**/*.md'
- '.circleci/**'
- 'appveyor.*'
- 'packages/**'
- 'plan9/**'
- 'projects/**'
- 'winbuild/**'
pull_request:
branches:
- master
paths-ignore:
- '**/*.md'
- '.circleci/**'
- 'appveyor.*'
- 'packages/**'
- 'plan9/**'
- 'projects/**'
- 'winbuild/**'
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.sha }}
cancel-in-progress: true
permissions: {}
env:
MAKEFLAGS: -j 5
CURL_CLANG_TIDYFLAGS: '-checks=-clang-analyzer-security.insecureAPI.strcpy,-clang-analyzer-optin.performance.Padding,-clang-analyzer-security.insecureAPI.DeprecatedOrUnsafeBufferHandling,-clang-analyzer-valist.Uninitialized'
# unhandled
bearssl-version: 0.6
# renovate: datasource=github-tags depName=libressl-portable/portable versioning=semver registryUrl=https://github.com
libressl-version: 4.0.0
# renovate: datasource=github-tags depName=wolfSSL/wolfssl versioning=semver extractVersion=^v?(?<version>.+)-stable$ registryUrl=https://github.com
wolfssl-version: 5.7.6
# renovate: datasource=github-tags depName=wolfSSL/wolfssh versioning=semver extractVersion=^v?(?<version>.+)-stable$ registryUrl=https://github.com
wolfssh-version: 1.4.19
# renovate: datasource=github-tags depName=Mbed-TLS/mbedtls versioning=semver registryUrl=https://github.com
mbedtls-version: 3.6.2
# renovate: datasource=github-tags depName=nibanks/msh3 versioning=semver registryUrl=https://github.com
msh3-version: 0.6.0
# renovate: datasource=github-tags depName=awslabs/aws-lc versioning=semver registryUrl=https://github.com
awslc-version: 1.47.0
# handled in renovate.json
openssl-version: 3.4.1
# handled in renovate.json
quictls-version: 3.3.0
# renovate: datasource=github-tags depName=rustls/rustls-ffi versioning=semver registryUrl=https://github.com
rustls-version: 0.14.1
jobs:
linux:
name: ${{ matrix.build.generate && 'CM' || 'AM' }} ${{ matrix.build.name }}
runs-on: ${{ matrix.build.image || 'ubuntu-latest' }}
container: ${{ matrix.build.container }}
timeout-minutes: 45
strategy:
fail-fast: false
matrix:
build:
- name: bearssl
install_packages: zlib1g-dev
install_steps: bearssl pytest
configure: LDFLAGS="-Wl,-rpath,$HOME/bearssl/lib" --with-bearssl=$HOME/bearssl --enable-debug
- name: bearssl clang
install_packages: zlib1g-dev clang
install_steps: bearssl
configure: CC=clang LDFLAGS="-Wl,-rpath,$HOME/bearssl/lib" --with-bearssl=$HOME/bearssl --enable-debug
- name: libressl heimdal
install_packages: zlib1g-dev heimdal-dev
install_steps: libressl pytest
configure: LDFLAGS="-Wl,-rpath,$HOME/libressl/lib" --with-openssl=$HOME/libressl --with-gssapi --enable-debug
- name: libressl heimdal valgrind
install_packages: zlib1g-dev heimdal-dev valgrind
install_steps: libressl pytest
generate: -DOPENSSL_ROOT_DIR=$HOME/libressl -DCURL_USE_GSSAPI=ON -DENABLE_DEBUG=ON -DCURL_LIBCURL_VERSIONED_SYMBOLS=ON
- name: libressl clang
install_packages: zlib1g-dev clang
install_steps: libressl
configure: CC=clang LDFLAGS="-Wl,-rpath,$HOME/libressl/lib" --with-openssl=$HOME/libressl --enable-debug
- name: wolfssl-all
install_packages: zlib1g-dev
install_steps: wolfssl-all wolfssh
configure: LDFLAGS="-Wl,-rpath,$HOME/wolfssl-all/lib" --with-wolfssl=$HOME/wolfssl-all --with-wolfssh=$HOME/wolfssh --enable-ech --enable-debug
- name: wolfssl-opensslextra valgrind
install_packages: zlib1g-dev valgrind
install_steps: wolfssl-opensslextra
configure: LDFLAGS="-Wl,-rpath,$HOME/wolfssl-opensslextra/lib" --with-wolfssl=$HOME/wolfssl-opensslextra --enable-debug
- name: mbedtls valgrind
install_packages: libnghttp2-dev valgrind
install_steps: mbedtls pytest
configure: LDFLAGS="-Wl,-rpath,$HOME/mbedtls/lib" --with-mbedtls=$HOME/mbedtls --enable-debug
- name: mbedtls clang
install_packages: libnghttp2-dev clang
install_steps: mbedtls
configure: CC=clang LDFLAGS="-Wl,-rpath,$HOME/mbedtls/lib" --with-mbedtls=$HOME/mbedtls --enable-debug
- name: mbedtls
install_packages: libnghttp2-dev
install_steps: mbedtls
PKG_CONFIG_PATH: '$HOME/mbedtls/lib/pkgconfig' # Requires v3.6.0 or v2.28.8
generate: -DCURL_USE_MBEDTLS=ON -DENABLE_DEBUG=ON
- name: mbedtls-pkg
install_packages: libnghttp2-dev libmbedtls-dev
generate: -DCURL_USE_MBEDTLS=ON -DENABLE_DEBUG=ON -DBUILD_LIBCURL_DOCS=OFF -DBUILD_MISC_DOCS=OFF -DENABLE_CURL_MANUAL=OFF
- name: mbedtls-pkg !pc
install_packages: libnghttp2-dev libmbedtls-dev
install_steps: skipall
generate: -DCURL_USE_MBEDTLS=ON -DENABLE_DEBUG=ON -DCURL_USE_PKGCONFIG=OFF
- name: msh3
install_packages: zlib1g-dev
install_steps: quictls msh3
configure: LDFLAGS="-Wl,-rpath,$HOME/msh3/lib -Wl,-rpath,$HOME/quictls/lib" --with-msh3=$HOME/msh3 --with-openssl=$HOME/quictls --enable-debug
- name: msh3
install_packages: zlib1g-dev
install_steps: quictls msh3 skipall
PKG_CONFIG_PATH: '$HOME/msh3/lib/pkgconfig' # Broken as of v0.6.0
generate: -DOPENSSL_ROOT_DIR=$HOME/quictls -DUSE_MSH3=ON -DMSH3_INCLUDE_DIR=$HOME/msh3/include -DMSH3_LIBRARY=$HOME/msh3/lib/libmsh3.so -DENABLE_DEBUG=ON
- name: awslc
install_packages: zlib1g-dev
install_steps: awslc
configure: LDFLAGS="-Wl,-rpath,$HOME/awslc/lib" --with-openssl=$HOME/awslc --enable-ech
- name: awslc
install_packages: zlib1g-dev
install_steps: awslc
generate: -DOPENSSL_ROOT_DIR=$HOME/awslc -DUSE_ECH=ON -DCMAKE_UNITY_BUILD=OFF
- name: openssl default
install_steps: pytest
configure: --with-openssl --enable-debug --disable-unity
- name: openssl libssh2 sync-resolver valgrind
install_packages: zlib1g-dev libssh2-1-dev valgrind
install_steps: pytest
configure: --with-openssl --enable-debug --disable-threaded-resolver --with-libssh2
- name: openssl
install_packages: zlib1g-dev
install_steps: pytest
configure: CFLAGS=-std=gnu89 --with-openssl --enable-debug
- name: openssl -O3 valgrind
install_packages: zlib1g-dev valgrind
configure: CFLAGS=-O3 --with-openssl --enable-debug
- name: openssl clang krb5
install_packages: zlib1g-dev libkrb5-dev clang
configure: CC=clang --with-openssl --with-gssapi --enable-debug --disable-docs --disable-manual
- name: openssl clang krb5 LTO
install_packages: zlib1g-dev libkrb5-dev clang
install_steps: skipall
generate: -DCURL_USE_OPENSSL=ON -DCURL_USE_GSSAPI=ON -DENABLE_DEBUG=ON -DCURL_LTO=ON
- name: openssl !ipv6 !--libcurl
configure: --with-openssl --disable-ipv6 --enable-debug --disable-unity --disable-libcurl-option
- name: openssl https-only
configure: >-
--with-openssl --enable-debug --disable-unity
--disable-dict --disable-gopher --disable-ldap --disable-telnet
--disable-imap --disable-pop3 --disable-smtp
--without-librtmp --disable-rtsp
--without-libssh2 --without-libssh --without-wolfssh
--disable-tftp --disable-ftp --disable-file --disable-smb
- name: openssl torture !FTP
install_packages: zlib1g-dev libnghttp2-dev libssh2-1-dev libc-ares-dev
generate: -DCURL_USE_OPENSSL=ON -DENABLE_DEBUG=ON -DENABLE_ARES=ON
tflags: -t --shallow=25 !FTP
torture: true
- name: openssl torture FTP
install_packages: zlib1g-dev libnghttp2-dev libssh2-1-dev libc-ares-dev
generate: -DCURL_USE_OPENSSL=ON -DENABLE_DEBUG=ON -DENABLE_ARES=ON
tflags: -t --shallow=20 FTP
torture: true
- name: openssl i686
install_packages: gcc-14-i686-linux-gnu libssl-dev:i386 librtmp-dev:i386 libssh2-1-dev:i386 libidn2-0-dev:i386 libc-ares-dev:i386 zlib1g-dev:i386
configure: >-
PKG_CONFIG_PATH=/usr/lib/i386-linux-gnu/pkgconfig
CC=i686-linux-gnu-gcc-14
CPPFLAGS=-I/usr/include/i386-linux-gnu
LDFLAGS=-L/usr/lib/i386-linux-gnu
--host=i686-linux-gnu
--with-openssl --with-librtmp --with-libssh2 --with-libidn2 --enable-ares --enable-debug
- name: '!ssl !http !smtp !imap'
configure: --without-ssl --enable-debug --disable-http --disable-smtp --disable-imap --disable-unity
- name: clang-tidy
install_packages: clang-tidy libssl-dev libssh2-1-dev
install_steps: skipall
configure: --with-openssl --with-libssh2
make-custom-target: tidy
- name: scanbuild
install_packages: clang-tools clang libssl-dev libssh2-1-dev
install_steps: skipall
configure: --with-openssl --enable-debug --with-libssh2 --disable-unity
configure-prefix: CC=clang scan-build
make-prefix: scan-build --status-bugs
- name: address-sanitizer
install_packages: zlib1g-dev libssh2-1-dev clang libssl-dev libubsan1 libasan8 libtsan2
install_steps: pytest
configure: >-
CC=clang
CFLAGS="-fsanitize=address,undefined,signed-integer-overflow -fno-sanitize-recover=undefined,integer -Wformat -Werror=format-security -Werror=array-bounds -g"
LDFLAGS="-fsanitize=address,undefined -fno-sanitize-recover=undefined,integer"
LIBS="-ldl -lubsan"
--with-openssl --enable-debug
- name: thread-sanitizer
install_packages: zlib1g-dev clang libtsan2
install_steps: pytest openssl-tsan
configure: >-
CC=clang
CFLAGS="-fsanitize=thread -g"
LDFLAGS="-fsanitize=thread -Wl,-rpath,$HOME/openssl/lib"
--with-openssl=$HOME/openssl --enable-debug
- name: memory-sanitizer
install_packages: clang
configure: >-
CC=clang
CFLAGS="-fsanitize=memory -Wformat -Werror=format-security -Werror=array-bounds -g"
LDFLAGS="-fsanitize=memory"
LIBS="-ldl"
--without-ssl --without-zlib --without-brotli --without-zstd --without-libpsl --without-nghttp2 --enable-debug
- name: event-based
install_packages: libssh-dev
configure: --enable-debug --disable-shared --disable-threaded-resolver --with-libssh --with-openssl
tflags: -n --test-event '!TLS-SRP'
- name: duphandle
install_packages: libssh-dev
configure: --enable-debug --disable-shared --disable-threaded-resolver --with-libssh --with-openssl
tflags: -n --test-duphandle '!TLS-SRP'
- name: rustls valgrind
install_packages: valgrind
install_steps: rust rustls pytest
configure: --with-rustls=$HOME/rustls --enable-debug
- name: rustls
install_steps: rust rustls skipall
PKG_CONFIG_PATH: '$HOME/rustls/lib/pkgconfig' # Not built as of v0.14.0
generate: -DCURL_USE_RUSTLS=ON -DRUSTLS_INCLUDE_DIR=$HOME/rustls/include -DRUSTLS_LIBRARY=$HOME/rustls/lib/librustls.a -DENABLE_DEBUG=ON
- name: IntelC openssl
install_packages: zlib1g-dev libssl-dev
install_steps: intel
configure: CC=icc --enable-debug --with-openssl
- name: Slackware openssl gssapi gcc
# These are essentially the same flags used to build the curl Slackware package
# https://ftpmirror.infania.net/slackware/slackware64-current/source/n/curl/curl.SlackBuild
configure: --with-openssl --with-libssh2 --with-gssapi --enable-ares --enable-static=no --without-ca-bundle --with-ca-path=/etc/ssl/certs
# Docker Hub image that `container-job` executes in
container: 'andy5995/slackware-build-essential:15.0'
- name: Alpine MUSL https-rr
configure: --enable-debug --with-ssl --with-libssh2 --with-libidn2 --with-gssapi --enable-ldap --with-libpsl --enable-threaded-resolver --enable-ares --enable-httpsrr
container: 'alpine:3.20'
- name: Alpine MUSL c-ares https-rr
configure: --enable-debug --with-ssl --with-libssh2 --with-libidn2 --with-gssapi --enable-ldap --with-libpsl --disable-threaded-resolver --enable-ares --enable-httpsrr --disable-unity
container: 'alpine:3.20'
steps:
- name: 'install prereqs'
if: matrix.build.container == null && !contains(matrix.build.name, 'i686')
run: |
sudo rm -f /etc/apt/sources.list.d/microsoft-prod.list
sudo apt-get -o Dpkg::Use-Pty=0 update
sudo apt-get -o Dpkg::Use-Pty=0 install \
libtool autoconf automake pkgconf ninja-build \
${{ matrix.build.install_steps != 'skipall' && matrix.build.install_steps != 'skiprun' && 'stunnel4' || '' }} \
libpsl-dev libbrotli-dev libzstd-dev \
${{ matrix.build.install_packages }} \
${{ contains(matrix.build.install_steps, 'pytest') && 'apache2 apache2-dev libnghttp2-dev vsftpd' || '' }}
python3 -m venv $HOME/venv
- name: 'install prereqs'
if: contains(matrix.build.name, 'i686')
run: |
sudo rm -f /etc/apt/sources.list.d/microsoft-prod.list
sudo dpkg --add-architecture i386
sudo apt-get -o Dpkg::Use-Pty=0 update
sudo apt-get -o Dpkg::Use-Pty=0 install \
libtool autoconf automake pkgconf stunnel4 \
libpsl-dev:i386 libbrotli-dev:i386 libzstd-dev:i386 \
${{ matrix.build.install_packages }}
python3 -m venv $HOME/venv
- name: 'install dependencies'
if: startsWith(matrix.build.container, 'alpine')
run: |
apk add --no-cache build-base autoconf automake libtool perl openssl-dev \
libssh2-dev zlib-dev brotli-dev zstd-dev libidn2-dev openldap-dev \
heimdal-dev libpsl-dev c-ares-dev \
py3-impacket py3-asn1 py3-six py3-pycryptodomex \
perl-time-hires openssh stunnel sudo git
- name: 'cache bearssl'
if: contains(matrix.build.install_steps, 'bearssl')
uses: actions/cache@d4323d4df104b026a6aa633fdb11d772146be0bf # v4
id: cache-bearssl
env:
cache-name: cache-bearssl
with:
path: ~/bearssl
key: ${{ runner.os }}-build-${{ env.cache-name }}-${{ env.bearssl-version }}
- name: 'build bearssl'
if: contains(matrix.build.install_steps, 'bearssl') && steps.cache-bearssl.outputs.cache-hit != 'true'
run: |
curl -LOsSf --retry 6 --retry-connrefused --max-time 999 \
https://bearssl.org/bearssl-${{ env.bearssl-version }}.tar.gz
tar -xzf bearssl-${{ env.bearssl-version }}.tar.gz
cd bearssl-${{ env.bearssl-version }}
make
mkdir -p $HOME/bearssl/lib $HOME/bearssl/include
cp inc/*.h $HOME/bearssl/include
cp build/libbearssl.* $HOME/bearssl/lib
- name: 'cache libressl'
if: contains(matrix.build.install_steps, 'libressl')
uses: actions/cache@d4323d4df104b026a6aa633fdb11d772146be0bf # v4
id: cache-libressl
env:
cache-name: cache-libressl
with:
path: ~/libressl
key: ${{ runner.os }}-build-${{ env.cache-name }}-${{ env.libressl-version }}
- name: 'build libressl'
if: contains(matrix.build.install_steps, 'libressl') && steps.cache-libressl.outputs.cache-hit != 'true'
run: |
curl -LOsSf --retry 6 --retry-connrefused --max-time 999 \
https://github.com/libressl/portable/releases/download/v${{ env.libressl-version }}/libressl-${{ env.libressl-version }}.tar.gz
tar -xzf libressl-${{ env.libressl-version }}.tar.gz
cd libressl-${{ env.libressl-version }}
./configure --disable-dependency-tracking --prefix=$HOME/libressl
make install
- name: 'cache wolfssl (all)'
if: contains(matrix.build.install_steps, 'wolfssl-all')
uses: actions/cache@d4323d4df104b026a6aa633fdb11d772146be0bf # v4
id: cache-wolfssl-all
env:
cache-name: cache-wolfssl-all
with:
path: ~/wolfssl-all
key: ${{ runner.os }}-build-${{ env.cache-name }}-${{ env.wolfssl-version }}
- name: 'build wolfssl (all)'
if: contains(matrix.build.install_steps, 'wolfssl-all') && steps.cache-wolfssl-all.outputs.cache-hit != 'true'
run: |
curl -LOsSf --retry 6 --retry-connrefused --max-time 999 \
https://github.com/wolfSSL/wolfssl/archive/v${{ env.wolfssl-version }}-stable.tar.gz
tar -xzf v${{ env.wolfssl-version }}-stable.tar.gz
cd wolfssl-${{ env.wolfssl-version }}-stable
./autogen.sh
./configure --disable-dependency-tracking --enable-tls13 --enable-harden --enable-all \
--disable-benchmark --disable-crypttests --disable-examples --prefix=$HOME/wolfssl-all
make install
- name: 'cache wolfssl (opensslextra)'
if: contains(matrix.build.install_steps, 'wolfssl-opensslextra')
uses: actions/cache@d4323d4df104b026a6aa633fdb11d772146be0bf # v4
id: cache-wolfssl-opensslextra
env:
cache-name: cache-wolfssl-opensslextra
with:
path: ~/wolfssl-opensslextra
key: ${{ runner.os }}-build-${{ env.cache-name }}-${{ env.wolfssl-version }}
- name: 'build wolfssl (opensslextra)'
if: contains(matrix.build.install_steps, 'wolfssl-opensslextra') && steps.cache-wolfssl-opensslextra.outputs.cache-hit != 'true'
run: |
curl -LOsSf --retry 6 --retry-connrefused --max-time 999 \
https://github.com/wolfSSL/wolfssl/archive/v${{ env.wolfssl-version }}-stable.tar.gz
tar -xzf v${{ env.wolfssl-version }}-stable.tar.gz
cd wolfssl-${{ env.wolfssl-version }}-stable
./autogen.sh
./configure --disable-dependency-tracking --enable-tls13 --enable-harden --enable-opensslextra \
--disable-benchmark --disable-crypttests --disable-examples --prefix=$HOME/wolfssl-opensslextra
make install
- name: 'cache wolfssh'
if: contains(matrix.build.install_steps, 'wolfssl')
uses: actions/cache@d4323d4df104b026a6aa633fdb11d772146be0bf # v4
id: cache-wolfssh
env:
cache-name: cache-wolfssh
with:
path: ~/wolfssh
key: ${{ runner.os }}-build-${{ env.cache-name }}-${{ env.wolfssh-version }}-${{ env.wolfssl-version }}
- name: 'build wolfssh'
if: contains(matrix.build.install_steps, 'wolfssh') && steps.cache-wolfssh.outputs.cache-hit != 'true'
run: |
curl -LOsSf --retry 6 --retry-connrefused --max-time 999 \
https://github.com/wolfSSL/wolfssh/archive/v${{ env.wolfssh-version }}-stable.tar.gz
tar -xzf v${{ env.wolfssh-version }}-stable.tar.gz
cd wolfssh-${{ env.wolfssh-version }}-stable
./autogen.sh
./configure --disable-dependency-tracking --with-wolfssl=$HOME/wolfssl-all --enable-scp --enable-sftp --disable-term \
--disable-examples --prefix=$HOME/wolfssh
make install
- name: 'cache mbedtls'
if: contains(matrix.build.install_steps, 'mbedtls')
uses: actions/cache@d4323d4df104b026a6aa633fdb11d772146be0bf # v4
id: cache-mbedtls
env:
cache-name: cache-mbedtls-threadsafe
with:
path: ~/mbedtls
key: ${{ runner.os }}-build-${{ env.cache-name }}-${{ env.mbedtls-version }}
- name: 'build mbedtls'
if: contains(matrix.build.install_steps, 'mbedtls') && steps.cache-mbedtls.outputs.cache-hit != 'true'
run: |
curl -LOsSf --retry 6 --retry-connrefused --max-time 999 \
https://github.com/Mbed-TLS/mbedtls/releases/download/mbedtls-${{ env.mbedtls-version }}/mbedtls-${{ env.mbedtls-version }}.tar.bz2
tar -xjf mbedtls-${{ env.mbedtls-version }}.tar.bz2
cd mbedtls-${{ env.mbedtls-version }}
./scripts/config.py set MBEDTLS_THREADING_C
./scripts/config.py set MBEDTLS_THREADING_PTHREAD
cmake -B . -G Ninja -DCMAKE_BUILD_TYPE=RelWithDebInfo -DCMAKE_POSITION_INDEPENDENT_CODE=ON -DCMAKE_INSTALL_PREFIX=$HOME/mbedtls \
-DENABLE_PROGRAMS=OFF -DENABLE_TESTING=OFF
cmake --build .
cmake --install .
- name: 'cache openssl (thread sanitizer)'
if: contains(matrix.build.install_steps, 'openssl-tsan')
uses: actions/cache@d4323d4df104b026a6aa633fdb11d772146be0bf # v4
id: cache-openssl-tsan
env:
cache-name: cache-openssl-tsan
with:
path: ~/openssl
key: ${{ runner.os }}-build-${{ env.cache-name }}-${{ env.openssl-version }}
- name: 'build openssl (thread sanitizer)'
if: contains(matrix.build.install_steps, 'openssl-tsan') && steps.cache-openssl-tsan.outputs.cache-hit != 'true'
run: |
git clone --quiet --depth=1 -b openssl-${{ env.openssl-version }} https://github.com/openssl/openssl
cd openssl
CC="clang" CFLAGS="-fsanitize=thread" LDFLAGS="-fsanitize=thread" ./config --prefix=$HOME/openssl --libdir=lib no-makedepend no-apps no-docs no-tests
make
make -j1 install_sw
- name: 'cache quictls'
if: contains(matrix.build.install_steps, 'quictls')
uses: actions/cache@d4323d4df104b026a6aa633fdb11d772146be0bf # v4
id: cache-quictls
env:
cache-name: cache-quictls
with:
path: ~/quictls
key: ${{ runner.os }}-build-${{ env.cache-name }}-${{ env.quictls-version }}-quic1
- name: 'build quictls'
if: contains(matrix.build.install_steps, 'quictls') && steps.cache-quictls.outputs.cache-hit != 'true'
run: |
git clone --quiet --depth=1 -b openssl-${{ env.quictls-version }}-quic1 https://github.com/quictls/openssl
cd openssl
./config --prefix=$HOME/quictls --libdir=lib no-makedepend no-apps no-docs no-tests
make
make -j1 install_sw
- name: 'cache msh3'
if: contains(matrix.build.install_steps, 'msh3')
uses: actions/cache@d4323d4df104b026a6aa633fdb11d772146be0bf # v4
id: cache-msh3
env:
cache-name: cache-msh3
with:
path: ~/msh3
key: ${{ runner.os }}-build-${{ env.cache-name }}-${{ env.msh3-version }}
- name: 'build msh3'
if: contains(matrix.build.install_steps, 'msh3') && steps.cache-msh3.outputs.cache-hit != 'true'
run: |
git clone --quiet --depth=1 -b v${{ env.msh3-version }} --recursive https://github.com/nibanks/msh3
cd msh3
cmake -B . -DCMAKE_BUILD_TYPE=RelWithDebInfo -DCMAKE_INSTALL_PREFIX=$HOME/msh3
cmake --build .
cmake --install .
- name: 'cache awslc'
if: contains(matrix.build.install_steps, 'awslc')
uses: actions/cache@d4323d4df104b026a6aa633fdb11d772146be0bf # v4
id: cache-awslc
env:
cache-name: cache-awslc
with:
path: ~/awslc
key: ${{ runner.os }}-build-${{ env.cache-name }}-${{ env.awslc-version }}
- name: 'build awslc'
if: contains(matrix.build.install_steps, 'awslc') && steps.cache-awslc.outputs.cache-hit != 'true'
run: |
curl -LOsSf --retry 6 --retry-connrefused --max-time 999 \
https://github.com/awslabs/aws-lc/archive/refs/tags/v${{ env.awslc-version }}.tar.gz
tar xzf v${{ env.awslc-version }}.tar.gz
mkdir aws-lc-${{ env.awslc-version }}-build
cd aws-lc-${{ env.awslc-version }}-build
cmake -G Ninja -DCMAKE_INSTALL_PREFIX=$HOME/awslc ../aws-lc-${{ env.awslc-version }} -DBUILD_TOOL=OFF -DBUILD_TESTING=OFF
cmake --build .
cmake --install .
- name: 'cache rustls'
if: contains(matrix.build.install_steps, 'rustls')
uses: actions/cache@d4323d4df104b026a6aa633fdb11d772146be0bf # v4
id: cache-rustls
env:
cache-name: cache-rustls
with:
path: ~/rustls
key: ${{ runner.os }}-build-${{ env.cache-name }}-${{ env.rustls-version }}
- name: 'install rust'
if: contains(matrix.build.install_steps, 'rust') && steps.cache-rustls.outputs.cache-hit != 'true'
run: |
cd $HOME
curl -sSf --compressed https://sh.rustup.rs/ | sh -s -- -y
source $HOME/.cargo/env
rustup toolchain install stable --profile minimal
- name: 'build rustls'
if: contains(matrix.build.install_steps, 'rustls') && steps.cache-rustls.outputs.cache-hit != 'true'
run: |
git clone --quiet --depth=1 -b v${{ env.rustls-version }} --recursive https://github.com/rustls/rustls-ffi.git
cd rustls-ffi
make DESTDIR=$HOME/rustls install
- name: 'install Intel compilers'
if: contains(matrix.build.install_steps, 'intel')
run: |
curl -sSf --compressed https://apt.repos.intel.com/intel-gpg-keys/GPG-PUB-KEY-INTEL-SW-PRODUCTS.PUB | sudo tee /etc/apt/trusted.gpg.d/intel-sw.asc >/dev/null
sudo add-apt-repository "deb https://apt.repos.intel.com/oneapi all main"
sudo apt-get -o Dpkg::Use-Pty=0 install intel-oneapi-compiler-dpcpp-cpp-and-cpp-classic
source /opt/intel/oneapi/setvars.sh
printenv >> $GITHUB_ENV
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
persist-credentials: false
- name: 'autoreconf'
if: ${{ matrix.build.configure }}
run: autoreconf -fi
- name: 'configure'
run: |
[[ '${{ matrix.build.install_steps }}' = *'awslc'* ]] && sudo apt-get -o Dpkg::Use-Pty=0 purge libssl-dev
if [ -n '${{ matrix.build.PKG_CONFIG_PATH }}' ]; then
export PKG_CONFIG_PATH="${{ matrix.build.PKG_CONFIG_PATH }}"
fi
if [ -n '${{ matrix.build.generate }}' ]; then
cmake -B bld -G Ninja \
-DCMAKE_INSTALL_PREFIX="$HOME/curl" \
-DCMAKE_C_COMPILER_TARGET=$(uname -m)-pc-linux-gnu -DBUILD_STATIC_LIBS=ON \
-DCMAKE_UNITY_BUILD=ON -DCURL_TEST_BUNDLES=ON -DCURL_WERROR=ON \
${{ matrix.build.generate }}
else
mkdir bld && cd bld && \
${{ matrix.build.configure-prefix }} \
../configure --enable-unity --enable-test-bundles --enable-warnings --enable-werror \
--disable-dependency-tracking \
${{ matrix.build.configure }}
fi
- name: 'configure log'
if: ${{ !cancelled() }}
run: cat bld/config.log bld/CMakeFiles/CMakeConfigureLog.yaml 2>/dev/null || true
- name: 'curl_config.h'
run: |
echo '::group::raw'; cat bld/lib/curl_config.h || true; echo '::endgroup::'
grep -F '#define' bld/lib/curl_config.h | sort || true
- name: 'test configs'
run: grep -H -v '^#' bld/tests/config bld/tests/http/config.ini || true
- name: 'build'
run: |
if [ -n '${{ matrix.build.generate }}' ]; then
${{ matrix.build.make-prefix }} cmake --build bld --verbose
else
${{ matrix.build.make-prefix }} make -C bld V=1 ${{ matrix.build.make-custom-target }}
fi
- name: 'single-use function check'
if: ${{ contains(matrix.build.configure, '--disable-unity') || contains(matrix.build.generate, '-DCMAKE_UNITY_BUILD=OFF') }}
run: |
git config --global --add safe.directory "*"
if [ -n '${{ matrix.build.generate }}' ]; then
libcurla=bld/lib/libcurl.a
else
libcurla=bld/lib/.libs/libcurl.a
fi
./scripts/singleuse.pl --unit ${libcurla}
- name: 'check curl -V output'
if: ${{ matrix.build.make-custom-target != 'tidy' }}
run: bld/src/curl -V
- name: 'cmake install'
if: ${{ matrix.build.generate }}
run: cmake --install bld --strip
- name: 'build tests'
if: ${{ matrix.build.install_steps != 'skipall' }}
run: |
if [ -n '${{ matrix.build.generate }}' ]; then
cmake --build bld --verbose --target testdeps
else
make -C bld V=1 -C tests
fi
- name: 'install test prereqs'
if: ${{ matrix.build.install_steps != 'skipall' && matrix.build.container == null }}
run: |
[ -x "$HOME/venv/bin/activate" ] && source $HOME/venv/bin/activate
python3 -m pip install -r tests/requirements.txt
- name: 'run tests'
if: ${{ matrix.build.install_steps != 'skipall' && matrix.build.install_steps != 'skiprun' }}
timeout-minutes: ${{ contains(matrix.build.install_packages, 'valgrind') && 30 || 15 }}
run: |
export TFLAGS='${{ matrix.build.tflags }}'
if [ -z '${{ matrix.build.torture }}' ]; then
if [[ '${{ matrix.build.install_steps }}' = *'wolfssh'* ]]; then
TFLAGS+=' ~SFTP'
fi
if [[ '${{ matrix.build.install_packages }}' = *'valgrind'* ]]; then
TFLAGS+=' -j6'
fi
if [[ '${{ matrix.build.install_packages }}' = *'heimdal-dev'* ]]; then
TFLAGS+=' ~2077 ~2078' # valgrind errors
fi
fi
[ -x "$HOME/venv/bin/activate" ] && source $HOME/venv/bin/activate
if [ -n '${{ matrix.build.generate }}' ]; then
cmake --build bld --verbose --target ${{ matrix.build.torture && 'test-torture' || 'test-ci' }}
else
make -C bld V=1 ${{ matrix.build.torture && 'test-torture' || 'test-ci' }}
fi
- name: 'install pytest prereqs'
if: contains(matrix.build.install_steps, 'pytest')
run: |
[ -x "$HOME/venv/bin/activate" ] && source $HOME/venv/bin/activate
python3 -m pip install -r tests/http/requirements.txt
- name: 'run pytest'
if: contains(matrix.build.install_steps, 'pytest')
env:
CURL_CI: github
PYTEST_ADDOPTS: '--color=yes'
run: |
[ -x "$HOME/venv/bin/activate" ] && source $HOME/venv/bin/activate
if [ -n '${{ matrix.build.generate }}' ]; then
cmake --build bld --verbose --target curl-pytest-ci
else
make -C bld V=1 pytest-ci
fi
- name: 'build examples'
if: ${{ matrix.build.make-custom-target != 'tidy' }}
run: |
if [ -n '${{ matrix.build.generate }}' ]; then
${{ matrix.build.make-prefix }} cmake --build bld --verbose --target curl-examples
else
${{ matrix.build.make-prefix }} make -C bld V=1 examples
fi

508
deps/curl/.github/workflows/macos.yml vendored Normal file
View File

@ -0,0 +1,508 @@
# Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
#
# SPDX-License-Identifier: curl
name: macOS
'on':
push:
branches:
- master
- '*/ci'
paths-ignore:
- '**/*.md'
- '.circleci/**'
- 'appveyor.*'
- 'packages/**'
- 'plan9/**'
- 'projects/**'
- 'winbuild/**'
pull_request:
branches:
- master
paths-ignore:
- '**/*.md'
- '.circleci/**'
- 'appveyor.*'
- 'packages/**'
- 'plan9/**'
- 'projects/**'
- 'winbuild/**'
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.sha }}
cancel-in-progress: true
permissions: {}
# Deprecated Apple APIs and the macos-version-min value required to avoid
# deprecation warnings with llvm/clang:
#
# - 10.7 Lion (2011) - GSS
# - 10.8 Mountain Lion (2012) - CFURLCreateDataAndPropertiesFromResource (used by curl Secure Transport code)
# - 10.9 Maverick (2013) - LDAP
# - 10.14 Mojave (2018) - Secure Transport
#
# For Secure Transport, curl implements features that require a target
# newer than the 10.8 required by `CFURLCreateDataAndPropertiesFromResource`.
env:
MAKEFLAGS: -j 4
LDFLAGS: -w # suppress 'object file was built for newer macOS version than being linked' warnings
jobs:
macos:
name: "${{ matrix.build.generate && 'CM' || 'AM' }} ${{ matrix.compiler }} ${{ matrix.build.name }}"
runs-on: 'macos-latest'
timeout-minutes: 45
env:
DEVELOPER_DIR: "/Applications/Xcode${{ matrix.build.xcode && format('_{0}', matrix.build.xcode) || '' }}.app/Contents/Developer"
CC: ${{ matrix.compiler }}
CFLAGS: ''
strategy:
fail-fast: false
matrix:
compiler: [clang, llvm@15, gcc-12]
build:
# automake
- name: '!ssl !debug brotli zstd'
compiler: clang
install: brotli zstd
configure: --without-ssl --with-brotli --with-zstd
- name: '!ssl !debug'
compiler: gcc-12
configure: --without-ssl
- name: '!ssl'
compiler: clang
configure: --enable-debug --without-ssl
- name: '!ssl libssh2 AppleIDN'
compiler: clang
configure: --enable-debug --with-libssh2=$(brew --prefix libssh2) --without-ssl --with-apple-idn
- name: 'OpenSSL libssh c-ares'
compiler: clang
install: libssh
configure: --enable-debug --with-libssh --with-openssl=$(brew --prefix openssl) --enable-ares
- name: 'OpenSSL libssh'
compiler: llvm@15
install: libssh libnghttp3
configure: --enable-debug --with-libssh --with-openssl=$(brew --prefix openssl) --with-openssl-quic
- name: '!ssl c-ares'
compiler: clang
configure: --enable-debug --enable-ares --without-ssl
- name: '!ssl HTTP-only'
compiler: clang
configure: >-
--enable-debug
--disable-alt-svc --disable-dict --disable-file --disable-ftp --disable-gopher --disable-imap
--disable-ldap --disable-pop3 --without-librtmp --disable-rtsp
--disable-shared --disable-smb --disable-smtp --disable-telnet --disable-tftp --disable-unix-sockets
--without-brotli --without-gssapi --without-libidn2 --without-libpsl --without-librtmp
--without-libssh2 --without-libssh --without-wolfssh
--without-nghttp2 --disable-ntlm --without-ssl --without-zlib --without-zstd
macos-version-min: '10.15' # Catalina (2019)
- name: 'SecureTransport libssh2'
compiler: clang
configure: --enable-debug --with-secure-transport --with-libssh2=$(brew --prefix libssh2)
macos-version-min: '10.8'
- name: 'SecureTransport libssh2 10.12'
compiler: clang
configure: --enable-debug --with-secure-transport --with-libssh2=$(brew --prefix libssh2)
macos-version-min: '10.12' # for monotonic timers
- name: 'SecureTransport libssh2'
compiler: gcc-12
configure: --enable-debug --with-secure-transport --with-libssh2=$(brew --prefix libssh2)
macos-version-min: '10.8'
- name: 'LibreSSL +examples'
compiler: clang
install: libressl
configure: --enable-debug --with-openssl=$(brew --prefix libressl)
- name: 'OpenSSL'
compiler: clang
configure: --enable-debug --with-openssl=$(brew --prefix openssl)
- name: 'OpenSSL event-based'
compiler: clang
configure: --enable-debug --with-openssl=$(brew --prefix openssl)
tflags: --test-event
- name: 'quictls libssh2 !ldap 10.15'
compiler: clang
install: quictls
configure: --enable-debug --disable-ldap --with-openssl=$(brew --prefix quictls) LDFLAGS="${LDFLAGS} -L$(brew --prefix quictls)/lib"
macos-version-min: '10.15'
# cmake
- name: 'OpenSSL gsasl rtmp AppleIDN'
install: gsasl rtmpdump
generate: -DOPENSSL_ROOT_DIR=$(brew --prefix openssl) -DCURL_USE_GSASL=ON -DUSE_LIBRTMP=ON -DUSE_APPLE_IDN=ON
- name: 'OpenSSL AppleIDN clang-tidy +examples'
install: llvm
generate: -DOPENSSL_ROOT_DIR=$(brew --prefix openssl) -DUSE_APPLE_IDN=ON -DCURL_CLANG_TIDY=ON -DCLANG_TIDY=$(brew --prefix llvm)/bin/clang-tidy
clang-tidy: true
chkprefill: _chkprefill
- name: 'quictls +static libssh +examples'
install: quictls libssh
generate: -DOPENSSL_ROOT_DIR=$(brew --prefix quictls) -DBUILD_STATIC_LIBS=ON -DCURL_USE_LIBSSH2=OFF -DCURL_USE_LIBSSH=ON
- name: 'SecureTransport debug'
generate: -DCURL_USE_SECTRANSP=ON -DENABLE_DEBUG=ON
macos-version-min: '10.8'
- name: 'LibreSSL !ldap heimdal c-ares +examples'
install: libressl heimdal
generate: -DOPENSSL_ROOT_DIR=$(brew --prefix libressl) -DENABLE_ARES=ON -DCURL_USE_GSSAPI=ON -DGSS_ROOT_DIR=$(brew --prefix heimdal) -DCURL_DISABLE_LDAP=ON
- name: 'wolfSSL !ldap brotli zstd'
install: brotli wolfssl zstd
generate: -DCURL_USE_WOLFSSL=ON -DCURL_DISABLE_LDAP=ON -DUSE_ECH=ON
- name: 'mbedTLS openldap brotli zstd'
install: brotli mbedtls zstd openldap
generate: -DCURL_USE_MBEDTLS=ON -DLDAP_INCLUDE_DIR="$(brew --prefix openldap)/include" -DLDAP_LIBRARY="$(brew --prefix openldap)/lib/libldap.dylib" -DLDAP_LBER_LIBRARY="$(brew --prefix openldap)/lib/liblber.dylib"
- name: 'GnuTLS !ldap krb5'
install: gnutls nettle krb5
generate: -DCURL_USE_GNUTLS=ON -DCURL_USE_OPENSSL=OFF -DCURL_USE_GSSAPI=ON -DGSS_ROOT_DIR=$(brew --prefix krb5) -DCURL_DISABLE_LDAP=ON -DUSE_SSLS_EXPORT=ON
- name: 'OpenSSL torture !FTP'
generate: -DENABLE_DEBUG=ON -DBUILD_SHARED_LIBS=OFF -DENABLE_THREADED_RESOLVER=OFF -DOPENSSL_ROOT_DIR=$(brew --prefix openssl)
tflags: -t --shallow=25 !FTP
torture: true
- name: 'OpenSSL torture FTP'
generate: -DENABLE_DEBUG=ON -DBUILD_SHARED_LIBS=OFF -DENABLE_THREADED_RESOLVER=OFF -DOPENSSL_ROOT_DIR=$(brew --prefix openssl)
tflags: -t --shallow=20 FTP
torture: true
exclude:
- { compiler: llvm@15, build: { macos-version-min: '10.15' } }
- { compiler: llvm@15, build: { torture: true } }
- { compiler: gcc-12, build: { torture: true } }
- { compiler: llvm@15, build: { clang-tidy: true } }
- { compiler: gcc-12, build: { clang-tidy: true } }
# opt out jobs from combinations that have the compiler set manually
- { compiler: llvm@15, build: { compiler: 'clang' } }
- { compiler: llvm@15, build: { compiler: 'gcc-12' } }
- { compiler: gcc-12, build: { compiler: 'clang' } }
- { compiler: gcc-12, build: { compiler: 'llvm@15' } }
- { compiler: clang, build: { compiler: 'gcc-12' } }
- { compiler: clang, build: { compiler: 'llvm@15' } }
steps:
- name: 'brew install'
# Run this command with retries because of spurious failures seen
# while running the tests, for example
# https://github.com/curl/curl/runs/4095721123?check_suite_focus=true
run: |
echo ${{ matrix.build.generate && 'ninja' || 'automake libtool' }} \
pkgconf libpsl libssh2 \
${{ !matrix.build.clang-tidy && 'nghttp2 stunnel' || '' }} \
${{ contains(matrix.build.install_steps, 'pytest') && 'caddy httpd vsftpd' || '' }} \
${{ matrix.build.install }} | xargs -Ix -n1 echo brew '"x"' > /tmp/Brewfile
while [[ $? == 0 ]]; do for i in 1 2 3; do brew update && brew bundle install --file /tmp/Brewfile && break 2 || { echo Error: wait to try again; sleep 10; } done; false Too many retries; done
- name: 'brew unlink openssl'
if: ${{ contains(matrix.build.install, 'libressl') || contains(matrix.build.install, 'quictls') }}
run: |
if test -d $(brew --prefix)/include/openssl; then
brew unlink openssl
fi
- name: 'toolchain versions'
run: |
[[ '${{ matrix.compiler }}' = 'llvm'* ]] && CC="$(brew --prefix ${{ matrix.compiler }})/bin/clang"
[[ '${{ matrix.compiler }}' = 'gcc'* ]] && "${CC}" --print-sysroot
which "${CC}"; "${CC}" --version || true
xcodebuild -version || true
xcrun --sdk macosx --show-sdk-path 2>/dev/null || true
xcrun --sdk macosx --show-sdk-version || true
ls -l /Library/Developer/CommandLineTools/SDKs || true
echo '::group::macros predefined'; "${CC}" -dM -E - < /dev/null | sort || true; echo '::endgroup::'
echo '::group::brew packages installed'; ls -l "$(brew --prefix)/opt"; echo '::endgroup::'
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
persist-credentials: false
- name: 'autoreconf'
if: ${{ matrix.build.configure }}
run: autoreconf -fi
- name: 'configure'
run: |
if [[ '${{ matrix.compiler }}' = 'gcc'* ]]; then
sysroot="$("${CC}" --print-sysroot)" # Must match the SDK gcc was built for
else
sysroot="$(xcrun --sdk macosx --show-sdk-path 2>/dev/null)"
fi
if [[ '${{ matrix.compiler }}' = 'llvm'* ]]; then
CC="$(brew --prefix ${{ matrix.compiler }})/bin/clang"
CC+=" --sysroot=${sysroot}"
CC+=" --target=$(uname -m)-apple-darwin"
fi
if [ -n '${{ matrix.build.generate }}' ]; then
for _chkprefill in '' ${{ matrix.build.chkprefill }}; do
options=''
[ -n '${{ matrix.build.macos-version-min }}' ] && options+=' -DCMAKE_OSX_DEPLOYMENT_TARGET=${{ matrix.build.macos-version-min }}'
[[ '${{ matrix.build.install_steps }}' = *'pytest'* ]] && options+=' -DTEST_NGHTTPX= -DHTTPD_NGHTTPX='
[ "${_chkprefill}" = '_chkprefill' ] && options+=' -D_CURL_PREFILL=OFF'
cmake -B "bld${_chkprefill}" -G Ninja -D_CURL_PREFILL=ON \
-DCMAKE_UNITY_BUILD=ON -DCURL_TEST_BUNDLES=ON -DCURL_WERROR=ON \
-DCMAKE_OSX_SYSROOT="${sysroot}" \
-DCMAKE_C_COMPILER_TARGET="$(uname -m | sed 's/arm64/aarch64/')-apple-darwin$(uname -r)" \
${{ matrix.build.generate }} ${options}
done
if [ -d bld_chkprefill ] && ! diff -u bld/lib/curl_config.h bld_chkprefill/lib/curl_config.h; then
echo '::group::reference configure log'; cat bld_chkprefill/CMakeFiles/CMake*.yaml 2>/dev/null || true; echo '::endgroup::'
false
fi
else
export CFLAGS
if [[ '${{ matrix.compiler }}' = 'llvm'* ]]; then
options+=" --target=$(uname -m)-apple-darwin"
fi
if [ '${{ matrix.compiler }}' != 'clang' ]; then
options+=" --with-sysroot=${sysroot}"
CFLAGS+=" --sysroot=${sysroot}"
fi
[ -n '${{ matrix.build.macos-version-min }}' ] && CFLAGS+=' -mmacosx-version-min=${{ matrix.build.macos-version-min }}'
[[ '${{ matrix.build.install_steps }}' = *'pytest'* ]] && options+=' --with-test-nghttpx= ac_cv_path_HTTPD_NGHTTPX='
mkdir bld && cd bld && ../configure --enable-unity --enable-test-bundles --enable-warnings --enable-werror \
--disable-dependency-tracking \
--with-libpsl=$(brew --prefix libpsl) \
${{ matrix.build.configure }} ${options}
fi
- name: 'configure log'
if: ${{ !cancelled() }}
run: cat bld/config.log bld/CMakeFiles/CMakeConfigureLog.yaml 2>/dev/null || true
- name: 'curl_config.h'
run: |
echo '::group::raw'; cat bld/lib/curl_config.h || true; echo '::endgroup::'
grep -F '#define' bld/lib/curl_config.h | sort || true
- name: 'test configs'
run: grep -H -v '^#' bld/tests/config bld/tests/http/config.ini || true
- name: 'build-cert'
if: contains(matrix.build.generate, '-DCURL_USE_SECTRANSP=ON') || contains(matrix.build.configure, '--with-secure-transport')
run: |
if [ -n '${{ matrix.build.generate }}' ]; then
cmake --build bld --target clean-certs
cmake --build bld --target build-certs --parallel 1
else
make -C bld/tests/certs clean-certs
make -C bld/tests/certs build-certs -j1
fi
- name: 'build'
run: |
if [ -n '${{ matrix.build.generate }}' ]; then
cmake --build bld --verbose
else
make -C bld V=1
fi
- name: 'curl version'
run: bld/src/curl --disable --version
- name: 'build tests'
run: |
if [ -n '${{ matrix.build.generate }}' ]; then
cmake --build bld --target testdeps
else
make -C bld V=1 -C tests
fi
- name: 'install test prereqs'
if: ${{ !matrix.build.clang-tidy }}
run: |
python3 -m venv $HOME/venv
source $HOME/venv/bin/activate
python3 -m pip install -r tests/requirements.txt
- name: 'run tests'
if: ${{ !matrix.build.clang-tidy }}
timeout-minutes: ${{ matrix.build.torture && 20 || 10 }}
run: |
export TFLAGS='-j20 ${{ matrix.build.tflags }}'
if [ -z '${{ matrix.build.torture }}' ]; then
TFLAGS+=' ~2037 ~2041' # flaky
fi
source $HOME/venv/bin/activate
rm -f $HOME/.curlrc
if [ -n '${{ matrix.build.generate }}' ]; then
cmake --build bld --target ${{ matrix.build.torture && 'test-torture' || 'test-ci' }}
else
make -C bld V=1 ${{ matrix.build.torture && 'test-torture' || 'test-ci' }}
fi
- name: 'install pytest prereqs'
if: ${{ !matrix.build.clang-tidy && contains(matrix.build.install_steps, 'pytest') }}
run: |
source $HOME/venv/bin/activate
python3 -m pip install -r tests/http/requirements.txt
- name: 'run pytest'
if: ${{ !matrix.build.clang-tidy && contains(matrix.build.install_steps, 'pytest') }}
env:
CURL_CI: github
PYTEST_ADDOPTS: '--color=yes'
run: |
source $HOME/venv/bin/activate
if [ -n '${{ matrix.build.generate }}' ]; then
cmake --build bld --verbose --target curl-pytest-ci
else
make -C bld V=1 pytest-ci
fi
- name: 'build examples'
if: ${{ contains(matrix.build.name, '+examples') }}
run: |
if [ -n '${{ matrix.build.generate }}' ]; then
cmake --build bld --verbose --target curl-examples
else
make -C bld examples V=1
fi
combinations: # Test buildability with host OS, Xcode / SDK, compiler, target-OS, Secure Transport/not, built tool, combinations
if: true # Set to `true` to enable this test matrix. It runs quickly.
name: "${{ matrix.build == 'cmake' && 'CM' || 'AM' }} ${{ matrix.compiler }} ${{ matrix.image }} ${{ matrix.xcode }} ${{ matrix.config }}"
runs-on: ${{ matrix.image }}
timeout-minutes: 10
env:
DEVELOPER_DIR: "/Applications/Xcode${{ matrix.xcode && format('_{0}', matrix.xcode) || '' }}.app/Contents/Developer"
CC: ${{ matrix.compiler }}
strategy:
fail-fast: false
matrix:
config: [SecureTransport] # also: OpenSSL
compiler: [gcc-12, gcc-13, gcc-14, llvm@15, llvm@18, clang]
# Xcode support matrix as of 2024-07, with default macOS SDK versions and OS names, years:
# * = default Xcode on the runner.
# macos-13: 14.1, 14.2, 14.3.1, 15.0.1, 15.1,*15.2
# macos-14: 15.0.1, 15.1, 15.2, 15.3,*15.4
# macos-15: *16.0, 16.1
# macOSSDK: 13.0, 13.1, 13.3, 14.0, 14.2, 14.2, 14.4, 14.5, 15.0, 15.1
# Ventura (2022) Sonoma (2023) Sequoia (2024)
# https://github.com/actions/runner-images/tree/main/images/macos
# https://en.wikipedia.org/wiki/MacOS_version_history
image: [macos-13, macos-14, macos-15]
# Can skip these to reduce jobs:
# 15.1 has the same default macOS SDK as 15.2 and identical test results.
# 14.1, 15.4 not revealing new fallouts.
#xcode: ['14.1', '14.2', '14.3.1', '15.0.1', '15.1', '15.2', '15.3', '15.4', '16.0', '16.1'] # all Xcode
#xcode: ['14.1', '14.2', '14.3.1', '15.0.1' , '15.2', '15.3', '15.4', '16.0', '16.1'] # all SDK
#xcode: [ '14.2', '14.3.1', '15.0.1' , '15.2', '15.3' , '16.0' ] # coverage
xcode: [''] # default Xcodes
macos-version-min: ['']
build: [autotools, cmake]
exclude:
# Combinations not covered by runner images:
- { image: macos-13, xcode: '15.3' }
- { image: macos-13, xcode: '15.4' }
- { image: macos-13, xcode: '16.0' }
- { image: macos-13, xcode: '16.1' }
- { image: macos-14, xcode: '14.1' }
- { image: macos-14, xcode: '14.2' }
- { image: macos-14, xcode: '14.3.1' }
- { image: macos-14, xcode: '16.0' }
- { image: macos-14, xcode: '16.1' }
- { image: macos-15, xcode: '14.1' }
- { image: macos-15, xcode: '14.2' }
- { image: macos-15, xcode: '14.3.1' }
- { image: macos-15, xcode: '15.0.1' }
- { image: macos-15, xcode: '15.1' }
- { image: macos-15, xcode: '15.2' }
- { image: macos-15, xcode: '15.3' }
- { image: macos-15, xcode: '15.4' }
- { image: macos-13, compiler: 'llvm@18' }
- { image: macos-14, compiler: 'llvm@18' }
- { image: macos-15, compiler: 'llvm@15' }
# Reduce build combinations, by dropping less interesting ones
- { compiler: gcc-12, config: SecureTransport }
- { compiler: gcc-13, build: cmake }
- { compiler: gcc-14, build: autotools }
steps:
- name: 'install autotools'
if: ${{ matrix.build == 'autotools' }}
run: |
echo automake libtool | xargs -Ix -n1 echo brew '"x"' > /tmp/Brewfile
while [[ $? == 0 ]]; do for i in 1 2 3; do brew update && brew bundle install --file /tmp/Brewfile && break 2 || { echo Error: wait to try again; sleep 10; } done; false Too many retries; done
- name: 'toolchain versions'
run: |
[[ '${{ matrix.compiler }}' = 'llvm'* ]] && CC="$(brew --prefix ${{ matrix.compiler }})/bin/clang"
[[ '${{ matrix.compiler }}' = 'gcc'* ]] && "${CC}" --print-sysroot
which "${CC}"; "${CC}" --version || true
xcodebuild -version || true
xcrun --sdk macosx --show-sdk-path 2>/dev/null || true
xcrun --sdk macosx --show-sdk-version || true
ls -l /Library/Developer/CommandLineTools/SDKs || true
echo '::group::macros predefined'; "${CC}" -dM -E - < /dev/null | sort || true; echo '::endgroup::'
echo '::group::brew packages preinstalled'; ls -l "$(brew --prefix)/opt"; echo '::endgroup::'
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
persist-credentials: false
- name: 'autoreconf'
if: ${{ matrix.build == 'autotools' }}
run: autoreconf -fi
- name: 'configure / ${{ matrix.build }}'
run: |
if [[ '${{ matrix.compiler }}' = 'gcc'* ]]; then
sysroot="$("${CC}" --print-sysroot)" # Must match the SDK gcc was built for
else
sysroot="$(xcrun --sdk macosx --show-sdk-path 2>/dev/null)"
fi
if [[ '${{ matrix.compiler }}' = 'llvm'* ]]; then
CC="$(brew --prefix ${{ matrix.compiler }})/bin/clang"
CC+=" --sysroot=${sysroot}"
CC+=" --target=$(uname -m)-apple-darwin"
fi
if [ '${{ matrix.build }}' = 'cmake' ]; then
[ '${{ matrix.config }}' = 'OpenSSL' ] && options+=' -DCURL_USE_OPENSSL=ON'
[ '${{ matrix.config }}' = 'SecureTransport' ] && options+=' -DCURL_USE_SECTRANSP=ON'
[ -n '${{ matrix.macos-version-min }}' ] && options+=' -DCMAKE_OSX_DEPLOYMENT_TARGET=${{ matrix.macos-version-min }}'
# would pick up nghttp2, libidn2, and libssh2
cmake -B bld -D_CURL_PREFILL=ON \
-DCMAKE_UNITY_BUILD=ON -DCURL_TEST_BUNDLES=ON -DCURL_WERROR=ON \
-DCMAKE_OSX_SYSROOT="${sysroot}" \
-DCMAKE_C_COMPILER_TARGET="$(uname -m | sed 's/arm64/aarch64/')-apple-darwin$(uname -r)" \
-DCMAKE_IGNORE_PREFIX_PATH="$(brew --prefix)" \
-DBUILD_LIBCURL_DOCS=OFF -DBUILD_MISC_DOCS=OFF -DENABLE_CURL_MANUAL=OFF \
-DUSE_NGHTTP2=OFF -DUSE_LIBIDN2=OFF \
-DCURL_USE_LIBPSL=OFF -DCURL_USE_LIBSSH2=OFF \
${options}
else
export CFLAGS
if [[ '${{ matrix.compiler }}' = 'llvm'* ]]; then
options+=" --target=$(uname -m)-apple-darwin"
fi
if [ '${{ matrix.compiler }}' != 'clang' ]; then
options+=" --with-sysroot=${sysroot}"
CFLAGS+=" --sysroot=${sysroot}"
fi
[ '${{ matrix.config }}' = 'OpenSSL' ] && options+=" --with-openssl=$(brew --prefix openssl)"
[ '${{ matrix.config }}' = 'SecureTransport' ] && options+=' --with-secure-transport'
[ -n '${{ matrix.macos-version-min }}' ] && CFLAGS+=' -mmacosx-version-min=${{ matrix.macos-version-min }}'
# would pick up nghttp2, libidn2, but libssh2 is disabled by default
mkdir bld && cd bld && ../configure --enable-unity --enable-test-bundles --enable-warnings --enable-werror \
--disable-dependency-tracking \
--disable-docs --disable-manual \
--without-nghttp2 --without-libidn2 \
--without-libpsl \
${options}
fi
- name: 'configure log'
if: ${{ !cancelled() }}
run: cat bld/config.log bld/CMakeFiles/CMakeConfigureLog.yaml 2>/dev/null || true
- name: 'curl_config.h'
run: |
echo '::group::raw'; cat bld/lib/curl_config.h || true; echo '::endgroup::'
grep -F '#define' bld/lib/curl_config.h | sort || true
- name: 'build / ${{ matrix.build }}'
run: make -C bld V=1 VERBOSE=1
- name: 'curl version'
run: bld/src/curl --disable --version

View File

@ -0,0 +1,729 @@
# Copyright (C) Viktor Szakats
#
# SPDX-License-Identifier: curl
name: non-native
'on':
push:
branches:
- master
- '*/ci'
paths-ignore:
- '**/*.md'
- '.circleci/**'
- 'appveyor.*'
- 'packages/**'
- 'plan9/**'
- 'projects/**'
- 'winbuild/**'
pull_request:
branches:
- master
paths-ignore:
- '**/*.md'
- '.circleci/**'
- 'appveyor.*'
- 'packages/**'
- 'plan9/**'
- 'projects/**'
- 'winbuild/**'
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.sha }}
cancel-in-progress: true
permissions: {}
jobs:
netbsd:
name: 'NetBSD, CM clang openssl ${{ matrix.arch }}'
runs-on: ubuntu-latest
timeout-minutes: 10
strategy:
matrix:
arch: ['x86_64']
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
persist-credentials: false
- name: 'cmake'
uses: cross-platform-actions/action@fe0167d8082ac584754ef3ffb567fded22642c7d # v0.27.0
with:
operating_system: 'netbsd'
version: '10.1'
architecture: ${{ matrix.arch }}
run: |
# https://pkgsrc.se/
time sudo pkgin -y install cmake ninja-build pkg-config perl brotli heimdal openldap-client libssh2 libidn2 libpsl nghttp2 py311-impacket
time cmake -B bld -G Ninja \
-DCMAKE_UNITY_BUILD=ON -DCURL_TEST_BUNDLES=ON \
-DCURL_WERROR=ON \
-DENABLE_DEBUG=ON -DCMAKE_BUILD_TYPE=Debug \
-DCURL_USE_OPENSSL=ON \
-DCURL_USE_GSSAPI=ON \
|| { cat bld/CMakeFiles/CMake*.yaml; false; }
echo '::group::curl_config.h (raw)'; cat bld/lib/curl_config.h || true; echo '::endgroup::'
echo '::group::curl_config.h'; grep -F '#define' bld/lib/curl_config.h | sort || true; echo '::endgroup::'
time cmake --build bld
bld/src/curl --disable --version
if [ '${{ matrix.arch }}' = 'x86_64' ]; then # Slow on emulated CPU
time cmake --build bld --target testdeps
export TFLAGS='-j4'
time cmake --build bld --target test-ci
fi
echo '::group::build examples'
time cmake --build bld --target curl-examples
echo '::endgroup::'
openbsd:
name: 'OpenBSD, CM clang libressl ${{ matrix.arch }}'
runs-on: ubuntu-latest
timeout-minutes: 10
strategy:
matrix:
arch: ['x86_64']
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
persist-credentials: false
- name: 'cmake'
uses: cross-platform-actions/action@fe0167d8082ac584754ef3ffb567fded22642c7d # v0.27.0
with:
operating_system: 'openbsd'
version: '7.5'
architecture: ${{ matrix.arch }}
run: |
# https://openbsd.app/
# https://www.openbsd.org/faq/faq15.html
time sudo pkg_add cmake ninja brotli openldap-client-- libssh2 libidn2 libpsl nghttp2 python3 py3-impacket
time cmake -B bld -G Ninja \
-DCMAKE_UNITY_BUILD=ON -DCURL_TEST_BUNDLES=ON \
-DCURL_WERROR=ON \
-DENABLE_DEBUG=ON -DCMAKE_BUILD_TYPE=Debug \
-DCURL_USE_OPENSSL=ON \
|| { cat bld/CMakeFiles/CMake*.yaml; false; }
echo '::group::curl_config.h (raw)'; cat bld/lib/curl_config.h || true; echo '::endgroup::'
echo '::group::curl_config.h'; grep -F '#define' bld/lib/curl_config.h | sort || true; echo '::endgroup::'
time cmake --build bld
bld/src/curl --disable --version
if [ '${{ matrix.arch }}' = 'x86_64' ]; then # Slow on emulated CPU
time cmake --build bld --target testdeps
export TFLAGS='-j8 ~3017 ~TFTP ~FTP' # FIXME: TFTP requests executed twice? Related: `curl: (69) TFTP: Access Violation`?
time cmake --build bld --target test-ci
fi
echo '::group::build examples'
time cmake --build bld --target curl-examples
echo '::endgroup::'
freebsd:
name: "FreeBSD, ${{ matrix.build == 'cmake' && 'CM' || 'AM' }} ${{ matrix.compiler }} openssl${{ matrix.desc }} ${{ matrix.arch }}"
runs-on: ubuntu-latest
timeout-minutes: 20
strategy:
matrix:
include:
- { build: 'autotools', arch: 'x86_64', compiler: 'clang' }
- { build: 'cmake' , arch: 'x86_64', compiler: 'clang', options: '-DCMAKE_UNITY_BUILD=OFF -DCURL_TEST_BUNDLES=OFF', desc: ' !unity !bundle !runtests !examples' }
- { build: 'autotools', arch: 'arm64', compiler: 'clang' }
- { build: 'cmake' , arch: 'arm64', compiler: 'clang' }
fail-fast: false
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
persist-credentials: false
- name: 'autotools'
if: ${{ matrix.build == 'autotools' }}
uses: cross-platform-actions/action@fe0167d8082ac584754ef3ffb567fded22642c7d # v0.27.0
with:
operating_system: 'freebsd'
version: '14.1'
architecture: ${{ matrix.arch }}
run: |
export MAKEFLAGS=-j3
# https://ports.freebsd.org/
time sudo pkg install -y autoconf automake libtool \
pkgconf brotli openldap26-client libidn2 libnghttp2 stunnel py311-impacket
time autoreconf -fi
export CC='${{ matrix.compiler }}'
if [ '${{ matrix.arch }}' != 'x86_64' ]; then
options='--disable-manual --disable-docs' # Slow with autotools, skip on emulated CPU
fi
mkdir bld && cd bld && time ../configure --enable-unity --enable-test-bundles --enable-debug --enable-warnings --enable-werror \
--prefix="${HOME}"/install \
--with-openssl \
--with-brotli --enable-ldap --enable-ldaps --with-libidn2 --with-libssh2 --with-nghttp2 --with-gssapi \
--disable-dependency-tracking \
${options} \
${{ matrix.options }} \
|| { tail -n 1000 config.log; false; }
echo '::group::curl_config.h (raw)'; cat lib/curl_config.h || true; echo '::endgroup::'
echo '::group::curl_config.h'; grep -F '#define' lib/curl_config.h | sort || true; echo '::endgroup::'
time make install
src/curl --disable --version
desc='${{ matrix.desc }}'
if [ '${{ matrix.arch }}' = 'x86_64' ]; then # Slow on emulated CPU
time make -C tests
if [ "${desc#*!runtests*}" = "${desc}" ]; then
time make test-ci V=1 TFLAGS='-j4'
fi
fi
if [ "${desc#*!examples*}" = "${desc}" ]; then
echo '::group::build examples'
time make examples
echo '::endgroup::'
fi
- name: 'cmake'
if: ${{ matrix.build == 'cmake' }}
uses: cross-platform-actions/action@fe0167d8082ac584754ef3ffb567fded22642c7d # v0.27.0
with:
operating_system: 'freebsd'
version: '14.1'
architecture: ${{ matrix.arch }}
run: |
# https://ports.freebsd.org/
time sudo pkg install -y cmake-core ninja perl5 \
pkgconf brotli openldap26-client libidn2 libnghttp2 stunnel py311-impacket
time cmake -B bld -G Ninja \
-DCMAKE_C_COMPILER='${{ matrix.compiler }}' \
-DCMAKE_UNITY_BUILD=ON -DCURL_TEST_BUNDLES=ON \
-DCURL_WERROR=ON \
-DENABLE_DEBUG=ON -DCMAKE_BUILD_TYPE=Debug \
-DCURL_USE_OPENSSL=ON \
-DCURL_USE_GSSAPI=ON \
${{ matrix.options }} \
|| { cat bld/CMakeFiles/CMake*.yaml; false; }
echo '::group::curl_config.h (raw)'; cat bld/lib/curl_config.h || true; echo '::endgroup::'
echo '::group::curl_config.h'; grep -F '#define' bld/lib/curl_config.h | sort || true; echo '::endgroup::'
time cmake --build bld
bld/src/curl --disable --version
desc='${{ matrix.desc }}'
if [ '${{ matrix.arch }}' = 'x86_64' ]; then # Slow on emulated CPU
time cmake --build bld --target testdeps
if [ "${desc#*!runtests*}" = "${desc}" ]; then
time cmake --build bld --target test-ci
fi
fi
if [ "${desc#*!examples*}" = "${desc}" ]; then
echo '::group::build examples'
time cmake --build bld --target curl-examples
echo '::endgroup::'
fi
omnios:
name: 'OmniOS, AM gcc openssl amd64'
runs-on: ubuntu-latest
timeout-minutes: 15
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
persist-credentials: false
- name: 'autotools'
uses: vmactions/omnios-vm@8eba2a9217262f275d4566751a92d6ef2f433d00 # v1
with:
usesh: true
# https://pkg.omnios.org/r151052/core/en/index.shtml
prepare: pkg install build-essential libtool nghttp2
run: |
set -e
ln -s /usr/bin/gcpp /usr/bin/cpp # Some tests expect `cpp`, which is named `gcpp` in this env.
export MAKEFLAGS=-j3
time autoreconf -fi
mkdir bld && cd bld && time ../configure --enable-unity --enable-test-bundles --enable-debug --enable-warnings --enable-werror \
--prefix="${HOME}"/install \
--with-openssl \
--disable-dependency-tracking \
|| { tail -n 1000 config.log; false; }
echo '::group::curl_config.h (raw)'; cat lib/curl_config.h || true; echo '::endgroup::'
echo '::group::curl_config.h'; grep -F '#define' lib/curl_config.h | sort || true; echo '::endgroup::'
time gmake install
src/curl --disable --version
time gmake -C tests
time gmake test-ci V=1
echo '::group::build examples'
time gmake examples
echo '::endgroup::'
ios:
name: "iOS, ${{ (matrix.build.generator && format('CM-{0}', matrix.build.generator)) || (matrix.build.generate && 'CM' || 'AM' )}} ${{ matrix.build.name }} arm64"
runs-on: 'macos-latest'
timeout-minutes: 10
env:
MAKEFLAGS: -j 4
DEVELOPER_DIR: "/Applications/Xcode${{ matrix.build.xcode && format('_{0}', matrix.build.xcode) || '' }}.app/Contents/Developer"
CC: ${{ matrix.build.compiler || 'clang' }}
# renovate: datasource=github-tags depName=libressl-portable/portable versioning=semver registryUrl=https://github.com
libressl-version: 4.0.0
strategy:
fail-fast: false
matrix:
build:
- name: 'libressl'
install_steps: libressl
configure: --with-openssl="$HOME/libressl" --without-libpsl
- name: 'libressl'
install_steps: libressl
# FIXME: Could not make OPENSSL_ROOT_DIR work. CMake seems to prepend sysroot to it.
generate: >-
-DCMAKE_BUILD_TYPE=Release -DCMAKE_UNITY_BUILD_BATCH_SIZE=50
-DOPENSSL_INCLUDE_DIR="$HOME/libressl/include"
-DOPENSSL_SSL_LIBRARY="$HOME/libressl/lib/libssl.a"
-DOPENSSL_CRYPTO_LIBRARY="$HOME/libressl/lib/libcrypto.a"
-DCURL_USE_LIBPSL=OFF
- name: 'libressl'
install_steps: libressl
generator: Xcode
options: --config Debug
generate: >-
-DCMAKE_XCODE_ATTRIBUTE_CODE_SIGNING_ALLOWED=OFF
-DMACOSX_BUNDLE_GUI_IDENTIFIER=se.curl
-DOPENSSL_INCLUDE_DIR="$HOME/libressl/include"
-DOPENSSL_SSL_LIBRARY="$HOME/libressl/lib/libssl.a"
-DOPENSSL_CRYPTO_LIBRARY="$HOME/libressl/lib/libcrypto.a"
-DCURL_USE_LIBPSL=OFF
steps:
- name: 'brew install'
if: ${{ matrix.build.configure }}
run: |
echo automake libtool | xargs -Ix -n1 echo brew '"x"' > /tmp/Brewfile
while [[ $? == 0 ]]; do for i in 1 2 3; do brew update && brew bundle install --file /tmp/Brewfile && break 2 || { echo Error: wait to try again; sleep 10; } done; false Too many retries; done
- name: 'toolchain versions'
run: |
which "${CC}"; "${CC}" --version || true
xcodebuild -version || true
xcodebuild -sdk -version | grep '^Path:' || true
xcrun --sdk iphoneos --show-sdk-path 2>/dev/null || true
xcrun --sdk iphoneos --show-sdk-version || true
echo '::group::macros predefined'; "${CC}" -dM -E - < /dev/null | sort || true; echo '::endgroup::'
echo '::group::brew packages installed'; ls -l "$(brew --prefix)/opt"; echo '::endgroup::'
- name: 'cache libressl'
if: contains(matrix.build.install_steps, 'libressl')
uses: actions/cache@d4323d4df104b026a6aa633fdb11d772146be0bf # v4
id: cache-libressl
env:
cache-name: cache-libressl
with:
path: ~/libressl
key: iOS-${{ env.cache-name }}-${{ env.libressl-version }}
- name: 'build libressl'
if: contains(matrix.build.install_steps, 'libressl') && steps.cache-libressl.outputs.cache-hit != 'true'
run: |
curl -LsSf --retry 6 --retry-connrefused --max-time 999 \
https://github.com/libressl/portable/releases/download/v${{ env.libressl-version }}/libressl-${{ env.libressl-version }}.tar.gz | tar -x
cd libressl-${{ env.libressl-version }}
# FIXME: on the 4.0.1 release, delete '-DHAVE_ENDIAN_H=0'
cmake -B . \
-DHAVE_ENDIAN_H=0 \
-DCMAKE_INSTALL_PREFIX="$HOME/libressl" \
-DCMAKE_SYSTEM_NAME=iOS \
-DCMAKE_SYSTEM_PROCESSOR=aarch64 \
-DBUILD_SHARED_LIBS=OFF \
-DLIBRESSL_APPS=OFF \
-DLIBRESSL_TESTS=OFF
cmake --build .
cmake --install . --verbose
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
persist-credentials: false
- name: 'autoreconf'
if: ${{ matrix.build.configure }}
run: autoreconf -fi
- name: 'configure'
run: |
if [ -n '${{ matrix.build.generate }}' ]; then
# https://cmake.org/cmake/help/latest/manual/cmake-toolchains.7.html#cross-compiling-for-ios-tvos-visionos-or-watchos
[ -n '${{ matrix.build.generator }}' ] && options='-G ${{ matrix.build.generator }}'
cmake -B bld -D_CURL_PREFILL=ON \
-DCMAKE_UNITY_BUILD=ON -DCURL_TEST_BUNDLES=ON -DCURL_WERROR=ON \
-DCMAKE_SYSTEM_NAME=iOS \
-DUSE_APPLE_IDN=ON \
${{ matrix.build.generate }} ${options}
else
mkdir bld && cd bld && ../configure --enable-unity --enable-test-bundles --enable-warnings --enable-werror \
--disable-dependency-tracking \
CFLAGS="-isysroot $(xcrun --sdk iphoneos --show-sdk-path 2>/dev/null)" \
--host=aarch64-apple-darwin \
--with-apple-idn \
${{ matrix.build.configure }}
fi
- name: 'configure log'
if: ${{ !cancelled() }}
run: cat bld/config.log bld/CMakeFiles/CMakeConfigureLog.yaml 2>/dev/null || true
- name: 'curl_config.h'
run: |
echo '::group::raw'; cat bld/lib/curl_config.h || true; echo '::endgroup::'
grep -F '#define' bld/lib/curl_config.h | sort || true
- name: 'build'
run: |
if [ -n '${{ matrix.build.generate }}' ]; then
cmake --build bld ${{ matrix.build.options }} --parallel 4 --verbose
else
make -C bld V=1
fi
- name: 'curl info'
run: find . -type f \( -name curl -o -name '*.dylib' -o -name '*.a' \) -exec file '{}' \;
- name: 'build tests'
run: |
if [ -n '${{ matrix.build.generate }}' ]; then
cmake --build bld ${{ matrix.build.options }} --parallel 4 --target testdeps --verbose
else
make -C bld V=1 -C tests
fi
- name: 'build examples'
run: |
if [ -n '${{ matrix.build.generate }}' ]; then
cmake --build bld ${{ matrix.build.options }} --parallel 4 --target curl-examples --verbose
else
make -C bld examples V=1
fi
android:
name: "Android ${{ matrix.platform }}, ${{ matrix.build == 'cmake' && 'CM' || 'AM' }} ${{ matrix.name }} arm64"
runs-on: 'ubuntu-latest'
timeout-minutes: 25
env:
MAKEFLAGS: -j 5
VCPKG_BINARY_SOURCES: 'clear;x-gha,readwrite'
VCPKG_DISABLE_METRICS: '1'
strategy:
matrix:
include:
- { build: 'autotools', platform: '21', name: "openssl", install: 'brotli zstd libpsl nghttp2 openssl libssh2',
options: '--with-openssl --with-brotli' }
- { build: 'cmake' , platform: '21', name: "openssl", install: 'brotli zstd libpsl nghttp2 openssl libssh2',
options: '-DCURL_USE_OPENSSL=ON' }
- { build: 'autotools', platform: '35', name: "openssl", install: 'brotli zstd libpsl nghttp2 openssl libssh2',
options: '--with-openssl --with-brotli' }
- { build: 'cmake' , platform: '35', name: "openssl", install: 'brotli zstd libpsl nghttp2 openssl libssh2',
options: '-DCURL_USE_OPENSSL=ON' }
# FIXME: Must disable zstd explicitly, otherwise cmake/pkg-config finds it in /usr/include
# and the build fails. I had found no option to disable this behavior. Other default
# dependencies not offered via vcpkg may also need this.
- { build: 'cmake' , platform: '35', name: "boringssl !zstd", install: 'libpsl boringssl',
options: '-DCURL_USE_OPENSSL=ON -DOPENSSL_USE_STATIC_LIBS=ON -DCURL_ZSTD=OFF' }
fail-fast: false
steps:
- name: 'vcpkg cache setup'
if: ${{ matrix.install }}
uses: actions/github-script@60a0d83039c74a4aee543508d2ffcb1c3799cdea # v7
with:
script: |
core.exportVariable('ACTIONS_CACHE_URL', process.env.ACTIONS_CACHE_URL || '');
core.exportVariable('ACTIONS_RUNTIME_TOKEN', process.env.ACTIONS_RUNTIME_TOKEN || '');
- name: 'vcpkg versions'
if: ${{ matrix.install }}
timeout-minutes: 1
run: |
git -C "$VCPKG_INSTALLATION_ROOT" show --no-patch --format='%H %ai'
vcpkg version
- name: 'install prereqs for vcpkg'
if: contains(matrix.install, 'boringssl')
timeout-minutes: 5
run: |
sudo rm -f /etc/apt/sources.list.d/microsoft-prod.list
sudo apt-get -o Dpkg::Use-Pty=0 update
sudo apt-get -o Dpkg::Use-Pty=0 install nasm
- name: 'vcpkg build'
if: ${{ matrix.install }}
timeout-minutes: 20
run: vcpkg x-set-installed ${{ matrix.install }} '--triplet=arm64-android'
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
persist-credentials: false
- name: 'autoreconf'
if: ${{ matrix.build == 'autotools' }}
run: autoreconf -fi
- name: 'configure'
run: |
export PKG_CONFIG_PATH="$VCPKG_INSTALLATION_ROOT/installed/arm64-android/lib/pkgconfig"
if [ '${{ matrix.build }}' = 'cmake' ]; then # https://developer.android.com/ndk/guides/cmake
cmake -B bld \
-DANDROID_ABI=arm64-v8a \
-DANDROID_PLATFORM='android-${{ matrix.platform }}' \
-DCMAKE_TOOLCHAIN_FILE="$VCPKG_INSTALLATION_ROOT/scripts/buildsystems/vcpkg.cmake" \
-DVCPKG_CHAINLOAD_TOOLCHAIN_FILE=${ANDROID_NDK_HOME}/build/cmake/android.toolchain.cmake -DCMAKE_WARN_DEPRECATED=OFF \
-DVCPKG_INSTALLED_DIR="$VCPKG_INSTALLATION_ROOT/installed" \
-DVCPKG_TARGET_TRIPLET=arm64-android \
-DCMAKE_UNITY_BUILD=ON -DCURL_TEST_BUNDLES=ON \
-DCURL_WERROR=ON \
${{ matrix.options }}
else
TOOLCHAIN="${ANDROID_NDK_HOME}/toolchains/llvm/prebuilt/linux-x86_64"
mkdir bld && cd bld && ../configure --disable-dependency-tracking --enable-unity --enable-test-bundles --enable-warnings --enable-werror \
CC="$TOOLCHAIN/bin/aarch64-linux-android${{ matrix.platform }}-clang" \
AR="$TOOLCHAIN/bin/llvm-ar" \
RANLIB="$TOOLCHAIN/bin/llvm-ranlib" \
--host=aarch64-linux-android${{ matrix.platform }} \
${{ matrix.options }}
fi
- name: 'configure log'
if: ${{ !cancelled() }}
run: cat bld/config.log bld/CMakeFiles/CMake*.yaml 2>/dev/null || true
- name: 'curl_config.h'
run: |
echo '::group::raw'; cat bld/lib/curl_config.h || true; echo '::endgroup::'
grep -F '#define' bld/lib/curl_config.h | sort || true
- name: 'build'
run: |
if [ '${{ matrix.build }}' = 'cmake' ]; then
cmake --build bld --verbose
else
make -C bld V=1
fi
- name: 'curl info'
run: find . -type f \( -name curl -o -name '*.so' -o -name '*.a' \) -exec file '{}' \;
- name: 'build tests'
run: |
if [ '${{ matrix.build }}' = 'cmake' ]; then
cmake --build bld --target testdeps
else
make -C bld -C tests
fi
- name: 'build examples'
run: |
if [ '${{ matrix.build }}' = 'cmake' ]; then
cmake --build bld --target curl-examples
else
make -C bld examples
fi
amiga:
name: "AmigaOS, ${{ matrix.build == 'cmake' && 'CM' || 'AM' }} gcc AmiSSL m68k"
runs-on: 'ubuntu-latest'
timeout-minutes: 5
env:
MAKEFLAGS: -j 5
amissl-version: 5.18
strategy:
matrix:
build: [autotools, cmake]
fail-fast: false
steps:
- name: 'install compiler'
if: ${{ steps.cache-compiler.outputs.cache-hit != 'true' }}
run: |
cd "${HOME}" || exit 1
curl --disable --fail --silent --show-error --connect-timeout 15 --max-time 120 --retry 3 \
--location https://github.com/bebbo/amiga-gcc/releases/download/Mechen/amiga-gcc.tgz | tar -xz
cd opt/appveyor || exit 1
curl --disable --fail --silent --show-error --connect-timeout 15 --max-time 60 --retry 3 \
--location https://github.com/jens-maus/amissl/releases/download/${{ env.amissl-version }}/AmiSSL-${{ env.amissl-version }}-SDK.lha --output bin.lha
7z x -bd -y bin.lha
rm -f bin.lha
mv "$HOME/opt/appveyor" /opt
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
persist-credentials: false
- name: 'configure'
run: |
if [ '${{ matrix.build }}' = 'cmake' ]; then
cmake -B bld \
-DAMIGA=1 \
-DCMAKE_SYSTEM_NAME=Generic \
-DCMAKE_SYSTEM_PROCESSOR=m68k \
-DCMAKE_C_COMPILER_TARGET=m68k-unknown-amigaos \
-DCMAKE_C_COMPILER=/opt/appveyor/build-agent/opt/amiga/bin/m68k-amigaos-gcc \
-DCMAKE_C_FLAGS='-O0 -msoft-float -mcrt=clib2' \
-DCMAKE_UNITY_BUILD=ON -DCURL_TEST_BUNDLES=ON \
-DCURL_WERROR=ON \
-DCURL_USE_LIBPSL=OFF \
-DAMISSL_INCLUDE_DIR=/opt/appveyor/AmiSSL/Developer/include \
-DAMISSL_STUBS_LIBRARY=/opt/appveyor/AmiSSL/Developer/lib/AmigaOS3/libamisslstubs.a \
-DAMISSL_AUTO_LIBRARY=/opt/appveyor/AmiSSL/Developer/lib/AmigaOS3/libamisslauto.a
else
autoreconf -fi
mkdir bld && cd bld && ../configure --disable-dependency-tracking --enable-unity --enable-test-bundles --enable-warnings --enable-werror \
CC=/opt/appveyor/build-agent/opt/amiga/bin/m68k-amigaos-gcc \
AR=/opt/appveyor/build-agent/opt/amiga/bin/m68k-amigaos-ar \
RANLIB=/opt/appveyor/build-agent/opt/amiga/bin/m68k-amigaos-ranlib \
--host=m68k-amigaos \
--disable-shared \
--without-libpsl \
--with-amissl \
LDFLAGS=-L/opt/appveyor/AmiSSL/Developer/lib/AmigaOS3 \
CPPFLAGS=-I/opt/appveyor/AmiSSL/Developer/include \
CFLAGS='-O0 -msoft-float -mcrt=clib2' \
LIBS='-lnet -lm -latomic'
fi
- name: 'configure log'
if: ${{ !cancelled() }}
run: cat bld/config.log bld/CMakeFiles/CMake*.yaml 2>/dev/null || true
- name: 'curl_config.h'
run: |
echo '::group::raw'; cat bld/lib/curl_config.h || true; echo '::endgroup::'
grep -F '#define' bld/lib/curl_config.h | sort || true
- name: 'build'
run: |
if [ '${{ matrix.build }}' = 'cmake' ]; then
cmake --build bld
else
make -C bld
fi
- name: 'curl info'
run: find . -type f \( -name curl -o -name '*.a' \) -exec file '{}' \;
- name: 'build tests'
if: ${{ matrix.build == 'cmake' }} # skip for autotools to save time
run: |
if [ '${{ matrix.build }}' = 'cmake' ]; then
cmake --build bld --target testdeps
else
make -C bld -C tests
fi
- name: 'build examples'
if: ${{ matrix.build == 'cmake' }} # skip for autotools to save time
run: |
if [ '${{ matrix.build }}' = 'cmake' ]; then
cmake --build bld --target curl-examples
else
make -C bld examples
fi
msdos:
name: "MS-DOS, ${{ matrix.build == 'cmake' && 'CM' || 'AM' }} djgpp openssl i586"
runs-on: 'ubuntu-latest'
timeout-minutes: 5
env:
MAKEFLAGS: -j 5
toolchain-version: '3.4'
strategy:
matrix:
build: [autotools, cmake]
fail-fast: false
steps:
- name: 'install packages'
run: sudo apt-get -o Dpkg::Use-Pty=0 install libfl2 ${{ matrix.build == 'cmake' && 'ninja-build' || '' }}
- name: 'cache compiler (djgpp)'
uses: actions/cache@d4323d4df104b026a6aa633fdb11d772146be0bf # v4
id: cache-compiler
with:
path: ~/djgpp
key: ${{ runner.os }}-djgpp-${{ env.toolchain-version }}-amd64
- name: 'install compiler (djgpp)'
if: ${{ steps.cache-compiler.outputs.cache-hit != 'true' }}
run: |
cd "${HOME}" || exit 1
curl --disable --fail --silent --show-error --connect-timeout 15 --max-time 120 --retry 3 \
--location 'https://github.com/andrewwutw/build-djgpp/releases/download/v${{ env.toolchain-version }}/djgpp-linux64-gcc1220.tar.bz2' | tar -xj
cd djgpp || exit 1
for f in wat3211b.zip zlb13b.zip ssl102ub.zip; do
curl --disable --fail --silent --show-error --connect-timeout 15 --max-time 60 --retry 3 \
"https://www.delorie.com/pub/djgpp/current/v2tk/$f" --output bin.zip
unzip -q bin.zip
rm -f bin.zip
done
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
persist-credentials: false
- name: 'configure'
run: |
if [ '${{ matrix.build }}' = 'cmake' ]; then
cmake -B bld -G Ninja \
-DCMAKE_SYSTEM_NAME=DOS \
-DCMAKE_SYSTEM_PROCESSOR=x86 \
-DCMAKE_C_COMPILER_TARGET=i586-pc-msdosdjgpp \
-DCMAKE_C_COMPILER="$HOME/djgpp/bin/i586-pc-msdosdjgpp-gcc" \
-DCMAKE_UNITY_BUILD=ON -DCURL_TEST_BUNDLES=ON \
-DCURL_WERROR=ON \
-DCURL_USE_LIBPSL=OFF \
-DOPENSSL_INCLUDE_DIR="$HOME/djgpp/include" \
-DOPENSSL_SSL_LIBRARY="$HOME/djgpp/lib/libssl.a" \
-DOPENSSL_CRYPTO_LIBRARY="$HOME/djgpp/lib/libcrypto.a" \
-DZLIB_INCLUDE_DIR="$HOME/djgpp/include" \
-DZLIB_LIBRARY="$HOME/djgpp/lib/libz.a" \
-DWATT_ROOT="$HOME/djgpp/net/watt"
else
autoreconf -fi
mkdir bld && cd bld && ../configure --disable-dependency-tracking --enable-unity --enable-test-bundles --enable-warnings --enable-werror \
CC="$HOME/djgpp/bin/i586-pc-msdosdjgpp-gcc" \
AR="$HOME/djgpp/bin/i586-pc-msdosdjgpp-ar" \
RANLIB="$HOME/djgpp/bin/i586-pc-msdosdjgpp-ranlib" \
WATT_ROOT="$HOME/djgpp/net/watt" \
--host=i586-pc-msdosdjgpp \
--with-openssl="$HOME/djgpp" \
--with-zlib="$HOME/djgpp" \
--without-libpsl \
--disable-shared
fi
- name: 'configure log'
if: ${{ !cancelled() }}
run: cat bld/config.log bld/CMakeFiles/CMake*.yaml 2>/dev/null || true
- name: 'curl_config.h'
run: |
echo '::group::raw'; cat bld/lib/curl_config.h || true; echo '::endgroup::'
grep -F '#define' bld/lib/curl_config.h | sort || true
- name: 'build'
run: |
if [ '${{ matrix.build }}' = 'cmake' ]; then
cmake --build bld
else
make -C bld
fi
- name: 'curl info'
run: find . \( -name '*.exe' -o -name '*.a' \) -exec file '{}' \;
- name: 'build tests'
if: ${{ matrix.build == 'cmake' }} # skip for autotools to save time
run: |
if [ '${{ matrix.build }}' = 'cmake' ]; then
cmake --build bld --target testdeps
else
make -C bld -C tests
fi
- name: 'build examples'
if: ${{ matrix.build == 'cmake' }} # skip for autotools to save time
run: |
if [ '${{ matrix.build }}' = 'cmake' ]; then
cmake --build bld --target curl-examples
else
make -C bld examples
fi

956
deps/curl/.github/workflows/windows.yml vendored Normal file
View File

@ -0,0 +1,956 @@
# Copyright (C) Viktor Szakats
#
# SPDX-License-Identifier: curl
name: Windows
'on':
push:
branches:
- master
- '*/ci'
paths-ignore:
- '**/*.md'
- '.circleci/**'
- 'appveyor.*'
- 'packages/**'
- 'plan9/**'
- 'projects/**'
- 'winbuild/**'
pull_request:
branches:
- master
paths-ignore:
- '**/*.md'
- '.circleci/**'
- 'appveyor.*'
- 'packages/**'
- 'plan9/**'
- 'projects/**'
- 'winbuild/**'
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.sha }}
cancel-in-progress: true
permissions: {}
jobs:
cygwin:
name: "cygwin, ${{ matrix.build == 'cmake' && 'CM' || 'AM' }} ${{ matrix.platform }} ${{ matrix.name }}"
runs-on: windows-latest
timeout-minutes: 25
defaults:
run:
shell: C:\cygwin\bin\bash.exe '{0}'
env:
MAKEFLAGS: -j 5
SHELLOPTS: 'igncr'
strategy:
matrix:
include:
- { build: 'automake', platform: 'x86_64', tflags: 'skiprun', config: '--with-openssl', install: 'libssl-devel', name: 'openssl R' }
- { build: 'cmake' , platform: 'x86_64', tflags: '' , config: '-DENABLE_DEBUG=ON -DCURL_USE_OPENSSL=ON -DENABLE_THREADED_RESOLVER=OFF', install: 'libssl-devel', name: 'openssl' }
fail-fast: false
steps:
- run: git config --global core.autocrlf input
shell: pwsh
- uses: cygwin/cygwin-install-action@f61179d72284ceddc397ed07ddb444d82bf9e559 # v5
with:
platform: ${{ matrix.platform }}
site: https://mirrors.kernel.org/sourceware/cygwin/
# https://cygwin.com/cgi-bin2/package-grep.cgi
packages: >-
autoconf libtool gcc-core gcc-g++ binutils
${{ matrix.build }} make ninja
openssh
libssh2-devel
libpsl-devel
zlib-devel
libbrotli-devel
libzstd-devel
libnghttp2-devel
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
persist-credentials: false
- name: 'autoreconf'
if: ${{ matrix.build == 'automake' }}
timeout-minutes: 2
run: autoreconf -fi
- name: 'configure'
timeout-minutes: 5
run: |
PATH=/usr/bin
if [ '${{ matrix.build }}' = 'cmake' ]; then
cmake -B bld -G Ninja -D_CURL_PREFILL=ON ${options} \
-DCMAKE_UNITY_BUILD=ON -DCMAKE_UNITY_BUILD_BATCH_SIZE=30 -DCURL_TEST_BUNDLES=ON \
-DCURL_WERROR=ON \
${{ matrix.config }}
else
mkdir bld && cd bld && ../configure --enable-unity --enable-test-bundles --enable-warnings --enable-werror \
--prefix="${HOME}"/install \
--with-libssh2 \
--disable-dependency-tracking \
${{ matrix.config }}
fi
- name: 'configure log'
if: ${{ !cancelled() }}
run: cat bld/config.log bld/CMakeFiles/CMake*.yaml 2>/dev/null || true
- name: 'curl_config.h'
run: |
echo '::group::raw'; cat bld/lib/curl_config.h || true; echo '::endgroup::'
grep -F '#define' bld/lib/curl_config.h | sort || true
- name: 'build'
timeout-minutes: 10
run: |
if [ '${{ matrix.build }}' = 'cmake' ]; then
cmake --build bld
else
make -C bld V=1 install
fi
- name: 'curl version'
timeout-minutes: 1
run: |
find . \( -name '*.exe' -o -name '*.dll' -o -name '*.a' \) -exec file '{}' \;
if [ '${{ matrix.build }}' = 'cmake' ]; then
PATH="$PWD/bld/lib:$PATH"
fi
bld/src/curl.exe --disable --version
- name: 'build tests'
if: ${{ matrix.tflags != 'skipall' }}
timeout-minutes: 15
run: |
if [ '${{ matrix.build }}' = 'cmake' ]; then
cmake --build bld --target testdeps
else
make -C bld V=1 -C tests
fi
- name: 'run tests'
if: ${{ matrix.tflags != 'skipall' && matrix.tflags != 'skiprun' }}
timeout-minutes: 15
run: |
export TFLAGS='-j8 ${{ matrix.tflags }} ~615'
if [ -x "$(cygpath "${SYSTEMROOT}/System32/curl.exe")" ]; then
TFLAGS+=" -ac $(cygpath "${SYSTEMROOT}/System32/curl.exe")"
fi
if [ '${{ matrix.build }}' = 'cmake' ]; then
PATH="$PWD/bld/lib:$PATH"
cmake --build bld --target test-ci
else
make -C bld V=1 test-ci
fi
- name: 'build examples'
if: ${{ matrix.build == 'cmake' }}
timeout-minutes: 5
run: |
if [ '${{ matrix.build }}' = 'cmake' ]; then
cmake --build bld --target curl-examples
else
make -C bld V=1 examples
fi
msys2: # both msys and mingw-w64
name: "${{ matrix.sys == 'msys' && 'msys2' || 'mingw' }}, ${{ matrix.build == 'cmake' && 'CM' || 'AM' }} ${{ matrix.env }} ${{ matrix.name }} ${{ matrix.test }}"
runs-on: windows-latest
timeout-minutes: 20
defaults:
run:
shell: msys2 {0}
env:
MAKEFLAGS: -j 5
strategy:
matrix:
include:
# MSYS
- { build: 'autotools', sys: 'msys' , env: 'x86_64' , tflags: '' , config: '--enable-debug --with-openssl --disable-threaded-resolver --disable-proxy', install: 'openssl-devel', name: '!proxy' }
- { build: 'autotools', sys: 'msys' , env: 'x86_64' , tflags: 'skiprun', config: '--enable-debug --with-openssl --disable-threaded-resolver', install: 'openssl-devel', name: 'default' }
- { build: 'cmake' , sys: 'msys' , env: 'x86_64' , tflags: '' , config: '-DENABLE_DEBUG=ON -DENABLE_THREADED_RESOLVER=OFF', install: 'openssl-devel', name: 'default' }
- { build: 'autotools', sys: 'msys' , env: 'x86_64' , tflags: '' , config: '--with-openssl', install: 'openssl-devel', name: 'default R' }
# MinGW
- { build: 'autotools', sys: 'mingw64', env: 'x86_64' , tflags: 'skiprun', config: '--enable-debug --with-openssl --disable-threaded-resolver --disable-curldebug --enable-static=no --without-zlib', install: 'mingw-w64-x86_64-openssl', name: 'default' }
- { build: 'autotools', sys: 'mingw64', env: 'x86_64' , tflags: '' , config: '--enable-debug --with-openssl --enable-windows-unicode --enable-ares --with-openssl-quic', install: 'mingw-w64-x86_64-openssl mingw-w64-x86_64-nghttp3', name: 'c-ares U' }
- { build: 'cmake' , sys: 'mingw64', env: 'x86_64' , tflags: '' , config: '-DENABLE_DEBUG=ON -DBUILD_SHARED_LIBS=OFF -DCURL_USE_SCHANNEL=ON -DENABLE_UNICODE=ON -DENABLE_ARES=ON', install: '', type: 'Debug', name: 'schannel c-ares U' }
- { build: 'cmake' , sys: 'clang64', env: 'clang-x86_64', tflags: '' , config: '-DENABLE_DEBUG=ON -DBUILD_SHARED_LIBS=OFF -DCURL_USE_GNUTLS=ON -DENABLE_UNICODE=OFF -DUSE_NGTCP2=ON', install: 'mingw-w64-clang-x86_64-gnutls mingw-w64-clang-x86_64-nghttp3 mingw-w64-clang-x86_64-ngtcp2', type: 'Debug', name: 'gnutls' }
- { build: 'cmake' , sys: 'ucrt64' , env: 'ucrt-x86_64' , tflags: 'skiprun', config: '-DENABLE_DEBUG=OFF -DBUILD_SHARED_LIBS=ON -DCURL_USE_SCHANNEL=ON -DENABLE_UNICODE=ON -DENABLE_CURLDEBUG=ON', install: '', type: 'Release', name: 'schannel R TrackMemory' }
- { build: 'cmake' , sys: 'clang64', env: 'clang-x86_64', tflags: 'skiprun', config: '-DENABLE_DEBUG=ON -DBUILD_SHARED_LIBS=OFF -DCURL_USE_OPENSSL=ON -DENABLE_UNICODE=OFF', install: 'mingw-w64-clang-x86_64-openssl', type: 'Release', name: 'openssl', chkprefill: '_chkprefill' }
- { build: 'cmake' , sys: 'ucrt64' , env: 'ucrt-x86_64' , tflags: 'skiprun', config: '-DENABLE_DEBUG=OFF -DBUILD_SHARED_LIBS=ON -DCURL_USE_SCHANNEL=ON', install: '', type: 'Release', test: 'uwp', name: 'schannel' }
# { build: 'autotools', sys: 'ucrt64' , env: 'ucrt-x86_64' , tflags: 'skiprun', config: '--without-debug --with-schannel --enable-shared', install: '', type: 'Release', test: 'uwp', name: 'schannel' }
- { build: 'cmake' , sys: 'mingw64', env: 'x86_64' , tflags: 'skiprun', config: '-DENABLE_DEBUG=ON -DBUILD_SHARED_LIBS=ON -DCURL_USE_SCHANNEL=ON -DENABLE_UNICODE=ON -DCMAKE_VERBOSE_MAKEFILE=ON', install: '', type: 'Debug', cflags: '-DCURL_SCHANNEL_DEV_DEBUG', name: 'schannel dev debug' }
- { build: 'cmake' , sys: 'mingw32', env: 'i686' , tflags: 'skiprun', config: '-DENABLE_DEBUG=OFF -DBUILD_SHARED_LIBS=ON -DCURL_USE_SCHANNEL=ON -DENABLE_UNICODE=ON', install: '', type: 'Release', name: 'schannel R' }
fail-fast: false
steps:
- run: git config --global core.autocrlf input
shell: pwsh
- uses: msys2/setup-msys2@61f9e5e925871ba6c9e3e8da24ede83ea27fa91f # v2
if: ${{ matrix.sys == 'msys' }}
with:
msystem: ${{ matrix.sys }}
# https://packages.msys2.org/search
install: >-
gcc
${{ matrix.build }} ${{ matrix.build == 'autotools' && 'make' || 'ninja' }}
diffutils
zlib-devel
brotli-devel
libzstd-devel
libnghttp2-devel
libpsl-devel
libssh2-devel
${{ matrix.install }}
- uses: msys2/setup-msys2@61f9e5e925871ba6c9e3e8da24ede83ea27fa91f # v2
if: ${{ matrix.sys != 'msys' }}
with:
msystem: ${{ matrix.sys }}
install: >-
mingw-w64-${{ matrix.env }}-cc
mingw-w64-${{ matrix.env }}-${{ matrix.build }} ${{ matrix.build == 'autotools' && 'make' || '' }}
mingw-w64-${{ matrix.env }}-diffutils
mingw-w64-${{ matrix.env }}-libssh2
mingw-w64-${{ matrix.env }}-libpsl
mingw-w64-${{ matrix.env }}-c-ares
${{ matrix.install }}
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
persist-credentials: false
- name: 'autoreconf'
if: ${{ matrix.build == 'autotools' }}
timeout-minutes: 2
run: autoreconf -fi
- name: 'configure'
timeout-minutes: 5
run: |
if [ '${{ matrix.test }}' = 'uwp' ]; then
CPPFLAGS='-DWINSTORECOMPAT -DWINAPI_FAMILY=WINAPI_FAMILY_APP'
if [[ '${{ matrix.env }}' != 'clang'* ]]; then
specs="$(realpath gcc-specs-uwp)"
gcc -dumpspecs | sed -e 's/-lmingwex/-lwindowsapp -lmingwex -lwindowsapp/' -e 's/-lmsvcrt/-lucrtapp/' > "${specs}"
CFLAGS="-specs=${specs}"
CFLAGS_CMAKE="-specs=$(cygpath -w "${specs}")"
fi
fi
if [ '${{ matrix.build }}' = 'cmake' ]; then
for _chkprefill in '' ${{ matrix.chkprefill }}; do
if [[ '${{ matrix.env }}' = 'clang'* ]]; then
options='-DCMAKE_C_COMPILER=clang'
else
options='-DCMAKE_C_COMPILER=gcc'
fi
[ '${{ matrix.sys }}' = 'msys' ] && options+=' -D_CURL_PREFILL=ON'
[ '${{ matrix.test }}' = 'uwp' ] && options+=' -DCMAKE_SYSTEM_NAME=WindowsStore -DCMAKE_SYSTEM_VERSION=10.0'
[ "${_chkprefill}" = '_chkprefill' ] && options+=' -D_CURL_PREFILL=OFF'
cmake -B "bld${_chkprefill}" -G Ninja ${options} \
-DCMAKE_C_FLAGS="${{ matrix.cflags }} ${CFLAGS_CMAKE} ${CPPFLAGS}" \
-DCMAKE_BUILD_TYPE='${{ matrix.type }}' \
-DCMAKE_UNITY_BUILD=ON -DCMAKE_UNITY_BUILD_BATCH_SIZE=30 -DCURL_TEST_BUNDLES=ON \
-DCURL_WERROR=ON \
${{ matrix.config }}
done
if [ -d bld_chkprefill ] && ! diff -u bld/lib/curl_config.h bld_chkprefill/lib/curl_config.h; then
echo '::group::reference configure log'; cat bld_chkprefill/CMakeFiles/CMake*.yaml 2>/dev/null || true; echo '::endgroup::'
false
fi
else
export CFLAGS CPPFLAGS
mkdir bld && cd bld && ../configure --enable-unity --enable-test-bundles --enable-warnings --enable-werror \
--prefix="${HOME}"/install \
--with-libssh2 \
--disable-dependency-tracking \
${{ matrix.config }}
fi
- name: 'configure log'
if: ${{ !cancelled() }}
run: cat bld/config.log bld/CMakeFiles/CMake*.yaml 2>/dev/null || true
- name: 'curl_config.h'
run: |
echo '::group::raw'; cat bld/lib/curl_config.h || true; echo '::endgroup::'
grep -F '#define' bld/lib/curl_config.h | sort || true
- name: 'build'
timeout-minutes: 10
run: |
if [ '${{ matrix.build }}' = 'cmake' ]; then
cmake --build bld
else
make -C bld V=1 install
fi
- name: 'curl version'
timeout-minutes: 1
run: |
if [ '${{ matrix.build }}' = 'cmake' ]; then
PATH="$PWD/bld/lib:$PATH"
else
PATH="$PWD/bld/lib/.libs:$PATH"
# avoid libtool's curl.exe wrapper
mv bld/src/.libs/curl.exe bld/src/curl.exe
fi
find . \( -name '*.exe' -o -name '*.dll' -o -name '*.a' \) -exec file '{}' \;
if [ '${{ matrix.test }}' != 'uwp' ]; then # curl: error initializing curl library
bld/src/curl.exe --disable --version
fi
- name: 'build tests'
if: ${{ matrix.tflags != 'skipall' }} # Save time by skipping this for autotools
timeout-minutes: 10
run: |
if [ '${{ matrix.build }}' = 'cmake' ]; then
cmake --build bld --target testdeps
else
make -C bld V=1 -C tests
fi
if [ '${{ matrix.build }}' != 'cmake' ]; then
# avoid libtool's .exe wrappers
mv bld/tests/http/clients/.libs/*.exe bld/tests/http/clients
mv bld/tests/libtest/.libs/*.exe bld/tests/libtest
mv bld/tests/server/.libs/*.exe bld/tests/server
mv bld/tests/unit/.libs/*.exe bld/tests/unit || true
fi
- name: 'install test prereqs'
if: ${{ matrix.tflags != 'skipall' && matrix.tflags != 'skiprun' }}
timeout-minutes: 5
run: |
/usr/bin/sed -i 's/^CheckSpace/#CheckSpace/' /etc/pacman.conf
/usr/bin/pacman --noconfirm --noprogressbar --sync --needed openssh
/c/ProgramData/chocolatey/choco.exe install --yes --no-progress --limit-output --timeout 180 --force stunnel || true
- name: 'downgrade msys2-runtime'
if: ${{ matrix.tflags != 'skipall' && matrix.tflags != 'skiprun' && matrix.sys != 'msys' }}
timeout-minutes: 2
# Downgrade to a known good MSYS2 runtime version to avoid the performance regression
# causing runtests.pl to run at 2-3x reduced speed.
run: exec /usr/bin/pacman --noconfirm --noprogressbar --upgrade https://mirror.msys2.org/msys/x86_64/msys2-runtime-3.5.4-8-x86_64.pkg.tar.zst
- name: 'run tests'
if: ${{ matrix.tflags != 'skipall' && matrix.tflags != 'skiprun' }}
timeout-minutes: 10
run: |
export TFLAGS='-j8 ${{ matrix.tflags }}'
if [ '${{ matrix.sys }}' != 'msys' ]; then
TFLAGS+=' ~612' # 'SFTP post-quote remove file' SFTP, post-quote
fi
TFLAGS+=' ~613' # 'SFTP directory retrieval' SFTP, directory
if [ -x "$(cygpath "${SYSTEMROOT}/System32/curl.exe")" ]; then
TFLAGS+=" -ac $(cygpath "${SYSTEMROOT}/System32/curl.exe")"
fi
PATH="$PATH:/c/Program Files (x86)/stunnel/bin"
if [ '${{ matrix.build }}' = 'cmake' ]; then
PATH="$PWD/bld/lib:$PATH"
cmake --build bld --target test-ci
else
PATH="$PWD/bld/lib/.libs:$PATH"
make -C bld V=1 test-ci
fi
- name: 'build examples'
if: ${{ matrix.build == 'cmake' || (matrix.tflags == 'skipall' || matrix.tflags == 'skiprun') }} # Save time by skipping this for autotools running tests
timeout-minutes: 5
run: |
if [ '${{ matrix.build }}' = 'cmake' ]; then
cmake --build bld --target curl-examples
else
make -C bld V=1 examples
fi
mingw-w64-standalone-downloads:
name: 'dl-mingw, CM ${{ matrix.ver }}-${{ matrix.env }} ${{ matrix.name }}'
runs-on: windows-latest
timeout-minutes: 20
defaults:
run:
shell: msys2 {0}
env:
MAKEFLAGS: -j 5
strategy:
matrix:
include:
- name: 'schannel'
dir: 'mingw64'
env: 'x86_64'
ver: '9.5.0'
url: 'https://github.com/brechtsanders/winlibs_mingw/releases/download/9.5.0-10.0.0-msvcrt-r1/winlibs-x86_64-posix-seh-gcc-9.5.0-mingw-w64msvcrt-10.0.0-r1.7z'
config: '-DENABLE_DEBUG=ON -DBUILD_SHARED_LIBS=OFF -DCURL_USE_SCHANNEL=ON -DENABLE_UNICODE=OFF'
type: 'Release'
- name: 'schannel mbedtls U'
dir: 'mingw64'
env: 'x86_64'
ver: '7.3.0'
url: 'https://downloads.sourceforge.net/mingw-w64/Toolchains%20targetting%20Win64/Personal%20Builds/mingw-builds/7.3.0/threads-win32/seh/x86_64-7.3.0-release-win32-seh-rt_v5-rev0.7z'
config: '-DENABLE_DEBUG=ON -DBUILD_SHARED_LIBS=OFF -DCURL_USE_SCHANNEL=ON -DENABLE_UNICODE=ON -DCURL_USE_MBEDTLS=ON'
install: mingw-w64-x86_64-mbedtls
type: 'Release'
tflags: 'skiprun'
- name: 'schannel !unity'
dir: 'mingw32'
env: 'i686'
ver: '6.4.0'
url: 'https://downloads.sourceforge.net/mingw-w64/Toolchains%20targetting%20Win32/Personal%20Builds/mingw-builds/6.4.0/threads-win32/dwarf/i686-6.4.0-release-win32-dwarf-rt_v5-rev0.7z'
config: '-DENABLE_DEBUG=ON -DBUILD_SHARED_LIBS=OFF -DCURL_USE_SCHANNEL=ON -DENABLE_UNICODE=OFF -DCMAKE_UNITY_BUILD=OFF'
type: 'Debug'
tflags: 'skiprun'
fail-fast: false
steps:
- uses: msys2/setup-msys2@d44ca8e88d8b43d56cf5670f91747359d5537f97 # v2
with:
msystem: ${{ matrix.dir }}
release: false
update: false
cache: false
path-type: inherit
install: >-
mingw-w64-${{ matrix.env }}-libpsl
${{ matrix.install }}
- name: 'cache compiler (gcc ${{ matrix.ver }}-${{ matrix.env }})'
uses: actions/cache@d4323d4df104b026a6aa633fdb11d772146be0bf # v4
id: cache-compiler
with:
path: ~\my-cache
key: ${{ runner.os }}-mingw-w64-${{ matrix.ver }}-${{ matrix.env }}
- name: 'install compiler (gcc ${{ matrix.ver }}-${{ matrix.env }})'
if: ${{ steps.cache-compiler.outputs.cache-hit != 'true' }}
timeout-minutes: 5
run: |
cd "${USERPROFILE}" || exit 1
mkdir my-cache
cd my-cache || exit 1
curl --fail --silent --show-error --retry 3 --retry-connrefused --output pack.bin --location --proto-redir =https '${{ matrix.url }}'
pwd
7z x -y pack.bin >/dev/null
rm -r -f pack.bin
ls -l
- run: git config --global core.autocrlf input
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
persist-credentials: false
- name: 'configure'
timeout-minutes: 5
run: |
PATH="$(cygpath "${USERPROFILE}")/my-cache/${{ matrix.dir }}/bin:$PATH"
for _chkprefill in '' ${{ matrix.chkprefill }}; do
options=''
[ "${_chkprefill}" = '_chkprefill' ] && options+=' -D_CURL_PREFILL=OFF'
cmake -B "bld${_chkprefill}" -G "${{ contains(matrix.url, '/winlibs_mingw/') && 'Ninja' || 'MSYS Makefiles' }}" ${options} \
-DCMAKE_C_COMPILER=gcc \
-DCMAKE_BUILD_TYPE='${{ matrix.type }}' \
-DCMAKE_UNITY_BUILD=ON -DCMAKE_UNITY_BUILD_BATCH_SIZE=30 -DCURL_TEST_BUNDLES=ON \
-DCURL_WERROR=ON \
-DUSE_LIBIDN2=OFF \
${{ matrix.config }}
done
if [ -d bld_chkprefill ] && ! diff -u bld/lib/curl_config.h bld_chkprefill/lib/curl_config.h; then
echo '::group::reference configure log'; cat bld_chkprefill/CMakeFiles/CMake*.yaml 2>/dev/null || true; echo '::endgroup::'
false
fi
- name: 'configure log'
if: ${{ !cancelled() }}
run: cat bld/CMakeFiles/CMake*.yaml 2>/dev/null || true
- name: 'curl_config.h'
run: |
echo '::group::raw'; cat bld/lib/curl_config.h || true; echo '::endgroup::'
grep -F '#define' bld/lib/curl_config.h | sort || true
- name: 'build'
timeout-minutes: 5
run: |
PATH="$(cygpath "${USERPROFILE}")/my-cache/${{ matrix.dir }}/bin:$PATH"
cmake --build bld
- name: 'curl version'
timeout-minutes: 1
run: |
PATH=/usr/bin find . \( -name '*.exe' -o -name '*.dll' -o -name '*.a' \) -exec file '{}' \;
PATH="$PWD/bld/lib:$PATH"
bld/src/curl.exe --disable --version
- name: 'build tests'
if: ${{ matrix.tflags != 'skipall' }}
timeout-minutes: 10
run: |
PATH="$(cygpath "${USERPROFILE}")/my-cache/${{ matrix.dir }}/bin:$PATH"
cmake --build bld --target testdeps
- name: 'install test prereqs'
if: ${{ matrix.tflags != 'skipall' && matrix.tflags != 'skiprun' }}
timeout-minutes: 5
run: |
/usr/bin/sed -i 's/^CheckSpace/#CheckSpace/' /etc/pacman.conf
/c/ProgramData/chocolatey/choco.exe install --yes --no-progress --limit-output --timeout 180 --force stunnel || true
python3 -m pip --disable-pip-version-check --no-input --no-cache-dir install --progress-bar off --prefer-binary impacket
- name: 'downgrade msys2-runtime'
if: ${{ matrix.tflags != 'skipall' && matrix.tflags != 'skiprun' }}
timeout-minutes: 2
# Downgrade to a known good MSYS2 runtime version to avoid the performance regression
# causing runtests.pl to run at 2-3x reduced speed.
run: exec /usr/bin/pacman --noconfirm --noprogressbar --upgrade https://mirror.msys2.org/msys/x86_64/msys2-runtime-3.5.4-8-x86_64.pkg.tar.zst
- name: 'run tests'
if: ${{ matrix.tflags != 'skipall' && matrix.tflags != 'skiprun' }}
timeout-minutes: 10
run: |
PATH="$(cygpath "${USERPROFILE}")/my-cache/${{ matrix.dir }}/bin:$PATH"
export TFLAGS='-j8 ${{ matrix.tflags }}'
if [ -x "$(cygpath "${SYSTEMROOT}/System32/curl.exe")" ]; then
TFLAGS+=" -ac $(cygpath "${SYSTEMROOT}/System32/curl.exe")"
fi
PATH="$PWD/bld/lib:$PATH:/c/Program Files (x86)/stunnel/bin"
cmake --build bld --target test-ci
- name: 'build examples'
timeout-minutes: 5
run: |
PATH="$(cygpath "${USERPROFILE}")/my-cache/${{ matrix.dir }}/bin:$PATH"
cmake --build bld --target curl-examples
linux-cross-mingw-w64:
name: "linux-mingw, ${{ matrix.build == 'cmake' && 'CM' || 'AM' }} ${{ matrix.compiler }}"
runs-on: ubuntu-latest
timeout-minutes: 15
env:
MAKEFLAGS: -j 5
TRIPLET: 'x86_64-w64-mingw32'
strategy:
fail-fast: false
matrix:
build: [autotools, cmake]
compiler: [gcc]
steps:
- name: 'install packages'
timeout-minutes: 5
run: sudo apt-get -o Dpkg::Use-Pty=0 install mingw-w64 ${{ matrix.build == 'cmake' && 'ninja-build' || '' }}
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
persist-credentials: false
- name: 'autoreconf'
if: ${{ matrix.build == 'autotools' }}
run: autoreconf -fi
- name: 'configure'
run: |
if [ '${{ matrix.build }}' = 'cmake' ]; then
cmake -B bld -G Ninja \
-DCMAKE_SYSTEM_NAME=Windows \
-DCMAKE_C_COMPILER_TARGET="${TRIPLET}" \
-DCMAKE_C_COMPILER="${TRIPLET}-gcc" \
-DCMAKE_UNITY_BUILD=ON -DCURL_TEST_BUNDLES=ON \
-DCURL_WERROR=ON \
-DCURL_USE_SCHANNEL=ON -DUSE_WIN32_IDN=ON \
-DCURL_USE_LIBPSL=OFF
else
mkdir bld && cd bld && ../configure --enable-unity --enable-test-bundles --enable-warnings --enable-werror \
--host="${TRIPLET}" \
--with-schannel --with-winidn \
--without-libpsl \
--disable-dependency-tracking
fi
- name: 'configure log'
if: ${{ !cancelled() }}
run: cat bld/config.log bld/CMakeFiles/CMake*.yaml 2>/dev/null || true
- name: 'curl_config.h'
run: |
echo '::group::raw'; cat bld/lib/curl_config.h || true; echo '::endgroup::'
grep -F '#define' bld/lib/curl_config.h | sort || true
- name: 'build'
run: |
if [ '${{ matrix.build }}' = 'cmake' ]; then
cmake --build bld
else
make -C bld
fi
- name: 'curl info'
run: find . \( -name '*.exe' -o -name '*.dll' -o -name '*.a' \) -exec file '{}' \;
- name: 'build tests'
if: ${{ matrix.build == 'cmake' }} # Save time by skipping this for autotools
run: |
if [ '${{ matrix.build }}' = 'cmake' ]; then
cmake --build bld --target testdeps
else
make -C bld -C tests
fi
- name: 'build examples'
run: |
if [ '${{ matrix.build }}' = 'cmake' ]; then
cmake --build bld --target curl-examples
else
make -C bld examples
fi
wince:
name: "mingw32ce, ${{ matrix.build == 'cmake' && 'CM' || 'AM' }} 4.4.0-arm schannel"
runs-on: 'macos-latest'
timeout-minutes: 10
env:
MAKEFLAGS: -j 4
toolchain-version: '0.59.1'
strategy:
matrix:
build: [autotools, cmake]
fail-fast: false
steps:
- name: 'install packages'
if: ${{ matrix.build == 'autotools' }}
timeout-minutes: 5
run: |
echo automake libtool | xargs -Ix -n1 echo brew '"x"' > /tmp/Brewfile
while [[ $? == 0 ]]; do for i in 1 2 3; do brew update && brew bundle install --file /tmp/Brewfile && break 2 || { echo Error: wait to try again; sleep 10; } done; false Too many retries; done
- name: 'cache compiler (mingw32ce)'
uses: actions/cache@d4323d4df104b026a6aa633fdb11d772146be0bf # v4
id: cache-compiler
with:
path: ~/opt/mingw32ce
key: ${{ runner.os }}-mingw32ce-${{ env.toolchain-version }}-amd64
- name: 'install compiler (mingw32ce)'
if: ${{ steps.cache-compiler.outputs.cache-hit != 'true' }}
timeout-minutes: 5
run: |
cd "${HOME}" || exit 1
curl --disable --fail --silent --show-error --connect-timeout 15 --max-time 120 --retry 3 --retry-connrefused --proto-redir =https \
--location 'https://downloads.sourceforge.net/cegcc/cegcc/${{ env.toolchain-version }}/cegcc_mingw32ce_snowleopard_r1397.tar.bz2' | tar -x
ls -l
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
- name: 'configure'
run: |
PATH="$HOME/opt/mingw32ce/bin:$PATH"
if [ '${{ matrix.build }}' = 'cmake' ]; then
cmake -B bld \
-DCMAKE_SYSTEM_NAME=WindowsCE \
-DCMAKE_SYSTEM_VERSION=8.0 \
-DCMAKE_SYSTEM_PROCESSOR=arm \
-DCMAKE_C_FLAGS='-O3 -DNDEBUG' \
-DCMAKE_C_COMPILER_TARGET=arm-mingw32ce \
-DCMAKE_C_COMPILER=arm-mingw32ce-gcc \
-DCMAKE_RC_COMPILER=arm-mingw32ce-windres \
-DMINGW32CE_LIBRARY_DIR="$HOME/opt/mingw32ce/arm-mingw32ce/lib" \
-DCMAKE_IGNORE_PREFIX_PATH="$(brew --prefix)" \
-DCMAKE_UNITY_BUILD=ON -DCMAKE_UNITY_BUILD_BATCH_SIZE=50 -DCURL_TEST_BUNDLES=ON \
-DBUILD_SHARED_LIBS=ON -DBUILD_STATIC_LIBS=ON -DBUILD_STATIC_CURL=OFF \
-DCURL_WERROR=ON \
-DCURL_USE_SCHANNEL=ON \
-DCURL_USE_LIBPSL=OFF
else
autoreconf -fi
mkdir bld && cd bld && ../configure --disable-dependency-tracking --enable-unity --enable-test-bundles --enable-warnings --enable-werror \
--host=arm-mingw32ce \
--with-schannel \
--without-libpsl \
--disable-shared
fi
- name: 'configure log'
if: ${{ !cancelled() }}
run: cat bld/config.log bld/CMakeFiles/CMake*.yaml 2>/dev/null || true
- name: 'curl_config.h'
run: |
echo '::group::raw'; cat bld/lib/curl_config.h || true; echo '::endgroup::'
grep -F '#define' bld/lib/curl_config.h | sort || true
- name: 'build'
run: |
PATH="$HOME/opt/mingw32ce/bin:$PATH"
if [ '${{ matrix.build }}' = 'cmake' ]; then
cmake --build bld
else
make -C bld
fi
- name: 'curl info'
run: |
find . \( -name '*.exe' -o -name '*.dll' -o -name '*.a' \) -exec file '{}' \;
- name: 'build tests'
if: ${{ matrix.build == 'cmake' }} # skip for autotools to save time
run: |
PATH="$HOME/opt/mingw32ce/bin:$PATH"
if [ '${{ matrix.build }}' = 'cmake' ]; then
cmake --build bld --target testdeps
else
make -C bld -C tests
fi
- name: 'build examples'
if: ${{ matrix.build == 'cmake' }} # skip for autotools to save time
run: |
PATH="$HOME/opt/mingw32ce/bin:$PATH"
if [ '${{ matrix.build }}' = 'cmake' ]; then
cmake --build bld --target curl-examples
else
make -C bld examples
fi
msvc:
name: 'msvc, CM ${{ matrix.arch }}-${{ matrix.plat }} ${{ matrix.name }}'
runs-on: windows-latest
timeout-minutes: 55
defaults:
run:
shell: msys2 {0}
env:
VCPKG_BINARY_SOURCES: 'clear;x-gha,readwrite'
VCPKG_DISABLE_METRICS: '1'
strategy:
matrix:
include:
- name: 'openssl +examples'
install: 'brotli zlib zstd nghttp2 nghttp3 openssl libssh2'
arch: 'x64'
plat: 'uwp'
type: 'Debug'
tflags: 'skiprun'
config: >-
-DENABLE_DEBUG=ON
-DCURL_USE_LIBSSH2=ON
-DCURL_USE_SCHANNEL=OFF -DCURL_USE_OPENSSL=ON -DUSE_OPENSSL_QUIC=ON
-DCURL_USE_LIBPSL=OFF
- name: 'openssl'
install: 'brotli zlib zstd libpsl nghttp2 nghttp3 openssl libssh2 pkgconf gsasl c-ares libuv krb5'
arch: 'x64'
plat: 'windows'
type: 'Debug'
config: >-
-DENABLE_DEBUG=ON
-DCURL_USE_LIBSSH2=ON
-DCURL_USE_SCHANNEL=OFF -DCURL_USE_OPENSSL=ON -DUSE_OPENSSL_QUIC=ON
-DCURL_USE_GSASL=ON -DENABLE_ARES=ON -DCURL_USE_LIBUV=ON -DCURL_USE_GSSAPI=ON
- name: 'schannel MultiSSL U'
# GnuTLS is not fully functional with MSVC, build-test only
# https://github.com/ShiftMediaProject/gnutls/issues/23
install: 'brotli zlib zstd libpsl nghttp2 libssh2[core,zlib] pkgconf gsasl shiftmedia-libgnutls openssl mbedtls wolfssl'
arch: 'x64'
plat: 'windows'
type: 'Debug'
config: >-
-DENABLE_DEBUG=ON
-DCURL_USE_LIBSSH2=ON
-DCURL_USE_SCHANNEL=ON -DCURL_USE_GNUTLS=ON -DCURL_USE_OPENSSL=ON -DCURL_USE_MBEDTLS=ON -DCURL_USE_WOLFSSL=ON -DCURL_DEFAULT_SSL_BACKEND=schannel
-DCURL_USE_GSASL=ON -DUSE_WIN32_IDN=ON -DENABLE_UNICODE=ON -DUSE_SSLS_EXPORT=ON
- name: 'libressl'
install: 'brotli zlib zstd libpsl nghttp2 libressl libssh2[core,zlib] pkgconf ngtcp2[libressl] nghttp3'
arch: 'x64'
plat: 'windows'
type: 'Debug'
config: >-
-DENABLE_DEBUG=ON
-DCURL_USE_LIBSSH2=ON
-DCURL_USE_SCHANNEL=OFF -DCURL_USE_OPENSSL=ON -DUSE_NGTCP2=ON
-DCURL_CA_SEARCH_SAFE=ON -DUSE_SSLS_EXPORT=ON
- name: 'boringssl'
install: 'brotli zlib zstd libpsl nghttp2 boringssl libssh2[core,zlib]'
arch: 'x64'
plat: 'windows'
type: 'Debug'
config: >-
-DENABLE_DEBUG=OFF
-DCURL_USE_LIBSSH2=ON
-DCURL_USE_SCHANNEL=OFF -DCURL_USE_OPENSSL=ON
-DUSE_ECH=ON
- name: 'wolfssl +examples'
install: 'brotli zlib zstd libpsl nghttp2 wolfssl libssh2 pkgconf gsasl ngtcp2[wolfssl] nghttp3'
arch: 'x64'
plat: 'windows'
type: 'Debug'
config: >-
-DENABLE_DEBUG=ON
-DCURL_USE_LIBSSH2=ON
-DCURL_USE_SCHANNEL=OFF -DCURL_USE_WOLFSSL=ON -DUSE_NGTCP2=ON
-DCURL_USE_GSASL=ON
-DUSE_ECH=ON
- name: 'mbedtls libssh'
install: 'brotli zlib zstd libpsl nghttp2 mbedtls libssh pkgconf gsasl'
arch: 'x64'
plat: 'windows'
type: 'Debug'
chkprefill: '_chkprefill'
# WARNING: libssh uses hard-coded world-writable paths (/etc/..., ~/.ssh/) to
# read its configuration from, making it vulnerable to attacks on
# Windows. Do not use this component till there is a fix for these.
# https://github.com/curl/curl-for-win/blob/3951808deb04df9489ee17430f236ed54436f81a/libssh.sh#L6-L8
config: >-
-DENABLE_DEBUG=ON
-DCURL_USE_LIBSSH2=OFF -DCURL_USE_LIBSSH=ON
-DCURL_USE_SCHANNEL=OFF -DCURL_USE_MBEDTLS=ON
-DCURL_USE_GSASL=ON
fail-fast: false
steps:
- uses: msys2/setup-msys2@d44ca8e88d8b43d56cf5670f91747359d5537f97 # v2
with:
msystem: ucrt64
release: false
update: false
cache: false
path-type: inherit
- name: 'vcpkg cache setup'
uses: actions/github-script@60a0d83039c74a4aee543508d2ffcb1c3799cdea # v7
with:
script: |
core.exportVariable('ACTIONS_CACHE_URL', process.env.ACTIONS_CACHE_URL || '');
core.exportVariable('ACTIONS_RUNTIME_TOKEN', process.env.ACTIONS_RUNTIME_TOKEN || '');
- name: 'vcpkg versions'
timeout-minutes: 1
run: |
git -C "$VCPKG_INSTALLATION_ROOT" show --no-patch --format='%H %ai'
vcpkg version
- name: 'vcpkg build'
timeout-minutes: 35
run: vcpkg x-set-installed ${{ matrix.install }} '--triplet=${{ matrix.arch }}-${{ matrix.plat }}'
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
persist-credentials: false
- name: 'configure'
timeout-minutes: 5
run: |
for _chkprefill in '' ${{ matrix.chkprefill }}; do
options=''
if [ '${{ matrix.plat }}' = 'uwp' ]; then
options+=' -DCMAKE_SYSTEM_NAME=WindowsStore -DCMAKE_SYSTEM_VERSION=10.0'
cflags='-DWINAPI_FAMILY=WINAPI_FAMILY_PC_APP'
ldflags='-OPT:NOREF -OPT:NOICF -APPCONTAINER:NO'
vsglobals=';AppxPackage=false;WindowsAppContainer=false'
fi
[ '${{ matrix.arch }}' = 'arm64' ] && options+=' -A ARM64'
[ '${{ matrix.arch }}' = 'x64' ] && options+=' -A x64'
[ '${{ matrix.arch }}' = 'x86' ] && options+=' -A Win32'
[ "${_chkprefill}" = '_chkprefill' ] && options+=' -D_CURL_PREFILL=OFF'
cmake -B "bld${_chkprefill}" ${options} \
-DCMAKE_TOOLCHAIN_FILE="$VCPKG_INSTALLATION_ROOT/scripts/buildsystems/vcpkg.cmake" \
-DVCPKG_INSTALLED_DIR="$VCPKG_INSTALLATION_ROOT/installed" \
-DVCPKG_TARGET_TRIPLET='${{ matrix.arch }}-${{ matrix.plat }}' \
-DCMAKE_C_FLAGS="${cflags}" \
-DCMAKE_EXE_LINKER_FLAGS="-INCREMENTAL:NO ${ldflags}" \
-DCMAKE_SHARED_LINKER_FLAGS="-INCREMENTAL:NO ${ldflags}" \
-DCMAKE_VS_GLOBALS="TrackFileAccess=false${vsglobals}" \
-DCMAKE_UNITY_BUILD=ON -DCURL_TEST_BUNDLES=ON \
-DCURL_WERROR=ON \
-DBUILD_SHARED_LIBS=OFF \
${{ matrix.config }}
done
if [ -d bld_chkprefill ] && ! diff -u bld/lib/curl_config.h bld_chkprefill/lib/curl_config.h; then
echo '::group::reference configure log'; cat bld_chkprefill/CMakeFiles/CMake*.yaml 2>/dev/null || true; echo '::endgroup::'
false
fi
- name: 'configure log'
if: ${{ !cancelled() }}
run: cat bld/CMakeFiles/CMake*.yaml 2>/dev/null || true
- name: 'curl_config.h'
run: |
echo '::group::raw'; cat bld/lib/curl_config.h || true; echo '::endgroup::'
grep -F '#define' bld/lib/curl_config.h | sort || true
- name: 'build'
timeout-minutes: 5
run: cmake --build bld --config '${{ matrix.type }}' --parallel 5
- name: 'curl version'
timeout-minutes: 1
run: |
PATH=/usr/bin find . \( -name '*.exe' -o -name '*.dll' -o -name '*.lib' -o -name '*.pdb' \) -exec file '{}' \;
if [ '${{ matrix.plat }}' != 'uwp' ]; then # Missing: ucrtbased.dll, VCRUNTIME140D.dll, VCRUNTIME140D_APP.dll
PATH="$PWD/bld/lib/${{ matrix.type }}:$PATH"
'bld/src/${{ matrix.type }}/curl.exe' --disable --version
fi
- name: 'build tests'
if: ${{ matrix.tflags != 'skipall' }}
timeout-minutes: 10
run: cmake --build bld --config '${{ matrix.type }}' --parallel 5 --target testdeps
- name: 'install test prereqs'
if: ${{ matrix.tflags != 'skipall' && matrix.tflags != 'skiprun' }}
timeout-minutes: 5
run: |
/usr/bin/sed -i 's/^CheckSpace/#CheckSpace/' /etc/pacman.conf
if [ '${{ matrix.openssh }}' = '' ]; then # MSYS2 openssh
/usr/bin/pacman --noconfirm --noprogressbar --sync --needed openssh
else # OpenSSH-Windows
[ '${{ matrix.openssh }}' = 'OpenSSH-Windows-Pre' ] && chocopkg+=' --prerelease'
chocopkg+=' openssh'
fi
/c/ProgramData/chocolatey/choco.exe install --yes --no-progress --limit-output --timeout 180 --force ${chocopkg} stunnel || true
python3 -m pip --disable-pip-version-check --no-input --no-cache-dir install --progress-bar off --prefer-binary impacket
- name: 'downgrade msys2-runtime'
if: ${{ matrix.tflags != 'skipall' && matrix.tflags != 'skiprun' }}
timeout-minutes: 2
# Downgrade to a known good MSYS2 runtime version to avoid the performance regression
# causing runtests.pl to run at 2-3x reduced speed.
run: exec /usr/bin/pacman --noconfirm --noprogressbar --upgrade https://mirror.msys2.org/msys/x86_64/msys2-runtime-3.5.4-8-x86_64.pkg.tar.zst
- name: 'run tests'
if: ${{ matrix.tflags != 'skipall' && matrix.tflags != 'skiprun' }}
timeout-minutes: 10
run: |
export CURL_DIRSUFFIX='${{ matrix.type }}'
export TFLAGS='-j8 ${{ matrix.tflags }}'
if [[ '${{ matrix.install }}' = *'libssh2[core,zlib]'* ]]; then
TFLAGS+=' !SCP !SFTP' # Fail with all tested openssh servers: curl: (67) Authentication failure
fi
TFLAGS+=' ~612' # 'SFTP post-quote remove file' SFTP, post-quote
if [ '${{ matrix.openssh }}' = '' ]; then # MSYS2 openssh
TFLAGS+=' ~613' # 'SFTP directory retrieval' SFTP, directory
else # OpenSSH-Windows
TFLAGS+=' ~601 ~603 ~617 ~619 ~621 ~641 ~665 ~2004' # SCP
if [[ '${{ matrix.install }}' = *'libssh '* ]]; then
TFLAGS+=' ~614' # 'SFTP pre-quote chmod' SFTP, pre-quote, directory
else
TFLAGS+=' ~3022' # 'SCP correct sha256 host key' SCP, server sha256 key check
fi
PATH="$PATH:/c/Program Files/OpenSSH-Win64"
fi
PATH="$PWD/bld/lib/${{ matrix.type }}:$PATH:/c/Program Files (x86)/stunnel/bin"
cmake --build bld --config '${{ matrix.type }}' --target test-ci
- name: 'build examples'
timeout-minutes: 5
if: ${{ contains(matrix.name, '+examples') }}
run: cmake --build bld --config '${{ matrix.type }}' --parallel 5 --target curl-examples

12
deps/curl/CHANGES.md vendored Normal file
View File

@ -0,0 +1,12 @@
<!--
Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
SPDX-License-Identifier: curl
-->
In a release tarball, check the RELEASES-NOTES file for what was done in the
most recent release. In a git check-out, that file mentions changes that have
been done since the previous release.
See the online [changelog](https://curl.se/changes.html) for the edited and
human readable version of what has changed in different curl releases.

2512
deps/curl/CMakeLists.txt vendored Normal file

File diff suppressed because it is too large Load Diff

28
deps/curl/GIT-INFO.md vendored Normal file
View File

@ -0,0 +1,28 @@
_ _ ____ _
___| | | | _ \| |
/ __| | | | |_) | |
| (__| |_| | _ <| |___
\___|\___/|_| \_\_____|
# GIT-INFO
This file is only present in git - never in release archives. It contains
information about other files and things that the git repository keeps in its
inner sanctum.
To build in environments that support configure, after having extracted
everything from git, do this:
autoreconf -fi
./configure --with-openssl
make
Daniel uses a configure line similar to this for easier development:
./configure --disable-shared --enable-debug --enable-maintainer-mode
## REQUIREMENTS
See [docs/INTERNALS.md][0] for requirement details.
[0]: docs/INTERNALS.md

11
deps/curl/LICENSES/BSD-3-Clause.txt vendored Normal file
View File

@ -0,0 +1,11 @@
Copyright (c) <year> <owner>.
Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met:
1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer.
2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution.
3. Neither the name of the copyright holder nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

15
deps/curl/LICENSES/BSD-4-Clause-UC.txt vendored Normal file
View File

@ -0,0 +1,15 @@
BSD-4-Clause (University of California-Specific)
Copyright [various years] The Regents of the University of California. All rights reserved.
Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met:
1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer.
2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution.
3. All advertising materials mentioning features or use of this software must display the following acknowledgement: This product includes software developed by the University of California, Berkeley and its contributors.
4. Neither the name of the University nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE REGENTS AND CONTRIBUTORS ``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE REGENTS OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

12
deps/curl/LICENSES/ISC.txt vendored Normal file
View File

@ -0,0 +1,12 @@
Permission to use, copy, modify, and distribute this software for any
purpose with or without fee is hereby granted, provided that the above
copyright notice and this permission notice appear in all copies.
THE SOFTWARE IS PROVIDED "AS IS" AND INTERNET SOFTWARE CONSORTIUM
DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL
IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL
INTERNET SOFTWARE CONSORTIUM BE LIABLE FOR ANY SPECIAL, DIRECT,
INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING
FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT,
NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION
WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.

22
deps/curl/LICENSES/curl.txt vendored Normal file
View File

@ -0,0 +1,22 @@
COPYRIGHT AND PERMISSION NOTICE
Copyright (C) Daniel Stenberg, <daniel@haxx.se>, and many
contributors, see the THANKS file.
All rights reserved.
Permission to use, copy, modify, and distribute this software for any purpose
with or without fee is hereby granted, provided that the above copyright
notice and this permission notice appear in all copies.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT OF THIRD PARTY RIGHTS. IN
NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE
OR OTHER DEALINGS IN THE SOFTWARE.
Except as contained in this notice, the name of a copyright holder shall not
be used in advertising or otherwise to promote the sale, use or other dealings
in this Software without prior written authorization of the copyright holder.

68
deps/curl/README.md vendored Normal file
View File

@ -0,0 +1,68 @@
<!--
Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
SPDX-License-Identifier: curl
-->
# [![curl logo](https://curl.se/logo/curl-logo.svg)](https://curl.se/)
curl is a command-line tool for transferring data specified with URL syntax.
Learn how to use curl by reading [the
manpage](https://curl.se/docs/manpage.html) or [everything
curl](https://everything.curl.dev/).
Find out how to install curl by reading [the INSTALL
document](https://curl.se/docs/install.html).
libcurl is the library curl is using to do its job. It is readily available to
be used by your software. Read [the libcurl
manpage](https://curl.se/libcurl/c/libcurl.html) to learn how.
## Open Source
curl is Open Source and is distributed under an MIT-like
[license](https://curl.se/docs/copyright.html).
## Contact
Contact us on a suitable [mailing list](https://curl.se/mail/) or
use GitHub [issues](https://github.com/curl/curl/issues)/
[pull requests](https://github.com/curl/curl/pulls)/
[discussions](https://github.com/curl/curl/discussions).
All contributors to the project are listed in [the THANKS
document](https://curl.se/docs/thanks.html).
## Commercial support
For commercial support, maybe private and dedicated help with your problems or
applications using (lib)curl visit [the support page](https://curl.se/support.html).
## Website
Visit the [curl website](https://curl.se/) for the latest news and downloads.
## Source code
Download the latest source from the Git server:
git clone https://github.com/curl/curl.git
## Security problems
Report suspected security problems via [our HackerOne
page](https://hackerone.com/curl) and not in public.
## Notice
curl contains pieces of source code that is Copyright (c) 1998, 1999 Kungliga
Tekniska Högskolan. This notice is included here to comply with the
distribution terms.
## Backers
Thank you to all our backers 🙏 [Become a backer](https://opencollective.com/curl#section-contribute).
## Sponsors
Support this project by becoming a [sponsor](https://curl.se/sponsors.html).

29
deps/curl/SECURITY.md vendored Normal file
View File

@ -0,0 +1,29 @@
<!--
Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
SPDX-License-Identifier: curl
-->
# Security Policy
Read our [Vulnerability Disclosure Policy](docs/VULN-DISCLOSURE-POLICY.md).
## Reporting a Vulnerability
If you have found or just suspect a security problem somewhere in curl or
libcurl, report it on [HackerOne](https://hackerone.com/curl).
We treat security issues with confidentiality until controlled and disclosed responsibly.
## OpenSSF Best Practices
curl has achieved Gold status on the Open Source Security Foundation (OpenSSF)
[Best Practices](https://bestpractices.dev/) (formerly Core Infrastructure
Initiative Best Practices), reflecting its adherence to rigorous
security and best practice standards. This achievement highlights curl's
comprehensive documentation, secure development processes, effective change
control mechanisms, and strong maintenance routines. Meeting these criteria
demonstrates curl's commitment to security and reliability, ensuring the
project's sustainability and trustworthiness. This underscores curl's role as
a leader in open-source software practices. More information can be found on
[curl's OpenSSF Best Practices project page](https://www.bestpractices.dev/projects/63).

43
deps/curl/docs/ALTSVC.md vendored Normal file
View File

@ -0,0 +1,43 @@
<!--
Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
SPDX-License-Identifier: curl
-->
# Alt-Svc
curl features support for the Alt-Svc: HTTP header.
## Enable Alt-Svc in build
`./configure --enable-alt-svc`
(enabled by default since 7.73.0)
## Standard
[RFC 7838](https://datatracker.ietf.org/doc/html/rfc7838)
# Alt-Svc cache file format
This is a text based file with one line per entry and each line consists of nine
space separated fields.
## Example
h2 quic.tech 8443 h3-22 quic.tech 8443 "20190808 06:18:37" 0 0
## Fields
1. The ALPN id for the source origin
2. The hostname for the source origin
3. The port number for the source origin
4. The ALPN id for the destination host
5. The hostname for the destination host
6. The port number for the destination host
7. The expiration date and time of this entry within double quotes. The date format is "YYYYMMDD HH:MM:SS" and the time zone is GMT.
8. Boolean (1 or 0) if "persist" was set for this entry
9. Integer priority value (not currently used)
If the hostname is an IPv6 numerical address, it is stored with brackets such
as `[::1]`.

148
deps/curl/docs/BINDINGS.md vendored Normal file
View File

@ -0,0 +1,148 @@
<!--
Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
SPDX-License-Identifier: curl
-->
libcurl bindings
================
Creative people have written bindings or interfaces for various environments
and programming languages. Using one of these allows you to take advantage of
curl powers from within your favourite language or system.
This is a list of all known interfaces as of this writing.
The bindings listed below are not part of the curl/libcurl distribution
archives, but must be downloaded and installed separately.
<!-- markdown-link-check-disable -->
[Ada95](https://web.archive.org/web/20070403105909/www.almroth.com/adacurl/index.html) Written by Andreas Almroth
[Basic](https://scriptbasic.com/) ScriptBasic bindings written by Peter Verhas
C++: [curlpp](https://github.com/jpbarrette/curlpp/) Written by Jean-Philippe Barrette-LaPierre,
[curlcpp](https://github.com/JosephP91/curlcpp) by Giuseppe Persico and [C++
Requests](https://github.com/libcpr/cpr) by Huu Nguyen
[Ch](https://chcurl.sourceforge.net/) Written by Stephen Nestinger and Jonathan Rogado
Cocoa: [BBHTTP](https://github.com/biasedbit/BBHTTP) written by Bruno de Carvalho
[curlhandle](https://github.com/karelia/curlhandle) Written by Dan Wood
Clojure: [clj-curl](https://github.com/lsevero/clj-curl) by Lucas Severo
[D](https://dlang.org/library/std/net/curl.html) Written by Kenneth Bogert
[Delphi](https://github.com/Mercury13/curl4delphi) Written by Mikhail Merkuryev
[Dylan](https://dylanlibs.sourceforge.net/) Written by Chris Double
[Eiffel](https://iron.eiffel.com/repository/20.11/package/ABEF6975-37AC-45FD-9C67-52D10BA0669B) Written by Eiffel Software
[Euphoria](https://web.archive.org/web/20050204080544/rays-web.com/eulibcurl.htm) Written by Ray Smith
[Falcon](http://www.falconpl.org/project_docs/curl/)
[Ferite](https://web.archive.org/web/20150102192018/ferite.org/) Written by Paul Querna
[Fortran](https://github.com/interkosmos/fortran-curl) Written by Philipp Engel
[Gambas](https://gambas.sourceforge.net/)
[glib/GTK+](https://web.archive.org/web/20100526203452/atterer.net/glibcurl) Written by Richard Atterer
Go: [go-curl](https://github.com/andelf/go-curl) by ShuYu Wang
[Guile](https://github.com/spk121/guile-curl) Written by Michael L. Gran
[Harbour](https://github.com/vszakats/hb/tree/main/contrib/hbcurl) Written by Viktor Szakats
[Haskell](https://hackage.haskell.org/package/curl) Written by Galois, Inc
[Hollywood](https://www.hollywood-mal.com/download.html) hURL by Andreas Falkenhahn
[Java](https://github.com/covers1624/curl4j)
[Julia](https://github.com/JuliaWeb/LibCURL.jl) Written by Amit Murthy
[Kapito](https://github.com/puzza007/katipo) is an Erlang HTTP library around libcurl.
[Lisp](https://common-lisp.net/project/cl-curl/) Written by Liam Healy
Lua: [luacurl](https://web.archive.org/web/20201205052437/luacurl.luaforge.net/) by Alexander Marinov, [Lua-cURL](https://github.com/Lua-cURL) by Jürgen Hötzel
[Mono](https://web.archive.org/web/20070606064500/https://forge.novell.com/modules/xfmod/project/?libcurl-mono) Written by Jeffrey Phillips
[.NET](https://sourceforge.net/projects/libcurl-net/) libcurl-net by Jeffrey Phillips
[Nim](https://nimble.directory/pkg/libcurl) wrapper for libcurl
[node.js](https://github.com/JCMais/node-libcurl) node-libcurl by Jonathan Cardoso Machado
[Object-Pascal](https://web.archive.org/web/20020610214926/www.tekool.com/opcurl) Free Pascal, Delphi and Kylix binding written by Christophe Espern.
[OCaml](https://opam.ocaml.org/packages/ocurl/) Written by Lars Nilsson and ygrek
[Pascal](https://web.archive.org/web/20030804091414/houston.quik.com/jkp/curlpas/) Free Pascal, Delphi and Kylix binding written by Jeffrey Pohlmeyer.
Perl: [WWW::Curl](https://github.com/szbalint/WWW--Curl) Maintained by Cris
Bailiff and Bálint Szilakszi,
[perl6-net-curl](https://github.com/azawawi/perl6-net-curl) by Ahmad M. Zawawi
[NET::Curl](https://metacpan.org/pod/Net::Curl) by Przemyslaw Iskra
[PHP](https://php.net/curl) Originally written by Sterling Hughes
[PostgreSQL](https://github.com/pramsey/pgsql-http) - HTTP client for PostgreSQL
[PostgreSQL](https://github.com/RekGRpth/pg_curl) - cURL client for PostgreSQL
[PureBasic](https://www.purebasic.com/documentation/http/index.html) uses libcurl in its "native" HTTP subsystem
[Python](http://pycurl.io/) PycURL by Kjetil Jacobsen
[Python](https://pypi.org/project/pymcurl/) mcurl by Ganesh Viswanathan
[Q](https://q-lang.sourceforge.net/) The libcurl module is part of the default install
[R](https://cran.r-project.org/package=curl)
[Rexx](https://rexxcurl.sourceforge.net/) Written Mark Hessling
[Ring](https://ring-lang.sourceforge.io/doc1.3/libcurl.html) RingLibCurl by Mahmoud Fayed
RPG, support for ILE/RPG on OS/400 is included in source distribution
Ruby: [curb](https://github.com/taf2/curb) written by Ross Bamford,
[ruby-curl-multi](https://github.com/kball/curl_multi.rb) by Kristjan Petursson and Keith Rarick
[Rust](https://github.com/alexcrichton/curl-rust) curl-rust - by Carl Lerche
[Scheme](https://www.metapaper.net/lisovsky/web/curl/) Bigloo binding by Kirill Lisovsky
[Scilab](https://help.scilab.org/docs/current/fr_FR/getURL.html) binding by Sylvestre Ledru
[S-Lang](https://www.jedsoft.org/slang/modules/curl.html) by John E Davis
[Smalltalk](https://www.squeaksource.com/CurlPlugin/) Written by Danil Osipchuk
[SP-Forth](https://sourceforge.net/p/spf/spf/ci/master/tree/devel/~ac/lib/lin/curl/) Written by Andrey Cherezov
[SPL](https://web.archive.org/web/20210203022158/www.clifford.at/spl/spldoc/curl.html) Written by Clifford Wolf
[Tcl](https://web.archive.org/web/20160826011806/mirror.yellow5.com/tclcurl/) Tclcurl by Andrés García
[Vibe](https://github.com/ttytm/vibe) HTTP requests through libcurl in V
[Visual Basic](https://sourceforge.net/projects/libcurl-vb/) libcurl-vb by Jeffrey Phillips
[Visual Foxpro](https://web.archive.org/web/20130730181523/www.ctl32.com.ar/libcurl.asp) by Carlos Alloatti
[wxWidgets](https://wxcode.sourceforge.net/components/wxcurl/) Written by Casey O'Donnell
[XBLite](https://web.archive.org/web/20060426150418/perso.wanadoo.fr/xblite/libraries.html) Written by David Szafranski
[Xojo](https://github.com/charonn0/RB-libcURL) Written by Andrew Lambert
[Zig](https://github.com/jiacai2050/zig-curl) Written by Jiacai Liu, both easy and multi API are supported.

94
deps/curl/docs/BUG-BOUNTY.md vendored Normal file
View File

@ -0,0 +1,94 @@
<!--
Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
SPDX-License-Identifier: curl
-->
# The curl bug bounty
The curl project runs a bug bounty program in association with
[HackerOne](https://www.hackerone.com) and the [Internet Bug
Bounty](https://internetbugbounty.org).
## How does it work?
Start out by posting your suspected security vulnerability directly to [curl's
HackerOne program](https://hackerone.com/curl).
After you have reported a security issue, it has been deemed credible, and a
patch and advisory has been made public, you may be eligible for a bounty from
this program. See the [Security Process](https://curl.se/dev/secprocess.html)
document for how we work with security issues.
## What are the reward amounts?
The curl project offers monetary compensation for reported and published
security vulnerabilities. The amount of money that is rewarded depends on how
serious the flaw is determined to be.
Since 2021, the Bug Bounty is managed in association with the Internet Bug
Bounty and they set the reward amounts. If it would turn out that they set
amounts that are way lower than we can accept, the curl project intends to
"top up" rewards.
In 2022, typical "Medium" rated vulnerabilities have been rewarded 2,400 USD
each.
## Who is eligible for a reward?
Everyone and anyone who reports a security problem in a released curl version
that has not already been reported can ask for a bounty.
Dedicated - paid for - security audits that are performed in collaboration
with curl developers are not eligible for bounties.
Vulnerabilities in features that are off by default and documented as
experimental are not eligible for a reward.
The vulnerability has to be fixed and publicly announced (by the curl project)
before a bug bounty is considered.
Once the vulnerability has been published by curl, the researcher can request
their bounty from the [Internet Bug Bounty](https://hackerone.com/ibb).
Bounties need to be requested within twelve months from the publication of the
vulnerability.
The curl security team reserves themselves the right to deny or allow bug
bounty payouts on its own discretion. There is no appeals process.
## Product vulnerabilities only
This bug bounty only concerns the curl and libcurl products and thus their
respective source codes - when running on existing hardware. It does not
include curl documentation, curl websites, or other curl related
infrastructure.
The curl security team is the sole arbiter if a reported flaw is subject to a
bounty or not.
## Third parties
The curl bug bounty does not cover flaws in third party dependencies
(libraries) used by curl or libcurl. If the bug triggers because of curl
behaving wrongly or abusing a third party dependency, the problem is rather in
curl and not in the dependency and then the bounty might cover the problem.
## How are vulnerabilities graded?
The grading of each reported vulnerability that makes a reward claim is
performed by the curl security team. The grading is based on the CVSS (Common
Vulnerability Scoring System) 3.0.
## How are reward amounts determined?
The curl security team gives the vulnerability a score or severity level, as
mentioned above. The actual monetary reward amount is decided and paid by the
Internet Bug Bounty..
## Regarding taxes, etc. on the bounties
In the event that the individual receiving a bug bounty needs to pay taxes on
the reward money, the responsibility lies with the receiver. The curl project
or its security team never actually receive any of this money, hold the money,
or pay out the money.

270
deps/curl/docs/BUGS.md vendored Normal file
View File

@ -0,0 +1,270 @@
<!--
Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
SPDX-License-Identifier: curl
-->
# BUGS
## There are still bugs
curl and libcurl keep being developed. Adding features and changing code
means that bugs sneak in, no matter how hard we try to keep them out.
Of course there are lots of bugs left. Not to mention misfeatures.
To help us make curl the stable and solid product we want it to be, we need
bug reports and bug fixes.
## Where to report
If you cannot fix a bug yourself and submit a fix for it, try to report an as
detailed report as possible to a curl mailing list to allow one of us to have
a go at a solution. You can optionally also submit your problem in [curl's
bug tracking system](https://github.com/curl/curl/issues).
Please read the rest of this document below first before doing that.
If you feel you need to ask around first, find a suitable [mailing
list](https://curl.se/mail/) and post your questions there.
## Security bugs
If you find a bug or problem in curl or libcurl that you think has a security
impact, for example a bug that can put users in danger or make them
vulnerable if the bug becomes public knowledge, then please report that bug
using our security development process.
Security related bugs or bugs that are suspected to have a security impact,
should be reported on the [curl security tracker at
HackerOne](https://hackerone.com/curl).
This ensures that the report reaches the curl security team so that they
first can deal with the report away from the public to minimize the harm and
impact it has on existing users out there who might be using the vulnerable
versions.
The curl project's process for handling security related issues is
[documented separately](https://curl.se/dev/secprocess.html).
## What to report
When reporting a bug, you should include all information to help us
understand what is wrong, what you expected to happen and how to repeat the
bad behavior. You therefore need to tell us:
- your operating system's name and version number
- what version of curl you are using (`curl -V` is fine)
- versions of the used libraries that libcurl is built to use
- what URL you were working with (if possible), at least which protocol
and anything and everything else you think matters. Tell us what you expected
to happen, tell use what did happen, tell us how you could make it work
another way. Dig around, try out, test. Then include all the tiny bits and
pieces in your report. You benefit from this yourself, as it enables us to
help you quicker and more accurately.
Since curl deals with networks, it often helps us if you include a protocol
debug dump with your bug report. The output you get by using the `-v` or
`--trace` options.
If curl crashed, causing a core dump (in Unix), there is hardly any use to
send that huge file to anyone of us. Unless we have the same system setup as
you, we cannot do much with it. Instead, we ask you to get a stack trace and
send that (much smaller) output to us instead.
The address and how to subscribe to the mailing lists are detailed in the
`MANUAL.md` file.
## libcurl problems
When you have written your own application with libcurl to perform transfers,
it is even more important to be specific and detailed when reporting bugs.
Tell us the libcurl version and your operating system. Tell us the name and
version of all relevant sub-components like for example the SSL library
you are using and what name resolving your libcurl uses. If you use SFTP or
SCP, the libssh2 version is relevant etc.
Showing us a real source code example repeating your problem is the best way
to get our attention and it greatly increases our chances to understand your
problem and to work on a fix (if we agree it truly is a problem).
Lots of problems that appear to be libcurl problems are actually just abuses
of the libcurl API or other malfunctions in your applications. It is advised
that you run your problematic program using a memory debug tool like valgrind
or similar before you post memory-related or "crashing" problems to us.
## Who fixes the problems
If the problems or bugs you describe are considered to be bugs, we want to
have the problems fixed.
There are no developers in the curl project that are paid to work on bugs.
All developers that take on reported bugs do this on a voluntary basis. We do
it out of an ambition to keep curl and libcurl excellent products and out of
pride.
Please do not assume that you can just lump over something to us and it then
magically gets fixed after some given time. Most often we need feedback and
help to understand what you have experienced and how to repeat a problem.
Then we may only be able to assist YOU to debug the problem and to track down
the proper fix.
We get reports from many people every month and each report can take a
considerable amount of time to really go to the bottom with.
## How to get a stack trace
First, you must make sure that you compile all sources with `-g` and that you
do not 'strip' the final executable. Try to avoid optimizing the code as well,
remove `-O`, `-O2` etc from the compiler options.
Run the program until it cores.
Run your debugger on the core file, like `<debugger> curl core`. `<debugger>`
should be replaced with the name of your debugger, in most cases that is
`gdb`, but `dbx` and others also occur.
When the debugger has finished loading the core file and presents you a
prompt, enter `where` (without quotes) and press return.
The list that is presented is the stack trace. If everything worked, it is
supposed to contain the chain of functions that were called when curl
crashed. Include the stack trace with your detailed bug report, it helps a
lot.
## Bugs in libcurl bindings
There are of course bugs in libcurl bindings. You should then primarily
approach the team that works on that particular binding and see what you can
do to help them fix the problem.
If you suspect that the problem exists in the underlying libcurl, then please
convert your program over to plain C and follow the steps outlined above.
## Bugs in old versions
The curl project typically releases new versions every other month, and we
fix several hundred bugs per year. For a huge table of releases, number of
bug fixes and more, see: https://curl.se/docs/releases.html
The developers in the curl project do not have bandwidth or energy enough to
maintain several branches or to spend much time on hunting down problems in
old versions when chances are we already fixed them or at least that they have
changed nature and appearance in later versions.
When you experience a problem and want to report it, you really SHOULD
include the version number of the curl you are using when you experience the
issue. If that version number shows us that you are using an out-of-date curl,
you should also try out a modern curl version to see if the problem persists
or how/if it has changed in appearance.
Even if you cannot immediately upgrade your application/system to run the
latest curl version, you can most often at least run a test version or
experimental build or similar, to get this confirmed or not.
At times people insist that they cannot upgrade to a modern curl version, but
instead, they "just want the bug fixed". That is fine, just do not count on us
spending many cycles on trying to identify which single commit, if that is
even possible, that at some point in the past fixed the problem you are now
experiencing.
Security wise, it is almost always a bad idea to lag behind the current curl
versions by a lot. We keep discovering and reporting security problems
over time see you can see in [this
table](https://curl.se/docs/vulnerabilities.html)
# Bug fixing procedure
## What happens on first filing
When a new issue is posted in the issue tracker or on the mailing list, the
team of developers first needs to see the report. Maybe they took the day off,
maybe they are off in the woods hunting. Have patience. Allow at least a few
days before expecting someone to have responded.
In the issue tracker, you can expect that some labels are set on the issue to
help categorize it.
## First response
If your issue/bug report was not perfect at once (and few are), chances are
that someone asks follow-up questions. Which version did you use? Which
options did you use? How often does the problem occur? How can we reproduce
this problem? Which protocols does it involve? Or perhaps much more specific
and deep diving questions. It all depends on your specific issue.
You should then respond to these follow-up questions and provide more info
about the problem, so that we can help you figure it out. Or maybe you can
help us figure it out. An active back-and-forth communication is important
and the key for finding a cure and landing a fix.
## Not reproducible
We may require further work from you who actually see or experience the
problem if we cannot reproduce it and cannot understand it even after having
gotten all the info we need and having studied the source code over again.
## Unresponsive
If the problem have not been understood or reproduced, and there is nobody
responding to follow-up questions or questions asking for clarifications or
for discussing possible ways to move forward with the task, we take that as a
strong suggestion that the bug is unimportant.
Unimportant issues are closed as inactive sooner or later as they cannot be
fixed. The inactivity period (waiting for responses) should not be shorter
than two weeks but may extend months.
## Lack of time/interest
Bugs that are filed and are understood can unfortunately end up in the
"nobody cares enough about it to work on it" category. Such bugs are
perfectly valid problems that *should* get fixed but apparently are not. We
try to mark such bugs as `KNOWN_BUGS material` after a time of inactivity and
if no activity is noticed after yet some time those bugs are added to the
`KNOWN_BUGS` document and are closed in the issue tracker.
## `KNOWN_BUGS`
This is a list of known bugs. Bugs we know exist and that have been pointed
out but that have not yet been fixed. The reasons for why they have not been
fixed can involve anything really, but the primary reason is that nobody has
considered these problems to be important enough to spend the necessary time
and effort to have them fixed.
The `KNOWN_BUGS` items are always up for grabs and we love the ones who bring
one of them back to life and offer solutions to them.
The `KNOWN_BUGS` document has a sibling document known as `TODO`.
## `TODO`
Issues that are filed or reported that are not really bugs but more missing
features or ideas for future improvements and so on are marked as
*enhancement* or *feature-request* and get added to the `TODO` document and
the issues are closed. We do not keep TODO items open in the issue tracker.
The `TODO` document is full of ideas and suggestions of what we can add or
fix one day. You are always encouraged and free to grab one of those items and
take up a discussion with the curl development team on how that could be
implemented or provided in the project so that you can work on ticking it odd
that document.
If an issue is rather a bug and not a missing feature or functionality, it is
listed in `KNOWN_BUGS` instead.
## Closing off stalled bugs
The [issue and pull request trackers](https://github.com/curl/curl) only hold
"active" entries open (using a non-precise definition of what active actually
is, but they are at least not completely dead). Those that are abandoned or
in other ways dormant are closed and sometimes added to `TODO` and
`KNOWN_BUGS` instead.
This way, we only have "active" issues open on GitHub. Irrelevant issues and
pull requests do not distract developers or casual visitors.

336
deps/curl/docs/CIPHERS-TLS12.md vendored Normal file
View File

@ -0,0 +1,336 @@
<!--
Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
SPDX-License-Identifier: curl
-->
# TLS 1.2 cipher suites
| Id | IANA name | OpenSSL name | RFC |
|--------|-----------------------------------------------|------------------------------------|--------------------|
| 0x0001 | TLS_RSA_WITH_NULL_MD5 | NULL-MD5 | [RFC5246] |
| 0x0002 | TLS_RSA_WITH_NULL_SHA | NULL-SHA | [RFC5246] |
| 0x0003 | TLS_RSA_EXPORT_WITH_RC4_40_MD5 | EXP-RC4-MD5 | [RFC4346][RFC6347] |
| 0x0004 | TLS_RSA_WITH_RC4_128_MD5 | RC4-MD5 | [RFC5246][RFC6347] |
| 0x0005 | TLS_RSA_WITH_RC4_128_SHA | RC4-SHA | [RFC5246][RFC6347] |
| 0x0006 | TLS_RSA_EXPORT_WITH_RC2_CBC_40_MD5 | EXP-RC2-CBC-MD5 | [RFC4346] |
| 0x0007 | TLS_RSA_WITH_IDEA_CBC_SHA | IDEA-CBC-SHA | [RFC8996] |
| 0x0008 | TLS_RSA_EXPORT_WITH_DES40_CBC_SHA | EXP-DES-CBC-SHA | [RFC4346] |
| 0x0009 | TLS_RSA_WITH_DES_CBC_SHA | DES-CBC-SHA | [RFC8996] |
| 0x000A | TLS_RSA_WITH_3DES_EDE_CBC_SHA | DES-CBC3-SHA | [RFC5246] |
| 0x000B | TLS_DH_DSS_EXPORT_WITH_DES40_CBC_SHA | EXP-DH-DSS-DES-CBC-SHA | [RFC4346] |
| 0x000C | TLS_DH_DSS_WITH_DES_CBC_SHA | DH-DSS-DES-CBC-SHA | [RFC8996] |
| 0x000D | TLS_DH_DSS_WITH_3DES_EDE_CBC_SHA | DH-DSS-DES-CBC3-SHA | [RFC5246] |
| 0x000E | TLS_DH_RSA_EXPORT_WITH_DES40_CBC_SHA | EXP-DH-RSA-DES-CBC-SHA | [RFC4346] |
| 0x000F | TLS_DH_RSA_WITH_DES_CBC_SHA | DH-RSA-DES-CBC-SHA | [RFC8996] |
| 0x0010 | TLS_DH_RSA_WITH_3DES_EDE_CBC_SHA | DH-RSA-DES-CBC3-SHA | [RFC5246] |
| 0x0011 | TLS_DHE_DSS_EXPORT_WITH_DES40_CBC_SHA | EXP-DHE-DSS-DES-CBC-SHA | [RFC4346] |
| 0x0012 | TLS_DHE_DSS_WITH_DES_CBC_SHA | DHE-DSS-DES-CBC-SHA | [RFC8996] |
| 0x0013 | TLS_DHE_DSS_WITH_3DES_EDE_CBC_SHA | DHE-DSS-DES-CBC3-SHA | [RFC5246] |
| 0x0014 | TLS_DHE_RSA_EXPORT_WITH_DES40_CBC_SHA | EXP-DHE-RSA-DES-CBC-SHA | [RFC4346] |
| 0x0015 | TLS_DHE_RSA_WITH_DES_CBC_SHA | DHE-RSA-DES-CBC-SHA | [RFC8996] |
| 0x0016 | TLS_DHE_RSA_WITH_3DES_EDE_CBC_SHA | DHE-RSA-DES-CBC3-SHA | [RFC5246] |
| 0x0017 | TLS_DH_anon_EXPORT_WITH_RC4_40_MD5 | EXP-ADH-RC4-MD5 | [RFC4346][RFC6347] |
| 0x0018 | TLS_DH_anon_WITH_RC4_128_MD5 | ADH-RC4-MD5 | [RFC5246][RFC6347] |
| 0x0019 | TLS_DH_anon_EXPORT_WITH_DES40_CBC_SHA | EXP-ADH-DES-CBC-SHA | [RFC4346] |
| 0x001A | TLS_DH_anon_WITH_DES_CBC_SHA | ADH-DES-CBC-SHA | [RFC8996] |
| 0x001B | TLS_DH_anon_WITH_3DES_EDE_CBC_SHA | ADH-DES-CBC3-SHA | [RFC5246] |
| 0x001C | | FZA-NULL-SHA | |
| 0x001D | | FZA-FZA-CBC-SHA | |
| 0x001E | TLS_KRB5_WITH_DES_CBC_SHA | KRB5-DES-CBC-SHA | [RFC2712] |
| 0x001F | TLS_KRB5_WITH_3DES_EDE_CBC_SHA | KRB5-DES-CBC3-SHA | [RFC2712] |
| 0x0020 | TLS_KRB5_WITH_RC4_128_SHA | KRB5-RC4-SHA | [RFC2712][RFC6347] |
| 0x0021 | TLS_KRB5_WITH_IDEA_CBC_SHA | KRB5-IDEA-CBC-SHA | [RFC2712] |
| 0x0022 | TLS_KRB5_WITH_DES_CBC_MD5 | KRB5-DES-CBC-MD5 | [RFC2712] |
| 0x0023 | TLS_KRB5_WITH_3DES_EDE_CBC_MD5 | KRB5-DES-CBC3-MD5 | [RFC2712] |
| 0x0024 | TLS_KRB5_WITH_RC4_128_MD5 | KRB5-RC4-MD5 | [RFC2712][RFC6347] |
| 0x0025 | TLS_KRB5_WITH_IDEA_CBC_MD5 | KRB5-IDEA-CBC-MD5 | [RFC2712] |
| 0x0026 | TLS_KRB5_EXPORT_WITH_DES_CBC_40_SHA | EXP-KRB5-DES-CBC-SHA | [RFC2712] |
| 0x0027 | TLS_KRB5_EXPORT_WITH_RC2_CBC_40_SHA | EXP-KRB5-RC2-CBC-SHA | [RFC2712] |
| 0x0028 | TLS_KRB5_EXPORT_WITH_RC4_40_SHA | EXP-KRB5-RC4-SHA | [RFC2712][RFC6347] |
| 0x0029 | TLS_KRB5_EXPORT_WITH_DES_CBC_40_MD5 | EXP-KRB5-DES-CBC-MD5 | [RFC2712] |
| 0x002A | TLS_KRB5_EXPORT_WITH_RC2_CBC_40_MD5 | EXP-KRB5-RC2-CBC-MD5 | [RFC2712] |
| 0x002B | TLS_KRB5_EXPORT_WITH_RC4_40_MD5 | EXP-KRB5-RC4-MD5 | [RFC2712][RFC6347] |
| 0x002C | TLS_PSK_WITH_NULL_SHA | PSK-NULL-SHA | [RFC4785] |
| 0x002D | TLS_DHE_PSK_WITH_NULL_SHA | DHE-PSK-NULL-SHA | [RFC4785] |
| 0x002E | TLS_RSA_PSK_WITH_NULL_SHA | RSA-PSK-NULL-SHA | [RFC4785] |
| 0x002F | TLS_RSA_WITH_AES_128_CBC_SHA | AES128-SHA | [RFC5246] |
| 0x0030 | TLS_DH_DSS_WITH_AES_128_CBC_SHA | DH-DSS-AES128-SHA | [RFC5246] |
| 0x0031 | TLS_DH_RSA_WITH_AES_128_CBC_SHA | DH-RSA-AES128-SHA | [RFC5246] |
| 0x0032 | TLS_DHE_DSS_WITH_AES_128_CBC_SHA | DHE-DSS-AES128-SHA | [RFC5246] |
| 0x0033 | TLS_DHE_RSA_WITH_AES_128_CBC_SHA | DHE-RSA-AES128-SHA | [RFC5246] |
| 0x0034 | TLS_DH_anon_WITH_AES_128_CBC_SHA | ADH-AES128-SHA | [RFC5246] |
| 0x0035 | TLS_RSA_WITH_AES_256_CBC_SHA | AES256-SHA | [RFC5246] |
| 0x0036 | TLS_DH_DSS_WITH_AES_256_CBC_SHA | DH-DSS-AES256-SHA | [RFC5246] |
| 0x0037 | TLS_DH_RSA_WITH_AES_256_CBC_SHA | DH-RSA-AES256-SHA | [RFC5246] |
| 0x0038 | TLS_DHE_DSS_WITH_AES_256_CBC_SHA | DHE-DSS-AES256-SHA | [RFC5246] |
| 0x0039 | TLS_DHE_RSA_WITH_AES_256_CBC_SHA | DHE-RSA-AES256-SHA | [RFC5246] |
| 0x003A | TLS_DH_anon_WITH_AES_256_CBC_SHA | ADH-AES256-SHA | [RFC5246] |
| 0x003B | TLS_RSA_WITH_NULL_SHA256 | NULL-SHA256 | [RFC5246] |
| 0x003C | TLS_RSA_WITH_AES_128_CBC_SHA256 | AES128-SHA256 | [RFC5246] |
| 0x003D | TLS_RSA_WITH_AES_256_CBC_SHA256 | AES256-SHA256 | [RFC5246] |
| 0x003E | TLS_DH_DSS_WITH_AES_128_CBC_SHA256 | DH-DSS-AES128-SHA256 | [RFC5246] |
| 0x003F | TLS_DH_RSA_WITH_AES_128_CBC_SHA256 | DH-RSA-AES128-SHA256 | [RFC5246] |
| 0x0040 | TLS_DHE_DSS_WITH_AES_128_CBC_SHA256 | DHE-DSS-AES128-SHA256 | [RFC5246] |
| 0x0041 | TLS_RSA_WITH_CAMELLIA_128_CBC_SHA | CAMELLIA128-SHA | [RFC5932] |
| 0x0042 | TLS_DH_DSS_WITH_CAMELLIA_128_CBC_SHA | DH-DSS-CAMELLIA128-SHA | [RFC5932] |
| 0x0043 | TLS_DH_RSA_WITH_CAMELLIA_128_CBC_SHA | DH-RSA-CAMELLIA128-SHA | [RFC5932] |
| 0x0044 | TLS_DHE_DSS_WITH_CAMELLIA_128_CBC_SHA | DHE-DSS-CAMELLIA128-SHA | [RFC5932] |
| 0x0045 | TLS_DHE_RSA_WITH_CAMELLIA_128_CBC_SHA | DHE-RSA-CAMELLIA128-SHA | [RFC5932] |
| 0x0046 | TLS_DH_anon_WITH_CAMELLIA_128_CBC_SHA | ADH-CAMELLIA128-SHA | [RFC5932] |
| 0x0060 | | EXP1024-RC4-MD5 | |
| 0x0061 | | EXP1024-RC2-CBC-MD5 | |
| 0x0062 | | EXP1024-DES-CBC-SHA | |
| 0x0063 | | EXP1024-DHE-DSS-DES-CBC-SHA | |
| 0x0064 | | EXP1024-RC4-SHA | |
| 0x0065 | | EXP1024-DHE-DSS-RC4-SHA | |
| 0x0066 | | DHE-DSS-RC4-SHA | |
| 0x0067 | TLS_DHE_RSA_WITH_AES_128_CBC_SHA256 | DHE-RSA-AES128-SHA256 | [RFC5246] |
| 0x0068 | TLS_DH_DSS_WITH_AES_256_CBC_SHA256 | DH-DSS-AES256-SHA256 | [RFC5246] |
| 0x0069 | TLS_DH_RSA_WITH_AES_256_CBC_SHA256 | DH-RSA-AES256-SHA256 | [RFC5246] |
| 0x006A | TLS_DHE_DSS_WITH_AES_256_CBC_SHA256 | DHE-DSS-AES256-SHA256 | [RFC5246] |
| 0x006B | TLS_DHE_RSA_WITH_AES_256_CBC_SHA256 | DHE-RSA-AES256-SHA256 | [RFC5246] |
| 0x006C | TLS_DH_anon_WITH_AES_128_CBC_SHA256 | ADH-AES128-SHA256 | [RFC5246] |
| 0x006D | TLS_DH_anon_WITH_AES_256_CBC_SHA256 | ADH-AES256-SHA256 | [RFC5246] |
| 0x0080 | | GOST94-GOST89-GOST89 | |
| 0x0081 | | GOST2001-GOST89-GOST89 | |
| 0x0082 | | GOST94-NULL-GOST94 | |
| 0x0083 | | GOST2001-NULL-GOST94 | |
| 0x0084 | TLS_RSA_WITH_CAMELLIA_256_CBC_SHA | CAMELLIA256-SHA | [RFC5932] |
| 0x0085 | TLS_DH_DSS_WITH_CAMELLIA_256_CBC_SHA | DH-DSS-CAMELLIA256-SHA | [RFC5932] |
| 0x0086 | TLS_DH_RSA_WITH_CAMELLIA_256_CBC_SHA | DH-RSA-CAMELLIA256-SHA | [RFC5932] |
| 0x0087 | TLS_DHE_DSS_WITH_CAMELLIA_256_CBC_SHA | DHE-DSS-CAMELLIA256-SHA | [RFC5932] |
| 0x0088 | TLS_DHE_RSA_WITH_CAMELLIA_256_CBC_SHA | DHE-RSA-CAMELLIA256-SHA | [RFC5932] |
| 0x0089 | TLS_DH_anon_WITH_CAMELLIA_256_CBC_SHA | ADH-CAMELLIA256-SHA | [RFC5932] |
| 0x008A | TLS_PSK_WITH_RC4_128_SHA | PSK-RC4-SHA | [RFC4279][RFC6347] |
| 0x008B | TLS_PSK_WITH_3DES_EDE_CBC_SHA | PSK-3DES-EDE-CBC-SHA | [RFC4279] |
| 0x008C | TLS_PSK_WITH_AES_128_CBC_SHA | PSK-AES128-CBC-SHA | [RFC4279] |
| 0x008D | TLS_PSK_WITH_AES_256_CBC_SHA | PSK-AES256-CBC-SHA | [RFC4279] |
| 0x008E | TLS_DHE_PSK_WITH_RC4_128_SHA | DHE-PSK-RC4-SHA | [RFC4279][RFC6347] |
| 0x008F | TLS_DHE_PSK_WITH_3DES_EDE_CBC_SHA | DHE-PSK-3DES-EDE-CBC-SHA | [RFC4279] |
| 0x0090 | TLS_DHE_PSK_WITH_AES_128_CBC_SHA | DHE-PSK-AES128-CBC-SHA | [RFC4279] |
| 0x0091 | TLS_DHE_PSK_WITH_AES_256_CBC_SHA | DHE-PSK-AES256-CBC-SHA | [RFC4279] |
| 0x0092 | TLS_RSA_PSK_WITH_RC4_128_SHA | RSA-PSK-RC4-SHA | [RFC4279][RFC6347] |
| 0x0093 | TLS_RSA_PSK_WITH_3DES_EDE_CBC_SHA | RSA-PSK-3DES-EDE-CBC-SHA | [RFC4279] |
| 0x0094 | TLS_RSA_PSK_WITH_AES_128_CBC_SHA | RSA-PSK-AES128-CBC-SHA | [RFC4279] |
| 0x0095 | TLS_RSA_PSK_WITH_AES_256_CBC_SHA | RSA-PSK-AES256-CBC-SHA | [RFC4279] |
| 0x0096 | TLS_RSA_WITH_SEED_CBC_SHA | SEED-SHA | [RFC4162] |
| 0x0097 | TLS_DH_DSS_WITH_SEED_CBC_SHA | DH-DSS-SEED-SHA | [RFC4162] |
| 0x0098 | TLS_DH_RSA_WITH_SEED_CBC_SHA | DH-RSA-SEED-SHA | [RFC4162] |
| 0x0099 | TLS_DHE_DSS_WITH_SEED_CBC_SHA | DHE-DSS-SEED-SHA | [RFC4162] |
| 0x009A | TLS_DHE_RSA_WITH_SEED_CBC_SHA | DHE-RSA-SEED-SHA | [RFC4162] |
| 0x009B | TLS_DH_anon_WITH_SEED_CBC_SHA | ADH-SEED-SHA | [RFC4162] |
| 0x009C | TLS_RSA_WITH_AES_128_GCM_SHA256 | AES128-GCM-SHA256 | [RFC5288] |
| 0x009D | TLS_RSA_WITH_AES_256_GCM_SHA384 | AES256-GCM-SHA384 | [RFC5288] |
| 0x009E | TLS_DHE_RSA_WITH_AES_128_GCM_SHA256 | DHE-RSA-AES128-GCM-SHA256 | [RFC5288] |
| 0x009F | TLS_DHE_RSA_WITH_AES_256_GCM_SHA384 | DHE-RSA-AES256-GCM-SHA384 | [RFC5288] |
| 0x00A0 | TLS_DH_RSA_WITH_AES_128_GCM_SHA256 | DH-RSA-AES128-GCM-SHA256 | [RFC5288] |
| 0x00A1 | TLS_DH_RSA_WITH_AES_256_GCM_SHA384 | DH-RSA-AES256-GCM-SHA384 | [RFC5288] |
| 0x00A2 | TLS_DHE_DSS_WITH_AES_128_GCM_SHA256 | DHE-DSS-AES128-GCM-SHA256 | [RFC5288] |
| 0x00A3 | TLS_DHE_DSS_WITH_AES_256_GCM_SHA384 | DHE-DSS-AES256-GCM-SHA384 | [RFC5288] |
| 0x00A4 | TLS_DH_DSS_WITH_AES_128_GCM_SHA256 | DH-DSS-AES128-GCM-SHA256 | [RFC5288] |
| 0x00A5 | TLS_DH_DSS_WITH_AES_256_GCM_SHA384 | DH-DSS-AES256-GCM-SHA384 | [RFC5288] |
| 0x00A6 | TLS_DH_anon_WITH_AES_128_GCM_SHA256 | ADH-AES128-GCM-SHA256 | [RFC5288] |
| 0x00A7 | TLS_DH_anon_WITH_AES_256_GCM_SHA384 | ADH-AES256-GCM-SHA384 | [RFC5288] |
| 0x00A8 | TLS_PSK_WITH_AES_128_GCM_SHA256 | PSK-AES128-GCM-SHA256 | [RFC5487] |
| 0x00A9 | TLS_PSK_WITH_AES_256_GCM_SHA384 | PSK-AES256-GCM-SHA384 | [RFC5487] |
| 0x00AA | TLS_DHE_PSK_WITH_AES_128_GCM_SHA256 | DHE-PSK-AES128-GCM-SHA256 | [RFC5487] |
| 0x00AB | TLS_DHE_PSK_WITH_AES_256_GCM_SHA384 | DHE-PSK-AES256-GCM-SHA384 | [RFC5487] |
| 0x00AC | TLS_RSA_PSK_WITH_AES_128_GCM_SHA256 | RSA-PSK-AES128-GCM-SHA256 | [RFC5487] |
| 0x00AD | TLS_RSA_PSK_WITH_AES_256_GCM_SHA384 | RSA-PSK-AES256-GCM-SHA384 | [RFC5487] |
| 0x00AE | TLS_PSK_WITH_AES_128_CBC_SHA256 | PSK-AES128-CBC-SHA256 | [RFC5487] |
| 0x00AF | TLS_PSK_WITH_AES_256_CBC_SHA384 | PSK-AES256-CBC-SHA384 | [RFC5487] |
| 0x00B0 | TLS_PSK_WITH_NULL_SHA256 | PSK-NULL-SHA256 | [RFC5487] |
| 0x00B1 | TLS_PSK_WITH_NULL_SHA384 | PSK-NULL-SHA384 | [RFC5487] |
| 0x00B2 | TLS_DHE_PSK_WITH_AES_128_CBC_SHA256 | DHE-PSK-AES128-CBC-SHA256 | [RFC5487] |
| 0x00B3 | TLS_DHE_PSK_WITH_AES_256_CBC_SHA384 | DHE-PSK-AES256-CBC-SHA384 | [RFC5487] |
| 0x00B4 | TLS_DHE_PSK_WITH_NULL_SHA256 | DHE-PSK-NULL-SHA256 | [RFC5487] |
| 0x00B5 | TLS_DHE_PSK_WITH_NULL_SHA384 | DHE-PSK-NULL-SHA384 | [RFC5487] |
| 0x00B6 | TLS_RSA_PSK_WITH_AES_128_CBC_SHA256 | RSA-PSK-AES128-CBC-SHA256 | [RFC5487] |
| 0x00B7 | TLS_RSA_PSK_WITH_AES_256_CBC_SHA384 | RSA-PSK-AES256-CBC-SHA384 | [RFC5487] |
| 0x00B8 | TLS_RSA_PSK_WITH_NULL_SHA256 | RSA-PSK-NULL-SHA256 | [RFC5487] |
| 0x00B9 | TLS_RSA_PSK_WITH_NULL_SHA384 | RSA-PSK-NULL-SHA384 | [RFC5487] |
| 0x00BA | TLS_RSA_WITH_CAMELLIA_128_CBC_SHA256 | CAMELLIA128-SHA256 | [RFC5932] |
| 0x00BD | TLS_DHE_DSS_WITH_CAMELLIA_128_CBC_SHA256 | DHE-DSS-CAMELLIA128-SHA256 | [RFC5932] |
| 0x00BE | TLS_DHE_RSA_WITH_CAMELLIA_128_CBC_SHA256 | DHE-RSA-CAMELLIA128-SHA256 | [RFC5932] |
| 0x00BF | TLS_DH_anon_WITH_CAMELLIA_128_CBC_SHA256 | ADH-CAMELLIA128-SHA256 | [RFC5932] |
| 0x00C0 | TLS_RSA_WITH_CAMELLIA_256_CBC_SHA256 | CAMELLIA256-SHA256 | [RFC5932] |
| 0x00C3 | TLS_DHE_DSS_WITH_CAMELLIA_256_CBC_SHA256 | DHE-DSS-CAMELLIA256-SHA256 | [RFC5932] |
| 0x00C4 | TLS_DHE_RSA_WITH_CAMELLIA_256_CBC_SHA256 | DHE-RSA-CAMELLIA256-SHA256 | [RFC5932] |
| 0x00C5 | TLS_DH_anon_WITH_CAMELLIA_256_CBC_SHA256 | ADH-CAMELLIA256-SHA256 | [RFC5932] |
| 0x00FF | TLS_EMPTY_RENEGOTIATION_INFO_SCSV | | [RFC5746] |
| 0x5600 | TLS_FALLBACK_SCSV | | [RFC7507] |
| 0xC001 | TLS_ECDH_ECDSA_WITH_NULL_SHA | ECDH-ECDSA-NULL-SHA | [RFC8422] |
| 0xC002 | TLS_ECDH_ECDSA_WITH_RC4_128_SHA | ECDH-ECDSA-RC4-SHA | [RFC8422][RFC6347] |
| 0xC003 | TLS_ECDH_ECDSA_WITH_3DES_EDE_CBC_SHA | ECDH-ECDSA-DES-CBC3-SHA | [RFC8422] |
| 0xC004 | TLS_ECDH_ECDSA_WITH_AES_128_CBC_SHA | ECDH-ECDSA-AES128-SHA | [RFC8422] |
| 0xC005 | TLS_ECDH_ECDSA_WITH_AES_256_CBC_SHA | ECDH-ECDSA-AES256-SHA | [RFC8422] |
| 0xC006 | TLS_ECDHE_ECDSA_WITH_NULL_SHA | ECDHE-ECDSA-NULL-SHA | [RFC8422] |
| 0xC007 | TLS_ECDHE_ECDSA_WITH_RC4_128_SHA | ECDHE-ECDSA-RC4-SHA | [RFC8422][RFC6347] |
| 0xC008 | TLS_ECDHE_ECDSA_WITH_3DES_EDE_CBC_SHA | ECDHE-ECDSA-DES-CBC3-SHA | [RFC8422] |
| 0xC009 | TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA | ECDHE-ECDSA-AES128-SHA | [RFC8422] |
| 0xC00A | TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA | ECDHE-ECDSA-AES256-SHA | [RFC8422] |
| 0xC00B | TLS_ECDH_RSA_WITH_NULL_SHA | ECDH-RSA-NULL-SHA | [RFC8422] |
| 0xC00C | TLS_ECDH_RSA_WITH_RC4_128_SHA | ECDH-RSA-RC4-SHA | [RFC8422][RFC6347] |
| 0xC00D | TLS_ECDH_RSA_WITH_3DES_EDE_CBC_SHA | ECDH-RSA-DES-CBC3-SHA | [RFC8422] |
| 0xC00E | TLS_ECDH_RSA_WITH_AES_128_CBC_SHA | ECDH-RSA-AES128-SHA | [RFC8422] |
| 0xC00F | TLS_ECDH_RSA_WITH_AES_256_CBC_SHA | ECDH-RSA-AES256-SHA | [RFC8422] |
| 0xC010 | TLS_ECDHE_RSA_WITH_NULL_SHA | ECDHE-RSA-NULL-SHA | [RFC8422] |
| 0xC011 | TLS_ECDHE_RSA_WITH_RC4_128_SHA | ECDHE-RSA-RC4-SHA | [RFC8422][RFC6347] |
| 0xC012 | TLS_ECDHE_RSA_WITH_3DES_EDE_CBC_SHA | ECDHE-RSA-DES-CBC3-SHA | [RFC8422] |
| 0xC013 | TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA | ECDHE-RSA-AES128-SHA | [RFC8422] |
| 0xC014 | TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA | ECDHE-RSA-AES256-SHA | [RFC8422] |
| 0xC015 | TLS_ECDH_anon_WITH_NULL_SHA | AECDH-NULL-SHA | [RFC8422] |
| 0xC016 | TLS_ECDH_anon_WITH_RC4_128_SHA | AECDH-RC4-SHA | [RFC8422][RFC6347] |
| 0xC017 | TLS_ECDH_anon_WITH_3DES_EDE_CBC_SHA | AECDH-DES-CBC3-SHA | [RFC8422] |
| 0xC018 | TLS_ECDH_anon_WITH_AES_128_CBC_SHA | AECDH-AES128-SHA | [RFC8422] |
| 0xC019 | TLS_ECDH_anon_WITH_AES_256_CBC_SHA | AECDH-AES256-SHA | [RFC8422] |
| 0xC01A | TLS_SRP_SHA_WITH_3DES_EDE_CBC_SHA | SRP-3DES-EDE-CBC-SHA | [RFC5054] |
| 0xC01B | TLS_SRP_SHA_RSA_WITH_3DES_EDE_CBC_SHA | SRP-RSA-3DES-EDE-CBC-SHA | [RFC5054] |
| 0xC01C | TLS_SRP_SHA_DSS_WITH_3DES_EDE_CBC_SHA | SRP-DSS-3DES-EDE-CBC-SHA | [RFC5054] |
| 0xC01D | TLS_SRP_SHA_WITH_AES_128_CBC_SHA | SRP-AES-128-CBC-SHA | [RFC5054] |
| 0xC01E | TLS_SRP_SHA_RSA_WITH_AES_128_CBC_SHA | SRP-RSA-AES-128-CBC-SHA | [RFC5054] |
| 0xC01F | TLS_SRP_SHA_DSS_WITH_AES_128_CBC_SHA | SRP-DSS-AES-128-CBC-SHA | [RFC5054] |
| 0xC020 | TLS_SRP_SHA_WITH_AES_256_CBC_SHA | SRP-AES-256-CBC-SHA | [RFC5054] |
| 0xC021 | TLS_SRP_SHA_RSA_WITH_AES_256_CBC_SHA | SRP-RSA-AES-256-CBC-SHA | [RFC5054] |
| 0xC022 | TLS_SRP_SHA_DSS_WITH_AES_256_CBC_SHA | SRP-DSS-AES-256-CBC-SHA | [RFC5054] |
| 0xC023 | TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256 | ECDHE-ECDSA-AES128-SHA256 | [RFC5289] |
| 0xC024 | TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384 | ECDHE-ECDSA-AES256-SHA384 | [RFC5289] |
| 0xC025 | TLS_ECDH_ECDSA_WITH_AES_128_CBC_SHA256 | ECDH-ECDSA-AES128-SHA256 | [RFC5289] |
| 0xC026 | TLS_ECDH_ECDSA_WITH_AES_256_CBC_SHA384 | ECDH-ECDSA-AES256-SHA384 | [RFC5289] |
| 0xC027 | TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256 | ECDHE-RSA-AES128-SHA256 | [RFC5289] |
| 0xC028 | TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA384 | ECDHE-RSA-AES256-SHA384 | [RFC5289] |
| 0xC029 | TLS_ECDH_RSA_WITH_AES_128_CBC_SHA256 | ECDH-RSA-AES128-SHA256 | [RFC5289] |
| 0xC02A | TLS_ECDH_RSA_WITH_AES_256_CBC_SHA384 | ECDH-RSA-AES256-SHA384 | [RFC5289] |
| 0xC02B | TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 | ECDHE-ECDSA-AES128-GCM-SHA256 | [RFC5289] |
| 0xC02C | TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384 | ECDHE-ECDSA-AES256-GCM-SHA384 | [RFC5289] |
| 0xC02D | TLS_ECDH_ECDSA_WITH_AES_128_GCM_SHA256 | ECDH-ECDSA-AES128-GCM-SHA256 | [RFC5289] |
| 0xC02E | TLS_ECDH_ECDSA_WITH_AES_256_GCM_SHA384 | ECDH-ECDSA-AES256-GCM-SHA384 | [RFC5289] |
| 0xC02F | TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256 | ECDHE-RSA-AES128-GCM-SHA256 | [RFC5289] |
| 0xC030 | TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384 | ECDHE-RSA-AES256-GCM-SHA384 | [RFC5289] |
| 0xC031 | TLS_ECDH_RSA_WITH_AES_128_GCM_SHA256 | ECDH-RSA-AES128-GCM-SHA256 | [RFC5289] |
| 0xC032 | TLS_ECDH_RSA_WITH_AES_256_GCM_SHA384 | ECDH-RSA-AES256-GCM-SHA384 | [RFC5289] |
| 0xC033 | TLS_ECDHE_PSK_WITH_RC4_128_SHA | ECDHE-PSK-RC4-SHA | [RFC5489][RFC6347] |
| 0xC034 | TLS_ECDHE_PSK_WITH_3DES_EDE_CBC_SHA | ECDHE-PSK-3DES-EDE-CBC-SHA | [RFC5489] |
| 0xC035 | TLS_ECDHE_PSK_WITH_AES_128_CBC_SHA | ECDHE-PSK-AES128-CBC-SHA | [RFC5489] |
| 0xC036 | TLS_ECDHE_PSK_WITH_AES_256_CBC_SHA | ECDHE-PSK-AES256-CBC-SHA | [RFC5489] |
| 0xC037 | TLS_ECDHE_PSK_WITH_AES_128_CBC_SHA256 | ECDHE-PSK-AES128-CBC-SHA256 | [RFC5489] |
| 0xC038 | TLS_ECDHE_PSK_WITH_AES_256_CBC_SHA384 | ECDHE-PSK-AES256-CBC-SHA384 | [RFC5489] |
| 0xC039 | TLS_ECDHE_PSK_WITH_NULL_SHA | ECDHE-PSK-NULL-SHA | [RFC5489] |
| 0xC03A | TLS_ECDHE_PSK_WITH_NULL_SHA256 | ECDHE-PSK-NULL-SHA256 | [RFC5489] |
| 0xC03B | TLS_ECDHE_PSK_WITH_NULL_SHA384 | ECDHE-PSK-NULL-SHA384 | [RFC5489] |
| 0xC03C | TLS_RSA_WITH_ARIA_128_CBC_SHA256 | ARIA128-SHA256 | [RFC6209] |
| 0xC03D | TLS_RSA_WITH_ARIA_256_CBC_SHA384 | ARIA256-SHA384 | [RFC6209] |
| 0xC044 | TLS_DHE_RSA_WITH_ARIA_128_CBC_SHA256 | DHE-RSA-ARIA128-SHA256 | [RFC6209] |
| 0xC045 | TLS_DHE_RSA_WITH_ARIA_256_CBC_SHA384 | DHE-RSA-ARIA256-SHA384 | [RFC6209] |
| 0xC048 | TLS_ECDHE_ECDSA_WITH_ARIA_128_CBC_SHA256 | ECDHE-ECDSA-ARIA128-SHA256 | [RFC6209] |
| 0xC049 | TLS_ECDHE_ECDSA_WITH_ARIA_256_CBC_SHA384 | ECDHE-ECDSA-ARIA256-SHA384 | [RFC6209] |
| 0xC04A | TLS_ECDH_ECDSA_WITH_ARIA_128_CBC_SHA256 | ECDH-ECDSA-ARIA128-SHA256 | [RFC6209] |
| 0xC04B | TLS_ECDH_ECDSA_WITH_ARIA_256_CBC_SHA384 | ECDH-ECDSA-ARIA256-SHA384 | [RFC6209] |
| 0xC04C | TLS_ECDHE_RSA_WITH_ARIA_128_CBC_SHA256 | ECDHE-ARIA128-SHA256 | [RFC6209] |
| 0xC04D | TLS_ECDHE_RSA_WITH_ARIA_256_CBC_SHA384 | ECDHE-ARIA256-SHA384 | [RFC6209] |
| 0xC04E | TLS_ECDH_RSA_WITH_ARIA_128_CBC_SHA256 | ECDH-ARIA128-SHA256 | [RFC6209] |
| 0xC04F | TLS_ECDH_RSA_WITH_ARIA_256_CBC_SHA384 | ECDH-ARIA256-SHA384 | [RFC6209] |
| 0xC050 | TLS_RSA_WITH_ARIA_128_GCM_SHA256 | ARIA128-GCM-SHA256 | [RFC6209] |
| 0xC051 | TLS_RSA_WITH_ARIA_256_GCM_SHA384 | ARIA256-GCM-SHA384 | [RFC6209] |
| 0xC052 | TLS_DHE_RSA_WITH_ARIA_128_GCM_SHA256 | DHE-RSA-ARIA128-GCM-SHA256 | [RFC6209] |
| 0xC053 | TLS_DHE_RSA_WITH_ARIA_256_GCM_SHA384 | DHE-RSA-ARIA256-GCM-SHA384 | [RFC6209] |
| 0xC056 | TLS_DHE_DSS_WITH_ARIA_128_GCM_SHA256 | DHE-DSS-ARIA128-GCM-SHA256 | [RFC6209] |
| 0xC057 | TLS_DHE_DSS_WITH_ARIA_256_GCM_SHA384 | DHE-DSS-ARIA256-GCM-SHA384 | [RFC6209] |
| 0xC05C | TLS_ECDHE_ECDSA_WITH_ARIA_128_GCM_SHA256 | ECDHE-ECDSA-ARIA128-GCM-SHA256 | [RFC6209] |
| 0xC05D | TLS_ECDHE_ECDSA_WITH_ARIA_256_GCM_SHA384 | ECDHE-ECDSA-ARIA256-GCM-SHA384 | [RFC6209] |
| 0xC05E | TLS_ECDH_ECDSA_WITH_ARIA_128_GCM_SHA256 | ECDH-ECDSA-ARIA128-GCM-SHA256 | [RFC6209] |
| 0xC05F | TLS_ECDH_ECDSA_WITH_ARIA_256_GCM_SHA384 | ECDH-ECDSA-ARIA256-GCM-SHA384 | [RFC6209] |
| 0xC060 | TLS_ECDHE_RSA_WITH_ARIA_128_GCM_SHA256 | ECDHE-ARIA128-GCM-SHA256 | [RFC6209] |
| 0xC061 | TLS_ECDHE_RSA_WITH_ARIA_256_GCM_SHA384 | ECDHE-ARIA256-GCM-SHA384 | [RFC6209] |
| 0xC062 | TLS_ECDH_RSA_WITH_ARIA_128_GCM_SHA256 | ECDH-ARIA128-GCM-SHA256 | [RFC6209] |
| 0xC063 | TLS_ECDH_RSA_WITH_ARIA_256_GCM_SHA384 | ECDH-ARIA256-GCM-SHA384 | [RFC6209] |
| 0xC064 | TLS_PSK_WITH_ARIA_128_CBC_SHA256 | PSK-ARIA128-SHA256 | [RFC6209] |
| 0xC065 | TLS_PSK_WITH_ARIA_256_CBC_SHA384 | PSK-ARIA256-SHA384 | [RFC6209] |
| 0xC066 | TLS_DHE_PSK_WITH_ARIA_128_CBC_SHA256 | DHE-PSK-ARIA128-SHA256 | [RFC6209] |
| 0xC067 | TLS_DHE_PSK_WITH_ARIA_256_CBC_SHA384 | DHE-PSK-ARIA256-SHA384 | [RFC6209] |
| 0xC068 | TLS_RSA_PSK_WITH_ARIA_128_CBC_SHA256 | RSA-PSK-ARIA128-SHA256 | [RFC6209] |
| 0xC069 | TLS_RSA_PSK_WITH_ARIA_256_CBC_SHA384 | RSA-PSK-ARIA256-SHA384 | [RFC6209] |
| 0xC06A | TLS_PSK_WITH_ARIA_128_GCM_SHA256 | PSK-ARIA128-GCM-SHA256 | [RFC6209] |
| 0xC06B | TLS_PSK_WITH_ARIA_256_GCM_SHA384 | PSK-ARIA256-GCM-SHA384 | [RFC6209] |
| 0xC06C | TLS_DHE_PSK_WITH_ARIA_128_GCM_SHA256 | DHE-PSK-ARIA128-GCM-SHA256 | [RFC6209] |
| 0xC06D | TLS_DHE_PSK_WITH_ARIA_256_GCM_SHA384 | DHE-PSK-ARIA256-GCM-SHA384 | [RFC6209] |
| 0xC06E | TLS_RSA_PSK_WITH_ARIA_128_GCM_SHA256 | RSA-PSK-ARIA128-GCM-SHA256 | [RFC6209] |
| 0xC06F | TLS_RSA_PSK_WITH_ARIA_256_GCM_SHA384 | RSA-PSK-ARIA256-GCM-SHA384 | [RFC6209] |
| 0xC070 | TLS_ECDHE_PSK_WITH_ARIA_128_CBC_SHA256 | ECDHE-PSK-ARIA128-SHA256 | [RFC6209] |
| 0xC071 | TLS_ECDHE_PSK_WITH_ARIA_256_CBC_SHA384 | ECDHE-PSK-ARIA256-SHA384 | [RFC6209] |
| 0xC072 | TLS_ECDHE_ECDSA_WITH_CAMELLIA_128_CBC_SHA256 | ECDHE-ECDSA-CAMELLIA128-SHA256 | [RFC6367] |
| 0xC073 | TLS_ECDHE_ECDSA_WITH_CAMELLIA_256_CBC_SHA384 | ECDHE-ECDSA-CAMELLIA256-SHA384 | [RFC6367] |
| 0xC074 | TLS_ECDH_ECDSA_WITH_CAMELLIA_128_CBC_SHA256 | ECDH-ECDSA-CAMELLIA128-SHA256 | [RFC6367] |
| 0xC075 | TLS_ECDH_ECDSA_WITH_CAMELLIA_256_CBC_SHA384 | ECDH-ECDSA-CAMELLIA256-SHA384 | [RFC6367] |
| 0xC076 | TLS_ECDHE_RSA_WITH_CAMELLIA_128_CBC_SHA256 | ECDHE-RSA-CAMELLIA128-SHA256 | [RFC6367] |
| 0xC077 | TLS_ECDHE_RSA_WITH_CAMELLIA_256_CBC_SHA384 | ECDHE-RSA-CAMELLIA256-SHA384 | [RFC6367] |
| 0xC078 | TLS_ECDH_RSA_WITH_CAMELLIA_128_CBC_SHA256 | ECDH-CAMELLIA128-SHA256 | [RFC6367] |
| 0xC079 | TLS_ECDH_RSA_WITH_CAMELLIA_256_CBC_SHA384 | ECDH-CAMELLIA256-SHA384 | [RFC6367] |
| 0xC07A | TLS_RSA_WITH_CAMELLIA_128_GCM_SHA256 | CAMELLIA128-GCM-SHA256 | [RFC6367] |
| 0xC07B | TLS_RSA_WITH_CAMELLIA_256_GCM_SHA384 | CAMELLIA256-GCM-SHA384 | [RFC6367] |
| 0xC07C | TLS_DHE_RSA_WITH_CAMELLIA_128_GCM_SHA256 | DHE-RSA-CAMELLIA128-GCM-SHA256 | [RFC6367] |
| 0xC07D | TLS_DHE_RSA_WITH_CAMELLIA_256_GCM_SHA384 | DHE-RSA-CAMELLIA256-GCM-SHA384 | [RFC6367] |
| 0xC086 | TLS_ECDHE_ECDSA_WITH_CAMELLIA_128_GCM_SHA256 | ECDHE-ECDSA-CAMELLIA128-GCM-SHA256 | [RFC6367] |
| 0xC087 | TLS_ECDHE_ECDSA_WITH_CAMELLIA_256_GCM_SHA384 | ECDHE-ECDSA-CAMELLIA256-GCM-SHA384 | [RFC6367] |
| 0xC088 | TLS_ECDH_ECDSA_WITH_CAMELLIA_128_GCM_SHA256 | ECDH-ECDSA-CAMELLIA128-GCM-SHA256 | [RFC6367] |
| 0xC089 | TLS_ECDH_ECDSA_WITH_CAMELLIA_256_GCM_SHA384 | ECDH-ECDSA-CAMELLIA256-GCM-SHA384 | [RFC6367] |
| 0xC08A | TLS_ECDHE_RSA_WITH_CAMELLIA_128_GCM_SHA256 | ECDHE-CAMELLIA128-GCM-SHA256 | [RFC6367] |
| 0xC08B | TLS_ECDHE_RSA_WITH_CAMELLIA_256_GCM_SHA384 | ECDHE-CAMELLIA256-GCM-SHA384 | [RFC6367] |
| 0xC08C | TLS_ECDH_RSA_WITH_CAMELLIA_128_GCM_SHA256 | ECDH-CAMELLIA128-GCM-SHA256 | [RFC6367] |
| 0xC08D | TLS_ECDH_RSA_WITH_CAMELLIA_256_GCM_SHA384 | ECDH-CAMELLIA256-GCM-SHA384 | [RFC6367] |
| 0xC08E | TLS_PSK_WITH_CAMELLIA_128_GCM_SHA256 | PSK-CAMELLIA128-GCM-SHA256 | [RFC6367] |
| 0xC08F | TLS_PSK_WITH_CAMELLIA_256_GCM_SHA384 | PSK-CAMELLIA256-GCM-SHA384 | [RFC6367] |
| 0xC090 | TLS_DHE_PSK_WITH_CAMELLIA_128_GCM_SHA256 | DHE-PSK-CAMELLIA128-GCM-SHA256 | [RFC6367] |
| 0xC091 | TLS_DHE_PSK_WITH_CAMELLIA_256_GCM_SHA384 | DHE-PSK-CAMELLIA256-GCM-SHA384 | [RFC6367] |
| 0xC092 | TLS_RSA_PSK_WITH_CAMELLIA_128_GCM_SHA256 | RSA-PSK-CAMELLIA128-GCM-SHA256 | [RFC6367] |
| 0xC093 | TLS_RSA_PSK_WITH_CAMELLIA_256_GCM_SHA384 | RSA-PSK-CAMELLIA256-GCM-SHA384 | [RFC6367] |
| 0xC094 | TLS_PSK_WITH_CAMELLIA_128_CBC_SHA256 | PSK-CAMELLIA128-SHA256 | [RFC6367] |
| 0xC095 | TLS_PSK_WITH_CAMELLIA_256_CBC_SHA384 | PSK-CAMELLIA256-SHA384 | [RFC6367] |
| 0xC096 | TLS_DHE_PSK_WITH_CAMELLIA_128_CBC_SHA256 | DHE-PSK-CAMELLIA128-SHA256 | [RFC6367] |
| 0xC097 | TLS_DHE_PSK_WITH_CAMELLIA_256_CBC_SHA384 | DHE-PSK-CAMELLIA256-SHA384 | [RFC6367] |
| 0xC098 | TLS_RSA_PSK_WITH_CAMELLIA_128_CBC_SHA256 | RSA-PSK-CAMELLIA128-SHA256 | [RFC6367] |
| 0xC099 | TLS_RSA_PSK_WITH_CAMELLIA_256_CBC_SHA384 | RSA-PSK-CAMELLIA256-SHA384 | [RFC6367] |
| 0xC09A | TLS_ECDHE_PSK_WITH_CAMELLIA_128_CBC_SHA256 | ECDHE-PSK-CAMELLIA128-SHA256 | [RFC6367] |
| 0xC09B | TLS_ECDHE_PSK_WITH_CAMELLIA_256_CBC_SHA384 | ECDHE-PSK-CAMELLIA256-SHA384 | [RFC6367] |
| 0xC09C | TLS_RSA_WITH_AES_128_CCM | AES128-CCM | [RFC6655] |
| 0xC09D | TLS_RSA_WITH_AES_256_CCM | AES256-CCM | [RFC6655] |
| 0xC09E | TLS_DHE_RSA_WITH_AES_128_CCM | DHE-RSA-AES128-CCM | [RFC6655] |
| 0xC09F | TLS_DHE_RSA_WITH_AES_256_CCM | DHE-RSA-AES256-CCM | [RFC6655] |
| 0xC0A0 | TLS_RSA_WITH_AES_128_CCM_8 | AES128-CCM8 | [RFC6655] |
| 0xC0A1 | TLS_RSA_WITH_AES_256_CCM_8 | AES256-CCM8 | [RFC6655] |
| 0xC0A2 | TLS_DHE_RSA_WITH_AES_128_CCM_8 | DHE-RSA-AES128-CCM8 | [RFC6655] |
| 0xC0A3 | TLS_DHE_RSA_WITH_AES_256_CCM_8 | DHE-RSA-AES256-CCM8 | [RFC6655] |
| 0xC0A4 | TLS_PSK_WITH_AES_128_CCM | PSK-AES128-CCM | [RFC6655] |
| 0xC0A5 | TLS_PSK_WITH_AES_256_CCM | PSK-AES256-CCM | [RFC6655] |
| 0xC0A6 | TLS_DHE_PSK_WITH_AES_128_CCM | DHE-PSK-AES128-CCM | [RFC6655] |
| 0xC0A7 | TLS_DHE_PSK_WITH_AES_256_CCM | DHE-PSK-AES256-CCM | [RFC6655] |
| 0xC0A8 | TLS_PSK_WITH_AES_128_CCM_8 | PSK-AES128-CCM8 | [RFC6655] |
| 0xC0A9 | TLS_PSK_WITH_AES_256_CCM_8 | PSK-AES256-CCM8 | [RFC6655] |
| 0xC0AA | TLS_PSK_DHE_WITH_AES_128_CCM_8 | DHE-PSK-AES128-CCM8 | [RFC6655] |
| 0xC0AB | TLS_PSK_DHE_WITH_AES_256_CCM_8 | DHE-PSK-AES256-CCM8 | [RFC6655] |
| 0xC0AC | TLS_ECDHE_ECDSA_WITH_AES_128_CCM | ECDHE-ECDSA-AES128-CCM | [RFC7251] |
| 0xC0AD | TLS_ECDHE_ECDSA_WITH_AES_256_CCM | ECDHE-ECDSA-AES256-CCM | [RFC7251] |
| 0xC0AE | TLS_ECDHE_ECDSA_WITH_AES_128_CCM_8 | ECDHE-ECDSA-AES128-CCM8 | [RFC7251] |
| 0xC0AF | TLS_ECDHE_ECDSA_WITH_AES_256_CCM_8 | ECDHE-ECDSA-AES256-CCM8 | [RFC7251] |
| 0xC100 | TLS_GOSTR341112_256_WITH_KUZNYECHIK_CTR_OMAC | GOST2012-KUZNYECHIK-KUZNYECHIKOMAC | [RFC9189] |
| 0xC101 | TLS_GOSTR341112_256_WITH_MAGMA_CTR_OMAC | GOST2012-MAGMA-MAGMAOMAC | [RFC9189] |
| 0xC102 | TLS_GOSTR341112_256_WITH_28147_CNT_IMIT | IANA-GOST2012-GOST8912-GOST8912 | [RFC9189] |
| 0xCC13 | | ECDHE-RSA-CHACHA20-POLY1305-OLD | |
| 0xCC14 | | ECDHE-ECDSA-CHACHA20-POLY1305-OLD | |
| 0xCC15 | | DHE-RSA-CHACHA20-POLY1305-OLD | |
| 0xCCA8 | TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 | ECDHE-RSA-CHACHA20-POLY1305 | [RFC7905] |
| 0xCCA9 | TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256 | ECDHE-ECDSA-CHACHA20-POLY1305 | [RFC7905] |
| 0xCCAA | TLS_DHE_RSA_WITH_CHACHA20_POLY1305_SHA256 | DHE-RSA-CHACHA20-POLY1305 | [RFC7905] |
| 0xCCAB | TLS_PSK_WITH_CHACHA20_POLY1305_SHA256 | PSK-CHACHA20-POLY1305 | [RFC7905] |
| 0xCCAC | TLS_ECDHE_PSK_WITH_CHACHA20_POLY1305_SHA256 | ECDHE-PSK-CHACHA20-POLY1305 | [RFC7905] |
| 0xCCAD | TLS_DHE_PSK_WITH_CHACHA20_POLY1305_SHA256 | DHE-PSK-CHACHA20-POLY1305 | [RFC7905] |
| 0xCCAE | TLS_RSA_PSK_WITH_CHACHA20_POLY1305_SHA256 | RSA-PSK-CHACHA20-POLY1305 | [RFC7905] |
| 0xD001 | TLS_ECDHE_PSK_WITH_AES_128_GCM_SHA256 | ECDHE-PSK-AES128-GCM-SHA256 | [RFC8442] |
| 0xE011 | | ECDHE-ECDSA-SM4-CBC-SM3 | |
| 0xE051 | | ECDHE-ECDSA-SM4-GCM-SM3 | |
| 0xE052 | | ECDHE-ECDSA-SM4-CCM-SM3 | |
| 0xFF00 | | GOST-MD5 | |
| 0xFF01 | | GOST-GOST94 | |
| 0xFF02 | | GOST-GOST89MAC | |
| 0xFF03 | | GOST-GOST89STREAM | |

273
deps/curl/docs/CIPHERS.md vendored Normal file
View File

@ -0,0 +1,273 @@
<!--
Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
SPDX-License-Identifier: curl
-->
## curl cipher options
A TLS handshake involves many parameters which take part in the negotiation
between client and server in order to agree on the TLS version and set of
algorithms to use for a connection.
What has become known as a "cipher" or better "cipher suite" in TLS
are names for specific combinations of
[key exchange](https://en.wikipedia.org/wiki/Key_exchange),
[bulk encryption](https://en.wikipedia.org/wiki/Link_encryption),
[message authentication code](https://en.wikipedia.org/wiki/Message_authentication_code)
and with TLSv1.3 the
[authenticated encryption](https://en.wikipedia.org/wiki/Authenticated_encryption).
In addition, there are other parameters that influence the TLS handshake, like
[DHE](https://en.wikipedia.org/wiki/DiffieHellman_key_exchange) "groups" and
[ECDHE](https://en.wikipedia.org/wiki/Elliptic-curve_DiffieHellman) with its
"curves".
### History
curl's way of letting users configure these settings closely followed OpenSSL
in its API. TLS learned new parameters, OpenSSL added new API functions and
curl added command line options.
Several other TLS backends followed the OpenSSL approach, more or less closely,
and curl maps the command line options to these TLS backends. Some TLS
backends do not support all of it and command line options are either
ignored or lead to an error.
Many examples below show the OpenSSL-like use of these options. GnuTLS
however chose a different approach. These are described in a separate
section further below.
## ciphers, the OpenSSL way
With curl's option
[`--tls13-ciphers`](https://curl.se/docs/manpage.html#--tls13-ciphers)
or
[`CURLOPT_TLS13_CIPHERS`](https://curl.se/libcurl/c/CURLOPT_TLS13_CIPHERS.html)
users can control which cipher suites to consider when negotiating TLS 1.3
connections. With option
[`--ciphers`](https://curl.se/docs/manpage.html#--ciphers)
or
[`CURLOPT_SSL_CIPHER_LIST`](https://curl.se/libcurl/c/CURLOPT_SSL_CIPHER_LIST.html)
users can control which cipher suites to consider when negotiating
TLS 1.2 (1.1, 1.0) connections.
By default, curl may negotiate TLS 1.3 and TLS 1.2 connections, so the cipher
suites considered when negotiating a TLS connection are a union of the TLS 1.3
and TLS 1.2 cipher suites. If you want curl to consider only TLS 1.3 cipher
suites for the connection, you have to set the minimum TLS version to 1.3 by
using [`--tlsv1.3`](https://curl.se/docs/manpage.html#--tlsv13)
or [`CURLOPT_SSLVERSION`](https://curl.se/libcurl/c/CURLOPT_SSLVERSION.html)
with `CURL_SSLVERSION_TLSv1_3`.
Both the TLS 1.3 and TLS 1.2 cipher options expect a list of cipher suites
separated by colons (`:`). This list is parsed opportunistically, cipher suites
that are not recognized or implemented are ignored. As long as there is at
least one recognized cipher suite in the list, the list is considered valid.
For both the TLS 1.3 and TLS 1.2 cipher options, the order in which the
cipher suites are specified determine the preference of them. When negotiating
a TLS connection the server picks a cipher suite from the intersection of the
cipher suites supported by the server and the cipher suites sent by curl. If
the server is configured to honor the client's cipher preference, the first
common cipher suite in the list sent by curl is chosen.
### TLS 1.3 cipher suites
Setting TLS 1.3 cipher suites is supported by curl with
OpenSSL (1.1.1+, curl 7.61.0+), LibreSSL (3.4.1+, curl 8.3.0+),
wolfSSL (curl 8.10.0+) and mbedTLS (3.6.0+, curl 8.10.0+).
The list of cipher suites that can be used for the `--tls13-ciphers` option:
```
TLS_AES_128_GCM_SHA256
TLS_AES_256_GCM_SHA384
TLS_CHACHA20_POLY1305_SHA256
TLS_AES_128_CCM_SHA256
TLS_AES_128_CCM_8_SHA256
```
#### wolfSSL notes
In addition to above list the following cipher suites can be used:
`TLS_SM4_GCM_SM3` `TLS_SM4_CCM_SM3` `TLS_SHA256_SHA256` `TLS_SHA384_SHA384`.
Usage of these cipher suites is not recommended. (The last two cipher suites
are NULL ciphers, offering no encryption whatsoever.)
### TLS 1.2 (1.1, 1.0) cipher suites
Setting TLS 1.2 cipher suites is supported by curl with OpenSSL, LibreSSL,
BoringSSL, mbedTLS (curl 8.8.0+), wolfSSL (curl 7.53.0+),
Secure Transport (curl 7.77.0+) and BearSSL (curl 7.83.0+). Schannel does not
support setting cipher suites directly, but does support setting algorithms
(curl 7.61.0+), see Schannel notes below.
For TLS 1.2 cipher suites there are multiple naming schemes, the two most used
are with OpenSSL names (e.g. `ECDHE-RSA-AES128-GCM-SHA256`) and IANA names
(e.g. `TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256`). IANA names of TLS 1.2 cipher
suites look similar to TLS 1.3 cipher suite names, to distinguish them note
that TLS 1.2 names contain `_WITH_`, while TLS 1.3 names do not. When setting
TLS 1.2 cipher suites with curl it is recommended that you use OpenSSL names
as these are most widely recognized by the supported SSL backends.
The complete list of cipher suites that may be considered for the `--ciphers`
option is extensive, it consists of more than 300 ciphers suites. However,
nowadays for most of them their usage is discouraged, and support for a lot of
them have been removed from the various SSL backends, if ever implemented at
all.
A shortened list (based on [recommendations by
Mozilla](https://wiki.mozilla.org/Security/Server_Side_TLS)) of cipher suites,
which are (mostly) supported by all SSL backends, that can be used for the
`--ciphers` option:
```
ECDHE-ECDSA-AES128-GCM-SHA256
ECDHE-RSA-AES128-GCM-SHA256
ECDHE-ECDSA-AES256-GCM-SHA384
ECDHE-RSA-AES256-GCM-SHA384
ECDHE-ECDSA-CHACHA20-POLY1305
ECDHE-RSA-CHACHA20-POLY1305
DHE-RSA-AES128-GCM-SHA256
DHE-RSA-AES256-GCM-SHA384
DHE-RSA-CHACHA20-POLY1305
ECDHE-ECDSA-AES128-SHA256
ECDHE-RSA-AES128-SHA256
ECDHE-ECDSA-AES128-SHA
ECDHE-RSA-AES128-SHA
ECDHE-ECDSA-AES256-SHA384
ECDHE-RSA-AES256-SHA384
ECDHE-ECDSA-AES256-SHA
ECDHE-RSA-AES256-SHA
DHE-RSA-AES128-SHA256
DHE-RSA-AES256-SHA256
AES128-GCM-SHA256
AES256-GCM-SHA384
AES128-SHA256
AES256-SHA256
AES128-SHA
AES256-SHA
DES-CBC3-SHA
```
See this [list](https://github.com/curl/curl/blob/master/docs/CIPHERS-TLS12.md)
for a complete list of TLS 1.2 cipher suites.
#### OpenSSL notes
In addition to specifying a list of cipher suites, OpenSSL also accepts a
format with specific cipher strings (like `TLSv1.2`, `AESGCM`, `CHACHA20`) and
`!`, `-` and `+` operators. Refer to the
[OpenSSL cipher documentation](https://docs.openssl.org/master/man1/openssl-ciphers/#cipher-list-format)
for further information on that format.
#### Schannel notes
Schannel does not support setting individual TLS 1.2 cipher suites directly.
It only allows the enabling and disabling of encryption algorithms. These are
in the form of `CALG_xxx`, see the [Schannel `ALG_ID`
documentation](https://docs.microsoft.com/windows/desktop/SecCrypto/alg-id)
for a list of these algorithms. Also, (since curl 7.77.0)
`SCH_USE_STRONG_CRYPTO` can be given to pass that flag to Schannel, lookup the
[documentation for the Windows version in
use](https://learn.microsoft.com/en-us/windows/win32/secauthn/cipher-suites-in-schannel)
to see how that affects the cipher suite selection. When not specifying the
`--ciphers` and `--tls13-ciphers` options curl passes this flag by default.
### Examples
```sh
curl \
--tls13-ciphers TLS_AES_128_GCM_SHA256:TLS_CHACHA20_POLY1305_SHA256 \
--ciphers ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256:\
ECDHE-ECDSA-CHACHA20-POLY1305:ECDHE-RSA-CHACHA20-POLY1305 \
https://example.com/
```
Restrict ciphers to `aes128-gcm` and `chacha20`. Works with OpenSSL, LibreSSL,
mbedTLS and wolfSSL.
```sh
curl \
--tlsv1.3 \
--tls13-ciphers TLS_AES_128_GCM_SHA256:TLS_CHACHA20_POLY1305_SHA256 \
https://example.com/
```
Restrict to only TLS 1.3 with `aes128-gcm` and `chacha20` ciphers. Works with
OpenSSL, LibreSSL, mbedTLS, wolfSSL and Schannel.
```sh
curl \
--ciphers ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256:\
ECDHE-ECDSA-CHACHA20-POLY1305:ECDHE-RSA-CHACHA20-POLY1305 \
https://example.com/
```
Restrict TLS 1.2 ciphers to `aes128-gcm` and `chacha20`, use default TLS 1.3
ciphers (if TLS 1.3 is available). Works with OpenSSL, LibreSSL, BoringSSL,
mbedTLS, wolfSSL, Secure Transport and BearSSL.
## ciphers, the GnuTLS way
With GnuTLS, curl allows configuration of all TLS parameters via option
[`--ciphers`](https://curl.se/docs/manpage.html#--ciphers)
or
[`CURLOPT_SSL_CIPHER_LIST`](https://curl.se/libcurl/c/CURLOPT_SSL_CIPHER_LIST.html)
only. The option
[`--tls13-ciphers`](https://curl.se/docs/manpage.html#--tls13-ciphers)
or
[`CURLOPT_TLS13_CIPHERS`](https://curl.se/libcurl/c/CURLOPT_TLS13_CIPHERS.html)
is being ignored.
`--ciphers` is used to set the GnuTLS **priority string** in
the following way:
* When the set string starts with '+', '-' or '!' it is *appended* to the
priority string libcurl itself generates (separated by ':'). This initial
priority depends other settings such as CURLOPT_SSLVERSION(3),
CURLOPT_TLSAUTH_USERNAME(3) (for SRP) or if HTTP/3 (QUIC)
is being negotiated.
* Otherwise, the set string fully *replaces* the libcurl generated one. While
giving full control to the application, the set priority needs to
provide for everything the transfer may need to negotiate. Example: if
the set priority only allows TLSv1.2, all HTTP/3 attempts fail.
Users may specify via `--ciphers` anything that GnuTLS supports: ciphers,
key exchange, MAC, compression, TLS versions, signature algorithms, groups,
elliptic curves, certificate types. In addition, GnuTLS has a variety of
other keywords that tweak its operations. Applications or a system
may define new alias names for priority strings that can then be used here.
Since the order of items in priority strings is significant, it makes no
sense for curl to puzzle other ssl options somehow together. `--ciphers`
is the single way to change priority.
### Examples
```sh
curl \
--ciphers '-CIPHER_ALL:+AES-128-GCM:+CHACHA20-POLY1305' \
https://example.com/
```
Restrict ciphers to `aes128-gcm` and `chacha20` in GnuTLS.
```sh
curl \
--ciphers 'NORMAL:-VERS-ALL:+TLS1.3:-AES-256-GCM' \
https://example.com/
```
Restrict to only TLS 1.3 without the `aes256-gcm` cipher.
```sh
curl \
--ciphers 'NORMAL:-VERS-ALL:+TLS1.2:-CIPHER_ALL:+CAMELLIA-128-GCM' \
https://example.com/
```
Restrict to only TLS 1.2 with the `CAMELLIA-128-GCM` cipher.
## Further reading
- [OpenSSL cipher suite names documentation](https://docs.openssl.org/master/man1/openssl-ciphers/#cipher-suite-names)
- [wolfSSL cipher support documentation](https://www.wolfssl.com/documentation/manuals/wolfssl/chapter04.html#cipher-support)
- [mbedTLS cipher suites reference](https://mbed-tls.readthedocs.io/projects/api/en/development/api/file/ssl__ciphersuites_8h/)
- [Schannel cipher suites documentation](https://learn.microsoft.com/en-us/windows/win32/secauthn/cipher-suites-in-schannel)
- [BearSSL supported crypto](https://www.bearssl.org/support.html)
- [Secure Transport cipher suite values](https://developer.apple.com/documentation/security/1550981-ssl_cipher_suite_values)
- [IANA cipher suites list](https://www.iana.org/assignments/tls-parameters/tls-parameters.xhtml#tls-parameters-4)
- [Wikipedia cipher suite article](https://en.wikipedia.org/wiki/Cipher_suite)
- [GnuTLS Priority Strings](https://gnutls.org/manual/html_node/Priority-Strings.html)

46
deps/curl/docs/CMakeLists.txt vendored Normal file
View File

@ -0,0 +1,46 @@
#***************************************************************************
# _ _ ____ _
# Project ___| | | | _ \| |
# / __| | | | |_) | |
# | (__| |_| | _ <| |___
# \___|\___/|_| \_\_____|
#
# Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
#
# This software is licensed as described in the file COPYING, which
# you should have received as part of this distribution. The terms
# are also available at https://curl.se/docs/copyright.html.
#
# You may opt to use, copy, modify, merge, publish, distribute and/or sell
# copies of the Software, and permit persons to whom the Software is
# furnished to do so, under the terms of the COPYING file.
#
# This software is distributed on an "AS IS" basis, WITHOUT WARRANTY OF ANY
# KIND, either express or implied.
#
# SPDX-License-Identifier: curl
#
###########################################################################
if(BUILD_LIBCURL_DOCS)
add_subdirectory(libcurl)
endif()
if(ENABLE_CURL_MANUAL AND BUILD_CURL_EXE)
add_subdirectory(cmdline-opts)
endif()
if(BUILD_MISC_DOCS)
foreach(_man_misc IN ITEMS "curl-config" "mk-ca-bundle")
set(_man_target "${CMAKE_CURRENT_BINARY_DIR}/${_man_misc}.1")
add_custom_command(OUTPUT "${_man_target}"
WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}
COMMAND "${PERL_EXECUTABLE}" "${PROJECT_SOURCE_DIR}/scripts/cd2nroff" "${_man_misc}.md" > "${_man_target}"
DEPENDS "${_man_misc}.md"
VERBATIM
)
add_custom_target("curl-generate-${_man_misc}.1" ALL DEPENDS "${_man_target}")
if(NOT CURL_DISABLE_INSTALL)
install(FILES "${_man_target}" DESTINATION "${CMAKE_INSTALL_MANDIR}/man1")
endif()
endforeach()
endif()

38
deps/curl/docs/CODE_OF_CONDUCT.md vendored Normal file
View File

@ -0,0 +1,38 @@
<!--
Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
SPDX-License-Identifier: curl
-->
Contributor Code of Conduct
===========================
As contributors and maintainers of this project, we pledge to respect all
people who contribute through reporting issues, posting feature requests,
updating documentation, submitting pull requests or patches, and other
activities.
We are committed to making participation in this project a harassment-free
experience for everyone, regardless of level of experience, gender, gender
identity and expression, sexual orientation, disability, personal appearance,
body size, race, ethnicity, age, or religion.
Examples of unacceptable behavior by participants include the use of sexual
language or imagery, derogatory comments or personal attacks, trolling, public
or private harassment, insults, or other unprofessional conduct.
Project maintainers have the right and responsibility to remove, edit, or
reject comments, commits, code, wiki edits, issues, and other contributions
that are not aligned to this Code of Conduct. Project maintainers who do not
follow the Code of Conduct may be removed from the project team.
This code of conduct applies both within project spaces and in public spaces
when an individual is representing the project or its community.
Instances of abusive, harassing, or otherwise unacceptable behavior may be
reported by opening an issue or contacting one or more of the project
maintainers.
This Code of Conduct is adapted from the [Contributor
Covenant](https://contributor-covenant.org/), version 1.1.0, available at
[https://contributor-covenant.org/version/1/1/0/](https://contributor-covenant.org/version/1/1/0/)

174
deps/curl/docs/CODE_REVIEW.md vendored Normal file
View File

@ -0,0 +1,174 @@
<!--
Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
SPDX-License-Identifier: curl
-->
# How to do code reviews for curl
Anyone and everyone is encouraged and welcome to review code submissions in
curl. This is a guide on what to check for and how to perform a successful
code review.
## All submissions should get reviewed
All pull requests and patches submitted to the project should be reviewed by
at least one experienced curl maintainer before that code is accepted and
merged.
## Let the tools and tests take the first rounds
On initial pull requests, let the tools and tests do their job first and then
start out by helping the submitter understand the test failures and tool
alerts.
## How to provide feedback to author
Be nice. Ask questions. Provide examples or suggestions of improvements.
Assume the best intentions. Remember language barriers.
All first-time contributors can become regulars. Let's help them go there.
## Is this a change we want?
If this is not a change that seems to be aligned with the project's path
forward and as such cannot be accepted, inform the author about this sooner
rather than later. Do it gently and explain why and possibly what could be
done to make it more acceptable.
## API/ABI stability or changed behavior
Changing the API and the ABI may be fine in a change but it needs to be done
deliberately and carefully. If not, a reviewer must help the author to realize
the mistake.
curl and libcurl are similarly strict on not modifying existing behavior. API
and ABI stability is not enough, the behavior should also remain intact as far
as possible.
## Code style
Most code style nits are detected by checksrc but not all. Only leave remarks
on style deviation once checksrc does not find anymore.
Minor nits from fresh submitters can also be handled by the maintainer when
merging, in case it seems like the submitter is not clear on what to do. We
want to make the process fun and exciting for new contributors.
## Encourage consistency
Make sure new code is written in a similar style as existing code. Naming,
logic, conditions, etc.
## Are pointers always non-NULL?
If a function or code rely on pointers being non-NULL, take an extra look if
that seems to be a fair assessment.
## Asserts
Conditions that should never be false can be verified with `DEBUGASSERT()`
calls to get caught in tests and debugging easier, while not having an impact
on final or release builds.
## Memory allocation
Can the mallocs be avoided? Do not introduce mallocs in any hot paths. If
there are (new) mallocs, can they be combined into fewer calls?
Are all allocations handled in error paths to avoid leaks and crashes?
## Thread-safety
We do not like static variables as they break thread-safety and prevent
functions from being reentrant.
## Should features be `#ifdef`ed?
Features and functionality may not be present everywhere and should therefore
be `#ifdef`ed. Additionally, some features should be possible to switch on/off
in the build.
Write `#ifdef`s to be as little of a "maze" as possible.
## Does it look portable enough?
curl runs "everywhere". Does the code take a reasonable stance and enough
precautions to be possible to build and run on most platforms?
Remember that we live by C89 restrictions.
## Tests and testability
New features should be added in conjunction with one or more test cases.
Ideally, functions should also be written so that unit tests can be done to
test individual functions.
## Documentation
New features or changes to existing functionality **must** be accompanied by
updated documentation. Submitting that in a separate follow-up pull request is
not OK. A code review must also verify that the submitted documentation update
matches the code submission.
English is not everyone's first language, be mindful of this and help the
submitter improve the text if it needs a rewrite to read better.
## Code should not be hard to understand
Source code should be written to maximize readability and be easy to
understand.
## Functions should not be large
A single function should never be large as that makes it hard to follow and
understand all the exit points and state changes. Some existing functions in
curl certainly violate this ground rule but when reviewing new code we should
propose splitting into smaller functions.
## Duplication is evil
Anything that looks like duplicated code is a red flag. Anything that seems to
introduce code that we *should* already have or provide needs a closer check.
## Sensitive data
When credentials are involved, take an extra look at what happens with this
data. Where it comes from and where it goes.
## Variable types differ
`size_t` is not a fixed size. `time_t` can be signed or unsigned and have
different sizes. Relying on variable sizes is a red flag.
Also remember that endianness and >= 32-bit accesses to unaligned addresses
are problematic areas.
## Integer overflows
Be careful about integer overflows. Some variable types can be either 32-bit
or 64-bit. Integer overflows must be detected and acted on *before* they
happen.
## Dangerous use of functions
Maybe use of `realloc()` should rather use the dynbuf functions?
Do not allow new code that grows buffers without using dynbuf.
Use of C functions that rely on a terminating zero must only be used on data
that really do have a null-terminating zero.
## Dangerous "data styles"
Make extra precautions and verify that memory buffers that need a terminating
zero always have exactly that. Buffers *without* a null-terminator must not be
used as input to string functions.
# Commit messages
Tightly coupled with a code review is making sure that the commit message is
good. It is the responsibility of the person who merges the code to make sure
that the commit message follows our standard (detailed in the
[CONTRIBUTE](CONTRIBUTE.md) document). This includes making sure the PR
identifies related issues and giving credit to reporters and helpers.

307
deps/curl/docs/CONTRIBUTE.md vendored Normal file
View File

@ -0,0 +1,307 @@
<!--
Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
SPDX-License-Identifier: curl
-->
# Contributing to the curl project
This document is intended to offer guidelines on how to best contribute to the
curl project. This concerns new features as well as corrections to existing
flaws or bugs.
## Join the Community
Skip over to [https://curl.se/mail/](https://curl.se/mail/) and join
the appropriate mailing list(s). Read up on details before you post
questions. Read this file before you start sending patches. We prefer
questions sent to and discussions being held on the mailing list(s), not sent
to individuals.
Before posting to one of the curl mailing lists, please read up on the
[mailing list etiquette](https://curl.se/mail/etiquette.html).
We also hang out on IRC in #curl on libera.chat
If you are at all interested in the code side of things, consider clicking
'watch' on the [curl repository on GitHub](https://github.com/curl/curl) to be
notified of pull requests and new issues posted there.
## License and copyright
When contributing with code, you agree to put your changes and new code under
the same license curl and libcurl is already using unless stated and agreed
otherwise.
If you add a larger piece of code, you can opt to make that file or set of
files to use a different license as long as they do not enforce any changes to
the rest of the package and they make sense. Such "separate parts" can not be
GPL licensed (as we do not want copyleft to affect users of libcurl) but they
must use "GPL compatible" licenses (as we want to allow users to use libcurl
properly in GPL licensed environments).
When changing existing source code, you do not alter the copyright of the
original file(s). The copyright is still owned by the original creator(s) or
those who have been assigned copyright by the original author(s).
By submitting a patch to the curl project, you are assumed to have the right
to the code and to be allowed by your employer or whatever to hand over that
patch/code to us. We credit you for your changes as far as possible, to give
credit but also to keep a trace back to who made what changes. Please always
provide us with your full real name when contributing,
## What To Read
Source code, the man pages, the [INTERNALS
document](https://curl.se/dev/internals.html),
[TODO](https://curl.se/docs/todo.html),
[KNOWN_BUGS](https://curl.se/docs/knownbugs.html) and the [most recent
changes](https://curl.se/dev/sourceactivity.html) in git. Just lurking on the
[curl-library mailing list](https://curl.se/mail/list.cgi?list=curl-library)
gives you a lot of insights on what's going on right now. Asking there is a
good idea too.
## Write a good patch
### Follow code style
When writing C code, follow the
[CODE_STYLE](https://curl.se/dev/code-style.html) already established in
the project. Consistent style makes code easier to read and mistakes less
likely to happen. Run `make checksrc` before you submit anything, to make sure
you follow the basic style. That script does not verify everything, but if it
complains you know you have work to do.
### Non-clobbering All Over
When you write new functionality or fix bugs, it is important that you do not
fiddle all over the source files and functions. Remember that it is likely
that other people have done changes in the same source files as you have and
possibly even in the same functions. If you bring completely new
functionality, try writing it in a new source file. If you fix bugs, try to
fix one bug at a time and send them as separate patches.
### Write Separate Changes
It is annoying when you get a huge patch from someone that is said to fix 11
odd problems, but discussions and opinions do not agree with 10 of them - or 9
of them were already fixed in a different way. Then the person merging this
change needs to extract the single interesting patch from somewhere within the
huge pile of source, and that creates a lot of extra work.
Preferably, each fix that corrects a problem should be in its own patch/commit
with its own description/commit message stating exactly what they correct so
that all changes can be selectively applied by the maintainer or other
interested parties.
Also, separate changes enable bisecting much better for tracking problems
and regression in the future.
### Patch Against Recent Sources
Please try to get the latest available sources to make your patches against.
It makes the lives of the developers so much easier. The best is if you get
the most up-to-date sources from the git repository, but the latest release
archive is quite OK as well.
### Documentation
Writing docs is dead boring and one of the big problems with many open source
projects but someone's gotta do it. It makes things a lot easier if you submit
a small description of your fix or your new features with every contribution
so that it can be swiftly added to the package documentation.
Documentation is mostly provided as manpages or plain ASCII files. The
manpages are rendered from their source files that are usually written using
markdown. Most HTML files on the website and in the release archives are
generated from corresponding markdown and ASCII files.
### Test Cases
Since the introduction of the test suite, we can quickly verify that the main
features are working as they are supposed to. To maintain this situation and
improve it, all new features and functions that are added need to be tested in
the test suite. Every feature that is added should get at least one valid test
case that verifies that it works as documented. If every submitter also posts
a few test cases, it does not end up a heavy burden on a single person.
If you do not have test cases or perhaps you have done something that is hard
to write tests for, do explain exactly how you have otherwise tested and
verified your changes.
# Submit Your Changes
## Get your changes merged
Ideally you file a [pull request on
GitHub](https://github.com/curl/curl/pulls), but you can also send your plain
patch to [the curl-library mailing
list](https://curl.se/mail/list.cgi?list=curl-library).
If you opt to post a patch on the mailing list, chances are someone converts
it into a pull request for you, to have the CI jobs verify it proper before it
can be merged. Be prepared that some feedback on the proposed change might
then come on GitHub.
Your changes be reviewed and discussed and you are expected to correct flaws
pointed out and update accordingly, or the change risks stalling and
eventually just getting deleted without action. As a submitter of a change,
you are the owner of that change until it has been merged.
Respond on the list or on GitHub about the change and answer questions and/or
fix nits/flaws. This is important. We take lack of replies as a sign that you
are not anxious to get your patch accepted and we tend to simply drop such
changes.
## About pull requests
With GitHub it is easy to send a [pull
request](https://github.com/curl/curl/pulls) to the curl project to have
changes merged.
We strongly prefer pull requests to mailed patches, as it makes it a proper
git commit that is easy to merge and they are easy to track and not that easy
to lose in the flood of many emails, like they sometimes do on the mailing
lists.
Every pull request submitted is automatically tested in several different
ways. [See the CI document for more
information](https://github.com/curl/curl/blob/master/tests/CI.md).
Sometimes the tests fail due to a dependency service temporarily being offline
or otherwise unavailable, e.g. package downloads. In this case you can just
try to update your pull requests to rerun the tests later as described below.
You can update your pull requests by pushing new commits or force-pushing
changes to existing commits. Force-pushing an amended commit without any
actual content changed also allows you to retrigger the tests for that commit.
When you adjust your pull requests after review, consider squashing the
commits so that we can review the full updated version more easily.
A pull request sent to the project might get labeled `needs-votes` by a
project maintainer. This label means that in addition to meeting all other
checks and qualifications this pull request must also receive more "votes" of
user support. More signs that people want this to happen. It could be in the
form of messages saying so, or thumbs-up reactions on GitHub.
## When the pull request is approved
If it does not seem to get approved when you think it is ready - feel free to
ask for approval.
Once your pull request has been approved it can be merged by a maintainer.
For new features, or changes, we require that the *feature window* is open for
the pull request to be merged. This is typically a three week period that
starts ten days after a previous release. New features submitted as pull
requests while the window is closed simply have to wait until it opens to get
merged.
If time passes without your approved pull request gets merged: feel free to
ask what more you can do to make it happen.
## Making quality changes
Make the patch against as recent source versions as possible.
If you have followed the tips in this document and your patch still has not
been incorporated or responded to after some weeks, consider resubmitting it
to the list or better yet: change it to a pull request.
## Commit messages
How to write git commit messages in the curl project.
---- start ----
[area]: [short line describing the main effect]
-- empty line --
[full description, no wider than 72 columns that describes as much as
possible as to why this change is made, and possibly what things
it fixes and everything else that is related,
-- end --
The first line is a succinct description of the change and should ideally work
as a single line in the RELEASE NOTES.
- use the imperative, present tense: **change** not "changed" nor "changes"
- do not capitalize the first letter
- no period (.) at the end
The `[area]` in the first line can be `http2`, `cookies`, `openssl` or
similar. There is no fixed list to select from but using the same "area" as
other related changes could make sense.
## Commit message keywords
Use the following ways to improve the message and provide pointers to related
work.
- `Follow-up to {shorthash}` - if this fixes or continues a previous commit;
add a `Ref:` that commit's PR or issue if it is not a small, obvious fix;
followed by an empty line
- `Bug: URL` to the source of the report or more related discussion; use
`Fixes` for GitHub issues instead when that is appropriate.
- `Approved-by: John Doe` - credit someone who approved the PR.
- `Authored-by: John Doe` - credit the original author of the code; only use
this if you cannot use `git commit --author=...`.
- `Signed-off-by: John Doe` - we do not use this, but do not bother removing
it.
- `whatever-else-by:` credit all helpers, finders, doers; try to use one of
the following keywords if at all possible, for consistency: `Acked-by:`,
`Assisted-by:`, `Co-authored-by:`, `Found-by:`, `Reported-by:`,
`Reviewed-by:`, `Suggested-by:`, `Tested-by:`.
- `Ref: #1234` - if this is related to a GitHub issue or PR, possibly one that
has already been closed.
- `Ref: URL` to more information about the commit; use `Bug:` instead for a
reference to a bug on another bug tracker]
- `Fixes #1234` - if this fixes a GitHub issue; GitHub closes the issue once
this commit is merged.
- `Closes #1234` - if this merges a GitHub PR; GitHub closes the PR once this
commit is merged.
Do not forget to use commit with `--author` if you commit someone else's work,
and make sure that you have your own user and email setup correctly in git
before you commit.
Add whichever header lines as appropriate, with one line per person if more
than one person was involved. There is no need to credit yourself unless you
are using `--author` which hides your identity. Do not include people's email
addresses in headers to avoid spam, unless they are already public from a
previous commit; saying `{userid} on github` is OK.
## Push Access
If you are a frequent contributor, you may be given push access to the git
repository and then you are able to push your changes straight into the git
repository instead of sending changes as pull requests or by mail as patches.
Just ask if this is what you would want. You are required to have posted
several high quality patches first, before you can be granted push access.
## Useful resources
- [Webinar on getting code into cURL](https://www.youtube.com/watch?v=QmZ3W1d6LQI)
# Update copyright and license information
There is a CI job called **REUSE compliance / check** that runs on every pull
request and commit to verify that the *REUSE state* of all files are still
fine.
This means that all files need to have their license and copyright information
clearly stated. Ideally by having the standard curl source code header, with
the `SPDX-License-Identifier` included. If the header does not work, you can
use a smaller header or add the information for a specific file to the
`REUSE.toml` file.
You can manually verify the copyright and compliance status by running the
[REUSE helper tool](https://github.com/fsfe/reuse-tool): `reuse lint`

191
deps/curl/docs/CURL-DISABLE.md vendored Normal file
View File

@ -0,0 +1,191 @@
<!--
Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
SPDX-License-Identifier: curl
-->
# Code defines to disable features and protocols
## `CURL_DISABLE_ALTSVC`
Disable support for Alt-Svc: HTTP headers.
## `CURL_DISABLE_BINDLOCAL`
Disable support for binding the local end of connections.
## `CURL_DISABLE_COOKIES`
Disable support for HTTP cookies.
## `CURL_DISABLE_BASIC_AUTH`
Disable support for the Basic authentication methods.
## `CURL_DISABLE_BEARER_AUTH`
Disable support for the Bearer authentication methods.
## `CURL_DISABLE_DIGEST_AUTH`
Disable support for the Digest authentication methods.
## `CURL_DISABLE_KERBEROS_AUTH`
Disable support for the Kerberos authentication methods.
## `CURL_DISABLE_NEGOTIATE_AUTH`
Disable support for the negotiate authentication methods.
## `CURL_DISABLE_AWS`
Disable **aws-sigv4** support.
## `CURL_DISABLE_CA_SEARCH`
Disable unsafe CA bundle search in PATH on Windows.
## `CURL_DISABLE_DICT`
Disable the DICT protocol
## `CURL_DISABLE_DOH`
Disable DNS-over-HTTPS
## `CURL_DISABLE_FILE`
Disable the FILE protocol
## `CURL_DISABLE_FORM_API`
Disable the form API
## `CURL_DISABLE_FTP`
Disable the FTP (and FTPS) protocol
## `CURL_DISABLE_GETOPTIONS`
Disable the `curl_easy_options` API calls that lets users get information
about existing options to `curl_easy_setopt`.
## `CURL_DISABLE_GOPHER`
Disable the GOPHER protocol.
## `CURL_DISABLE_HEADERS_API`
Disable the HTTP header API.
## `CURL_DISABLE_HSTS`
Disable the HTTP Strict Transport Security support.
## `CURL_DISABLE_HTTP`
Disable the HTTP(S) protocols. Note that this then also disable HTTP proxy
support.
## `CURL_DISABLE_HTTP_AUTH`
Disable support for all HTTP authentication methods.
## `CURL_DISABLE_IMAP`
Disable the IMAP(S) protocols.
## `CURL_DISABLE_LDAP`
Disable the LDAP(S) protocols.
## `CURL_DISABLE_LDAPS`
Disable the LDAPS protocol.
## `CURL_DISABLE_LIBCURL_OPTION`
Disable the --libcurl option from the curl tool.
## `CURL_DISABLE_MIME`
Disable MIME support.
## `CURL_DISABLE_MQTT`
Disable MQTT support.
## `CURL_DISABLE_NETRC`
Disable the netrc parser.
## `CURL_DISABLE_NTLM`
Disable support for NTLM.
## `CURL_DISABLE_OPENSSL_AUTO_LOAD_CONFIG`
Disable the auto load config support in the OpenSSL backend.
## `CURL_DISABLE_PARSEDATE`
Disable date parsing
## `CURL_DISABLE_POP3`
Disable the POP3 protocol
## `CURL_DISABLE_PROGRESS_METER`
Disable the built-in progress meter
## `CURL_DISABLE_PROXY`
Disable support for proxies
## `CURL_DISABLE_IPFS`
Disable the IPFS/IPNS protocols. This affects the curl tool only, where
IPFS/IPNS protocol support is implemented.
## `CURL_DISABLE_RTSP`
Disable the RTSP protocol.
## `CURL_DISABLE_SHA512_256`
Disable the SHA-512/256 hash algorithm.
## `CURL_DISABLE_SHUFFLE_DNS`
Disable the shuffle DNS feature
## `CURL_DISABLE_SMB`
Disable the SMB(S) protocols
## `CURL_DISABLE_SMTP`
Disable the SMTP(S) protocols
## `CURL_DISABLE_SOCKETPAIR`
Disable the use of `socketpair()` internally to allow waking up and canceling
`curl_multi_poll()`.
## `CURL_DISABLE_TELNET`
Disable the TELNET protocol
## `CURL_DISABLE_TFTP`
Disable the TFTP protocol
## `CURL_DISABLE_VERBOSE_STRINGS`
Disable verbose strings and error messages.
## `CURL_DISABLE_WEBSOCKETS`
Disable the WebSocket protocols.

168
deps/curl/docs/CURLDOWN.md vendored Normal file
View File

@ -0,0 +1,168 @@
<!--
Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
SPDX-License-Identifier: curl
-->
# curldown
A markdown-like syntax for libcurl man pages.
## Purpose
A text format for writing libcurl documentation in the shape of man pages.
Make it easier for users to contribute and write documentation. A format that
is easier on the eye in its source format.
Make it harder to do syntactical mistakes.
Use a format that allows creating man pages that end up looking exactly like
the man pages did when we wrote them in nroff format.
Take advantage of the fact that people these days are accustomed to markdown
by using a markdown-like syntax.
This allows us to fix issues in the nroff format easier since now we generate
them. For example: escaping minus to prevent them from being turned into
Unicode by man.
Generate nroff output that looks (next to) *identical* to the previous files,
so that the look, existing test cases, HTML conversions, existing
infrastructure etc remain mostly intact.
Contains meta-data in a structured way to allow better output (for example the
see also information) and general awareness of what the file is about.
## File extension
Since curldown looks similar to markdown, we use `.md` extensions on the
files.
## Conversion
Convert **from curldown to nroff** with `cd2nroff`. Generates nroff man pages.
Convert **from nroff to curldown** with `nroff2cd`. This is only meant to be
used for the initial conversion to curldown and should ideally never be needed
again.
Convert, check or clean up an existing curldown to nicer, better, cleaner
curldown with **cd2cd**.
Mass-convert all curldown files to nroff in specified directories with
`cdall`:
cdall [dir1] [dir2] [dir3] ..
## Known issues
The `cd2nroff` tool does not yet handle *italics* or **bold** where the start
and the end markers are used on separate lines.
The `nroff2cd` tool generates code style quotes for all `.fi` sections since
the nroff format does not carry a distinction.
# Format
Each curldown starts with a header with meta-data:
---
c: Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
SPDX-License-Identifier: curl
Title: CURLOPT_AWS_SIGV4
Section: 3
Source: libcurl
Protocol:
- HTTP
See-also:
- CURLOPT_HEADEROPT (3)
- CURLOPT_HTTPAUTH (3)
TLS-backend:
- [name]
Added-in: [version or "n/a"]
---
All curldown files *must* have all the headers present and at least one
`See-also:` entry specified.
If the man page is for section 3 (library related). The `Protocol` list must
contain at least one protocol, which can be `*` if the option is virtually for
everything. If `*` is used, it must be the only listed protocol. Recognized
protocols are either URL schemes (in uppercase), `TLS` or `TCP`.
If the `Protocol` list contains `TLS`, then there must also be a `TLS-backend`
list, specifying `All` or a list of what TLS backends that work with this
option. The available TLS backends are:
- `BearSSL`
- `GnuTLS`
- `mbedTLS`
- `OpenSSL` (also covers BoringSSL, LibreSSL, quictls, AWS-LC and AmiSSL)
- `rustls`
- `Schannel`
- `Secure Transport`
- `wolfSSL`
- `All`: all TLS backends
Following the header in the file, is the manual page using markdown-like
syntax:
~~~
# NAME
a page - this is a page descriving something
# SYNOPSIS
~~~c
#include <curl/curl.h>
CURLcode curl_easy_setopt(CURL *handle, CURLOPT_AWS_SIGV4, char *param);
~~~
~~~
Quoted source code should start with `~~~c` and end with `~~~` while regular
quotes can start with `~~~` or just be indented with 4 spaces.
Headers at top-level `#` get converted to `.SH`.
`nroff2cd` supports the `##` next level header which gets converted to `.IP`.
Write bold words or phrases within `**` like:
This is a **bold** word.
Write italics like:
This is *italics*.
Due to how man pages do not support backticks especially formatted, such
occurrences in the source are instead just using italics in the generated
output:
This `word` appears in italics.
When generating the nroff output, the tooling removes superfluous newlines,
meaning they can be used freely in the source file to make the text more
readable.
To make sure curldown documents render correctly as markdown, all literal
occurrences of `<` or `>` need to be escaped by a leading backslash.
## Generating contents
`# %PROTOCOLS%` - inserts a **PROTOCOLS** section based on the metadata
provided in the header.
`# %AVAILABILITY%` - inserts an **AVAILABILITY** section based on the metadata
provided in the header.
## Symbols
All mentioned curl symbols that have their own man pages, like
`curl_easy_perform(3)` are automatically rendered using italics in the output
without having to enclose it with asterisks. This helps ensuring that they get
converted to links properly later in the HTML version on the website, as
converted with `roffit`. This makes the curldown text easier to read even when
mentioning many curl symbols.
This auto-linking works for patterns matching `(lib|)curl[^ ]*(3)`.

59
deps/curl/docs/DEPRECATE.md vendored Normal file
View File

@ -0,0 +1,59 @@
<!--
Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
SPDX-License-Identifier: curl
-->
# Items to be removed from future curl releases
If any of these deprecated features is a cause for concern for you, please
email the
[curl-library mailing list](https://lists.haxx.se/listinfo/curl-library)
as soon as possible and explain to us why this is a problem for you and
how your use case cannot be satisfied properly using a workaround.
## TLS libraries without 1.3 support
curl drops support for TLS libraries without TLS 1.3 capability after May
2025.
It requires that a curl build using the library should be able to negotiate
and use TLS 1.3, or else it is not good enough.
As of May 2024, the libraries that need to get fixed to remain supported after
May 2025 are: BearSSL and Secure Transport.
## msh3 support
The msh3 backed for QUIC and HTTP/3 was introduced in April 2022 but has never
been made to work properly. It has seen no visible traction or developer
activity from the msh3 main author (or anyone else seemingly interested) in
two years. As a non-functional backend, it only adds friction and "weight" to
the development and maintenance.
Meanwhile, we have a fully working backend in the ngtcp2 one and we have two
fully working backends in OpenSSL-QUIC and quiche well on their way of ending
their experimental status in a future.
We remove msh3 support from the curl source tree in July 2025.
## winbuild build system
curl drops support for the winbuild build method after September 2025.
We recommend migrating to CMake. See the migration guide in
`docs/INSTALL-CMAKE.md`.
## Past removals
- Pipelining
- axTLS
- PolarSSL
- NPN
- Support for systems without 64-bit data types
- NSS
- gskit
- MinGW v1
- NTLM_WB
- space-separated `NOPROXY` patterns
- hyper

286
deps/curl/docs/DISTROS.md vendored Normal file
View File

@ -0,0 +1,286 @@
<!--
Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
SPDX-License-Identifier: curl
-->
# curl distros
<!-- markdown-link-check-disable -->
Lots of organizations distribute curl packages to end users. This is a
collection of pointers to where to learn more about curl on and with each
distro. Those marked *Rolling Release* typically run the latest version of curl
and are therefore less likely to have back-ported patches to older versions.
We discuss curl distro issues, patches and collaboration on the [curl-distros
mailing list](https://lists.haxx.se/listinfo/curl-distros) ([list
archives](https://curl.se/mail/list.cgi?list=curl-distros)).
## AlmaLinux
- curl package source and patches: https://git.almalinux.org/rpms/curl/
- curl issues: https://bugs.almalinux.org/view_all_bug_page.php click Category and choose curl
- curl security: https://errata.almalinux.org/ search for curl
## Alpine Linux
- curl: https://pkgs.alpinelinux.org/package/edge/main/x86_64/curl
- curl issues: https://gitlab.alpinelinux.org/alpine/aports/-/issues
- curl security: https://security.alpinelinux.org/srcpkg/curl
- curl package source and patches: https://gitlab.alpinelinux.org/alpine/aports/-/tree/master/main/curl
## Alt Linux
- curl: http://www.sisyphus.ru/srpm/Sisyphus/curl
- curl issues: https://packages.altlinux.org/en/sisyphus/srpms/curl/issues/
- curl patches: https://git.altlinux.org/gears/c/curl.git?p=curl.git;a=tree;f=.gear
## Arch Linux
*Rolling Release*
- curl: https://archlinux.org/packages/core/x86_64/curl/
- curl issues: https://gitlab.archlinux.org/archlinux/packaging/packages/curl/-/issues
- curl security: https://security.archlinux.org/package/curl
- curl wiki: https://wiki.archlinux.org/title/CURL
## Buildroot
*Rolling Release*
- curl package source and patches: https://git.buildroot.net/buildroot/tree/package/libcurl
- curl issues: https://bugs.buildroot.org/buglist.cgi?quicksearch=curl
## Chimera
- curl package source and patches: https://github.com/chimera-linux/cports/tree/master/main/curl
## Clear Linux
*Rolling Release*
- curl: https://github.com/clearlinux-pkgs/curl
- curl issues: https://github.com/clearlinux/distribution/issues
## Conary
- curl: https://github.com/conan-io/conan-center-index/tree/master/recipes/libcurl
- curl issues: https://github.com/conan-io/conan-center-index/issues
- curl patches: https://github.com/conan-io/conan-center-index/tree/master/recipes/libcurl (in `all/patches/*`, if any)
## conda-forge
- curl: https://github.com/conda-forge/curl-feedstock
- curl issues: https://github.com/conda-forge/curl-feedstock/issues
## CRUX
- curl: https://crux.nu/portdb/?a=search&q=curl
- curl issues: https://git.crux.nu/ports/core/issues/?type=all&state=open&q=curl
## curl-for-win
(this is the official curl binaries for Windows shipped by the curl project)
*Rolling Release*
- curl: https://curl.se/windows/
- curl patches: https://github.com/curl/curl-for-win/blob/main/curl.patch (if any)
- build-specific issues: https://github.com/curl/curl-for-win/issues
Issues and patches for this are managed in the main curl project.
## Cygwin
- curl: https://cygwin.com/cgit/cygwin-packages/curl/tree/curl.cygport
- curl patches: https://cygwin.com/cgit/cygwin-packages/curl/tree
- curl issues: https://inbox.sourceware.org/cygwin/?q=s%3Acurl
## Cygwin (cross mingw64)
- mingw64-x86_64-curl: https://cygwin.com/cgit/cygwin-packages/mingw64-x86_64-curl/tree/mingw64-x86_64-curl.cygport
- mingw64-x86_64-curl patches: https://cygwin.com/cgit/cygwin-packages/mingw64-x86_64-curl/tree
- mingw64-x86_64-curl issues: https://inbox.sourceware.org/cygwin/?q=s%3Amingw64-x86_64-curl
## Debian
- curl: https://tracker.debian.org/pkg/curl
- curl issues: https://bugs.debian.org/cgi-bin/pkgreport.cgi?pkg=curl
- curl patches: https://udd.debian.org/patches.cgi?src=curl
- curl patches: https://salsa.debian.org/debian/curl (in debian/* branches, inside the folder debian/patches)
## Fedora
- curl: https://src.fedoraproject.org/rpms/curl
- curl issues: [bugzilla](https://bugzilla.redhat.com/buglist.cgi?bug_status=NEW&bug_status=ASSIGNED&classification=Fedora&product=Fedora&product=Fedora%20EPEL&component=curl)
- curl patches: [list of patches in package git](https://src.fedoraproject.org/rpms/curl/tree/rawhide)
## FreeBSD
- curl: https://cgit.freebsd.org/ports/tree/ftp/curl
- curl patches: https://cgit.freebsd.org/ports/tree/ftp/curl
- curl issues: https://bugs.freebsd.org/bugzilla/buglist.cgi?bug_status=__open__&order=Importance&product=Ports%20%26%20Packages&query_format=advanced&short_desc=curl&short_desc_type=allwordssubstr
## Gentoo Linux
*Rolling Release*
- curl: https://packages.gentoo.org/packages/net-misc/curl
- curl issues: https://bugs.gentoo.org/buglist.cgi?quicksearch=net-misc/curl
- curl package sources and patches: https://gitweb.gentoo.org/repo/gentoo.git/tree/net-misc/curl/
## GNU Guix
*Rolling Release*
- curl: https://git.savannah.gnu.org/gitweb/?p=guix.git;a=blob;f=gnu/packages/curl.scm;hb=HEAD
- curl issues: https://issues.guix.gnu.org/search?query=curl
## Homebrew
*Rolling Release*
- curl: https://formulae.brew.sh/formula/curl
Homebrew's policy is that all patches and issues should be submitted upstream
unless it is specific to Homebrew's way of packaging software.
## MacPorts
*Rolling Release*
- curl: https://github.com/macports/macports-ports/tree/master/net/curl
- curl issues: https://trac.macports.org/query?0_port=curl&0_port_mode=%7E&0_status=%21closed
- curl patches: https://github.com/macports/macports-ports/tree/master/net/curl/files
## Mageia
- curl: https://svnweb.mageia.org/packages/cauldron/curl/current/SPECS/curl.spec?view=markup
- curl issues: https://bugs.mageia.org/buglist.cgi?bug_status=NEW&bug_status=UNCONFIRMED&bug_status=NEEDINFO&bug_status=UPSTREAM&bug_status=ASSIGNED&component=RPM%20Packages&f1=cf_rpmpkg&list_id=176576&o1=casesubstring&product=Mageia&query_format=advanced&v1=curl
- curl patches: https://svnweb.mageia.org/packages/cauldron/curl/current/SOURCES/
- curl patches in stable distro releases: https://svnweb.mageia.org/packages/updates/<STABLE_VERSION>/curl/current/SOURCES/
- curl security: https://advisories.mageia.org/src_curl.html
## MSYS2
*Rolling Release*
- curl: https://github.com/msys2/MSYS2-packages/tree/master/curl
- curl issues: https://github.com/msys2/MSYS2-packages/issues
- curl patches: https://github.com/msys2/MSYS2-packages/tree/master/curl (`*.patch`)
## MSYS2 (mingw-w64)
*Rolling Release*
- curl: https://github.com/msys2/MINGW-packages/tree/master/mingw-w64-curl
- curl issues: https://github.com/msys2/MINGW-packages/issues
- curl patches: https://github.com/msys2/MINGW-packages/tree/master/mingw-w64-curl (`*.patch`)
## Muldersoft
*Rolling Release*
- curl: https://github.com/lordmulder/cURL-build-win32
- curl issues: https://github.com/lordmulder/cURL-build-win32/issues
- curl patches: https://github.com/lordmulder/cURL-build-win32/tree/master/patch
## NixOS
- curl: https://github.com/NixOS/nixpkgs/blob/master/pkgs/tools/networking/curl/default.nix
- curl issues: https://github.com/NixOS/nixpkgs
nixpkgs is the package repository used by the NixOS Linux distribution, but
can also be used on other distributions
## OmniOS
- curl: https://github.com/omniosorg/omnios-build/tree/master/build/curl
- curl issues: https://github.com/omniosorg/omnios-build/issues
- curl patches: https://github.com/omniosorg/omnios-build/tree/master/build/curl/patches
## OpenIndiana
- curl: https://github.com/OpenIndiana/oi-userland/tree/oi/hipster/components/web/curl
- curl issues: https://www.illumos.org/projects/openindiana/issues
- curl patches: https://github.com/OpenIndiana/oi-userland/tree/oi/hipster/components/web/curl/patches
## OpenSUSE
- curl source and patches: https://build.opensuse.org/package/show/openSUSE%3AFactory/curl
## Oracle Solaris
- curl: https://github.com/oracle/solaris-userland/tree/master/components/curl
- curl issues: https://support.oracle.com/ (requires support contract)
- curl patches: https://github.com/oracle/solaris-userland/tree/master/components/curl/patches
## OpenEmbedded / Yocto Project
*Rolling Release*
- curl: https://layers.openembedded.org/layerindex/recipe/5765/
- curl issues: https://bugzilla.yoctoproject.org/
- curl patches: https://git.openembedded.org/openembedded-core/tree/meta/recipes-support/curl
## PLD Linux
- curl package source and patches: https://github.com/pld-linux/curl
- curl issues: https://bugs.launchpad.net/pld-linux?field.searchtext=curl&search=Search&field.status%3Alist=NEW&field.status%3Alist=INCOMPLETE_WITH_RESPONSE&field.status%3Alist=INCOMPLETE_WITHOUT_RESPONSE&field.status%3Alist=CONFIRMED&field.status%3Alist=TRIAGED&field.status%3Alist=INPROGRESS&field.status%3Alist=FIXCOMMITTED&field.assignee=&field.bug_reporter=&field.omit_dupes=on&field.has_patch=&field.has_no_package=
## pkgsrc
- curl: https://github.com/NetBSD/pkgsrc/tree/trunk/www/curl
- curl issues: https://github.com/NetBSD/pkgsrc/issues
- curl patches: https://github.com/NetBSD/pkgsrc/tree/trunk/www/curl/patches
## Red Hat Enterprise Linux / CentOS Stream
- curl: https://kojihub.stream.centos.org/koji/packageinfo?packageID=217
- curl issues: https://issues.redhat.com/secure/CreateIssueDetails!init.jspa?pid=12332745&issuetype=1&components=12377466&priority=10300
- curl patches: https://gitlab.com/redhat/centos-stream/rpms/curl
## Rocky Linux
- curl: https://git.rockylinux.org/staging/rpms/curl/-/blob/r9/SPECS/curl.spec
- curl issues: https://bugs.rockylinux.org
- curl patches: https://git.rockylinux.org/staging/rpms/curl/-/tree/r9/SOURCES
## SerenityOS
- curl: https://github.com/SerenityOS/serenity/tree/master/Ports/curl
- curl issues: https://github.com/SerenityOS/serenity/issues?q=label%3Aports
- curl patches: https://github.com/SerenityOS/serenity/tree/master/Ports/curl/patches
## SmartOS
- curl: https://github.com/TritonDataCenter/illumos-extra/tree/master/curl
- curl issues: https://github.com/TritonDataCenter/illumos-extra/issues
- curl patches: https://github.com/TritonDataCenter/illumos-extra/tree/master/curl/Patches
## SPACK
- curl package source and patches: https://github.com/spack/spack/tree/develop/var/spack/repos/builtin/packages/curl
## vcpkg
*Rolling Release*
- curl: https://github.com/microsoft/vcpkg/tree/master/ports/curl
- curl issues: https://github.com/microsoft/vcpkg/issues
- curl patches: https://github.com/microsoft/vcpkg/tree/master/ports/curl (`*.patch`)
## Void Linux
*Rolling Release*
- curl: https://github.com/void-linux/void-packages/tree/master/srcpkgs/curl
- curl issues: https://github.com/void-linux/void-packages/issues
- curl patches: https://github.com/void-linux/void-packages/tree/master/srcpkgs/curl/patches
## Wolfi
*Rolling Release*
- curl: https://github.com/wolfi-dev/os/blob/main/curl.yaml

73
deps/curl/docs/EARLY-RELEASE.md vendored Normal file
View File

@ -0,0 +1,73 @@
<!--
Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
SPDX-License-Identifier: curl
-->
# How to determine if an early patch release is warranted
In the curl project we do releases every 8 weeks. Unless we break the cycle
and do an early patch release.
We do frequent releases partly to always have the next release "not too far
away".
## Bugfix
During the release cycle, and especially in the beginning of a new cycle (the
so-called "cool down" period), there are times when a bug is reported and
after it has been subsequently fixed correctly, the question might be asked:
is this bug and associated fix important enough for an early patch release?
The question can only be properly asked when a fix has been created and landed
in the git master branch.
## Early release
An early patch release means that we ship a new, complete and full release
called `major.minor.patch` where the `patch` part is increased by one since
the previous release. A curl release is a curl release. There is no small or
big and we never release just a patch. There is only "release".
## Questions to ask
- Is there a security advisory rated high or critical?
- Is there a data corruption bug?
- Did the bug cause an API/ABI breakage?
- Does the problem annoy a significant share of the user population?
If the answer is yes to one or more of the above, an early release might be
warranted.
More questions to ask ourselves when doing the assessment if the answers to
the three ones above are all 'no'.
- Does the bug cause curl to prematurely terminate?
- How common is the affected buggy option/feature/protocol/platform to get
used?
- How large is the estimated impacted user base?
- Does the bug block something crucial for applications or other adoption of
curl "out there" ?
- Does the bug cause problems for curl developers or others on "the curl
team" ?
- Is the bug limited to the curl tool only? That might have a smaller impact
than a bug also present in libcurl.
- Is there a (decent) workaround?
- Is it a regression? Is the bug introduced in this release?
- Can the bug be fixed "easily" by applying a patch?
- Does the bug break the build? Most users do not build curl themselves.
- How long is it until the already scheduled next release?
- Can affected users safely rather revert to a former release until the next
scheduled release?
- Is it a performance regression with no functionality side-effects? If so it
has to be substantial.
## If an early release is deemed necessary
Unless done for security or similarly important reasons, an early release
should not be done within a week of the previous release.
This, to enable us to collect and bundle more fixes into the same release to
make the release more worthwhile for everyone and to allow more time for fixes
to settle and things to get tested. Getting a release in shape and done in
style is work that should not be rushed.

478
deps/curl/docs/ECH.md vendored Normal file
View File

@ -0,0 +1,478 @@
<!--
Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
SPDX-License-Identifier: curl
-->
# Building curl with HTTPS-RR and ECH support
We have added support for ECH to curl. It can use HTTPS RRs published in the
DNS if curl uses DoH, or else can accept the relevant ECHConfigList values
from the command line. This works with OpenSSL, wolfSSL, BoringSSL or AWS-LC as
the TLS provider.
This feature is EXPERIMENTAL. DO NOT USE IN PRODUCTION.
This should however provide enough of a proof-of-concept to prompt an informed
discussion about a good path forward for ECH support in curl.
## OpenSSL Build
To build our ECH-enabled OpenSSL fork:
```bash
cd $HOME/code
git clone https://github.com/defo-project/openssl
cd openssl
./config --libdir=lib --prefix=$HOME/code/openssl-local-inst
...stuff...
make -j8
...stuff (maybe go for coffee)...
make install_sw
...a little bit of stuff...
```
To build curl ECH-enabled, making use of the above:
```bash
cd $HOME/code
git clone https://github.com/curl/curl
cd curl
autoreconf -fi
LDFLAGS="-Wl,-rpath,$HOME/code/openssl-local-inst/lib/" ./configure --with-ssl=$HOME/code/openssl-local-inst --enable-ech
...lots of output...
WARNING: ECH HTTPSRR enabled but marked EXPERIMENTAL...
make
...lots more output...
```
If you do not get that WARNING at the end of the ``configure`` command, then
ECH is not enabled, so go back some steps and re-do whatever needs re-doing:-)
If you want to debug curl then you should add ``--enable-debug`` to the
``configure`` command.
In a recent (2024-05-20) build on one machine, configure failed to find the
ECH-enabled SSL library, apparently due to the existence of
``$HOME/code/openssl-local-inst/lib/pkgconfig`` as a directory containing
various settings. Deleting that directory worked around the problem but may
not be the best solution.
## Using ECH and DoH
curl supports using DoH for A/AAAA lookups so it was relatively easy to add
retrieval of HTTPS RRs in that situation. To use ECH and DoH together:
```bash
cd $HOME/code/curl
LD_LIBRARY_PATH=$HOME/code/openssl ./src/curl --ech true --doh-url https://one.one.one.one/dns-query https://defo.ie/ech-check.php
...
SSL_ECH_STATUS: success <img src="greentick-small.png" alt="good" /> <br/>
...
```
The output snippet above is within the HTML for the webpage, when things work.
The above works for these test sites:
```bash
https://defo.ie/ech-check.php
https://draft-13.esni.defo.ie:8413/stats
https://draft-13.esni.defo.ie:8414/stats
https://crypto.cloudflare.com/cdn-cgi/trace
https://tls-ech.dev
```
The list above has 4 different server technologies, implemented by 3 different
parties, and includes a case (the port 8414 server) where HelloRetryRequest
(HRR) is forced.
We currently support the following new curl command line arguments/options:
- ``--ech <config>`` - the ``config`` value can be one of:
- ``false`` says to not attempt ECH
- ``true`` says to attempt ECH, if possible
- ``grease`` if attempting ECH is not possible, then send a GREASE ECH extension
- ``hard`` hard-fail the connection if ECH cannot be attempted
- ``ecl:<b64value>`` a base64 encoded ECHConfigList, rather than one accessed from the DNS
- ``pn:<name>`` override the ``public_name`` from an ECHConfigList
Note that in the above "attempt ECH" means the client emitting a TLS
ClientHello with a "real" ECH extension, but that does not mean that the
relevant server can succeed in decrypting, as things can fail for other
reasons.
## Supplying an ECHConfigList on the command line
To supply the ECHConfigList on the command line, you might need a bit of
cut-and-paste, e.g.:
```bash
dig +short https defo.ie
1 . ipv4hint=213.108.108.101 ech=AED+DQA8PAAgACD8WhlS7VwEt5bf3lekhHvXrQBGDrZh03n/LsNtAodbUAAEAAEAAQANY292ZXIuZGVmby5pZQAA ipv6hint=2a00:c6c0:0:116:5::10
```
Then paste the base64 encoded ECHConfigList onto the curl command line:
```bash
LD_LIBRARY_PATH=$HOME/code/openssl ./src/curl --ech ecl:AED+DQA8PAAgACD8WhlS7VwEt5bf3lekhHvXrQBGDrZh03n/LsNtAodbUAAEAAEAAQANY292ZXIuZGVmby5pZQAA https://defo.ie/ech-check.php
...
SSL_ECH_STATUS: success <img src="greentick-small.png" alt="good" /> <br/>
...
```
The output snippet above is within the HTML for the webpage.
If you paste in the wrong ECHConfigList (it changes hourly for ``defo.ie``) you
should get an error like this:
```bash
LD_LIBRARY_PATH=$HOME/code/openssl ./src/curl -vvv --ech ecl:AED+DQA8yAAgACDRMQo+qYNsNRNj+vfuQfFIkrrUFmM4vogucxKj/4nzYgAEAAEAAQANY292ZXIuZGVmby5pZQAA https://defo.ie/ech-check.php
...
* OpenSSL/3.3.0: error:0A00054B:SSL routines::ech required
...
```
There is a reason to want this command line option - for use before publishing
an ECHConfigList in the DNS as per the Internet-draft [A well-known URI for
publishing ECHConfigList values](https://datatracker.ietf.org/doc/draft-ietf-tls-wkech/).
If you do use a wrong ECHConfigList value, then the server might return a
good value, via the ``retry_configs`` mechanism. You can see that value in
the verbose output, e.g.:
```bash
LD_LIBRARY_PATH=$HOME/code/openssl ./src/curl -vvv --ech ecl:AED+DQA8yAAgACDRMQo+qYNsNRNj+vfuQfFIkrrUFmM4vogucxKj/4nzYgAEAAEAAQANY292ZXIuZGVmby5pZQAA https://defo.ie/ech-check.php
...
* ECH: retry_configs AQD+DQA8DAAgACBvYqJy+Hgk33wh/ZLBzKSPgwxeop7gvojQzfASq7zeZQAEAAEAAQANY292ZXIuZGVmby5pZQAA/g0APEMAIAAgXkT5r4cYs8z19q5rdittyIX8gfQ3ENW4wj1fVoiJZBoABAABAAEADWNvdmVyLmRlZm8uaWUAAP4NADw2ACAAINXSE9EdXzEQIJZA7vpwCIQsWqsFohZARXChgPsnfI1kAAQAAQABAA1jb3Zlci5kZWZvLmllAAD+DQA8cQAgACASeiD5F+UoSnVoHvA2l1EifUVMFtbVZ76xwDqmMPraHQAEAAEAAQANY292ZXIuZGVmby5pZQAA
* ECH: retry_configs for defo.ie from cover.defo.ie, 319
...
```
At that point, you could copy the base64 encoded value above and try again.
For now, this only works for the OpenSSL and BoringSSL/AWS-LC builds.
## Default settings
curl has various ways to configure default settings, e.g. in ``$HOME/.curlrc``,
so one can set the DoH URL and enable ECH that way:
```bash
cat ~/.curlrc
doh-url=https://one.one.one.one/dns-query
silent
ech=true
```
Note that when you use the system's curl command (rather than our ECH-enabled
build), it is liable to warn that ``ech`` is an unknown option. If that is an
issue (e.g. if some script re-directs stdout and stderr somewhere) then adding
the ``silent`` line above seems to be a good enough fix. (Though of
course, yet another script could depend on non-silent behavior, so you may have
to figure out what you prefer yourself.) That seems to have changed with the
latest build, previously ``silent=TRUE`` was what I used in ``~/.curlrc`` but
now that seems to cause a problem, so that the following line(s) are ignored.
If you want to always use our OpenSSL build you can set ``LD_LIBRARY_PATH``
in the environment:
```bash
export LD_LIBRARY_PATH=$HOME/code/openssl
```
When you do the above, there can be a mismatch between OpenSSL versions
for applications that check that. A ``git push`` for example fails so you
should unset ``LD_LIBRARY_PATH`` before doing that or use a different shell.
```bash
git push
OpenSSL version mismatch. Built against 30000080, you have 30200000
...
```
With all that setup as above the command line gets simpler:
```bash
./src/curl https://defo.ie/ech-check.php
...
SSL_ECH_STATUS: success <img src="greentick-small.png" alt="good" /> <br/>
...
```
The ``--ech true`` option is opportunistic, so tries to do ECH but does not fail if
the client for example cannot find any ECHConfig values. The ``--ech hard``
option hard-fails if there is no ECHConfig found in DNS, so for now, that is not
a good option to set as a default. Once ECH has really been attempted by
the client, if decryption on the server side fails, then curl fails.
## Code changes for ECH support when using DoH
Code changes are ``#ifdef`` protected via ``USE_ECH`` or ``USE_HTTPSRR``:
- ``USE_HTTPSRR`` is used for HTTPS RR retrieval code that could be generically
used should non-ECH uses for HTTPS RRs be identified, e.g. use of ALPN values
or IP address hints.
- ``USE_ECH`` protects ECH specific code.
There are various obvious code blocks for handling the new command line
arguments which are not described here, but should be fairly clear.
As shown in the ``configure`` usage above, there are ``configure.ac`` changes
that allow separately dis/enabling ``USE_HTTPSRR`` and ``USE_ECH``. If ``USE_ECH``
is enabled, then ``USE_HTTPSRR`` is forced. In both cases ``CURL_DISABLE_DOH``
must not be enabled. (There may be some configuration conflicts available for the
determined :-)
The main functional change, as you would expect, is in ``lib/vtls/openssl.c``
where an ECHConfig, if available from command line or DNS cache, is fed into
the OpenSSL library via the new APIs implemented in our OpenSSL fork for that
purpose. This code also implements the opportunistic (``--ech true``) or hard-fail
(``--ech hard``) logic.
Other than that, the main additions are in ``lib/doh.c``
where we reuse ``dohprobe()`` to retrieve an HTTPS RR value for the target
domain. If such a value is found, that is stored using a new ``doh_store_https()``
function in a new field in the ``dohentry`` structure.
The qname for the DoH query is modified if the port number is not 443, as
defined in the SVCB specification.
When the DoH process has worked, ``Curl_doh_is_resolved()`` now also returns
the relevant HTTPS RR value data in the ``Curl_dns_entry`` structure.
That is later accessed when the TLS session is being established, if ECH is
enabled (from ``lib/vtls/openssl.c`` as described above).
## Limitations
Things that need fixing, but that can probably be ignored for the
moment:
- We could easily add code to make use of an ``alpn=`` value found in an HTTPS
RR, passing that on to OpenSSL for use as the "inner" ALPN value, but have
yet to do that.
Current limitations (more interesting than the above):
- Only the first HTTPS RR value retrieved is actually processed as described
above, that could be extended in future, though picking the "right" HTTPS RR
could be non-trivial if multiple RRs are published - matching IP address hints
versus A/AAAA values might be a good basis for that. Last I checked though,
browsers supporting ECH did not handle multiple HTTPS RRs well, though that
needs re-checking as it has been a while.
- It is unclear how one should handle any IP address hints found in an HTTPS RR.
It may be that a bit of consideration of how "multi-CDN" deployments might
emerge would provide good answers there, but for now, it is not clear how best
curl might handle those values when present in the DNS.
- The SVCB/HTTPS RR specification supports a new "CNAME at apex" indirection
("aliasMode") - the current code takes no account of that at all. One could
envisage implementing the equivalent of following CNAMEs in such cases, but
it is not clear if that'd be a good plan. (As of now, chrome browsers do not seem
to have any support for that "aliasMode" and we have not checked Firefox for that
recently.)
- We have not investigated what related changes or additions might be needed
for applications using libcurl, as opposed to use of curl as a command line
tool.
- We have not yet implemented tests as part of the usual curl test harness as
doing so would seem to require re-implementing an ECH-enabled server as part
of the curl test harness. For now, we have a ``./tests/ech_test.sh`` script
that attempts ECH with various test servers and with many combinations of the
allowed command line options. While that is a useful test and has find issues,
it is not comprehensive and we are not (as yet) sure what would be the right
level of coverage. When running that script you should not have a
``$HOME/.curlrc`` file that affects ECH or some of the negative tests could
produce spurious failures.
## Building with cmake
To build with cmake, assuming our ECH-enabled OpenSSL is as before:
```bash
cd $HOME/code
git clone https://github.com/curl/curl
cd curl
mkdir build
cd build
cmake -DOPENSSL_ROOT_DIR=$HOME/code/openssl -DUSE_ECH=1 ..
...
make
...
[100%] Built target curl
```
The binary produced by the cmake build does not need any ECH-specific
``LD_LIBRARY_PATH`` setting.
## BoringSSL build
BoringSSL is also supported by curl and also supports ECH, so to build
with that, instead of our ECH-enabled OpenSSL:
```bash
cd $HOME/code
git clone https://boringssl.googlesource.com/boringssl
cd boringssl
cmake -DCMAKE_INSTALL_PREFIX:PATH=$HOME/code/boringssl/inst -DBUILD_SHARED_LIBS=1
make
...
make install
```
Then:
```bash
cd $HOME/code
git clone https://github.com/curl/curl
cd curl
autoreconf -fi
LDFLAGS="-Wl,-rpath,$HOME/code/boringssl/inst/lib" ./configure --with-ssl=$HOME/code/boringssl/inst --enable-ech
...lots of output...
WARNING: ECH HTTPSRR enabled but marked EXPERIMENTAL. Use with caution.
make
```
The BoringSSL/AWS-LC APIs are fairly similar to those in our ECH-enabled
OpenSSL fork, so code changes are also in ``lib/vtls/openssl.c``, protected
via ``#ifdef OPENSSL_IS_BORINGSSL`` and are mostly obvious API variations.
The BoringSSL/AWS-LC APIs however do not support the ``--ech pn:`` command
line variant as of now.
## wolfSSL build
wolfSSL also supports ECH and can be used by curl, so here's how:
```bash
cd $HOME/code
git clone https://github.com/wolfSSL/wolfssl
cd wolfssl
./autogen.sh
./configure --prefix=$HOME/code/wolfssl/inst --enable-ech --enable-debug --enable-opensslextra
make
make install
```
The install prefix (``inst``) in the above causes wolfSSL to be installed there
and we seem to need that for the curl configure command to work out. The
``--enable-opensslextra`` turns out (after much faffing about;-) to be
important or else we get build problems with curl below.
```bash
cd $HOME/code
git clone https://github.com/curl/curl
cd curl
autoreconf -fi
./configure --with-wolfssl=$HOME/code/wolfssl/inst --enable-ech
make
```
There are some known issues with the ECH implementation in wolfSSL:
- The main issue is that the client currently handles HelloRetryRequest
incorrectly. [HRR issue](https://github.com/wolfSSL/wolfssl/issues/6802).)
The HRR issue means that the client does not work for
[this ECH test web site](https://tls-ech.dev) and any other similarly configured
sites.
- There is also an issue related to so-called middlebox compatibility mode.
[middlebox compatibility issue](https://github.com/wolfSSL/wolfssl/issues/6774)
### Code changes to support wolfSSL
There are what seem like oddball differences:
- The DoH URL in``$HOME/.curlrc`` can use `1.1.1.1` for OpenSSL but has to be
`one.one.one.one` for wolfSSL. The latter works for both, so OK, we us that.
- There seems to be some difference in CA databases too - the wolfSSL version
does not like ``defo.ie``, whereas the system and OpenSSL ones do. We can
ignore that for our purposes via ``--insecure``/``-k`` but would need to fix
for a real setup. (Browsers do like those certificates though.)
Then there are some functional code changes:
- tweak to ``configure.ac`` to check if wolfSSL has ECH or not
- added code to ``lib/vtls/wolfssl.c`` mirroring what's done in the
OpenSSL equivalent above.
- wolfSSL does not support ``--ech false`` or the ``--ech pn:`` command line
argument.
The lack of support for ``--ech false`` is because wolfSSL has decided to
always at least GREASE if built to support ECH. In other words, GREASE is
a compile time choice for wolfSSL, but a runtime choice for OpenSSL or
BoringSSL/AWS-LC. (Both are reasonable.)
## Additional notes
### Supporting ECH without DoH
All of the above only applies if DoH is being used. There should be a use-case
for ECH when DoH is not used by curl - if a system stub resolver supports DoT
or DoH, then, considering only ECH and the network threat model, it would make
sense for curl to support ECH without curl itself using DoH. The author for
example uses a combination of stubby+unbound as the system resolver listening
on localhost:53, so would fit this use-case. That said, it is unclear if
this is a niche that is worth trying to address. (The author is just as happy to
let curl use DoH to talk to the same public recursive that stubby might use:-)
Assuming for the moment this is a use-case we would like to support, then if
DoH is not being used by curl, it is not clear at this time how to provide
support for ECH. One option would seem to be to extend the ``c-ares`` library
to support HTTPS RRs, but in that case it is not now clear whether such
changes would be attractive to the ``c-ares`` maintainers, nor whether the
"tag=value" extensibility inherent in the HTTPS/SVCB specification is a good
match for the ``c-ares`` approach of defining structures specific to decoded
answers for each supported RRtype. We are also not sure how many downstream
curl deployments actually make use of the ``c-ares`` library, which would
affect the utility of such changes. Another option might be to consider using
some other generic DNS library that does support HTTPS RRs, but it is unclear
if such a library could or would be used by all or almost all curl builds and
downstream releases of curl.
Our current conclusion is that doing the above is likely best left until we
have some experience with the "using DoH" approach, so we are going to punt on
this for now.
### Debugging
Just a note to self as remembering this is a nuisance:
```bash
LD_LIBRARY_PATH=$HOME/code/openssl:./lib/.libs gdb ./src/.libs/curl
```
### Localhost testing
It can be useful to be able to run against a localhost OpenSSL ``s_server``
for testing. We have published instructions for such
[localhost tests](https://github.com/defo-project/ech-dev-utils/blob/main/howtos/localhost-tests.md)
in another repository. Once you have that set up, you can start a server
and then run curl against that:
```bash
cd $HOME/code/ech-dev-utils
./scripts/echsvr.sh -d
...
```
The ``echsvr.sh`` script supports many ECH-related options. Use ``echsvr.sh -h``
for details.
In another window:
```bash
cd $HOME/code/curl/
./src/curl -vvv --insecure --connect-to foo.example.com:8443:localhost:8443 --ech ecl:AD7+DQA6uwAgACBix2B78sX+EQhEbxMspDOc8Z3xVS5aQpYP0Cxpc2AWPAAEAAEAAQALZXhhbXBsZS5jb20AAA==
```
### Automated use of ``retry_configs`` not supported so far...
As of now we have not added support for using ``retry_config`` handling in the
application - for a command line tool, one can just use ``dig`` (or ``kdig``)
to get the HTTPS RR and pass the ECHConfigList from that on the command line,
if needed, or one can access the value from command line output in verbose more
and then reuse that in another invocation.
Both our OpenSSL fork and BoringSSL/AWS-LC have APIs for both controlling GREASE
and accessing and logging ``retry_configs``, it seems wolfSSL has neither.

90
deps/curl/docs/EXPERIMENTAL.md vendored Normal file
View File

@ -0,0 +1,90 @@
<!--
Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
SPDX-License-Identifier: curl
-->
# Experimental
Some features and functionality in curl and libcurl are considered
**EXPERIMENTAL**.
Experimental support in curl means:
1. Experimental features are provided to allow users to try them out and
provide feedback on functionality and API etc before they ship and get
"carved in stone".
2. You must enable the feature when invoking configure as otherwise curl is
not built with the feature present.
3. We strongly advise against using this feature in production.
4. **We reserve the right to change behavior** of the feature without sticking
to our API/ABI rules as we do for regular features, as long as it is marked
experimental.
5. Experimental features are clearly marked so in documentation. Beware.
## Graduation
1. Each experimental feature should have a set of documented requirements of
what is needed for that feature to graduate. Graduation means being removed
from the list of experiments.
2. An experiment should NOT graduate if it needs test cases to be disabled,
unless they are for minor features that are clearly documented as not
provided by the experiment and then the disabling should be managed inside
each affected test case.
## Experimental features right now
### HTTP/3 support (non-ngtcp2 backends)
Graduation requirements:
- The used libraries should be considered out-of-beta with a reasonable
expectation of a stable API going forward.
- Using HTTP/3 with the given build should perform without risking busy-loops
### The Rustls backend
Graduation requirements:
- a reasonable expectation of a stable API going forward.
## ECH
Use of the HTTPS resource record and Encrypted Client Hello (ECH) when using
DoH
Graduation requirements:
- ECH support exists in at least one widely used TLS library apart from
BoringSSL and wolfSSL.
- feedback from users saying that ECH works for their use cases
- it has been given time to mature, so no earlier than April 2025 (twelve
months after being added here)
## SSL session import/export
Import/Export of SSL sessions tickets in libcurl and curl command line
option '--ssl-session <filename>' for faster TLS handshakes and use
of TLSv1.3/QUIC Early Data (0-RTT).
Graduation requirements:
- the implementation is considered safe
- feedback from users saying that session export works for their use cases
## HTTPS RR
HTTPS records support is a requirement for ECH but is provided as a
stand-alone feature that is itself considered EXPERIMENTAL.
Graduation requirements:
- HTTPS records work for DoH, c-ares and the threaded resolver
- HTTPS records can control ALPN and port number, at least
- There are options to control HTTPS use

249
deps/curl/docs/FEATURES.md vendored Normal file
View File

@ -0,0 +1,249 @@
<!--
Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
SPDX-License-Identifier: curl
-->
# Features -- what curl can do
## curl tool
- config file support
- multiple URLs in a single command line
- range "globbing" support: [0-13], {one,two,three}
- multiple file upload on a single command line
- redirect stderr
- parallel transfers
## libcurl
- URL RFC 3986 syntax
- custom maximum download time
- custom lowest download speed acceptable
- custom output result after completion
- guesses protocol from hostname unless specified
- supports .netrc
- progress bar with time statistics while downloading
- standard proxy environment variables support
- have run on 101 operating systems and 28 CPU architectures
- selectable network interface for outgoing traffic
- IPv6 support on Unix and Windows
- happy eyeballs dual-stack IPv4 + IPv6 connects
- persistent connections
- SOCKS 4 + 5 support, with or without local name resolving
- *pre-proxy* support, for *proxy chaining*
- supports username and password in proxy environment variables
- operations through HTTP proxy "tunnel" (using CONNECT)
- replaceable memory functions (malloc, free, realloc, etc)
- asynchronous name resolving
- both a push and a pull style interface
- international domain names (IDN)
- transfer rate limiting
- stable API and ABI
- TCP keep alive
- TCP Fast Open
- DNS cache (that can be shared between transfers)
- non-blocking single-threaded parallel transfers
- Unix domain sockets to server or proxy
- DNS-over-HTTPS
- uses non-blocking name resolves
- selectable name resolver backend
## URL API
- parses RFC 3986 URLs
- generates URLs from individual components
- manages "redirects"
## Header API
- easy access to HTTP response headers, from all contexts
- named headers
- iterate over headers
## TLS
- selectable TLS backend(s)
- TLS False Start
- TLS version control
- TLS session resumption
- key pinning
- mutual authentication
- Use dedicated CA cert bundle
- Use OS-provided CA store
- separate TLS options for HTTPS proxy
## HTTP
- HTTP/0.9 responses are optionally accepted
- HTTP/1.0
- HTTP/1.1
- HTTP/2, including multiplexing and server push
- GET
- PUT
- HEAD
- POST
- multipart formpost (RFC 1867-style)
- authentication: Basic, Digest, NTLM (9) and Negotiate (SPNEGO)
to server and proxy
- resume transfers
- follow redirects
- maximum amount of redirects to follow
- custom HTTP request
- cookie get/send fully parsed
- reads/writes the Netscape cookie file format
- custom headers (replace/remove internally generated headers)
- custom user-agent string
- custom referrer string
- range
- proxy authentication
- time conditions
- via HTTP proxy, HTTPS proxy or SOCKS proxy
- HTTP/2 or HTTP/1.1 to HTTPS proxy
- retrieve file modification date
- Content-Encoding support for deflate, gzip, brotli and zstd
- "Transfer-Encoding: chunked" support in uploads
- HSTS
- alt-svc
- ETags
- HTTP/1.1 trailers, both sending and getting
## HTTPS
- HTTP/3
- using client certificates
- verify server certificate
- via HTTP proxy, HTTPS proxy or SOCKS proxy
- select desired encryption
- select usage of a specific TLS version
- ECH
## FTP
- download
- authentication
- Kerberos 5
- active/passive using PORT, EPRT, PASV or EPSV
- single file size information (compare to HTTP HEAD)
- 'type=' URL support
- directory listing
- directory listing names-only
- upload
- upload append
- upload via http-proxy as HTTP PUT
- download resume
- upload resume
- custom ftp commands (before and/or after the transfer)
- simple "range" support
- via HTTP proxy, HTTPS proxy or SOCKS proxy
- all operations can be tunneled through proxy
- customizable to retrieve file modification date
- no directory depth limit
## FTPS
- implicit `ftps://` support that use SSL on both connections
- explicit "AUTH TLS" and "AUTH SSL" usage to "upgrade" plain `ftp://`
connection to use SSL for both or one of the connections
## SSH (both SCP and SFTP)
- selectable SSH backend
- known hosts support
- public key fingerprinting
- both password and public key auth
## SFTP
- both password and public key auth
- with custom commands sent before/after the transfer
- directory listing
## TFTP
- download
- upload
## TELNET
- connection negotiation
- custom telnet options
- stdin/stdout I/O
## LDAP
- full LDAP URL support
## DICT
- extended DICT URL support
## FILE
- URL support
- upload
- resume
## SMB
- SMBv1 over TCP and SSL
- download
- upload
- authentication with NTLMv1
## SMTP
- authentication: Plain, Login, CRAM-MD5, Digest-MD5, NTLM, Kerberos 5 and
External
- send emails
- mail from support
- mail size support
- mail auth support for trusted server-to-server relaying
- multiple recipients
- via http-proxy
## SMTPS
- implicit `smtps://` support
- explicit "STARTTLS" usage to "upgrade" plain `smtp://` connections to use SSL
- via http-proxy
## POP3
- authentication: Clear Text, APOP and SASL
- SASL based authentication: Plain, Login, CRAM-MD5, Digest-MD5, NTLM,
Kerberos 5 and External
- list emails
- retrieve emails
- enhanced command support for: CAPA, DELE, TOP, STAT, UIDL and NOOP via
custom requests
- via http-proxy
## POP3S
- implicit `pop3s://` support
- explicit `STLS` usage to "upgrade" plain `pop3://` connections to use SSL
- via http-proxy
## IMAP
- authentication: Clear Text and SASL
- SASL based authentication: Plain, Login, CRAM-MD5, Digest-MD5, NTLM,
Kerberos 5 and External
- list the folders of a mailbox
- select a mailbox with support for verifying the `UIDVALIDITY`
- fetch emails with support for specifying the UID and SECTION
- upload emails via the append command
- enhanced command support for: EXAMINE, CREATE, DELETE, RENAME, STATUS,
STORE, COPY and UID via custom requests
- via http-proxy
## IMAPS
- implicit `imaps://` support
- explicit "STARTTLS" usage to "upgrade" plain `imap://` connections to use SSL
- via http-proxy
## MQTT
- Subscribe to and publish topics using URL scheme `mqtt://broker/topic`

202
deps/curl/docs/GOVERNANCE.md vendored Normal file
View File

@ -0,0 +1,202 @@
<!--
Copyright (C) Daniel Stenberg, <daniel@haxx.se>, et al.
SPDX-License-Identifier: curl
-->
# Decision making in the curl project
A rough guide to how we make decisions and who does what.
## BDFL
This project was started by and has to some extent been pushed forward over
the years with Daniel Stenberg as the driving force. It matches a standard
BDFL (Benevolent Dictator For Life) style project.
This setup has been used due to convenience and the fact that it has worked
fine this far. It is not because someone thinks of it as a superior project
leadership model. It also only works as long as Daniel manages to listen in to
what the project and the general user population wants and expects from us.
## Legal entity
There is no legal entity. The curl project is just a bunch of people scattered
around the globe with the common goal to produce source code that creates
great products. We are not part of any umbrella organization and we are not
located in any specific country. We are totally independent.
The copyrights in the project are owned by the individuals and organizations
that wrote those parts of the code.
## Decisions
The curl project is not a democracy, but everyone is entitled to state their
opinion and may argue for their sake within the community.
All and any changes that have been done or are done are eligible to bring up
for discussion, to object to or to praise. Ideally, we find consensus for the
appropriate way forward in any given situation or challenge.
If there is no obvious consensus, a maintainer who's knowledgeable in the
specific area takes an "executive" decision that they think is the right for
the project.
## Donations
Donating plain money to curl is best done to curl's [Open Collective
fund](https://opencollective.com/curl). Open Collective is a US based
non-profit organization that holds on to funds for us. This fund is then used
for paying the curl security bug bounties, to reimburse project related
expenses etc.
Donations to the project can also come in the form of server hosting, providing
services and paying for people to work on curl related code etc. Usually, such
donations are services paid for directly by the sponsors.
We grade sponsors in a few different levels and if they meet the criteria,
they can be mentioned on the Sponsors page on the curl website.
## Commercial Support
The curl project does not do or offer commercial support. It only hosts
mailing lists, runs bug trackers etc to facilitate communication and work.
However, Daniel works for wolfSSL and we offer commercial curl support there.
# Key roles
## User
Someone who uses or has used curl or libcurl.
## Contributor
Someone who has helped the curl project, who has contributed to bring it
forward. Contributing could be to provide advice, debug a problem, file a bug
report, run test infrastructure or writing code etc.
## Commit author
Sometimes also called 'committer'. Someone who has authored a commit in the
curl source code repository. Committers are recorded as `Author` in git.
## Maintainers
A maintainer in the curl project is an individual who has been given
permissions to push commits to one of the git repositories.
Maintainers are free to push commits to the repositories at they see fit.
Maintainers are however expected to listen to feedback from users and any
change that is non-trivial in size or nature *should* be brought to the
project as a Pull-Request (PR) to allow others to comment/object before merge.
## Former maintainers
A maintainer who stops being active in the project gets their push permissions
removed at some point. We do this for security reasons but also to make sure
that we always have the list of maintainers as "the team that push stuff to
curl".
Getting push permissions removed is not a punishment. Everyone who ever worked
on maintaining curl is considered a hero, for all time hereafter.
## Security team members
We have a security team. That is the team of people who are subscribed to the
curl-security mailing list; the receivers of security reports from users and
developers. This list of people varies over time but they are all skilled
developers familiar with the curl project.
The security team works best when it consists of a small set of active
persons. We invite new members when the team seems to need it, and we also
expect to retire security team members as they "drift off" from the project or
just find themselves unable to perform their duties there.
## Core team
There is a curl core team. It currently has the same set of members as the
security team. It can also be reached on the security email address.
The core team nominates and invites new members to the team when it sees fit.
There is no open member voting or formal ways to be a candidate. Active
participants in the curl project who want to join the core team can ask to
join.
The core team is a board of advisors. It deals with project management
subjects that need confidentiality or for other reasons cannot be dealt with
and discussed in the open (for example reports of code of conduct violations).
Project matters should always as far as possible be discussed on open mailing
lists.
## Server admins
We run a web server, a mailing list and more on the curl project's primary
server. That physical machine is owned and run by Haxx. Daniel is the primary
admin of all things curl related server stuff, but Björn Stenberg and Linus
Feltzing serve as backup admins for when Daniel is gone or unable.
The primary server is paid for by Haxx. The machine is physically located in a
server bunker in Stockholm Sweden, operated by the company Glesys.
The website contents are served to the web via Fastly and Daniel is the
primary curl contact with Fastly.
## BDFL
That is Daniel.
# Maintainers
A curl maintainer is a project volunteer who has the authority and rights to
merge changes into a git repository in the curl project.
Anyone can aspire to become a curl maintainer.
### Duties
There are no mandatory duties. We hope and wish that maintainers consider
reviewing patches and help merging them, especially when the changes are
within the area of personal expertise and experience.
### Requirements
- only merge code that meets our quality and style guide requirements.
- *never* merge code without doing a PR first, unless the change is "trivial"
- if in doubt, ask for input/feedback from others
### Recommendations
- we require two-factor authentication enabled on your GitHub account to
reduce risk of malicious source code tampering
- consider enabling signed git commits for additional verification of changes
### Merge advice
When you are merging patches/pull requests...
- make sure the commit messages follow our template
- squash patch sets into a few logical commits even if the PR did not, if
necessary
- avoid the "merge" button on GitHub, do it "manually" instead to get full
control and full audit trail (GitHub leaves out you as "Committer:")
- remember to credit the reporter and the helpers.
## Who are maintainers?
The [list of maintainers](https://github.com/orgs/curl/people). Be aware that
the level of presence and activity in the project vary greatly between
different individuals and over time.
### Become a maintainer?
If you think you can help making the project better by shouldering some
maintaining responsibilities, then please get in touch.
You are expected to be familiar with the curl project and its ways of working.
You need to have gotten a few quality patches merged as a proof of this.
### Stop being a maintainer
If you (appear to) not be active in the project anymore, you may be removed as
a maintainer. Thank you for your service.

Some files were not shown because too many files have changed in this diff Show More