Files
wasmtime/.azure-pipelines.yml
Alex Crichton 47eee434d4 Build Linux binary in an older docker container (#268)
Currently our Linux binaries aren't quite as portable as they otherwise could
be. There's two primary reasons for this, and one of them is that the binary is
produced in a relatively recent Linux distribution (Ubuntu 16.04) which means
it has a relatively recent requirement in terms of glibc versions. On
OSX/Windows we can set some flags to rely on older libc implementations, but on
Linux we have to actually build against an older version.

This commit switches the container for the build to CentOS 6 instead of the
default Ubuntu 16.04. The main trick here is also finding a C++11-capable
compiler to compile wabt. Turns out though there's a helpful tutorial for this
at https://edwards.sdsu.edu/research/c11-on-centos-6/ and it was as easy as
installing a few packages.

The second portability concern of our Linux binaries is that they link
dynamically to `libstdc++.so` which isn't always installed on target systems,
or even if it is it may be too old or have a different ABI. This is solved by
statically linking to `libstdc++.a` in the build on Azure by doing a bit of
trickery with libraries and what's available.

After these results the glibc requirements drops from 2.18 (released in 2013)
to 2.6 (released in 2007) and avoids the need for users to have libstdc++.so
installed. We may eventually want to investigate fully-static musl binaries,
but finding a musl compiler for C++ is something I'm not that good at, so I
figure this is probably good enough for now.
2019-08-10 01:07:16 +02:00

205 lines
6.7 KiB
YAML

name: $(Build.SourceBranch)-$(date:yyyyMMdd)$(rev:.r)
trigger:
branches:
include:
- 'master'
tags:
include:
- '*'
exclude:
- 'dev'
jobs:
- job: rustfmt
pool:
vmImage: 'macos-10.14'
steps:
- checkout: self
submodules: true
- template: ci/azure-install-rust.yml
- script: rustup component add rustfmt
displayName: Add rustfmt
- script: cargo fmt --all -- --check
displayName: Check formatting
variables:
toolchain: stable
# Smoke test to build docs on one builder, using OSX for now since it's the
# fastest
- job: docs
pool:
vmImage: 'macos-10.14'
steps:
- checkout: self
submodules: true
- template: ci/azure-install-rust.yml
- script: cargo doc
displayName: Build documentation
variables:
toolchain: stable
- job: Test
strategy:
matrix:
windows-stable:
imageName: 'vs2017-win2016'
toolchain: stable
linux-stable:
imageName: 'ubuntu-16.04'
toolchain: stable
mac-stable:
imageName: 'macos-10.14'
toolchain: stable
mac-beta:
imageName: 'macos-10.14'
toolchain: beta
mac-nightly:
imageName: 'macos-10.14'
toolchain: nightly
pool:
vmImage: $(imageName)
steps:
- checkout: self
submodules: true
- template: ci/azure-install-rust.yml
- script: cargo fetch
displayName: Fetch cargo dependencies
# Build and test all features except for lightbeam
- bash: cargo test --all --exclude lightbeam --exclude wasmtime-wasi-c
displayName: Cargo test
env:
RUST_BACKTRACE: 1
# Build and test lightbeam if we're using the nightly toolchain
- bash: cargo build --package lightbeam
displayName: Cargo build lightbeam
condition: and(succeeded(), eq(variables['toolchain'], 'nightly'))
- bash: cargo test --package lightbeam
displayName: Cargo test lightbeam
# Lightbeam tests fail right now, but we don't want to block on that.
continueOnError: true
condition: and(succeeded(), eq(variables['toolchain'], 'nightly'))
env:
RUST_BACKTRACE: 1
- job: Build
strategy:
matrix:
windows:
imageName: 'vs2017-win2016'
# Statically link against msvcrt to produce slightly more portable
# binaries on Windows by reducing our binary compatibility requirements.
RUSTFLAGS: -Ctarget-feature=+crt-static
mac:
imageName: 'macos-10.14'
# Lower the deployment target from our build image in an attempt to
# build more portable binaries that run on older releases. Note that
# 10.9 here is arbitrarily chosen and just happens to be the lowest that
# works at this time. Raising this is probably fine.
MACOSX_DEPLOYMENT_TARGET: 10.9
variables:
toolchain: stable
pool:
vmImage: $(imageName)
steps:
- template: ci/azure-build-release.yml
# Build the Linux release binary in an older Linux container (in this case
# Centos 6)
- job: Build_linux
variables:
toolchain: stable
container:
image: centos:6
options: "--name ci-container -v /usr/bin/docker:/tmp/docker:ro"
steps:
# We're executing in the container as non-root but `yum` requires root. We
# need to install `sudo` but to do that we need `sudo`. Do a bit of a weird
# hack where we use the host `docker` executable to re-execute in our own
# container with the root user to install `sudo`
- bash: /tmp/docker exec -t -u 0 ci-container sh -c "yum install -y sudo"
displayName: Configure sudo
# See https://edwards.sdsu.edu/research/c11-on-centos-6/ for where these
# various commands came from.
- bash: |
set -e
sudo yum install -y centos-release-scl cmake xz
sudo yum install -y devtoolset-8-gcc devtoolset-8-binutils devtoolset-8-gcc-c++
echo "##vso[task.prependpath]/opt/rh/devtoolset-8/root/usr/bin"
displayName: Install system dependencies
# Delete `libstdc++.so` to force gcc to link against `libstdc++.a` instead.
# This is a hack and not the right way to do this, but it ends up doing the
# right thing for now.
- bash: sudo rm -f /opt/rh/devtoolset-8/root/usr/lib/gcc/x86_64-redhat-linux/8/libstdc++.so
displayName: Force a static libstdc++
- template: ci/azure-build-release.yml
- job: Publish
dependsOn:
- Build
- Build_linux
condition: and(succeeded(), in(variables['Build.Reason'], 'IndividualCI', 'BatchedCI'))
steps:
# Checking out the sources is needed to be able to delete the "dev" tag, see below.
- checkout: self
persistCredentials: true
submodules: false
- task: DownloadPipelineArtifact@1
inputs:
targetPath: $(Build.ArtifactStagingDirectory)
- script: |
echo "##vso[task.setvariable variable=tagName;]`echo $BUILD_SOURCEBRANCH | sed -e 's|refs/tags/||'`"
condition: startsWith(variables['Build.SourceBranch'], 'refs/tags/')
- task: GitHubRelease@0
inputs:
gitHubConnection: 'tschneidereit-releases'
target: '$(Build.SourceVersion)'
tagSource: 'manual'
tag: '$(tagName)'
title: 'Wasmtime $(tagName)'
assets: '$(Build.ArtifactStagingDirectory)/**'
isDraft: false
isPreRelease: true
condition: and(startsWith(variables['Build.SourceBranch'], 'refs/tags/'),
ne(variables['Build.SourceBranch'], 'refs/tags/dev'))
# GitHub doesn't support doing rolling releases for branch.
# To simulate that for dev builds, always do a release for the "dev" tag.
# While the `edit` action for the GitHubRelease task would replace any assets
# associated with the tag, it wouldn't update the tag itself. Hence, delete the
# tag if it exists, and re-create it every time.
# Also explicitly delete the GitHub release, which would otherwise turn into a draft
# and linger forever.
- task: GitHubRelease@0
inputs:
gitHubConnection: 'tschneidereit-releases'
action: 'delete'
tag: 'dev'
# This might fail in case the target repo doesn't yet have this tag, which is fine.
continueOnError: true
condition: in(variables['Build.SourceBranch'], 'refs/heads/master', 'refs/tags/dev')
- script: |
git -c http.extraheader="AUTHORIZATION: basic ***" push origin :dev
# This might fail in case the target repo doesn't yet have this tag, which is fine.
continueOnError: true
condition: in(variables['Build.SourceBranch'], 'refs/heads/master', 'refs/tags/dev')
- task: GitHubRelease@0
inputs:
gitHubConnection: 'tschneidereit-releases'
action: 'create'
target: '$(Build.SourceVersion)'
tag: 'dev'
tagSource: 'manual'
title: 'Latest CI build'
assets: '$(Build.ArtifactStagingDirectory)/**'
isDraft: false
isPreRelease: true
condition: in(variables['Build.SourceBranch'], 'refs/heads/master', 'refs/tags/dev')