For wasm programs using SIMD vector types, the type known at function entry or exit may not match the type used within the body of the function, so we have to bitcast them. We have to check all calls and returns for this condition, which involves comparing a subset of a function's signature with the CLIF types we're trying to use. Currently, this check heap-allocates a short-lived Vec for the appropriate subset of the signature. But most of the time none of the values need a bitcast. So this patch avoids allocating unless at least one bitcast is actually required, and only saves the pointers to values that need fixing up. I leaned heavily on iterators to keep space usage constant until we discover allocation is necessary after all. I don't think it's possible to eliminate the allocation entirely, because the function signature we're examining is borrowed from the FuncBuilder, but we need to mutably borrow the FuncBuilder to insert the bitcast instructions. Fortunately, the FromIterator implementation for Vec doesn't reserve any heap space if the iterator is empty, so we can unconditionally collect into a Vec and still avoid unnecessary allocations. Since the relationship between a function signature and a list of CLIF values is somewhat complicated, I refactored all the uses of `bitcast_arguments` and `wasm_param_types`. Instead there's `bitcast_wasm_params` and `bitcast_wasm_returns` which share a helper that combines the previous pair of functions into one. According to DHAT, when compiling the Sightglass Spidermonkey benchmark, this avoids 52k allocations averaging about 9 bytes each, which are freed on average within 2k instructions. Most Sightglass benchmarks, including Spidermonkey, show no performance difference with this change. Only one has a slowdown, and it's small: compilation :: nanoseconds :: benchmarks/shootout-matrix/benchmark.wasm Δ = 689373.34 ± 593720.78 (confidence = 99%) lazy-bitcast.so is 0.94x to 1.00x faster than main-e121c209f.so! main-e121c209f.so is 1.00x to 1.06x faster than lazy-bitcast.so! [19741713 21375844.46 32976047] lazy-bitcast.so [19345471 20686471.12 30872267] main-e121c209f.so But several Sightglass benchmarks have notable speed-ups, with smaller improvements for shootout-ed25519, meshoptimizer, and pulldown-cmark, and larger ones as follows: compilation :: nanoseconds :: benchmarks/bz2/benchmark.wasm Δ = 20071545.47 ± 2950014.92 (confidence = 99%) lazy-bitcast.so is 1.26x to 1.36x faster than main-e121c209f.so! main-e121c209f.so is 0.73x to 0.80x faster than lazy-bitcast.so! [55995164 64849257.68 89083031] lazy-bitcast.so [79382460 84920803.15 98016185] main-e121c209f.so compilation :: nanoseconds :: benchmarks/blake3-scalar/benchmark.wasm Δ = 16620780.61 ± 5395162.13 (confidence = 99%) lazy-bitcast.so is 1.14x to 1.28x faster than main-e121c209f.so! main-e121c209f.so is 0.77x to 0.88x faster than lazy-bitcast.so! [54604352 79877776.35 103666598] lazy-bitcast.so [94011835 96498556.96 106200091] main-e121c209f.so compilation :: nanoseconds :: benchmarks/intgemm-simd/benchmark.wasm Δ = 36891254.34 ± 12403663.50 (confidence = 99%) lazy-bitcast.so is 1.12x to 1.24x faster than main-e121c209f.so! main-e121c209f.so is 0.79x to 0.90x faster than lazy-bitcast.so! [131610845 201289587.88 247341883] lazy-bitcast.so [232065032 238180842.22 250957563] main-e121c209f.so
wasmtime
A standalone runtime for WebAssembly
A Bytecode Alliance project
Guide | Contributing | Website | Chat
Installation
The Wasmtime CLI can be installed on Linux and macOS with a small install script:
curl https://wasmtime.dev/install.sh -sSf | bash
Windows or otherwise interested users can download installers and binaries directly from the GitHub Releases page.
Example
If you've got the Rust compiler installed then you can take some Rust source code:
fn main() {
println!("Hello, world!");
}
and compile/run it with:
$ rustup target add wasm32-wasi
$ rustc hello.rs --target wasm32-wasi
$ wasmtime hello.wasm
Hello, world!
Features
-
Fast. Wasmtime is built on the optimizing Cranelift code generator to quickly generate high-quality machine code either at runtime or ahead-of-time. Wasmtime's runtime is also optimized for cases such as efficient instantiation, low-overhead transitions between the embedder and wasm, and scalability of concurrent instances.
-
Secure. Wasmtime's development is strongly focused on the correctness of its implementation with 24/7 fuzzing donated by Google's OSS Fuzz, leveraging Rust's API and runtime safety guarantees, careful design of features and APIs through an RFC process, a security policy in place for when things go wrong, and a release policy for patching older versions as well. We follow best practices for defense-in-depth and known protections and mitigations for issues like Spectre. Finally, we're working to push the state-of-the-art by collaborating with academic researchers to formally verify critical parts of Wasmtime and Cranelift.
-
Configurable. Wastime supports a rich set of APIs and build time configuration to provide many options such as further means of restricting WebAssembly beyond its basic guarantees such as its CPU and Memory consumption. Wasmtime also runs in tiny environments all the way up to massive servers with many concurrent instances.
-
WASI. Wasmtime supports a rich set of APIs for interacting with the host environment through the WASI standard.
-
Standards Compliant. Wasmtime passes the official WebAssembly test suite, implements the official C API of wasm, and implements future proposals to WebAssembly as well. Wasmtime developers are intimately engaged with the WebAssembly standards process all along the way too.
Language Support
You can use Wasmtime from a variety of different languages through embeddings of the implementation:
- Rust - the
wasmtimecrate - C - the
wasm.h,wasi.h, andwasmtime.hheaders, CMake orwasmtimeConan package - C++ - the
wasmtime-cpprepository or usewasmtime-cppConan package - Python - the
wasmtimePyPI package - .NET - the
WasmtimeNuGet package - Go - the
wasmtime-gorepository
Documentation
📚 Read the Wasmtime guide here! 📚
The wasmtime guide is the best starting point to learn about what Wasmtime can do for you or help answer your questions about Wasmtime. If you're curious in contributing to Wasmtime, it can also help you do that!
It's Wasmtime.