This commit implements a few optimizations, mainly inlining, that should improve the performance of calling a WebAssembly function. This code path can be quite hot depending on the embedding case and we hadn't really put much effort into optimizing the nitty gritty. The predominant optimization here is adding `#[inline]` to trivial functions so performance is improved without having to compile with LTO. Another optimization is to call `lazy_per_thread_init` when traps are initialized per-thread (when a `Store` is created) rather than each time a function is called. The next optimization is to change the unwind reason in the `CallThreadState` to `MaybeUninit` to avoid extra checks in the default case about whether we need to drop its variants (since in the happy path we never need to drop it). The final optimization is to optimize out a few checks when `async` support is disabled for a small speed boost. In a small benchmark where wasmtime calls a simple wasm function my macOS computer dropped from 110ns to 86ns overhead, a 20% decrease. The macOS overhead is still largely dominated by the global lock acquisition and hash table management for traps right now, but I suspect the Linux overhead is much better (should be on the order of ~30 or so ns). We still have a long way to go to compete with SpiderMonkey which, in testing, seem to have ~6ns overhead in calling the same wasm function on my computer.
wasmtime
A standalone runtime for WebAssembly
A Bytecode Alliance project
Guide | Contributing | Website | Chat
Installation
The Wasmtime CLI can be installed on Linux and macOS with a small install script:
$ curl https://wasmtime.dev/install.sh -sSf | bash
Windows or otherwise interested users can download installers and binaries directly from the GitHub Releases page.
Example
If you've got the Rust compiler installed then you can take some Rust source code:
fn main() {
println!("Hello, world!");
}
and compile/run it with:
$ rustup target add wasm32-wasi
$ rustc hello.rs --target wasm32-wasi
$ wasmtime hello.wasm
Hello, world!
Features
-
Lightweight. Wasmtime is a standalone runtime for WebAssembly that scales with your needs. It fits on tiny chips as well as makes use of huge servers. Wasmtime can be embedded into almost any application too.
-
Fast. Wasmtime is built on the optimizing Cranelift code generator to quickly generate high-quality machine code at runtime.
-
Configurable. Whether you need to precompile your wasm ahead of time, generate code blazingly fast with Lightbeam, or interpret it at runtime, Wasmtime has you covered for all your wasm-executing needs.
-
WASI. Wasmtime supports a rich set of APIs for interacting with the host environment through the WASI standard.
-
Standards Compliant. Wasmtime passes the official WebAssembly test suite, implements the official C API of wasm, and implements future proposals to WebAssembly as well. Wasmtime developers are intimately engaged with the WebAssembly standards process all along the way too.
Language Support
You can use Wasmtime from a variety of different languages through embeddings of the implementation:
- Rust - the
wasmtimecrate - C - the
wasm.h,wasi.h, andwasmtime.hheaders - Python - the
wasmtimePyPI package - .NET - the
WasmtimeNuGet package - Go - the
wasmtime-gorepository
Documentation
📚 Read the Wasmtime guide here! 📚
The wasmtime guide is the best starting point to learn about what Wasmtime can do for you or help answer your questions about Wasmtime. If you're curious in contributing to Wasmtime, it can also help you do that!
It's Wasmtime.