Currently the "sequential" and "parallel" benchmarks reports somewhat different timings. For sequential it's time-to-instantiate, but for parallel it's time-to-instantiate-10k instances. The parallelism in the parallel benchmark can also theoretically be affected by rayon's work-stealing. For example if rayon doesn't actually do any work stealing at all then this ends up being a sequential test again. Otherwise though it's possible for some threads to finish much earlier as rayon isn't guaranteed to keep threads busy. This commit applies a few updates to the benchmark: * First an `InstancePre<T>` is now used instead of a `Linker<T>` to front-load type-checking and avoid that on each instantiation (and this is generally the fastest path to instantiate right now). * Next the instantiation benchmark is changed to measure one instantiation-per-iteration to measure per-instance instantiation to better compare with sequential numbers. * Finally rayon is removed in favor of manually creating background threads that infinitely do work until we tell them to stop. These background threads are guaranteed to be working for the entire time the benchmark is executing and should theoretically exhibit what the situation that there's N units of work all happening at once. I also applied some minor updates here such as having the parallel instantiation defined conditionally for multiple modules as well as upping the limits of the pooling allocator to handle a large module (rustpython.wasm) that I threw at it.
wasmtime
A standalone runtime for WebAssembly
A Bytecode Alliance project
Guide | Contributing | Website | Chat
Installation
The Wasmtime CLI can be installed on Linux and macOS with a small install script:
$ curl https://wasmtime.dev/install.sh -sSf | bash
Windows or otherwise interested users can download installers and binaries directly from the GitHub Releases page.
Example
If you've got the Rust compiler installed then you can take some Rust source code:
fn main() {
println!("Hello, world!");
}
and compile/run it with:
$ rustup target add wasm32-wasi
$ rustc hello.rs --target wasm32-wasi
$ wasmtime hello.wasm
Hello, world!
Features
-
Lightweight. Wasmtime is a standalone runtime for WebAssembly that scales with your needs. It fits on tiny chips as well as makes use of huge servers. Wasmtime can be embedded into almost any application too.
-
Fast. Wasmtime is built on the optimizing Cranelift code generator to quickly generate high-quality machine code at runtime.
-
Configurable. Whether you need to precompile your wasm ahead of time, or interpret it at runtime, Wasmtime has you covered for all your wasm-executing needs.
-
WASI. Wasmtime supports a rich set of APIs for interacting with the host environment through the WASI standard.
-
Standards Compliant. Wasmtime passes the official WebAssembly test suite, implements the official C API of wasm, and implements future proposals to WebAssembly as well. Wasmtime developers are intimately engaged with the WebAssembly standards process all along the way too.
Language Support
You can use Wasmtime from a variety of different languages through embeddings of the implementation:
- Rust - the
wasmtimecrate - C - the
wasm.h,wasi.h, andwasmtime.hheaders or usewasmtimeConan package - [C++] - the
wasmtime-cpprepository or usewasmtime-cppConan package - Python - the
wasmtimePyPI package - .NET - the
WasmtimeNuGet package - Go - the
wasmtime-gorepository
Documentation
📚 Read the Wasmtime guide here! 📚
The wasmtime guide is the best starting point to learn about what Wasmtime can do for you or help answer your questions about Wasmtime. If you're curious in contributing to Wasmtime, it can also help you do that!
It's Wasmtime.