Alex Crichton 43b37944ff Tweak parallelism and the instantiation benchmark (#3775)
Currently the "sequential" and "parallel" benchmarks reports somewhat
different timings. For sequential it's time-to-instantiate, but for
parallel it's time-to-instantiate-10k instances. The parallelism in the
parallel benchmark can also theoretically be affected by rayon's
work-stealing. For example if rayon doesn't actually do any work
stealing at all then this ends up being a sequential test again.
Otherwise though it's possible for some threads to finish much earlier
as rayon isn't guaranteed to keep threads busy.

This commit applies a few updates to the benchmark:

* First an `InstancePre<T>` is now used instead of a `Linker<T>` to
  front-load type-checking and avoid that on each instantiation (and
  this is generally the fastest path to instantiate right now).

* Next the instantiation benchmark is changed to measure one
  instantiation-per-iteration to measure per-instance instantiation to
  better compare with sequential numbers.

* Finally rayon is removed in favor of manually creating background
  threads that infinitely do work until we tell them to stop. These
  background threads are guaranteed to be working for the entire time
  the benchmark is executing and should theoretically exhibit what the
  situation that there's N units of work all happening at once.

I also applied some minor updates here such as having the parallel
instantiation defined conditionally for multiple modules as well as
upping the limits of the pooling allocator to handle a large module
(rustpython.wasm) that I threw at it.
2022-02-07 15:55:38 -08:00
2021-11-17 13:04:17 -08:00
2022-02-04 14:58:39 -05:00
2020-02-28 09:16:05 -08:00
2021-12-17 12:00:11 -08:00
2021-09-27 12:27:19 -05:00
2022-01-05 13:26:50 -06:00

wasmtime

A standalone runtime for WebAssembly

A Bytecode Alliance project

build status zulip chat supported rustc stable Documentation Status

Guide | Contributing | Website | Chat

Installation

The Wasmtime CLI can be installed on Linux and macOS with a small install script:

$ curl https://wasmtime.dev/install.sh -sSf | bash

Windows or otherwise interested users can download installers and binaries directly from the GitHub Releases page.

Example

If you've got the Rust compiler installed then you can take some Rust source code:

fn main() {
    println!("Hello, world!");
}

and compile/run it with:

$ rustup target add wasm32-wasi
$ rustc hello.rs --target wasm32-wasi
$ wasmtime hello.wasm
Hello, world!

Features

  • Lightweight. Wasmtime is a standalone runtime for WebAssembly that scales with your needs. It fits on tiny chips as well as makes use of huge servers. Wasmtime can be embedded into almost any application too.

  • Fast. Wasmtime is built on the optimizing Cranelift code generator to quickly generate high-quality machine code at runtime.

  • Configurable. Whether you need to precompile your wasm ahead of time, or interpret it at runtime, Wasmtime has you covered for all your wasm-executing needs.

  • WASI. Wasmtime supports a rich set of APIs for interacting with the host environment through the WASI standard.

  • Standards Compliant. Wasmtime passes the official WebAssembly test suite, implements the official C API of wasm, and implements future proposals to WebAssembly as well. Wasmtime developers are intimately engaged with the WebAssembly standards process all along the way too.

Language Support

You can use Wasmtime from a variety of different languages through embeddings of the implementation:

Documentation

📚 Read the Wasmtime guide here! 📚

The wasmtime guide is the best starting point to learn about what Wasmtime can do for you or help answer your questions about Wasmtime. If you're curious in contributing to Wasmtime, it can also help you do that!


It's Wasmtime.

Description
No description provided
Readme 125 MiB
Languages
Rust 77.8%
WebAssembly 20.6%
C 1.3%