7e42b4f314
Both umounts happening from another process, as well as tvix-store itself calling umount() on FuseDaemon will cause the FUSE worker threads to terminate. So far there was no nice way to wait on these threads to be terminated from multiple places, causing the `tvix-store mount` command to only be terminated if interrupted via ctrl-c, not via an external umount. Update FuseDaemon to use a ThreadPool, which gives us a join primitive over all threads, that can also be called from multiple places. Await on a join() from there to end the program, not the ctrl-c signal handler as it was before. Using FuseDaemon from multiple tasks requires Arc<>-ing both the ThreadPool as well as the inner FuseSession (which also needs to be inside a Mutex if we want to unmount), but now we can clone FuseDaemon around and use it in two places. We could probably also have used an Option and drop the FuseSession after the first umount, but this looks cleaner. Change-Id: Id635ef59b560c111db52ad0b3ca3d12bc7ae28ca Reviewed-on: https://cl.tvl.fyi/c/depot/+/11825 Reviewed-by: Brian Olsen <me@griff.name> Tested-by: BuildkiteCI |
||
---|---|---|
.. | ||
protos | ||
src | ||
build.rs | ||
Cargo.toml | ||
default.nix | ||
README.md |
//tvix/store
This contains the code hosting the tvix-store.
For the local store, Nix realizes files on the filesystem in /nix/store
(and
maintains some metadata in a SQLite database). For "remote stores", it
communicates this metadata in NAR (Nix ARchive) and NARInfo format.
Compared to the Nix model, tvix-store
stores data on a much more granular
level than that, which provides more deduplication possibilities, and more
granular copying.
However, enough information is preserved to still be able to render NAR and NARInfo when needed.
More Information
The store consists out of two different gRPC services, tvix.castore.v1
for
the low-level content-addressed bits, and tvix.store.v1
for the Nix and
StorePath
-specific bits.
Check the protos/
subfolder both here and in castore
for the definition of
the exact RPC methods and messages.
Interacting with the GRPC service manually
The shell environment in //tvix
provides evans
, which is an interactive
REPL-based gPRC client.
You can use it to connect to a tvix-store
and call the various RPC methods.
$ cargo run -- daemon &
$ evans --host localhost --port 8000 -r repl
______
| ____|
| |__ __ __ __ _ _ __ ___
| __| \ \ / / / _. | | '_ \ / __|
| |____ \ V / | (_| | | | | | \__ \
|______| \_/ \__,_| |_| |_| |___/
more expressive universal gRPC client
localhost:8000> package tvix.castore.v1
tvix.castore.v1@localhost:8000> service BlobService
tvix.castore.v1.BlobService@localhost:8000> call Put --bytes-from-file
data (TYPE_BYTES) => /run/current-system/system
{
"digest": "KOM3/IHEx7YfInAnlJpAElYezq0Sxn9fRz7xuClwNfA="
}
tvix.castore.v1.BlobService@localhost:8000> call Read --bytes-as-base64
digest (TYPE_BYTES) => KOM3/IHEx7YfInAnlJpAElYezq0Sxn9fRz7xuClwNfA=
{
"data": "eDg2XzY0LWxpbnV4"
}
$ echo eDg2XzY0LWxpbnV4 | base64 -d
x86_64-linux
Thanks to tvix-store
providing gRPC Server Reflection (with reflection
feature), you don't need to point evans
to the .proto
files.