The Fluence Labs Developer Hub

Welcome to the Fluence Labs developer hub. You'll find comprehensive guides and documentation to help you start working with Fluence Labs as quickly as possible, as well as support if you get stuck. Let's jump right in!

Should you have any questions, feel free to join our Discord or Telegram!

Get Started

JavaScript SDK

Fluence JS SDK is a bridge to Fluence Network. It provides you a local Fluence Peer, powering you to develop your application in a p2p fashion.

SDK gives you the following powers:

  • Means to create and manage the identity of your local peer and applications
  • Ability to execute AIR scripts on local WASM runtime
  • Define the behavior of local function calls (i.e., when script calls function on your local peer)
  • Automatically forward AIR script execution to remote peers, according to script-defined topology

How to install

With npm

npm install @fluencelabs/fluence

With yarn

yarn add @fluencelabs/fluence

Getting started

Pick a node to connect to the Fluence network. The easiest way to do so is by using fluence-network-environment package

import { dev } from '@fluencelabs/fluence-network-environment';

export const relayNode = dev[0];

Initialize client

import { createClient, FluenceClient } from '@fluencelabs/fluence';

const client = await createClient(relayNode);

Add response service function calls

subscribeToEvent(client, 'helloService', 'helloFunction', (args) => {
    const [networkInfo] = args;

Make a particle

const particle = new Particle(`
        (call myRelay ("op" "identify") [] result)
        (call %init_peer_id% ("helloService" "helloFunction") [result])
        myRelay: client.relayPeerId,

Send it to the network

await sendParticle(client, particle);

Observe the result in browser console

    "external_addresses": [ "/ip4/", "/dns4/" ]

Fluence network basics

To interact with the network JS SDK provides FluenceClient class. It is a bridge between javascript in browser (or on Node.js) and the Aquamarine environment.

Working with the client

To create a client use createClient function as following:

const client = await createClient(nodeToConnectTo, selfPeerId);

Fluence is a p2p network so you have to know a node to connect to. The multiaddress of the node is passed as the first parameter. You can find one in fluence-network-environment package.

The second parameter is the peer id of your client. You can read about identity and peer ids in the next section. By default a random peer id will be generated.

The client exposes several useful methods. For example you can get you own peer id with client.selfPeerId property and the peer id of the relay with client.relayPeerId.

By default the client will connect to the network automatically. You can disconnect from the network with disconnect() function and connect back with connect(address) one.

See reference for the full api description.

There is no restriction on where to store the client variable. You can stored somewhere with the local state of you application or globally, say as a global variable.

Managing identity

Managing identity is crucial, as your identity is used to locate your peer in the network and enable ownership services created by you on remote peers.

In JS, PeerId holds 3 related things:

  • Private key: PeerId.privKey()
  • Public key: PeerId.pubKey()
  • PeerId in base58: PeerId.toB58String()

This makes a peer's identity: the ability to sign things and the peer's location. You can generate and use PeerId like the following.

import { generatePeerId, createClient } from '@fluencelabs/fluence'

const peerId = await generatePeerId();
const client = await createClient(multiaddr, peerId);

peerIdToSeed and seedToPeerId functions are there to simplify private key extraction and vice versa. You can pair them with some secure browser storage and follow standard key management guidelines.

import {peerIdToSeed, seedToPeerId} from '@fluencelabs/fluence'

const seed = peerIdToSeed(peerId); // store it somewhere and then transform back to peerId
const peerId = seedToPeerId(seed)";

Defining services

With JS SDK you can define you own services which would be accessible from Aquamarine. Unlike services running inside nodes frontend apps are allowed to define any handlers without the need to provide and register blueprints.

In Fluence Aquamarine is the only method to transmit data between peers. This is true for frontend services as well, so providing handlers on client side is crucial.

Register service function

The primary way to provide a handler is by using registerServiceFunction. The function takes four arguments:

  • FluenceClient, created with createClient function
  • serviceId, which would be used in Aquamarine call.
  • fnName, which would be used in Aquamarine call.
  • the handler function of two arguments: arguments themselves and tetraplets

You can read about tetraplets in Security section.

Have a look at the example usage:

import { registerServiceFunction } from '@fluencelabs/fluence';

registerServiceFunction(client, 'greeting', 'hello_world', (args, tetraplets) => {
    const [ str ] = args as [string];
    return "Hello, " + str;

Here we create a handler which takes a single string argument and returns a new string with "Hello, " prepended to it. Now this code would be executed if someone makes the following call from Aquamarine:

(call our_client_peer_id ("greeting" "hello_world") ["John"] result)

they will get "Hello John" into result variable.

Note: keep in mind that the code of callback is executed synchronously thus blocking the execution of the particle. Do not put lots of heavy computation inside callbacks and consider returning as soon as possible.

Register event callback

In case you don't need to return any value, i.e use your service call as a notification (much like websocket's onmessage listener) you can use a convenience subscribeToEvent function. It takes exactly the same arguments as the registerServiceFunction and works exactly the same with the only difference that it returns and empty response immediately and executes the callback asynchronously

Below is the example usage of subscribeToEvent. It defines the similar to console.log. We'll show how to call it in the next step.

import { registerServiceFunction } from '@fluencelabs/fluence';

subscribeToEvent(client, 'console', 'log', (args, tetraplets) => {
    console.log(`log: ${args}`);

Working with Aquamarine

Aquamarine is used for all communications in Fluence network. The unit of execution im Aquamarine environment is called Particle. JS SDK provied a way to initiate a particle and send into into the network.

Creating particles

To create a particle you should use the Particle class. It's constructor takes up to three arguments:

  • Air script which defines the execution of a particle.
  • Optional variables passed to the particle. It can be either in the form of JS Map or in the form of JS object with keys representing variable names and values representing values correspondingly. This parameter is optional in case you don't need to pass any variables
  • Time to live, a timeout after which the particle execution is stopped by Aquamarine. This parameter is optional with a default.

The code below creates a particles and sends it into the network.

import { Particle, sendParticle } from '@fluencelabs/fluence';

const particle = new Particle(
            (call relay ("op" "identify") [] result)
            (call %init_peer_id% ("console" "log") [result])
        relay: client.relayPeerId,

const particleId = await sendParticle(client, particle);

Using variables

Values can be passed inline (i.e., values in quotes). Using literals can be tedious if you need to repeat values or wish to keep the script short and readable. To avoid that, you can use variables that refer to particle data.

const script = `(call %init_peer_id% ("console" "log") [msg])`;
const particle = new Particle(script, {msg: 'hello'});
await sendParticle(client, particle);

To learn more about writing AIR scripts, refer to AIR doc.

Calling functions

As AIR scripts describe the topology of execution functions on peers, we can write a script to call a function on our local console service from the example in the previous section.

A script could be as follows

(call %init_peer_id% ("console" "log") ["hello" "from" "WASM"])

init_peer_id refers to a peer that initiated script execution. In that case, it is us, so the call of console.log will call the previously defined function call on service console.

Here's how this can be expressed in terms of Fluence JS SDK.

import { createClient, Particle, sendParticle } from '@fluencelabs/fluence';

const client = await createClient();

// call is an instruction that takes the following parameters
// (call <Peer location> <Function location> <[argument list]> <optional output>)
const script = `(call %init_peer_id% ("console" "log") ["hello"])`;

// Wrap script into particle, so it can be executed by local WASM runtime
const particle = new Particle(script);

await sendParticle(client, particle);
// "[hello]" should be printed in a console",

Using built-in functions

There are a number of built-in function which simplify common operations with Fluence Network

Upload wasm modules

Services are created from wasm modules, and these modules are currently distributed by manually uploading them to desired nodes.

To learn more about the service lifecycle, refer to doc.

To upload modules, as .wasm files they are, they first need to be converted to base64 strings. Here I'll assume you already possess a base64 string of desired modules.

One more thing about modules is that they are referred to by their names. When you upload a module, you specify its name. You can later use these names when specifying dependencies in a blueprint.

// connect to some Fluence node
let client = await createClient(multiaddr);
let moduleBs64 = ...; // load .wasm module into base64 string

// upload module under "module_name" to connected node
await uploadModule(client, "module_name", moduleBs64);

// uploading to different node by specifying its PeerId
let remotePeerId = "123DTargetNode";
await uploadModule(client, "module_name", moduleBs64, remotePeerId);

Create service

In Fluence, services are created from blueprints. Blueprints specify a list of modules required to create the service and some meta-information. To learn more about blueprints, refer to doc.

Modules specified in the blueprint can be interlinked, i.e., they can import functions from each other. The important thing is that the last module in the dependency list is a facade module. i.e., it's the only module that can be called directly from AIR scripts. The facade module is "public," while all other modules are "private" in that sense. To learn more about building services, refer to doc.

You can create a service from modules ["module_a", "module_b", "module_facade"] as follows.

// modules could be linked to each other. If so, dependent modules should be specified after dependencies.
let blueprintId = await addBlueprint(client, "great_service", ["module_a", "module_b", "module_facade"], remotePeerId);
let serviceId = await createService(client, blueprintId, remotePeerId);

blueprintId now can be used to create instances of great_service. Remember that blueprints must be added to a node before you can use them.

serviceId can be used to call function on the created service, the same way as console was used in Calling function section above. Here's an example of calling function greet on the created service.

let script = `
    (call relay ("op" "identity") [])
    (call remotePeerId (serviceId "greet") [name])
let data = new Map(
   "name" -> "folex", 
   "serviceId" -> serviceId, 
   "remotePeerId" -> ...

let particle = new Particle(script, data);
await sendParticle(client, particle);

Service aliases

To learn about service aliasing, refer to doc on Aliases.

JS SDK wraps these scripts into JS API, so you can call them like follows.


Read doc on authentication & authorization patterns to learn of possibilities.

Showcase: relaying & remote execution

Fluence network is made of peers of various execution power, availability guarantees, and most importantly – various connectivity. To allow peers from non-public networks to communicate, Fluence employs relay mechanics. Currently, any Fluence Node can be used as a relay.

To learn more about relaying, refer to the doc.

For now, we'll use a relay to connect two browser peers. You can emulate two peers by opening two browser tabs, for example. I'll assume that you have done so, and their peer ids are 123DPeerIdA and 123DPeerIdB.

We'll use the following relays:

  • /dns4/
  • /dns4/

On a first browser, connect to the first relay, and register service with a single function as follows.

import { createClient, subscribeToEvent } from '@fluencelabs/fluence';
import { dev } from '@fluencelabs/fluence-network-environment';

const client = await createClient(dev[0]);

subscribeToEvent(client, 'console', 'log', (args, tetraplets) => {
    console.log(`log: ${args}`);

console.log(`First PeerId: ${client.selfPeerId}`);

On a second browser, connect to the second relay, and call remote console.log as follows.

import { createClient, Particle, sendParticle } from '@fluencelabs/fluence';
import { dev } from '@fluencelabs/fluence-network-environment';

const client = await createClient(dev[1]);

console.log(`Second PeerId: ${client.selfPeerId}`);

const script = `
        (call second-relay ("op" "identity") [])
            (call first-relay ("op" "identity") [])
            (call first-peer ("console" "log") [msg])

const data = {
    "first-peer": "123DPeerIdA" // <== Do not forget to change 123DPeerIdA to actual peer id
    "second-relay": "12D3KooWHk9BjDQBUqnavciRPhAYFvqKBe4ZiPPvde7vDaqgn5er",
    "first-relay": "12D3KooWEXNUbCXooUwHrHBbrmjsrpHXoEphPwbjQXEGyzbqKnE9",
    "msg": "hello"

const particle = new Particle(script, data);
await sendParticle(client, particle);

After that, you should see the message log: [hello] in the console of the first browser.

To learn more about AIR scripts, refer to doc.

Updated about a month ago

JavaScript SDK

Suggested Edits are limited on API Reference Pages

You can only suggest edits to Markdown body content, but not to the API spec.