With Ease Magazine https://withease.effector.dev The collection of articles about Effector and related topics. It is not a replacement for the official documentation, but it can help you to understand some concepts better. Sat, 09 Aug 2025 09:37:45 GMT https://validator.w3.org/feed/docs/rss2.html https://github.com/jpmonette/feed en Copyright (c) 2023-present, Igor Kamyşev <![CDATA[Catch Scope-less Calls]]> https://withease.effector.dev/magazine/scopefull.html https://withease.effector.dev/magazine/scopefull.html Tue, 20 Aug 2024 00:00:00 GMT Catch Scope-less Calls

Fork API is an Effector's killer feature. It allows you to execute any number of application instances in parallel in a single thread which is great for testing and SSR. Fork API has some rules to follow and this article is about automated validation of them.

The Problem

Some violations of the Fork API rules can be detected by static analysis tools like ESLint with the effector/scope preset. But some rules require runtime validation. For example, it is illegal to make an imperative call of an Event with no explicit Scope. However, for ESLint it is almost impossible to detect such calls.

In this case we need to listen to all messages that pass through Effector's kernel and analyze them. If we find a message with no Scope we can log it.

The Solution

Effector has a special API to listen messages that pass through the library. It is called Inspect API. You can use it to catch all messages and analyze them. This API is great for debugging and testing which is what we need.

The usage of the Inspect API is quite simple. You need to call the inspect function with a callback that will be called for each message. The callback will receive a message object that contains all the information about the message. You can analyze this object and do whatever you want.

ts
import { inspect, Message } from 'effector/inspect';

inspect({
  /**
   * Explicitly define that we will
   * catch only messages where Scope is undefined
   */
  scope: undefined,
  fn: (m: Message) => {
    const name = `${m.kind} ${m.name}`;
    const error = new Error(`${name} is not bound to scope`);

    console.error(error);
  },
});
]]>
<![CDATA[Migrating from Redux to Effector]]> https://withease.effector.dev/magazine/migration_from_redux.html https://withease.effector.dev/magazine/migration_from_redux.html Fri, 08 Mar 2024 00:00:00 GMT Migrating from Redux to Effector

This guide explains how to perform a gradual, non-blocking code migration from Redux to Effector.

Preparation

Install effector

First, you need to install the effector package. See the official documentation for instructions.

TIP

It is also highly recommended setting up the official Effector ESLint Plugin, so it would be easier for you to follow Effector's best practices.

Also, it is recommended to read at least some of the Effector's docs, so it is easier to follow the guide. E.g. you can read Effector-related terminology here.

Install @withease/redux

This guide uses the @withease/redux package, which is a minimalistic set of helpers to simplify the migration, so it is recommended to install it too.

See the package documentation for detailed installation instructions.

Create Redux interoperability object

In order for Redux and Effector to communicate effectively with each other, a special object must be created.

You should do it by using createReduxIntegration method of the @withease/redux somewhere near the Redux Store configuration itself.

INFO

Redux Toolkit configureStore is used here as an example, @withease/redux supports any kind of Redux Store.

ts
// src/redux-store
import { createReduxIntegration } from '@withease/redux';
import { configureStore } from '@reduxjs/tookit';

export const myReduxStore = configureStore({
  // ...
});

export const reduxInterop = createReduxIntegration({
  reduxStore: myReduxStore,
  setup: appStarted,
});

Avoiding dependency cycles

Note, that overload of the createReduxIntegration with explicit reduxStore allows for better Typescript type inference, but also might result in cyclic dependencies.

In case if you encountered this issue, you can use "async" setup of the reduxInterop object, but it will lead to null-checks later, because in that case there is a possibility, that Redux Store is not initialized yet, while reduxInterop object is already in use.

See the package documentation for more details.

ts
// src/shared/redux-interop
export const startReduxInterop = createEvent<ReduxStore>();
export const reduxInterop = createReduxIntegration({
  setup: startReduxInterop,
});

// src/entrypoint.ts
import { startReduxInterop } from 'shared/redux-interop';

const myReduxStore = configureStore({
  // ...
});

startReduxInterop(myReduxStore);

☝️ This would allow you to access reduxInterop object and avoid possible dependency cycles, if reduxInterop is being imported somewhere along with some reducers or middlewares.

Explicit setup

Notice, how explicit setup event is required to initialize the interoperability. Usually it would be an appStarted event or any other "app's lifecycle" event.

You can read more about this best-practice in the "Explicit start of the app" article.

It is recommended to pick a place in your project architecture and add a model for the app lifecycle events declaration:

ts
// e.g. shared/app-lifecycle/index.ts
import { createEvent } from 'effector';

export const appStarted = createEvent();

And then call this event in the point, which corresponds to "start of the app" - usually this is somewhere near the render.

tsx
// src/entrypoint.ts
import { appStarted } from 'root/shared/app-lifecycle';

appStarted();

render(<App />);

After that, you have everything ready to start a gradual migration.

Migration

Now you have existing code with Redux" that implements the features of your product. There is no point in stopping development altogether to migrate between technologies, this process should be integrated into the product development.

TIP

It is a good idea to select one of the existing functions in your code, rewrite it for the new technology and show the resulting Pull Request to your colleagues before starting a full-fledged migration.

This way you can evaluate whether this technology helps you solve your problems and how well it suits your team.

This is a list of cases with examples of organizing a migration from Redux code to Effector code.

Migrating existing feature

First thing you need to do in that case is to create an Effector model somewhere, where you want to put a new implementation.

Effector API for the Redux code

At first new model will only contain a "mirrored" stores and events, which are reading and sending updates to Redux Store:

ts
// src/features/user-info/model.ts
export const $userName = combine(
  reduxInterop.$state,
  (state) => state.userInfo.name ?? ''
);
export const updateName = reduxInterop.dispatch.prepend((name: string) =>
  userInfoSlice.updateName(name)
);

TIP

It is recommended to use .prepend API of reduxInterop.dispatch effect to create separate Effector events, connected to their Redux action counterparts.

The same is recommended for reduxInterop.$state - it is better to create separate stores via combine for "slices" of the Redux state, because it makes gradual migration easier.

But since reduxInterop.dispatch is a normal Effect and reduxInterop.$state is a normal store, you can safely use both of them like so.

This model then can be used anywhere in place of classic actions and selectors.

E.g. a UI component:

tsx
import { useUnit } from 'effector-react';

function UserInfoForm() {
  const { name, nameUpdated } = useUnit({
    name: $userName,
    nameUpdated: updateName,
  });

  return (
    <Wrapper>
      <Input
        value={name}
        onChange={(e) => {
          nameUpdated(e.currentTarget.value);
        }}
      />
    </Wrapper>
  );
}

You can find API reference of UI-framework integrations in the Effector's documentation.

Testing

Now that we have the Effector API for the old code, we can write some tests for it, so that the behavior of the Redux code will be captured, and we won't break anything when porting the feature implementation to Effector.

TIP

Notice, that we also need to create mock version of the Redux Store, so this test is independent of any other.

Testable version of the Redux Store should also properly mock any thunks or custom middlewares, which are used in the test.

ts
import { configureStore } from '@reduxjs/tookit';

import { $userName, updateName } from 'root/features/user-info';
import { reduxInterop } from 'root/redux-store';
import { appStarted } from 'root/shared/app-lifecycle';

test('username is updated', async () => {
  const mockStore = configureStore({
    // ...
  });

  const scope = fork({
    values: [
      // Providing mock version of the redux store
      [reduxInterop.$reduxStore, mockStore],
    ],
  });

  await allSettled(appStarted, { scope });

  expect(scope.getState($userName)).toBe('');

  await allSettled(updateName, { scope, params: 'John' });

  expect(scope.getState($userName)).toBe('John');
});

Such tests will allow us to notice any changes in logic early on.

INFO

You can find more details about Effector-way testing in the "Writing tests" guide in the documentation.

Gradual rewrite

We can now extend this model with new logic or carry over existing logic from Redux, while keeping public API of Effector units.

ts
// src/features/user-info/model.ts
export const $userName = combine(
  reduxInterop.$state,
  (state) => state.userInfo.name ?? ''
);
export const updateName = createEvent<string>();

sample({
  clock: updateName,
  filter: (name) => name.length <= 20,
  target: [
    reduxInterop.dispatch.prepend((name: string) =>
      userInfoSlice.updateName(name)
    ),
  ],
});

☝️ Effector's model for the feature is extended with new logic (name can't be longer than 20 characters), but the public API of $userName store and updateName event is unchanged and state of the username is still lives inside Redux.

Moving the state

Eventually you should end up with a situation where:

  1. The state of the feature is still stored in Redux
  2. But all related logic and side effects are now managed by the Effector
  3. and all external consumers (UI-components, other features, etc.) interact with the feature through its Effector-model.

After that you can safely move the state into the model and get rid of Redux-reducer for it:

ts
// src/features/user-info/model.ts
export const $userName = createStore('');
export const updateName = createEvent<string>();

sample({
  clock: updateName,
  filter: (name) => name.length <= 20,
  target: $userName,
});

☝️ Feature is completely ported to Effector, reduxInterop is not used here anymore.

Edge-case

If there is still code that consumes this state via the Redux Store selector, and there is currently no way to move that consumer to use the Effector model, it is still possible to "sync" the state back into Redux as a read-only mirror of the Effector model state:

ts
// src/features/user-info/model.ts

// ...main code

// sync state back to Redux
sample({
  clock: $userName,
  target: [
    reduxInterop.dispatch.prepend((name: string) =>
      userInfoSlice.syncNameFromEffector(name)
    ),
  ],
});

☝️ But it's important to make sure that this is a read-only mirror that won't be changed in Redux in any other way - because then there would be two parallel versions of this state, which would probably lead to nasty bugs.

New feature

Adding a new feature on Effector to a Redux project is not much different from the initial step of migrating an existing feature:

  1. Any new code is written in Effector
  2. Any dependencies to Redux Store should work through reduxInterop API

Special cases

Middleware with side effects

Sometimes Redux actions are not changing state, but trigger side effects via middlewares.

Suppose Redux Store has middleware that reacts to action like { type: SEND_ANALYTICS_EVENT, payload } and sends the event to our analytics.

Sending analytics is usually involved in almost all code of the application and migration of such a feature will be much more complicated.

In this case, the recommended upgrade path is as follows:

Mirror of the action

First, create a mirror Effector's event of the SEND_ANALYTICS_EVENT action by using its action-creator:

ts
// src/shared/analytics/model.ts
import { reduxInterop } from 'root/redux-store';
import { sendAnalyticsEventAction } from './actions';

export const sendAnalytics = reduxInterop.dispatch.prepend((payload) =>
  sendAnalyticsEventAction(payload)
);

Move to event instead of an action

As a second step, gradually change all dispatches of this action to an event call.

E.g. instead of

ts
import { sendAnalyticsEventAction } from 'root/analytics';

dispatch(sendAnalyticsEventAction(payload));

do

ts
import { sendAnalytics } from 'root/analytics';

sendAnalytics(payload);

It is safe to do, because the sendAnalytics(payload) call here is a full equivalent of the dispatch(sendAnalyticsEventAction(payload)) and can be used instead of it - the action will still be dispatched by the reduxInterop.dispatch under the hood.

In the end Redux, Effector and your UI-framework should all use this event instead of dispatching the action.

Move the implementation

Since now all analytics is sent via this event, it is now possible to fully move from the analytics middleware to Effector's model:

ts
// src/shared/analytics/model.ts
import { createEvent, createEffect, sample } from 'effector';
import { sendEvent } from 'root/shared/analytics-client';

export const sendAnalytics = createEvent();

const sendEventFx = createEffect(sendEvent);

sample({
  clock: sendAnalytics,
  target: sendEventFx,
});

Redux Thunks

Redux Thunks are a standard approach for writing asynchronous logic in Redux apps, and are commonly used for data fetching, so your app is probably already have a bunch of thunks, which should also be migrated at some point.

The closest equivalent to Thunk in Effector is an Effect, which is a container for any function, which produces side effects (like fetching the data from remote source) - so Thunks should be converted to Effects.

Create an Effect representation for a Thunk

You can convert any Thunk to Effect by using Effector's attach operator and wrapping a reduxInterop.dispatch with it.

ts
import { createAsyncThunk } from '@reduxjs/toolkit';
import { attach } from 'effector';

import { reduxInterop } from 'root/redux-store';

const someThunk = createAsyncThunk(
  'some/thunk',
  async (p: number, thunkApi) => {
    // thunk code
  }
);

/**
 * This is a redux-thunk, converted into an effector Effect.
 *
 * This allows gradual migration from redux-thunks to effector Effects
 */
const someFx = attach({
  mapParams: (p: number) => someThunk(p),
  effect: interop.dispatch,
});

Now you can use it in any new code with Effector:

ts
sample({
  clock: doSomeButtonClicked,
  target: someFx,
});

INFO

Adding of Fx postfix for Effects is an Effector's naming convention, just like adding $ to the store names.

It is described in details in the "Naming convention" article in the docs.

Use this Effect instead of original Thunk

Created Effect can be safely used anywhere, where you would use the original thunk - this will allow to simply swap Effect's implementation from Thunk usage later.

UI Component
tsx
const doSome = useUnit(someThunkFx);

return <button onClick={doSome}>Do thunk</button>;
Other Thunk
ts
const makeASandwichWithSecretSauce = (clientName) = async (dispatch) => {
  try {
    const result = await sandwichApi.getSandwichFor(clientName)

    dispatch(sandwichSlice.ready(result))
  } catch(error) {
    dispatch(sandwichSlice.failed(error))
  }
};

const makeASandwichFx = attach({
  mapParams(client) {
    return makeASandwichWithSecretSauce(client)
  },
  effect: reduxInterop.dispatch,
})

function makeSandwichesForEverybody() {
  return function (dispatch, getState) {
    if (!getState().sandwiches.isShopOpen) {
      return Promise.resolve();
    }

    return dispatch(makeASandwichWithSecretSauce('My Grandma'))
      .then(() =>
        Promise.all([
          makeASandwichFx('Me')),
          // ☝️ Notice, that this Effect is intertwined with the Thunk flow
          dispatch(makeANormalSandwich('My wife')),
        ])
      )
  };
}

Swap Effect's implementation

After this Effect is used everywhere instead of a Thunk you can safely swap implementation:

ts
// If Thunk was dispatching some actions internally, you can also preserve this logic in Effector's model
// and then migrate for it by following "Migrating existing feature" part of this guide
const sandwichReady = reduxInterop.dispatch.prepend((result) =>
  sandwichSlice.ready(result)
);
const sandiwchFailed = reduxInterop.dispatch.prepend((error) =>
  sandwichSlice.fail(error)
);

const makeASandwichFx = createEffect((clientName) =>
  sandwichApi.getSandwichFor(clientName)
);

sample({
  clock: makeASandwichFx.doneData,
  target: sandwichReady,
});

sample({
  clock: makeASandwichFx.failData,
  target: [
    sandwichFailed,
    reportErrorToSentry,
    // ...
  ],
});

That's it, Thunk is now Effect!

Redux Sagas

Redux-Saga is a side effect management library for Redux. Coincidentally, side effect management is also the main focus of Effector, so to migrate you will need to simply rewrite your sagas to Effector's concepts.

Thanks to @withease/redux you can do it partially and in any order. Here are few examples of the Saga code ported to Effector.

TIP

These examples show the ported code, but the use of Redux actions and states is left as is, since other sagas (and any middlewares in general) may depend on them.

See the "Migrating Existing Functions" part of this guide for how to migrate from dispatchers and selectors to events and stores completely.

Data fetching

ts
function* fetchPosts() {
  yield put(actions.requestPosts());
  const page = yield select((state) => state.currentPage);
  const products = yield call(fetchApi, '/products', page);
  yield put(actions.receivePosts(products));
}

function* watchFetch() {
  while (yield take('FETCH_POSTS')) {
    yield call(fetchPosts); // waits for the fetchPosts task to terminate
  }
}
ts
const $page = combine(reduxInterop.$state, (state) => state.currentPage);
const postsRequested = reduxInterop.dispatch.prepend(actions.requestPosts);
const postsReceived = reduxInterop.dispatch.prepend(actions.receivePosts);
// This event should be used to dispatch this action in place of original dispatch
// See "Middleware with side-effects" part of this guide for explanation
const fetchPosts = reduxInterop.dispatch.prepend(() => ({
  type: 'FETCH_POSTS',
}));

const fetchProductsByPageFx = createEffect((page) =>
  fetchApi('/products', page)
);

// this sample describes the key part of the saga's logic
sample({
  clock: postsRequested,
  source: $page,
  target: fetchProductsByPageFx,
});

// Notice, that these two `sample`s here are used only to preserve actions dispatching,
// as there is might be other redux code depending on them
sample({
  clock: fetchPosts,
  target: postsRequested,
});

sample({
  clock: fetchProductsByPageFx.doneData,
  target: postsReceived,
});

Throttle, delay and debounce

TIP

You can implement debounce, delay and throttle logic in Effector by yourself.

But since those are common patterns, it is recommended to use Patronum - the official utility library for Effector.

ts
import { throttle, debounce, delay } from 'redux-saga/effects';

function* handleInput(input) {
  // ...
}

function* throttleInput() {
  yield throttle(500, 'INPUT_CHANGED', handleInput);
}

function* debounceInput() {
  yield debounce(1000, 'INPUT_CHANGED', handleInput);
}

function* delayInput() {
  yield take('INPUT_CHANGED');
  yield delay(5000);
}
ts
import { debounce, delay, throttle } from 'patronum';
import { createEffect, createEvent, sample } from 'effector';

const inputChanged = createEvent();
const handleInputChangeFx = createEffect((input) => {
  // ...
});

sample({
  clock: [
    throttle({
      source: inputChanged,
      timeout: 500,
    }),
    debounce({
      source: inputChanged,
      timeout: 1000,
    }),
    delay({
      source: inputChanged,
      timeout: 5000,
    }),
  ],
  target: handleInputChangeFx,
});

Background task

ts
function* bgSync() {
  try {
    while (true) {
      yield put(actions.requestStart());
      const result = yield call(someApi);
      yield put(actions.requestSuccess(result));
      yield delay(5000);
    }
  } finally {
    if (yield cancelled()) yield put(actions.requestFailure('Sync cancelled!'));
  }
}

function* main() {
  while (yield take('START_BACKGROUND_SYNC')) {
    // starts the task in the background
    const bgSyncTask = yield fork(bgSync);

    // wait for the user stop action
    yield take('STOP_BACKGROUND_SYNC');
    // user clicked stop. cancel the background task
    // this will cause the forked bgSync task to jump into its finally block
    yield cancel(bgSyncTask);
  }
}
ts
import { createStore, sample, createEffect } from 'effector';
import { delay } from 'patronum';

import { reduxInterop } from 'root/redux-store';

const startRequested = reduxInterop.dispatch.prepend(actions.requestStart);
const requestSuccess = reduxInterop.dispatch.prepend(actions.requestSuccess);

export const backgroundSyncStarted = reduxInterop.dispatch.prepend(
  actions.startBackgroundSync
);
export const backgroundSyncStopped = reduxInterop.dispatch.prepend(
  actions.stopBackgroundSync
);

const $needSync = createStore(false)
  .on(backgroundSyncStarted, () => true)
  .on(backgroundSyncStopped, () => false);
const someApiFx = createEffect(someApi);

// This sample will run someApiFx in cycle with 5 second delays,
// until background sync is stopped
sample({
  clock: [
    backgroundSyncStarted,
    delay({
      source: someApiFx.done,
      timeout: 5_000,
    }),
  ],
  filter: $needSync,
  target: [
    // Dispatching original action for compatibility
    // with the rest of the project
    startRequested,
    // Calling the API
    someApiFx,
  ],
});

// Dispatching original action for compatibility
// with the rest of the project
sample({
  clock: someApiFx.doneData,
  target: requestSuccess,
});

Partial Saga migration

Previous examples shown the full rewrite of sagas, but it is not necessary. You can move parts of the logic from any saga step-by-step, without rewriting the whole thing:

  1. To call an Effector's Event or Effect from Saga you can use a call operator, like yield call(effectorEvent, argument).
  2. To read state of the Effector's Store in the Saga you can also use call + getState() method of a store, like this: yield call(() => $someStore.getState()).

WARNING

Note that it is generally not recommended calling the getState method of Effector Stores, because it is imperative and non-reactive. This method is an escape-hatch for cases where there is no other way.

But you can sometimes use it in Sagas, because they are imperative and non-reactive themselves, and you're not always going to have the option to rewrite it to Effector right away.

Here is an earlier "Data fetching" example, but in a state of partial rewrite.

ts
// effector model
const $page = combine(reduxInterop.$state, (state) => state.currentPage);

const postsRequested = reduxInterop.dispatch.prepend(actions.requestPosts);
const postsReceived = reduxInterop.dispatch.prepend(actions.receivePosts);

export const fetchPosts = reduxInterop.dispatch.prepend(() => ({
  type: 'FETCH_POSTS',
}));

const fetchProductsByPageFx = attach({
  source: $page,
  effect(page, filter) {
    return fetchApi('/products', page, filter);
  },
});

// saga
import { $filters } from 'root/features/filters';

import { postsRequested, postsReceived, fetchProductsByPageFx } from './model';

function* fetchPosts() {
  yield call(postsRequested);
  const filters = yield call(() => $filters.getState());
  const products = yield call(fetchProductsByPageFx);
  yield call(postsReceived, products);
}

function* watchFetch() {
  while (yield take('FETCH_POSTS')) {
    yield call(fetchPosts); // waits for the fetchPosts task to terminate
  }
}

☝️ Notice how yield call(effectorEvent, argument) is used instead of yield put(action) here. It allows to both call Effector's event (to use it in Effector-based code) and dispatch an action (to use it in Redux-based code).

Summary

To perform a gradual, non-blocking code migration from Redux to Effector you will need to:

  1. Install @withease/redux helpers package.
  2. Convert a single feature to Effector, so you and your colleagues are able to evaluate if it fits you.
  3. Rewrite Redux code to Effector, by converting entities of the former to their counterparts of the latter. You can do it gradually over the course of months and years, without stopping feature development of your product.
  4. Remove @withease/redux, once there is no more Redux code left.
]]>
<![CDATA[Explicit start of the app]]> https://withease.effector.dev/magazine/explicit_start.html https://withease.effector.dev/magazine/explicit_start.html Fri, 26 Jan 2024 00:00:00 GMT Explicit start of the app

In Effector Events can not be triggered implicitly. It gives you more control over the app's lifecycle and helps to avoid unexpected behavior.

The code

In the simplest case, you can just create something like appStarted Event and trigger it right after the app initialization. Let us pass through the code line by line and explain what's going on here.

  1. Create start Event

This Event will be used to trigger the start of the app. For example, you can attach some global listeners after this it.

ts
import { createEvent, fork, allSettled } from "effector";

const appStarted = createEvent();

const scope = fork();

await allSettled(appStarted, { scope });
  1. Create isolated Scope

Fork API allows you to create isolated Scope which will be used across the app. It helps you to prevent using global state and avoid unexpected behavior.

ts
import { createEvent, fork, allSettled } from "effector";

const appStarted = createEvent();

const scope = fork();

await allSettled(appStarted, { scope });
  1. Trigger start Event on the patricular Scope

allSettled function allows you to start an Event on particular Scope and wait until all computations will be finished.

ts
import { createEvent, fork, allSettled } from "effector";

const appStarted = createEvent();

const scope = fork();

await allSettled(appStarted, { scope });

The reasons

The main reason for this approach is it allows you to control the app's lifecycle. It helps you to avoid unexpected behavior and make your app more predictable in some cases. Let us say we have a module with the following code:

ts
// app.ts
import { createStore, createEvent, sample, scopeBind } from 'effector';

const $counter = createStore(0);
const increment = createEvent();

const startIncrementationIntervalFx = createEffect(() => {
  const boundIncrement = scopeBind(increment, { safe: true });

  setInterval(() => {
    boundIncrement();
  }, 1000);
});

sample({
  clock: increment,
  source: $counter,
  fn: (counter) => counter + 1,
  target: $counter,
});

startIncrementationIntervalFx();

Tests

We believe that any serious application has to be testable, so we have to isolate application lifecycle inside particular test-case. In case of implicit start (start of model logic by module execution), it will be impossible to test the app's behavior in different states.

TIP

scopeBind function allows you to bind an Event to particular Scope, more details you can find in the article about Fork API rules.

Now, to test the app's behavior, we have to mock setInterval function and check that $counter value is correct after particular time.

ts
// app.test.ts
import { $counter } from './app';

test('$counter should be 5 after 5 seconds', async () => {
  // ... test
});

test('$counter should be 10 after 10 seconds', async () => {
  // ... test
});

But, counter will be started immediately after the module execution, and we will not be able to test the app's behavior in different states.

SSR

In case of SSR, we have to start all application's logic on every user's request, and it will be impossible to do with implicit start.

ts
// server.ts
import * as app from './app';

function handleRequest(req, res) {
  // ...
}

But, counter will be started immediately after the module execution (aka application initialization), and we will not be able to start the app's logic on every user's request.

Add explicit start

Let us rewrite the code and add explicit start of the app.

ts
// app.ts
import { createStore, createEvent, sample, scopeBind } from 'effector';

const $counter = createStore(0);
const increment = createEvent();

const startIncrementationIntervalFx = createEffect(() => {
  const boundIncrement = scopeBind(increment, { safe: true });

  setInterval(() => {
    boundIncrement();
  }, 1000);
});

sample({
  clock: increment,
  source: $counter,
  fn: (counter) => counter + 1,
  target: $counter,
});

startIncrementationIntervalFx(); 
const appStarted = createEvent(); 
sample({ clock: appStarted, target: startIncrementationIntervalFx }); 

That is it! Now we can test the app's behavior in different states and start the app's logic on every user's request.

TIP

In real-world applications, it is better to add not only explicit start of the app, but also explicit stop of the app. It will help you to avoid memory leaks and unexpected behavior.

One more thing

In this recipe, we used application-wide appStarted Event to trigger the start of the app. However, in real-world applications, it is better to use more granular Events to trigger the start of the particular part of the app.

Recap

  • Do not execute any logic just on module execution
  • Use explicit start Event of the application
]]>
<![CDATA[Fork API rules]]> https://withease.effector.dev/magazine/fork_api_rules.html https://withease.effector.dev/magazine/fork_api_rules.html Fri, 26 Jan 2024 00:00:00 GMT Fork API rules

Fork API allows you to run multiple instances of the same application in the single process. It is useful for testing, SSR, and other cases. It is powerful mechanism, but it has some rules that you should follow to avoid unexpected behavior.

TIP

Some of the rules can be validated by static analysis tools like preset scope of eslint-plugin-effector, but others require runtime validation. Please, refer to the tutorial to learn how to set up such validations in your project.

Prefer declarative code

All Effector's operators (like sample or combine) support Fork API out of the box, if you describe your application logic in a declarative way with Effector's operator, you do not have to do anything to make it work with Fork API.

Of course, in some cases, you have to use some logic without Effector's operators, in this case, you have to follow some rules.

Do not mix Effects and async functions

It is illegal to mix Effects and async functions inside Effect handler body. This code will lead to unexpected behavior:

ts
import { createEffect } from "effector";

async function regularAsyncFunction() {
  // do stuff
}

const asyncFunctionInFx = createEffect(async () => {
  // do other stuff
});

const doAllStuffFx = createEffect(async () => {
  await regularAsyncFunction(); // 🔴 regular async function
  await asyncFunctionInFx(); // 🔴 effect
});

Actually, it can be fixed in a simple way. Just wrap all async functions into Effects:

ts
import { createEffect } from "effector";

async function regularAsyncFunction() {
  // do stuff
}
const regularAsyncFunctionFx = createEffect(regularAsyncFunction);

const asyncFunctionInFx = createEffect(async () => {
  // do other stuff
});

const doAllStuffFx = createEffect(async () => {
  await regularAsyncFunctionFx(); // 🟢 effect
  await asyncFunctionInFx(); // 🟢 effect
});
One more thing

The last example is supported by Fork API, but there is a better way to do it. You can use sample operator to express the same logic:

ts
const doAllStuff = createEvent();

sample({ clock: doAllStuff, target: regularAsyncFunctionFx });
sample({ clock: regularAsyncFunctionFx.done, target: asyncFunctionInFx });

It is more declarative and expandable. For example, you can easily handle errors from this Effects independently:

ts
sample({ clock: regularAsyncFunctionFx.fail, target: logError });
sample({ clock: asyncFunctionInFx.fail, target: showErrorMessage });

Promise.all and Promise.race

Fork API supports Promise.all and Promise.race out of the box. You can use them in your code without any restrictions.

ts
const doAllStuffFx = createEffect(async () => {
  // 🟢 valid
  await Promise.all([regularAsyncFunctionFx(), asyncFunctionInFx()]);
});

const doRaceStuffFx = createEffect(async () => {
  // 🟢 valid
  await Promise.race([regularAsyncFunctionFx(), asyncFunctionInFx()]);
});

Bind Events to particular Scope

Another important rule is to bind Events to particular Scope if you call them from external sources outside the Effector. For example, if you pass them as a callback to some external library, or if you call them from the UI layer as an event handler.

useUnit

For UI-libraries (like SolidJS or React), Effector has a special hooks that help you to bind Events to the current Scope automatically:

ts
import { useUnit } from 'effector-solid';

const doStuff = createEvent();

function Component() {
  const handleClick = useUnit(doStuff);

  return <button onClick={handleClick}>Click me</button>;
}
ts
import { useUnit } from 'effector-react';

const doStuff = createEvent();

function Component() {
  const handleClick = useUnit(doStuff);

  return <button onClick={handleClick}>Click me</button>;
}

Also, you have to provide the current Scope to UI-library through the context. Read more about it in the official documentation.

scopeBind

However, sometimes you have to call Events from the external sources, for example, pass them as a callback to some external library or DOM APIs. In this case, you have to use scopeBind function:

ts
import { createEvent, createEffect, scopeBind, sample } from 'effector'

const windowGotFocus = createEvent();

const setupListenersFx = createEffect(async () => {
  const boundWindowGotFocus = scopeBind(windowGotFocus);
  addEventListener('focus', boundWindowGotFocus);
});

sample({ clock: appStarted, target: setupListenersFx });

TIP

In this example we have to scopeBind inside Effect because it contains current Scope. To call this Effect we use explicit application start Event.

Use explicit start of the application

The last rule is to use explicit start of the application. It is important because you have to provide the current Scope to the Effector itself. To fulfill this requirement, you can call start function with the current Scope through allSetteled method:

ts
import { allSettled } from 'effector';

await allSettled(appStarted, { scope });

Recap

  • One effect is one Effect, do not use asynchronous functions inside Effect body
  • Always use scopeBind for Events that are passed to external sources
  • Do not forget to use useUnit (or its analogs) for Events that are used in the UI layer
  • Do not execute any logic just on module execution, prefer explicit start of the application
]]>
<![CDATA[Handle Effector's Events in UI-frameworks]]> https://withease.effector.dev/magazine/handle_events_in_ui_frameworks.html https://withease.effector.dev/magazine/handle_events_in_ui_frameworks.html Fri, 26 Jan 2024 00:00:00 GMT Handle Effector's Events in UI-frameworks

Sometimes you need to do something on UI-framework layer when an Event is fired on Effector layer. For example, you may want to show a notification when a request for data is failed. In this article, we will look into a way to do it.

The problem

TIP

In this article, we will use React as an example of a UI-framework. However, the same principles can be applied to any other UI-framework.

Let us imagine that we have an application uses Ant Design and its notification system. It is pretty straightforward to show a notification on UI-layer

tsx
import { notification } from 'antd';

function App() {
  const [api, contextHolder] = notification.useNotification();

  const showNotification = () => {
    api.info({
      message: 'Hello, React',
      description: 'Notification from UI-layer',
    });
  };

  return (
    <>
      {contextHolder}
      <button onClick={showNotification}>Show notification</button>
    </>
  );
}

But what if we want to show a notification when a request for data is failed? The whole data-flow of the application should not be exposed to the UI-layer. So, we need to find a way to handle Events on UI-layer without exposing the whole data-flow.

Let us say that we have an Event responsible for data loading failure:

ts
// model.ts
import { createEvent } from 'effector';

const dataLoadingFailed = createEvent<{ reason: string }>();

Our application calls it every time when a request for data is failed, and we need to listen to it on UI-layer.

The solution

We need to bound dataLoadingFailed and notification.useNotification somehow.

Let us take a look on a ideal solution and a couple of not-so-good solutions.

🟢 Save notification instance to a Store

The best way is saving notification API-instance to a Store and using it thru Effect. Let us create a couple new units to do it.

ts
// notifications.ts
import { createEvent, createStore, sample } from 'effector';

// We will use instance from this Store in the application
const $notificationApi = createStore(null);

// It has to be called every time when a new instance of notification API is created
export const notificationApiChanged = createEvent();

// Save new instance to the Store
sample({ clock: notificationApiChanged, target: $notificationApi });

Now we have to call notificationApiChanged to save notification API-instance to Store $notificationApi.

tsx
import { notification } from 'antd';
import { useEffect } from 'react';
import { useUnit } from 'effector-react';

import { notificationApiChanged } from './notifications';

function App() {
  // use useUnit to respect Fork API rules
  const onNewApiInstance = useUnit(notificationApiChanged);
  const [api, contextHolder] = notification.useNotification();

  // call onNewApiInstance on every change of api
  useEffect(() => {
    onNewApiInstance(api);
  }, [api]);

  return (
    <>
      {contextHolder}
      {/* ...the rest of the application */}
    </>
  );
}

After that, we have a valid Store $notificationApi with notification API-instance. We can use it in any place of the application. Let us create a couple Effects to work with it comfortably.

ts
// notifications.ts
import { attach } from 'effector';

// ...

export const showWarningFx = attach({
  source: $notificationApi,
  effect(api, { message, description }) {
    if (!api) {
      throw new Error('Notification API is not ready');
    }

    api.warning({ message, description });
  },
});

TIP

attach is a function that allows to bind specific Store to an Effect. It means that we can use notificationApi in showWarningFx without passing it as a parameter. Read more in Effector's documentation.

Effect showWarningFx can be used in any place of the application without any additional hustle.

ts
// model.ts
import { createEvent, sample } from 'effector';

import { showWarningFx } from './notifications';

const dataLoadingFailed = createEvent<{ reason: string }>();

// Show warning when dataLoadingFailed is happened
sample({
  clock: dataLoadingFailed,
  fn: ({ reason }) => ({ message: reason }),
  target: showWarningFx,
});

Now we have a valid solution to handle Events on UI-layer without exposing the whole data-flow.

However, if you want to know why other (maybe more obvious) solutions are not so good, you can read about them below 👇

Not-so-good solutions

🔴 Global notification service

Ant Design allows using global notification instance.

ts
// model.ts
import { createEvent, createEffect, sample } from 'effector';
import { notification } from 'antd';

const dataLoadingFailed = createEvent<{ reason: string }>();

// Create an Effect to show a notification
const showWarningFx = createEffect((params: { message: string }) => {
  notification.warning(params);
});

// Execute it when dataLoadingFailed is happened
sample({
  clock: dataLoadingFailed,
  fn: ({ reason }) => ({ message: reason }),
  target: showWarningFx,
});

In this solution it is not possible to use any Ant's settings from React Context, because it does not have access to the React at all. It means that notifications will not be styled properly and could look different from the rest of the application.

So, this is not a solution.

🔴 Just .watch an Event in a component

It is possible to call .watch-method of an Event in a component.

tsx
import { useEffect } from 'react';
import { notification } from 'antd';

import { dataLoadingFailed } from './model';

function App() {
  const [api, contextHolder] = notification.useNotification();

  useEffect(
    () =>
      dataLoadingFailed.watch(({ reason }) => {
        api.warning({
          message: reason,
        });
      }),
    [api]
  );

  return (
    <>
      {contextHolder}
      {/* ...the rest of the application */}
    </>
  );
}

In this solution we do not respect Fork API rules, it means that we could have memory leaks, problems with test environments and Storybook-like tools.

So, this is not a solution.

Summary

To bind some UI-framework specific API to Effector's data-flow we need to follow these steps:

  1. Retrieve API-instance from the framework.
  2. Save it to a Store.
  3. Create an Effect to work with it.
  4. Use this Effect in the application.
]]>
<![CDATA[You Don't Need Domains]]> https://withease.effector.dev/magazine/no_domains.html https://withease.effector.dev/magazine/no_domains.html Fri, 26 Jan 2024 00:00:00 GMT You Don't Need Domains

Domain in Effector is a namespace for Events, Effects and Stores. It could be used for two purposes:

  1. Semantic grouping of units
  2. Bulk operations on units

However, in most cases, you do not need Domains at all. Let us see why.

Semantic Grouping

JavaScript does have semantic grouping of entities: it is modules. Since you do not have an option not to use modules, you will be using them to group your units anyway. So, why do you need another grouping mechanism?

ts
// 👇 all units are already grouped by module
// src/features/counter.ts

import { createEvent, createStore, sample } from 'effector';

export const increment = createEvent();
export const decrement = createEvent();

export const $counter = createStore(0);

sample({
  source: $counter,
  clock: increment,
  fn: (counter) => counter + 1,
  target: $counter,
});

sample({
  source: $counter,
  clock: decrement,
  fn: (counter) => counter - 1,
  target: $counter,
});
ts
// 👇 all units are already grouped by module
// src/features/counter.ts

import { createDomain, createEvent, createStore, sample } from 'effector';

// AND by domain, so it is redundant
const counterDomain = createDomain();

export const increment = createEvent({ domain: counterDomain });
export const decrement = createEvent({ domain: counterDomain });

export const $counter = createStore(0, { domain: counterDomain });

sample({
  source: $counter,
  clock: increment,
  fn: (counter) => counter + 1,
  target: $counter,
});

sample({
  source: $counter,
  clock: decrement,
  fn: (counter) => counter - 1,
  target: $counter,
});

Bulk Operations

But Domains are not only about grouping. They also allow you to perform bulk operations on units.

For example, you can reset values of all Stores in the Domain with the following code:

ts
import { createDomain, createStore, createEvent } from 'effector';

const domain = createDomain();

export const someEvent = createEvent({ domain });

export const $store1 = createStore(0, { domain });
export const $store2 = createStore(0, { domain });
export const $store3 = createStore(0, { domain });

// 👇 callback will be called on every Store in the Domain
domain.onCreateStore((store) => {
  store.reset(someEvent);
});

This approach has a significant drawback: it is implicit. In case of creating a new Store in the Domain, you will have to remember that trigger of someEvent will reset the new Store as well. It is really easy to forget about it.

Things become even worse if you have more than one bulk operations in the Domain.

Instead of using Domains, you can explicitly perform bulk operations on units. The previous example can be rewritten as follows:

ts
import { createDomain, createStore, createEvent } from 'effector';

const domain = createDomain(); 

export const someEvent = createEvent({
  domain, 
});

export const $store1 = createStore(0, {
  domain, 
});
export const $store2 = createStore(0, {
  domain, 
});
export const $store3 = createStore(0, {
  domain, 
});

// 👇 callback will be called on every Store in the Domain
domain.onCreateStore((store) => {
  store.reset(someEvent); 
});

// 👇 now it is explicit
resetMany({ stores: [$store1, $store2, $store3], reset: someEvent }); 

function resetMany({ stores, reset }) {
  for (const unit of stores) {
    unit.reset(reset);
  }
}

This approach not only more explicit but also less verbose, because you do not need to specify Domain for every unit.

Summary

  • Do not use Domains for semantic grouping - use modules instead
  • Do not use Domains for bulk operations - use explicit functions instead
]]>
<![CDATA[Prefer Operators to Methods]]> https://withease.effector.dev/magazine/no_methods.html https://withease.effector.dev/magazine/no_methods.html Fri, 26 Jan 2024 00:00:00 GMT Prefer Operators to Methods

In Effector, there are two ways to create a new unit from an existing one:

  • Methods, e.g. event.map(...), event.filter(...), store.map(...)
  • Operators, e.g. combine(...) and sample(...)

In most cases, operators are more powerful and flexible than methods. You can add new features to operators without rewriting the code. Let us see how it works on a few examples.

combine

Let us say you have a derived Store to calculate a discount percentage for user:

ts
const $discountPercentage = $user.map((user) => {
  if (user.isPremium) return 20;
  return 0;
});

Some time later, you need to add a new feature: use current market conditions to calculate a discount percentage. In this case, you will need to completely rewrite the code:

ts
 const $discountPercentage = $user.map((user) => {
  if (user.isPremium) return 20;
  return 0;
});

 const $discountPercentage = combine(
  { user: $user, market: $market },
  ({ user, market }) => {
    if (user.isPremium) return 20;
    if (market.isChristmas) return 10;
    return 0;
  }
);

But if you use combine from the very beginning, you will be able to add a new feature without rewriting the code:

ts
const $discountPercentage = combine(
  {
    user: $user,
    market: $market, 
  },
  ({ user, market }) => {
    if (user.isPremium) return 20;
    if (market.isChristmas) return 10; 
    return 0;
  }
);

sample

It is even more noticeable when you need to filter an Event by a payload. Let us say you have an Event representing form submission and derived Event representing valid form submission:

ts
const formSubmitted = createEvent();

const validFormSubmitted = formSubmitted.filter({
  fn: (form) => {
    return form.isValid();
  },
});

Some time later, you need to add a new feature: use external service to validate form instead of using isValid method. In this case, you will need to completely rewrite the code:

ts
 const validFormSubmitted = formSubmitted.filter({
  fn: (form) => {
    return form.isValid();
  },
});

 const validFormSubmitted = sample({
  clock: formSubmitted,
  source: $externalValidator,
  filter: (validator, form) => validator(form),
});

But if you use sample from the very beginning, you will be able to add a new feature without rewriting the code:

ts
const validFormSubmitted = sample({
  clock: formSubmitted,
  filter: (form) => form.isValid(), 
  source: $externalValidator, 
  filter: (validator, form) => validator(form), 
});

With a sample we can go even further and add payload transformation just by adding a new argument:

ts
const validFormSubmitted = sample({
  clock: formSubmitted,
  source: $externalValidator,
  filter: (validator, form) => validator(form),
  fn: (_, form) => form.toJson(), 
});

Cool, right? But it is not the end. We can add a new feature: use external Store to enrich the payload:

ts
const validFormSubmitted = sample({
  clock: formSubmitted,
  source: {
    validator: $externalValidator,
    userName: $userName, 
  },
  filter: ({ validator }, form) => validator(form),
  fn: ({ userName }, form) => ({
    ...form.toJson(),
    userName, 
  }),
});

Summary

Prefer sample to event.filter/event.map and combine to store.map to make your code more extensible and transformable.

Exception

There is only one exception when you have to use method instead of operator: event.prepend(...) does not have an operator equivalent.

]]>
<![CDATA[.watch calls are (not) weird]]> https://withease.effector.dev/magazine/watch_calls.html https://withease.effector.dev/magazine/watch_calls.html Fri, 26 Jan 2024 00:00:00 GMT .watch calls are (not) weird

Sometimes, you can notice a weird behavior in your code if you use .watch to track Store changes. Let us explain what is going on and how to deal with it.

Effector's main mantra

Summary

.watch method immediately executes callback after module execution with the current value of the Store.

Effector is based on the idea of explicit initialization. It means that module execution should not produce any side effects. It is a good practice because it allows you to control the order of execution and avoid unexpected behavior. This mantra leads us to the idea of explicit start of the app.

However, it is one exception to this rule: callback in .watch call on Store is executed immediately after the store is created with a current value. This behavior is not quite obvious, but it is introduced on purpose.

Why?

Effector introduced this behavior to be compatible with default behavior of Redux on the early stages of development. Also, it allows using Effector Stores in Svelte as its native stores without any additional compatibility layers.

It is not a case anymore, but we still keep this behavior for historical reasons.

The problem and solutions

Now, let us consider the following example:

ts
const $store = createStore('original value');

$store.watch((value) => {
  console.log(value);
});

const scope = fork({
  values: [[$store, 'forked value']],
});

// -> 'original value'

In this example, console will print only "original value" since fork call does not produce any side effects.

Even if we change order of calls, it will not change the behavior:

ts
const $store = createStore('original value');

const scope = fork({
  values: [[$store, 'forked value']],
});

$store.watch((value) => {
  console.log(value);
});

// -> 'original value'

It could be confusing, but it is not a bug. First .watch call executes only with current value of the Store outside of Scope. In real-world applications, it means that you probably should not use .watch.

Current value?

Actually, yes. Callback executes with the current value of the Store outside of Scope. It means, you can change value of the Store before .watch call and it will be printed in the console:

ts
const $store = createStore('original value');

$store.setState('something new');

$store.watch((value) => {
  console.log(value);
});

// -> 'something new'

However, it is a dangerous way, and you have to avoid it in application code.

In general .watch could be useful for debugging purposes and as a way to track changes in Store and react somehow. Since, it is not a good idea to use it in the production code, let us consider some alternatives.

Debug

Effector's ecosystem provides a way more powerful tool for debugging: patronum/debug. It correctly works with Fork API and has a lot of other useful features.

First, install it as a dependency:

sh
pnpm install patronum
sh
yarn add patronum
sh
npm install patronum

Then, mark Store with debug method and register Scope with debug.registerScope method:

ts
import { createStore, fork } from 'effector';
import { debug } from 'patronum';

const $store = createStore('original value');

debug($store);

const scope = fork({
  values: [[$store, 'forked value']],
});

debug.registerScope(scope, { name: 'myAppScope' });

// -> [store] $store [getState] original value
// -> [store] (scope: myAppScope) $store [getState] forked value
ts
import { createStore, fork } from 'effector';

const $store = createStore('original value');

$store.watch((value) => console.log('[store] $store', value));

const scope = fork({
  values: [[$store, 'forked value']],
});

// -> [store] $store original value

That is it! Furthermore, you can use debug method not only to debug value of Store but also for track execution of other units like Event or Effect, for trace chain of calls and so on. For more details, please, check patronum/debug documentation.

TIP

Do not forget to remove debug calls from the production code. To ensure that, you can use effector/no-patronum-debug rule for ESLint.

React on changes

If you need to react on changes in Store, you can use .updates property. It is an Event that emits new values of the Store on each update. With a combination of sample and Effect it allows you to create side effects on changes in Store in a declarative and robust way.

ts
import {
  createEffect,
  createStore,
  createEvent,
  sample,
  fork,
  allSettled,
} from 'effector';

const someSideEffectFx = createEffect((storeValue) => {
  console.log('side effect with ', storeValue);
});

const $store = createStore('original value');

const appInited = createEvent();

sample({
  clock: [appInited, $store.updates],
  source: $store,
  target: someSideEffectFx,
});

const scope = fork({
  values: [[$store, 'forked value']],
});

allSettled(appInited, { scope });

// -> side effect with forked value
ts
import { createStore, fork } from 'effector';

const $store = createStore('original value');

$store.watch((value) => console.log('side effect with ', value));

const scope = fork({
  values: [[$store, 'forked value']],
});

// -> side effect with original value

TIP

Since, Effector is based on idea of explicit triggers, in this example we use explicit start of the app.

This approach not only solve problems that mentioned above but also increases code readability and maintainability. For example, real-world side effects can sometimes fail, and you need to handle errors. With .watch approach, you need to handle errors in each callback. With Effect approach, you can handle errors in seamless declarative way, because Effect has a built-in property .fail which is an Event that emits on each failure.

Summary

  • Do not use .watch for debug - use patronum/debug instead
  • Do not use .watch for logic and side effects - use Effects instead
]]>
<![CDATA[Dependency injection]]> https://withease.effector.dev/magazine/dependency_injection.html https://withease.effector.dev/magazine/dependency_injection.html Mon, 19 Jun 2023 00:00:00 GMT Dependency injection

Effector provides a simple way to inject dependencies into your application — Fork API. Let us take a look at how it works.

TIP

Application has to follow some rules to work with Fork API

Why

Sometimes you need to inject some dependencies into your application in particular environment. For example, you want to disable logger in tests. The easiest way to do it is to declare global variable and check it in your code:

ts
// app.ts
import { createEffect } from "effector";

const logEnabled = Boolean(process.env.IS_TEST);

const logFx = createEffect((message) => {
  if (!logEnabled) {
    return;
  }

  console.log(message);
});

sample({ clock: somethingHappened, target: logFx });

But it is not the best way. What if we want to enable it back for a particular test? We have to change the code and support one more variable. So, it will lead to a mess in the code.

Other reason is that you may want to use different implementations of a logger in different environments. For example, in browser you want to send logs to some external system (like Rollbar or Sentry) and on server you want to write logs to stdout.

How

To solve these problems we can use Fork API. It allows us to create a new instance of the application with different dependencies. Let us take a look at how it works.

ts
// app.ts

// Store instance of a logger in a Store
const $logger = createStore(null);

const logFx = attach({
  source: $logger,
  effect: (logger, message) => logger?.(message),
});

sample({ clock: somethingHappened, target: logFx });

That is it, now we can inject logger into our application.

ts
import { fork, allSettled } from "effector";

describe("app", () => {
  it("should not log anything", async () => {
    const scope = fork({
      values: [[$logger, null]],
    });

    await allSettled(somethingHappened, { scope });

    expect(console.log).not.toBeCalled();
  });
});
ts
import { fork, allSettled } from "effector";

function handleHttp(req, res) {
  const scope = fork({
    values: [[$logger, console.log]],
  });

  await allSettled(somethingHappened, { scope });

  // render the app
}
ts
import { fork, allSettled } from "effector";

const scope = fork({
  values: [[$logger, Rollbar.log]],
});

await allSettled(somethingHappened, { scope });

We can inject any dependencies into our application in particular environment without changing the code.

Recap

  • Follow the rules to work with Fork API
  • Use Fork API as a dependency injection
]]>
<![CDATA[Global variables]]> https://withease.effector.dev/magazine/global_variables.html https://withease.effector.dev/magazine/global_variables.html Mon, 02 Jan 2023 00:00:00 GMT Global variables

What problems do we have with the following code?

js
axios.interceptors.request.use(function (config) {
  config.headers['X-Custom-Token'] = getTokenSomehow();

  return config;
});

Actually, it is quite a lot, but let us focus on the global variable axios, and it is operations.

TL;DR

It causes possible mixing between different users during SSR, make tests slower and stories harder to write.

Environments

In the modern world our frontend applications can run in different environments:

  • browser as a standalone application
  • browser as a part of a bigger application (e.g. in Storybook)
  • Node.js as a test-case
  • Node.js as a server-side rendering application

Let us take a closer look and find out how global variables can affect our application in each of them.

✅ Standalone application

In this case, we have only one instance of our application in a single process. It means that we can use global variables to store our application state. It is safe.

🟨 Embedded application (e.g. in Storybook)

This case is valid only for development mode, it will not affect production.

Typically, we have a lot of stories inside single tab of a browser while using tools like Storybook. It means that we can have more than one instance of our application in a single process. It can be a bit dangerous to use global variables to store our application state, because different stories can interfere with each other.

However, some tools from this category can provide its own way to isolate different stories from each other. So, it could be safe to use global variables in this case.

🟨 Test-case

This case is valid only for development mode, it will not affect production.

Tests are running in a Node.js which is single-threaded by default. It means that we can have more than one instance of our application in a single process. We have to be careful with global variables to store our application state, because otherwise different tests can interfere with each other.

To simplify it, some of test-runners can provide its own way to isolate different tests from each other, but due to limit access to code internal implementation their solution can significantly decrease performance of tests. So, it could be safe to use global variables in this case.

🔴 Server-side rendering

Server side rendering is process of rendering application on a server and sending it to a browser. Because of single-threaded nature of Node.js, we can have more than one instance of our application in a single process during render. If we use global variables to store our application state and change it for one user, it can affect another user. In general, it is not safe to use global variables in case of SSR.

The problem

As you can see, in almost all environments we have more than one instance of our application in a single process. It means that we cannot use global variables to store our application state. It is not safe. Let us see how we can solve this problem.

Q: I do not use SSR, I can use global variables, right?

A: Yes, but. If avoiding global variables costs you almost nothing, why not to do it? It will make your application more predictable and easier to test. If you need to use SSR in the future, you will have to refactor your code anyway.

Because of usage of global instance of axios and applying some global state (with getTokenSomehow function) requests can be sent with wrong token in SSR or tests.

Theoretical solution

The key of this problem is global state. Let us see how to avoid global state in different frameworks.

React-way

React-way is to use React Context to store our application state.

TIP

React used as an example, but almost all frontend frameworks have similar concepts.

We can use a value from a context 👇

tsx
// app.tsx
function App() {
  const userId = useContext(UserIdContext);

  return (
    <main>
      <h1>Hello, world!</h1>
      <p>{userId}</p>
    </main>
  );
}

And pass it in particular environment independently through a context provider 👇

tsx
import { createRoot } from 'react-dom/client';

// In client-side environment we can read a value from a browser
createRoot(document.getElementById('root')).render(
  <UserIdContext.Provider value={readUserIdFromBrowser()}>
    <App />
  </UserIdContext.Provider>
);
tsx
import { renderToString } from 'react-dom/server';

function handleRequest(req, res) {
  // In server-side environment we can read a value from a request
  const html = renderToString(
    <UserIdContext.Provider value={readUserIdFromRequest(req)}>
      <App />
    </UserIdContext.Provider>
  );

  res.send(html);
}
tsx
import { render } from '@testing-library/react';

describe('App', () => {
  it('should render userId', () => {
    // In test environment we can use a mock value
    const { getByText } = render(
      <UserIdContext.Provider value={'42'}>
        <App />
      </UserIdContext.Provider>
    );

    expect(getByText('42')).toBeInTheDocument();
  });
});
tsx
export default {
  component: App,
  title: 'Any random title',
};

export const Default = () => {
  // In Storybook environment we can use a mock value as well
  return (
    <UserIdContext.Provider value={'mockUserId'}>
      <App />
    </UserIdContext.Provider>
  );
};

Now, it is bulletproof. We can render any amount of instances of our application in a single process, and they will not interfere with each other. It is a good solution, but it is not suitable for non-React contexts (like business logic layer). Let us see how we can solve this problem with Effector.

Effector-way

TIP

To correct work with Scope-full runtime, your application have to follow some rules.

Effector has its own API to isolate application state, it is called Fork API — fork function returns a new Scope which is a container for all application state. Let us see how we can use it in all mentioned environments.

Let us save a user ID in a Store 👇

ts
// app.ts
import { createStore } from 'effector';

const $userId = createStore(null);

Later we can replace a value in a Store during fork call 👇

ts
import { fork } from 'effector';

// In client-side environment we can read a value from a browser
const scope = fork({ values: [[$userId, readUserIdFromBrowser()]] });
tsx
import { fork } from 'effector';

function handleRequest(req, res) {
  // In server-side environment we can read a value from a request
  const scope = fork({ values: [[$userId, readUserIdFromRequest(req)]] });

  // ...
}
tsx
import { fork } from 'effector';

describe('App', () => {
  it('should pass userId', () => {
    // In test environment we can use a mock value
    const scope = fork({ values: [[$userId, '42']] });

    expect(scope.getState($userId)).toBe('42');
  });
});

Much more powerful

Fork API can be used not only for avoiding global variables, but as a full-featured dependency injection mechanism, because it allows to replace a value of any Store and a handler of any Effect in a particular Scope during fork call.

UI-libraries integration

To connect UI-library to Effector, you have to use an integration library. For example, for React, you can use effector-react library. It supports Fork API, let us see how we can use it 👇

tsx
// app.tsx
import { useUnit } from "effector-react";

function App() {
  const userId = useUnit($userId);

  return (
    <main>
      <h1>Hello, world!</h1>
      <p>{userId}</p>
    </main>
  );
}

And pass your Scope to the integration library through a context provider 👇

tsx
import { createRoot } from "react-dom/client";
import { fork } from "effector";
import { Provider } from "effector-react";

// In client-side environment we can read a value from a browser
const scope = fork({ values: [[$userId, readUserIdFromBrowser()]] });

createRoot(document.getElementById("root")).render(
  <Provider value={scope}>
    <App />
  </Provider>
);
tsx
import { renderToString } from "react-dom/server";
import { fork } from "effector";
import { Provider } from "effector-react";

function handleRequest(req, res) {
  // In server-side environment we can read a value from a request
  const scope = fork({ values: [[$userId, readUserIdFromRequest(req)]] });

  const html = renderToString(
    <Provider value={scope}>
      <App />
    </Provider>
  );

  res.send(html);
}
tsx
import { render } from "@testing-library/react";
import { fork } from "effector";
import { Provider } from "effector-react";

describe("App", () => {
  it("should render userId", () => {
    // In test environment we can use a mock value
    const scope = fork({ values: [[$userId, "42"]] });

    const { getByText } = render(
      <Provider value={scope}>
        <App />
      </Provider>
    );

    expect(getByText("42")).toBeInTheDocument();
  });
});
tsx
import { fork } from "effector";
import { Provider } from "effector-react";

export default {
  component: App,
  title: "Any random title",
};

// In Storybook environment we can use a mock value as well
const scope = fork({ values: [[$userId, "mockUserId"]] });

export const Default = () => {
  return (
    <Provider value={scope}>
      <App />
    </Provider>
  );
};

TIP

React is used as an example, but you can use any UI-library which has an integration with Effector.

The solution

So, let us return to original problem with a global interceptor on global axios instance. We can save an instance to the Store and apply an interceptor to it exclusively 👇

ts
// app.ts
import { createStore, createEvent, sample, attach } from 'effector';
import axios from 'axios';

// Will be filled later, during fork
const $userToken = createStore(null);

const $axios = createStore(null, {
  /*
   * Important to exclude $axios store serialize
   * for reduce possible ssr errors inside fork scope units
   */
  serialize: 'ignore',
});

// An event that will be fired when application is started
const applicationStared = createEvent();

const setupAxiosFx = attach({
  source: { userToken: $userToken },
  effect({ userToken }) {
    const instance = axios.create();

    instance.interceptors.request.use((config) => {
      config.headers['X-Custom-Token'] = userToken;
      return config;
    });

    return instance;
  },
});

sample({ clock: applicationStared, target: setupAxiosFx });
sample({ clock: setupAxiosFx.doneData, target: $axios });

TIP

In this example, we use explicit start Event of the application. It is a good practice to use it, because it allows you to control the start of the application.

After that, we can fork the application and pass a new value of our Stores for every particular environment 👇

ts
import { fork, allSettled } from 'effector';

const scope = fork({
  values: [[$userToken, readUserTokenFromBrowserCookie()]],
});

await allSettled(applicationStared, { scope });
tsx
import { fork, allSettled } from 'effector';

async function handleRequest(req, res) {
  const scope = fork({
    values: [[$userToken, readUserTokenFromRequestCookies(req)]],
  });

  await allSettled(applicationStared, { scope });
}
tsx
import { fork, allSettled } from 'effector';

describe('App', () => {
  it('should start an app', async () => {
    // Do not pass any values to the fork, because we do not need them in tests
    // $userToken will be filled with null
    const scope = fork();

    await allSettled(applicationStared, { scope });
  });
});

That is it! Now we can use the same code for all environments and do not worry about global state, because it is isolated in the Scope.

TIP

Read more about allSettled function in the article about explicit start Event of the application.

Usage of resulted $axios store Store is the same as usage of any other Store:

ts
import { attach } from 'effector';

const fetchUserFx = attach({
  source: $axios,
  effect: (axios, { id }) => axios.get(`/users/${id}`),
});

Recap

  • Global state is a bad idea, because it can lead to unpredictable behavior in tests, SSR and other environments.
  • Effector has its own API to isolate application state, it is called Fork API — fork function returns a new Scope which is a container for all application state.
  • Application that uses Fork API must follow some rules.
  • To use Fork API with a UI-library, you have to use an integration library. For example, for React, you can use effector-react library.
]]>