Is it possible to add new embedded worker while cluster is running on statefun?

96 views Asked by At

Here is the deal;

I'm dealing with adding new worker (embbeded) to on running the cluster (flink statefun 2.2.1).

As you see the new task manager can be registered to the cluster;

Screenshot of new deployed taskmanager

But it doesn't initialize (it doesn't deploying sources);

What am I missing here?? (master and workers has to same jar files too? or it should be enough deploying taskmanager with jar file)

Any help would be appreciated, Thx.

1

There are 1 answers

0
David Anderson On

Flink supports two different approaches to rescaling: active and reactive.

Reactive mode is new in Flink 1.13 (released just this week), and works as you expected: add (or remove) a task manager, and your application will adjust to the new parallelism. You can read about elastic scaling and reactive mode in the docs. Reactive mode is currently a work in progress, but might need your needs.

In broad strokes, for active mode rescaling you need to:

  1. Do a stop with savepoint to bring down your current job while taking a snapshot of its state.
  2. Relaunch with the new parallelism, using the savepoint as the starting point.

The exact details depend on how your cluster is deployed.

For a step-by-step tutorial, see Upgrading & Rescaling a Job in the Flink Operations Playground.


The above applies to rescaling statefun embedded functions. Being stateless, remote functions can be rescaled more straightforwardly.