I am trying to implement the fitting routine for experimentally received data. But the function that I am trying to optimize is a black-box - I don't know anything about some specific moments right now - but I can call it with some parameters.
I am trying to find optimal parameters for function f(x), where x - is the list of parameters to optimize,
The function f() returns one value as a result.
Trying to use Particle Swarm Optimization to find optimal parameters for x. I have bounds for all the parameters inside x, also I have some like initial parameters for almost all of them.
As the toy-problem trying to get this code working:
import pyswarms as ps
import numpy as np
# Define the function to optimize
def f1(x:list) -> float:
return x[0]**2 + x[1]**2 + x[2]**2
# Define the bounds for the parameters to optimize
# Create bounds
max_bound = 5 * np.ones(3)
min_bound = - max_bound
bounds = (min_bound, max_bound)
print(bounds)
# Set up the optimization options
options = {'c1': 0.5, 'c2': 0.3, 'w': 0.9}
# Perform the optimization
dimensions = 3 # because we have 3 inputs for f1()??
# how to give the PSO initial values for all optimization parameters?
# init_pos =
optimizer = ps.single.GlobalBestPSO(n_particles=100, dimensions=dimensions, options=options, bounds=bounds, init_pos=None)
cost, pos = optimizer.optimize(f1, iters=1000)
# Print the optimized parameters and the cost
optimized_params = pos
print("Optimized parameters: ", optimized_params)
print("Cost: ", cost)
It gives an error here:
ValueError: operands could not be broadcast together with shapes (3,) (100,)
What am I doing wrong?
If I give the n_particles=3 parameter - it actually works - but it can't find the minima of the function and works really slow.. That is strange so I am pretty confused.
Note My real application requires large number of elements in X-list in input can be relatively large in real-world application - approx 100.
And the real application must vary all the components inside the x-list... Maybe someone can suggest a python module to efficiently use PSO?
How can I give the optimizer the information on the initial guesses for the parameters in this case?
The reson for dimensions of 3 is indeed because optimisation function is also of dimension 3.
The reason for slow training is the options you provided.
Try:
The function you have defined is the same as pyswarms sphere function, see here, and it caters for dynamic dimensions due to implementation.