MLHEP2020: Optimisation of FEL

Please log in.

    • In this competition, participants are tasked with improving the performance of free-electron laser (FEL) by developing an algorithm for focusing the incoming electron beam.
      The laser setup is a sequence of 10 quadrupole magnetic lenses followed by an undulator.

      Performance of the FEL is measured by a single number (the objective function, a function of the resulting radiation from the undulator) — lower values of the objective function mean better performance.

      The values of the objective function typically range from 0 to 3.

      The objective function depends on:
      - strength of the magnets: 10 numbers each from [-2, 2], each number represents the gradient of the magnetic field of the corresponding quadrupole;
      - the geometry of the FEL (distance between quadrupoles);
      - the shape of the incoming beam.

      The strength of the magnets is directly under the control of the focusing algorithm, geometry of the FEL and shape of the beam are hidden parameters that change from run to run (but fixed during focusing).

      Formally, this is a black-box optimization problem, i.e., given value of hidden parameters H:

      objective(x | H) → min

      where:
        x in [-2, 2]10

      The main challenge is that FEL has a state, and, in order to evaluate the objective function, the FEL has to be reconfigured, i.e. the strength of the magnets takes time to change and measuring current performance is not immediate:
      - a measurement takes 1 abstract second,
      - changing configuration of the FEL from x0 to x: max(|x - x0|) abstract seconds, i.e. magnets can simultaneously change their strength with a limited speed - 1 unit/sec.

      Your goal is to minimize the objective function within 128 abstract seconds.

    • Evaluation

      Optimization algorithms are tested on a (secret) set of fixed configurations (geometry of FEL + shape of the beam). For each configuration: - the optimizer is run inside an ask-tell loop; - the whole optimization trajectory is recorded; - the score of the optimizer is the minimal observed value of the objective function;

      The final score of an optimizer is the average of its scores on test configurations.

      For each configuration, an optimizer has a fixed budget of 128 abstract seconds. If FEL current configuration is x0 and the optimizer requests configuration x1, then the cost of this request is:

      max(|x1 - x0|) + 1
      

      Note, that the total cost of requests depends on their order, e.g., for a 1D problem, requesting [-1, 2, 1] costs (with x0 = 0):

      |0 + 1| + 1 + |-1 -2| + 1 + |1 -2| + 1 = 8

      but [-1, 1, 2] costs:

      |0 + 1| + 1 + |-1 -1| + 1 + |1 - 2| + 1 = 7

      Stating kit

      The starting kit is located at https://github.com/yandexdataschool/MLHEP2020-black-box.

      Before installing the starting kit, one should install ocelot package, installing optional dependencies (numexpr, pyfftw, numba) is highly recommended as they significantly speed up computations.

      Then:

      git clone https://github.com/yandexdataschool/MLHEP2020-black-box.git xfel
      cd xfel
      pip install -e .
      

      An example of a submission file is also included.

      Solution

      Your solution should implement the following class (note spelling: Optimiser):

      class Optimiser(object):
          def __init__(self, x0):
              """
              Optimizer should accept initial configuration x0
              """
              pass
      
          def reset(self, ):
              """
              Auxilary method --- resets internal state of the optimizer.
              """
              raise NotImplementedError()
      
          def ask(self,):
              """
              Returns next configuration to probe.
              """
              raise NotImplementedError()
      
          def tell(self, x, f):
              """
              Callback method:
              `x` - configuration returned by `ask()` method (possibly clipped to satisfy bounds),
              `f` - value of the objective function in the point `x`.
              """
              raise NotImplementedError()
      
      • constructor (__init__) accepts initial configuration x0 as the only argument, i.e. __init__(self, x0 : array(shape=10, dtype=float32));
      • ask method with signature (self, ) -> array(shape=10, dtype=float32);
      • tell method with signature (self, x: array(shape=10, dtype=float32), f: float) -> None

      The class that implements the solution class should also be named Optimizer.

      The Optimizer class should be submitted inside a zip file that contains a single python script named optimiser.py. Please, that optimizer.py should be spelled with S, not Z.

      Please, make sure nothing else is evaluated inside optimiser.py, e.g., by putting everything else inside if __name__ == '__main__' statement:

      class Optimiser(object):
        ...
      
      if __name__ == '__main__':
        launch_rocket()
      

      Example optimizer

      The following optimizer implements a special case of Simulated Annealing.

      class Optimiser(object):
          def __init__(self, x0):
              self.x0 = x0
              self.rng = np.random.RandomState(seed=1234)
      
              self.x = x0
              self.f = None
      
              self.scale = 1e-2
      
          def ask(self, ):
              return self.x + self.rng.normal(size=self.x.shape, scale=self.scale)
      
          def tell(self, x, f):
              if self.f is None:
                  self.f = f
                  self.x = x
      
              elif self.f < f:
                  pass
      
              else:
                  self.f = f
                  self.x = x
      
          def reset(self,):
              self.x = self.x0
              self.f = None
      
  • Make your submission using github

    ID Status Inputs