On this page... (hide)

  1.   1.  Likelihood
  2.   2.  Forward model

To ease up the development of new exotic model that can be plugged on the rest of the infrastructure, we have developped a Julia bind. The choice is Julia is motivated by the following aspects:

  • language fairly close to other existing scripted vectorization framework like Python/Numpy or R. This makes it easy for new adopter to contribute new models.
  • an efficient Just-In-Time compiler and optimization passes that allows for quite efficient machine code to be generated from compact scripts.
  • a relatively easy way to bind the Julia machine to the existing C++ core
  • a large availability of packages for scientific and machine learning computing, with also the possibility of importing industry backed neural network packages (like TensorFlow).

Python could not qualify easily to all these aspects and Julia offers an attractive middle ground between the fairly abstract C++ code base of BORG3 and the high level representation in Python.

When the Julia core is activated inside the libLSS core, a Julia Virtual machine is started and a number of libLSS primitives are made available from Julia code as called by libLSS at some point. Currently there are two available points:

  • the generic likelihood
  • a generic forward model

While the main user of the Julia likelihood/Forward model is the HADES_JULIA main program it can be used by any other user of the LibLSS framework of ARES/BORG3. In HADES_JULIA the likelihood (and model in the future) are provided by two simple statement in the text configuration file: the main julia source file of the likelihood (as given in the example below) and the name of the module coded in that file.

1.  Likelihood

The likelihood model is designed to allow some flexibility and easy parallelization for quick experimentation on large supercluster. We have opted to export the C++ basic code path for likelihoods while restraining to make the entirety of subtleties available to the user. Also the available I/Os are limited to the one provided by the ARES/BORG3 core to allow for safety and correct error reporting.

The file describing the likelihood must follow the following pattern:

  • define a specific module
  • in that module you should import the symbols from "libLSS"
  • there must be four functions defined: initialize, get_required_planes, likelihood and adjoint_gradient
  • to sample the bias parameters, there must be additionally a function named "get_step_hint"
module SomeLikelihood
    # libLSS module provides the mapping to the current state of the C++ part of
    # libLSS
    using libLSS

    import libLSS.State
    import libLSS.print, libLSS.LOG_INFO, libLSS.LOG_VERBOSE, libLSS.LOG_DEBUG

    # This is called when initializing the MCMC loop
    function initialize(state::State)
        # You can print messages to console using libLSS system to ensure
        # clean and ordered output.
        print(LOG_VERBOSE, "Initialize likelihood")
        jbias = libLSS.new_array(state, "julia_bias", 10, Float64)
        data = libLSS.get_array_3d(state, "galaxy_data_0", Float64)
    end

    # This is called to get information on additional 2d planes that are required
    # locally in case of MPI parallelization. The C++ part will retrieve the data
    # from the other node as passive data (for density) and active (for adjoint gradient)
    # and will handle all necessary communucations.
    # an Int64 array must be returned.
    function get_required_planes(state::State)
       return [1,2,3]  # I need plane 1, 2 and 3.
    end

    # This is called to evaluate the likelihood. It must return the equivalent
    # of a Float64
    # The three parameters are of type State, the GhostPlane manager and some type array. It is advised to keep
    # the type as "AbstractArray", with 3 dimensions.
    function likelihood(state::State, ghosts::libLSS.GhostPlanes, array::AbstractArray{Float64,3})
        print(LOG_VERBOSE, "Likelihood evaluation in Julia")

        # This retrieve the 2d ghost plane corresponding to the slice '0'. The returned
        # array is Float64 compatible.
        some_ghost = libLSS.get_ghost_plane(ghosts, 0)


        # The input data is available in the state object.
        data = libLSS.get_array_3d(state, "galaxy_data_0", Float64)

        L = sum((data.-1.-array).^2)/100
        print(LOG_VERBOSE, "Likelihood is " * repr(L))

        return L
    end

    # This is the adjoint gradient. "ag" must be filled with the adequate gradient vector.
    function adjoint_gradient(state::State, density::AbstractArray{Float64,3}, ghosts::GhostPlanes, ag::AbstractArray{Float64,3})
        # A single line does the job here.
        ag = (data.-1.-array)./100
    end

    # This function generate mock data compatible with the bias and likelihood described in the module
    # with the parameters provided in the state file.
    # The result of the generation must be put in 'array'.
    function generate_mock_data(state::State, ghosts::GhostPlanes, array::AbstractArray{Float64,3})
    end
end

2.  Forward model

The C++ side also provides a way of plugging a Julia code as a forward model. With the chaining framework for forward models, it is thus possible to couple a very complex chain of non-linear models. This part is still in heavy development and we only provide some early example.

module SomeForwardModel
    using libLSS

    import libLSS::State

    function update_cosmology()
    end

    function forward(input::AbstractArray{Float64, 3}, output::AbstractArray{Float64,3}, adjointNext::Bool)
    end

    function adjoint(input_ag::AbstractArray{Float64,3}, output_ag::AbstractArray{Float64,3})
    end

end