Bayesian imaging meets radio reality

Venue

Arrive here. Then ask the gatekeeper for old lecture hall.

Schedule

Participants

Closing Thoughts

Landman Bester

The invention of the MGVI technique in NIFTy could be used to convert almost any existing maximum likelihood algorithm based on a Newton scheme into a fully Bayesian algorithm. This is especially true if an explicit representation of the maximum likelihood Hessian has been implemented and if the parameter space is much smaller than the data space (so that the Hessian, or some sparse representation of it, fits into memory). In principle, besides the description of the prior, the only additional feature that is required is an implicit representation of the adjoint of the Jacobian operator. This makes for a powerful Bayesian hammer for which many nails already exist.

Oleg Smirnov

The next two steps necessary to turn RESOLVE into a practical tool for e.g. MeerKAT would be: adding a frequency axis and segmenting the image and running the imaging problem on independent islands a-la Landman's dimensionality reduction procedure.

Ulrich A. Mbou Sob

You can compress data in different ways provided you have some prior knowledge about the underlying structure of Data. They are several schemes which are currently been used for visibilities averaging notably baseline dependent average (BDA). I think since we have enough information about the data measured by the different baselines, we can use techniques from Information Field Theory to find the optimal schemes. The only caveat here will whether or not we have enough computing power for such expensive optimisations.

Valentina Vacca

A Bayesian approach to calibration and self-calibration can turn out to be useful not only to improve the scientific outcome of a single data set, but also to build a full model of the instrument. At each time new data are acquired, the previous model can be used as a prior and can be further refined during the data reduction process. This improves the knowledge of the instrument and therefore the overall quality of data reduction process as soon as new data are available.

Philipp Arras

My impression is that many issues which arise in current radio pipelines and calibration algorithms in particular suffer from being either maximum-likelihood or maximum-a-posteriori algorithms which exhibit the problem of overfitting. Thus, pushing forward the development of Bayesian calibration algorithms is probably a good idea.

Torsten Enßlin

Beam forming might be regarded as data compression. The sky brightness is the original data (stored in the past light cone of the telescope), and the instrument response to the sky represents the compressed data. Maybe this leads to insight how to form better beams and steer telescopes, maybe not.

Back to main page