Repository logo
 

Measuring and using information gained by observing diffraction data.

Published version
Peer-reviewed

Loading...
Thumbnail Image

Type

Article

Change log

Authors

McCoy, Airlie J 

Abstract

The information gained by making a measurement, termed the Kullback-Leibler divergence, assesses how much more precisely the true quantity is known after the measurement was made (the posterior probability distribution) than before (the prior probability distribution). It provides an upper bound for the contribution that an observation can make to the total likelihood score in likelihood-based crystallographic algorithms. This makes information gain a natural criterion for deciding which data can legitimately be omitted from likelihood calculations. Many existing methods use an approximation for the effects of measurement error that breaks down for very weak and poorly measured data. For such methods a different (higher) information threshold is appropriate compared with methods that account well for even large measurement errors. Concerns are raised about a current trend to deposit data that have been corrected for anisotropy, sharpened and pruned without including the original unaltered measurements. If not checked, this trend will have serious consequences for the reuse of deposited data by those who hope to repeat calculations using improved new methods.

Description

Keywords

Anisotropy, Information Gain, Translational Noncrystallographic Symmetry, Diffraction Intensities

Journal Title

Conference Name

Journal ISSN

Volume Title

Publisher

Sponsorship
NIH HHS (P01GM063210)
Wellcome Trust (209407/Z/17/Z)