next up previous contents index
Next: Least Squares Refinement Up: Theory of TNT Refinement Previous: What Direction to Shift?

How Far to Shift?

Once the shift vector has been calculated the fraction of this vector, or tex2html_wrap_inline1466 must be determined. We want to find a value for tex2html_wrap_inline1466 which will cause tex2html_wrap_inline1506 to be as small as possible. Since we know tex2html_wrap_inline1458 and now know tex2html_wrap_inline1460 the problem is a one dimensional minimization problem. The strategy used in TNT to find tex2html_wrap_inline1466 is

  1. Guess a value for tex2html_wrap_inline1466 .
  2. Calculate tex2html_wrap_inline1506 .
  3. Fit a parabola to the known data.
  4. Set tex2html_wrap_inline1466 to the minimum of the parabola.
  5. Goto 2
The process repeats until the change in tex2html_wrap_inline1466 from one cycle to the next is insignificant.

The particular definition of insignificant used is ``When the predicted improvement in function value is less than five percent''. The function value at the point tex2html_wrap_inline1522 was smaller than it was at the start. The function value should be even lower with the new value of tex2html_wrap_inline1466 . From the parabola we can predict how much lower the function value will be at the new point. This difference is the predicted improvement in function value. If the predicted improvement is small there is little reason to deliberate the merits of the different values for tex2html_wrap_inline1466 . It is time to start an entirely new cycle.

The one dimensional minimization loops require the calculation of the value of the function but no gradients or curvatures. They take a small proportion of the computer time of the entire cycle of refinement. Therefore they are called ``short loops''. In a fit of bad analogy, the name ``long loop'' has been given to the portion of a cycle which calculates the shift vector. A cycle of refinement consists of a single long loop and two short loops. If the approximations of the calculations are violated in some fashion more short loops will be required. If additional short loops are executed there may be nothing wrong with the cycle but it should be examined for error.




next up previous contents index
Next: Least Squares Refinement Up: Theory of TNT Refinement Previous: What Direction to Shift?

Dale Edwin Tronrud
Thu Jul 6 23:24:57 PDT 2000