Thursday, March 05, 2009

3.5.09

More 236 lecturing today...more holding my breath and breathing through my mouth. We're finally going over inequality constrained optimization, which is exactly the problem I'm facing now for correcting static disturbances. I still think it would be bad ass if I could one day implement some kind of interior point algorithm in real time, even if the performance would be crap with dynamic noise. Some other stuff:

- Finished writing modal spgd script. Performs similarly to the non-modal case; not surprising since using all the modes reconstructs the commands exactly. The main difference is that the cost function being fed back is now the norm of the modal vector, not simply the wavefront error itself.

- Normalizing the estimated gradient helps performance greatly, although so far its still not as good as when using only positive perturbations. Naturally the gain has to be changed appropriately, but with these descent algorithms, the direction is really what matters. Here's a comparison between the unnormalized (v=0.011) and normalized (v=100) cases:



- I'm now looking into estimating the gradient using least-squares, but I'm not sure if this is equivalent, worse, or better than the current method. Comparing methods by estimating a known gradient (of a random quadratic function) wasn't conclusive. Theoretically, 31 perturbations would be needed to really identify the 31 entries of the gradient, thus requiring lots of time to capture each image. However if m modes are used instead, the gradient would only have m terms that need to be identified, so there might be some benefit to using the modes there. All this will hopefully be faster when I have the new SH sensor.

- If I can find a good way to estimate the Hessian at each iteration, I could use the full Newton's method with all the corresponding bells and whistles. I should probably look up some more recent papers.

No comments: