Tuesday, July 28, 2009

Nope

Last week I finished writing a new slope calculation script based on idea of using 2 linear operators and dividing the result. IS this faster than looping through each sub-aperture in Matlab? Nope.

The problem is that the matrix to calculate the unnormalized slopes is 2N*M, where N is the number of sub-apertures and M is the total number of pixels in the image. Since the WFS has around 1600 sub-apertures, and the image is more than 1000x1000, this matrix is gigantic. Even though its sparse, it takes up more than 1.5 MB of space. The matrix to calculate the total intensity of each sub-aperture is around the same order of magnitude, so combined the script has to load over 2MB of data each time a WFS image is captured. This takes more than a second in Matlab, so that option is off the table for now. Simply loading them into the workspace beforehand and passing them to the script is much faster, but still around the same speed as using loops.

Ultimately, when I get the DM working faster (still waiting for word on that from the manufacturer), I'll have to write the slope algorithm as a mex file and run it in C. What's another wasted week?

Wednesday, July 22, 2009

Faster

I've heard from the company that manufactures our DM and WFS that they can achieve frame rates an order of magnitude higher than what I'm capable of doing, around 800 Hz. This blows my mind since I have no idea how I make any more than incremental improvements with the Matlab commands I'm using now. In a sense this pisses me off since I'm not sure how we would have found this out if my advisor hadn't brought it up. It seems common in this industry for companies to ignore what most people are doing and write their own, barely adequate, proprietary software, and then go into stealth mode with the support.

Anyway, we're focusing on the DM first since that's most likely the harder problem. I'm actually skeptical they can get the frame rates they claim, but who knows. I've sent them the code I'm current using-code that was adapted from stuff they sent us-so I should hear something soon. I fully expect to hear complaints about a variety of minor coding transgressions and other shit that doesn't matter.

Nonetheless, I'm assuming that I can eventually speed things up significantly on the hardware side, which means that my software will have to change as well. Right now the most significant software bottleneck is the script for calculating the slope vector from the WFS image. For a full frame 1280x1024 Hartmann image, calculating the slopes in Matlab takes around 0.08 seconds for a grid of around 40x40 lenselets. The problem, as I mentioned before, is that finding the centroids is not a linear operation since you have to divide by the sub-aperature intensity, thus you can't just multiply the image by some matrix. Loops in Matlab are shit, so looping over all the sub-aperatures is what causes most of the delay.

Right now I'd like to try to reduce the computation time in Matlab by reducing the centroid calculation to 2 linear operations on the image. if I is the vectorized image, then there is a (sparse, large) matrix A such that S1 = A*I, where S1 is a vector of unnormalized slopes. There is also another matrix Q such that L = Q*I where L contains the total intensity in each sub-aperature. The slope vector is then (in Matlab speak) S = S1./[L;L] (since S1 contains x and y slopes).

I have no idea if this is really faster, but there's enough evidence to give it a try. Coming up with A efficiently though isn't easy, but hopetully I'll finish that tomorrow.

Sunday, July 19, 2009

5.19.09

Last weekend I ran a couple long SPGD experiments with and without the SLM to see if it had any noticeable effect on the performance. I ran each lasted for around 10,000 iterations, and since I can only update the SLM every 3 seconds they each took over 8 frickin hours to run. I feel like a biologist.



For the "w/ SLM" case the image was set to a focus with a sinusoidally varying intensity such that the maximum phase distortion occurs at the center of the focus at the peak sinusoidal amplitue. Maybe I'm just cynical, but its not really clear to me that the SLM has any effect. Its nice the the algorithm can sort of maximize the image cost function (J), ignoring those unexplained spikes, but its hard to distinguish any difference between the 2 cases. Control starts at k=1000.

Here's a close up of a section of the uncontrolled iterations.



The SLM obviously has an effect on the objective function, but its not much, and it doesn't look like that effect more than the noise from calculating the image objective function. What is clear though is that unless I can speed things up, using the SLM at all is totally impractical. This 3 second bullshit can't go on if I'm going to really be using it.

Thursday, July 09, 2009

I'm not totally convinced the SLM is doing much to the phase. I'm pretty sure its working, but since I'm not measuring the phase directly I don't know if a single wavelength of change is much.

It's clear that something is happening. Here's a plot of the target image variance (J) and slope vector norm with the SLM active. At 300 iterations it displays a focus shape (not that easy to calculate efficiently!) where the amplitude varies sinusoidally from the max displacement to min.



The good news is that the sinusoidal pattern is reflected in both objective functions. But although the change is visible, its miniscule percentage compared to what's available with the DM. As a comparison, the maximum stroke of the DM is in the range of 10 microns. The SLM is limited to 2pi rad, about 650 nm. This wouldn't matter much if I were mapping the phase measurement back to the range of [0,2pi], such as with a SRI. But since I'm not measuring the phase directly, I'm not sure how variations like that are mapped in a Hartmann sensor.

The other problem is the ludicrously slow response time of the SLR, which I still have to limit to 1/3 Hz. Given the shit this thing has given me over the last few months, I think the best plan of action is to convince my advisor to ditch the SLM and get another DM. That will require me to exhaust all obvious ways to speed things up, but at this rate that shouldn't be hard.

Wednesday, July 08, 2009

Damn the torpedoes

I'm finally back in the swing of things after my little excursion to the Old World. Its amazing how difficult it is to get used to "work" again.

The SLM is still heinously slow, still limited to around 1/3 Hz....yes 1 frame every 3 seconds. Personally, I think its unlikely I'll ever find a solution to this, but right now the best hope for a fix is to try using with with the other SLM's hardware (which is working fine). I'm also going to email BNL, but since I didn't write the modified code to control it using Matlab, doubt that'll result in anything useful.

Nevertheless, I'm going on as if everything was operating normally. I've devoted this week to determining if the SLM is doing anything at all. Here's a plot showing a few relevant objective functions as a focus bias was applied to the SLM with sinusoidally varying intensity. Since the bitch has been fighting me every step of the way, I was surprised to see actual sinusoidal responses here, even though their amplitude might be too small to be useful. The DM was stationary in the bias position for these experiments, and the SLM is set to zero until iteration 300:



Of course, at 1/3 Hz, these experiments take half an hour to perform. Nonetheless, there's at least an indication that the SLM may be useful to generate disturbances if I can get it moving faster. For tomorrow, I'd like begin adding such SLM disturbances into the SPGD algorithm, even though a full experiment might take a day to run (I can do it over night if necessary).

As another side project, my advisor wants me to characterize the response times of each component in my system. Unfortunately he's starting to think about real-time experiments, which seems infeasible now considering the problems I'm already having.