Commit 9bc1f17
authored
Optimization infrastructure: pluggable backends, losses, batched GPU processing (#531)
* add pluggable optimizer backends (lbfgs, nelder_mead, grid_search), convergence criteria, and `use_gradients` flag
* rename num_iterations to max_iterations
* add CacheSpec and disk-backed TransferFunctionCache for multi-tile TF reuse
* add benchmark_optimizers for comparing optimization strategies"
* add optimizer and TF cache benchmark examples"
* add pluggable, normalized loss functions with 3D support
* fix widget tests: use model_validator for loss config instead of discriminated union
* Fix: import PrintLogger in optimize.py for nelder_mead/grid_search defaults
* add batched (B,Z,Y,X) reconstruction support to all 4 models
Accept both (Z,Y,X) and (B,Z,Y,X) inputs following PyTorch convention.
Internal code unsqueezes/squeezes at boundaries so all operations use
(B,Z,Y,X). Unbatched path is bit-exact with prior behavior.
- util: pad_zyx_along_z, inten_normalization, inten_normalization_3D
- isotropic_thin_3d: calculate_transfer_function (batched tilt angles),
calculate_singular_system, apply_inverse_transfer_function, reconstruct
- isotropic_fluorescent_thin_3d: apply_inverse_transfer_function, reconstruct
- phase_thick_3d: apply_inverse_transfer_function, reconstruct
- isotropic_fluorescent_thick_3d: apply_inverse_transfer_function, reconstruct
- optics: compute_weak_object_transfer_function_2d now supports broadcasting
- optics: generate_tilted_pupil documents batched output shape
* add batched API input and per-tile optimization support
API layer: phase and fluorescence apply_inverse_transfer_function accept
list[xr.DataArray] for batched processing. Adds _wrap_output_tensor
helper to reduce wrapping duplication.
Optimizer: when data.ndim == 4, parameters become (B,) tensors for
independent per-tile optimization. Standard Adam with (B,) tensors
gives B independent optimizers (verified bit-exact vs sequential).
Loss is summed per-tile; cross-tile gradient terms are zero since
tiles are decoupled in the forward pass.
* add batched reconstruction tests for all 4 models
Tests B=4 batched vs sequential bit-exactness, B=1 matches unbatched,
and per-tile tilt angles for isotropic_thin_3d.
* restore torch.linalg.svd in calculate_singular_system
The norm-based decomposition (U=identity) introduced a regression
vs waveorder 3.0.0 by ignoring cross-channel coupling between
absorption and phase transfer functions. Restoring the real SVD
reduces the mean reconstruction error from 0.077 to 0.0005 on
OPS data (verified against production phenotyping_phase_2d.zarr).
torch.linalg.svd supports backpropagation through complex tensors,
so gradient-based optimization still works.
* add use_svd flag to calculate_singular_system for gradient compatibility
torch.linalg.svd backward fails with complex tensors on some GPU types
due to singular vector phase ambiguity. When gradients are needed
(optimization), reconstruct() auto-selects the norm-based decomposition
(use_svd=False) which supports backpropagation on all devices. When no
gradients are needed (final reconstruction), full SVD is used for best
accuracy.
The no-grad path remains bit-exact with the previous SVD-only behavior.
* catch SVD/NaN errors in optimization loop and revert to last good params
* support batched parameters in calculate_transfer_function and apply_inverse_transfer_function
* Remove TF caching and benchmarking harness
Removes cache.py, benchmark.py, CacheSpec, and associated tests/examples.
TF caching showed marginal speedup with ~24% accuracy loss (nearest-neighbor)
and the benchmarking harness is better suited as standalone scripts.
* Remove CacheSpec from OptimizableFloat
* Add NAdam optimizer backend
* Validate optimization method name
* Use default lr=1.0 for L-BFGS line search step size
* Eliminate L-BFGS double forward pass
* Use np.linspace for grid search to avoid float-step accumulation1 parent cadc965 commit 9bc1f17
File tree
24 files changed
+1928
-495
lines changed- docs/examples/optimization
- tests
- cli_tests
- models
- optim
- waveorder
- api
- cli
- models
- optim
24 files changed
+1928
-495
lines changed| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
155 | 155 | | |
156 | 156 | | |
157 | 157 | | |
| 158 | + | |
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
8 | 8 | | |
9 | 9 | | |
10 | 10 | | |
| 11 | + | |
11 | 12 | | |
12 | 13 | | |
13 | 14 | | |
| |||
32 | 33 | | |
33 | 34 | | |
34 | 35 | | |
35 | | - | |
36 | | - | |
| 36 | + | |
| 37 | + | |
37 | 38 | | |
38 | 39 | | |
39 | 40 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
11 | 11 | | |
12 | 12 | | |
13 | 13 | | |
| 14 | + | |
14 | 15 | | |
15 | 16 | | |
16 | 17 | | |
| |||
53 | 54 | | |
54 | 55 | | |
55 | 56 | | |
56 | | - | |
57 | | - | |
| 57 | + | |
| 58 | + | |
58 | 59 | | |
59 | 60 | | |
60 | 61 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
24 | 24 | | |
25 | 25 | | |
26 | 26 | | |
27 | | - | |
| 27 | + | |
| 28 | + | |
28 | 29 | | |
29 | | - | |
30 | | - | |
| 30 | + | |
| 31 | + | |
31 | 32 | | |
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
270 | 270 | | |
271 | 271 | | |
272 | 272 | | |
273 | | - | |
| 273 | + | |
274 | 274 | | |
275 | 275 | | |
276 | 276 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
| 1 | + | |
| 2 | + | |
| 3 | + | |
| 4 | + | |
| 5 | + | |
| 6 | + | |
| 7 | + | |
| 8 | + | |
| 9 | + | |
| 10 | + | |
| 11 | + | |
| 12 | + | |
| 13 | + | |
| 14 | + | |
| 15 | + | |
| 16 | + | |
| 17 | + | |
| 18 | + | |
| 19 | + | |
| 20 | + | |
| 21 | + | |
| 22 | + | |
| 23 | + | |
| 24 | + | |
| 25 | + | |
| 26 | + | |
| 27 | + | |
| 28 | + | |
| 29 | + | |
| 30 | + | |
| 31 | + | |
| 32 | + | |
| 33 | + | |
| 34 | + | |
| 35 | + | |
| 36 | + | |
| 37 | + | |
| 38 | + | |
| 39 | + | |
| 40 | + | |
| 41 | + | |
| 42 | + | |
| 43 | + | |
| 44 | + | |
| 45 | + | |
| 46 | + | |
| 47 | + | |
| 48 | + | |
| 49 | + | |
| 50 | + | |
| 51 | + | |
| 52 | + | |
| 53 | + | |
| 54 | + | |
| 55 | + | |
| 56 | + | |
| 57 | + | |
| 58 | + | |
| 59 | + | |
| 60 | + | |
| 61 | + | |
| 62 | + | |
| 63 | + | |
| 64 | + | |
| 65 | + | |
| 66 | + | |
| 67 | + | |
| 68 | + | |
| 69 | + | |
| 70 | + | |
| 71 | + | |
| 72 | + | |
| 73 | + | |
| 74 | + | |
| 75 | + | |
| 76 | + | |
| 77 | + | |
| 78 | + | |
| 79 | + | |
| 80 | + | |
| 81 | + | |
| 82 | + | |
| 83 | + | |
| 84 | + | |
| 85 | + | |
| 86 | + | |
| 87 | + | |
| 88 | + | |
| 89 | + | |
| 90 | + | |
| 91 | + | |
| 92 | + | |
| 93 | + | |
| 94 | + | |
| 95 | + | |
| 96 | + | |
| 97 | + | |
| 98 | + | |
| 99 | + | |
| 100 | + | |
| 101 | + | |
| 102 | + | |
| 103 | + | |
| 104 | + | |
| 105 | + | |
| 106 | + | |
| 107 | + | |
| 108 | + | |
| 109 | + | |
| 110 | + | |
| 111 | + | |
| 112 | + | |
| 113 | + | |
| 114 | + | |
| 115 | + | |
| 116 | + | |
| 117 | + | |
| 118 | + | |
| 119 | + | |
| 120 | + | |
| 121 | + | |
| 122 | + | |
| 123 | + | |
| 124 | + | |
| 125 | + | |
| 126 | + | |
| 127 | + | |
| 128 | + | |
| 129 | + | |
| 130 | + | |
| 131 | + | |
| 132 | + | |
| 133 | + | |
| 134 | + | |
| 135 | + | |
| 136 | + | |
| 137 | + | |
| 138 | + | |
| 139 | + | |
| 140 | + | |
| 141 | + | |
| 142 | + | |
| 143 | + | |
| 144 | + | |
| 145 | + | |
| 146 | + | |
| 147 | + | |
| 148 | + | |
| 149 | + | |
| 150 | + | |
| 151 | + | |
| 152 | + | |
| 153 | + | |
| 154 | + | |
| 155 | + | |
| 156 | + | |
| 157 | + | |
| 158 | + | |
| 159 | + | |
| 160 | + | |
| 161 | + | |
| 162 | + | |
| 163 | + | |
| 164 | + | |
| 165 | + | |
| 166 | + | |
| 167 | + | |
| 168 | + | |
| 169 | + | |
| 170 | + | |
| 171 | + | |
| 172 | + | |
| 173 | + | |
| 174 | + | |
| 175 | + | |
| 176 | + | |
| 177 | + | |
| 178 | + | |
| 179 | + | |
| 180 | + | |
| 181 | + | |
| 182 | + | |
| 183 | + | |
| 184 | + | |
| 185 | + | |
| 186 | + | |
| 187 | + | |
| 188 | + | |
| 189 | + | |
| 190 | + | |
| 191 | + | |
| 192 | + | |
| 193 | + | |
| 194 | + | |
| 195 | + | |
| 196 | + | |
| 197 | + | |
| 198 | + | |
| 199 | + | |
| 200 | + | |
| 201 | + | |
| 202 | + | |
| 203 | + | |
| 204 | + | |
| 205 | + | |
| 206 | + | |
| 207 | + | |
| 208 | + | |
| 209 | + | |
| 210 | + | |
| 211 | + | |
| 212 | + | |
| 213 | + | |
| 214 | + | |
| 215 | + | |
| 216 | + | |
| 217 | + | |
| 218 | + | |
| 219 | + | |
| 220 | + | |
| 221 | + | |
| 222 | + | |
| 223 | + | |
| 224 | + | |
| 225 | + | |
| 226 | + | |
| 227 | + | |
| 228 | + | |
| 229 | + | |
| 230 | + | |
| 231 | + | |
| 232 | + | |
| 233 | + | |
| 234 | + | |
| 235 | + | |
| 236 | + | |
| 237 | + | |
| 238 | + | |
0 commit comments