I was wondering what would happen if I maximize the compliance - in other words to minimize stiffness. Note that the compliance is defined as the scalar product of force (externaly applied in the center of this bridge) and the displacement. <f, u>. When minimizing compliance I mininimize the mechanical energy which is <u, Ku> what is equivalent. Here the first formulation is more interesting, as our force is applied in the example only at a single node. The simulation is done with IPOPT as optimizer, the animation shows, that in a first attempt IPOPT weakens the material at the fix points - but removing the material at the load point allows clearly the highest displacement at this single load point.
In this (private) blog I write about my research in topology optimization. Piezoelectric topology optimization was the topic of my Ph.D. thesis but I work also other fields of topology optimization. I work for Prof. Stingl and I am funded by the Excellence Cluster Engineering of Advanced Materials.
Thursday, 15 March 2007
Monday, 5 March 2007
Comparing IPOPT with the Optimality Condition
Here I compare the optimality condition against IPOPT via the bridge well known example. Now I fix (horizontal and vertical) the lower right end but let the left end move horizontal (only the vertical displacement is fxed). I think this is a nice example about the impact of boundary conditions (compare with the 2D bridge example!).
The video using IPOPT is here IPOPT video, the video using optimality condition is optimality condition video. The final image for the the IPOPT and optimality condition optimization is:
Note, that the volume constraint (0.5) is always fulfilled! The "flickering" in the IPOPT animation can be better understood having a look at the iteration plot . IPOPT quickly runs into a local minima and it takes some time to leave it. Then one can see brave tries to escape the next minima (which could be close to the global minimum) several times. Note that I do not let IPOPT quit the optimization process but rather limit the maximal number of iterations.
What I find noticable is the with the optimality condition, I really have almost no grayness (I will have to quantify this!) what actually makes sense (and is what we want). In the literature and with [1] find more grayness at the silhouette (as with IPOPT). I'm currently reimplementing the filtering of the gradient and now I also get more "aliasing". What do you get?
[Update 10.2010]
I found that this pretty old post is sometimes still read. The key point is, that sensitivity filtering has been used. It is well known that this disturbs the original gradient and general purpose optimizers have problems with that - SNOPT normally fails. IPOPT seems to be more robust here while MMA and OC are rock solid against it. So to compare IPOPT you need rigorous regularization like density filtering, slope constraints or feature size control (e.g. MOLE) to mention a relevant regularization techniques mentioned in THE BOOK.
The video using IPOPT is here IPOPT video, the video using optimality condition is optimality condition video. The final image for the the IPOPT and optimality condition optimization is:
Note, that the volume constraint (0.5) is always fulfilled! The "flickering" in the IPOPT animation can be better understood having a look at the iteration plot . IPOPT quickly runs into a local minima and it takes some time to leave it. Then one can see brave tries to escape the next minima (which could be close to the global minimum) several times. Note that I do not let IPOPT quit the optimization process but rather limit the maximal number of iterations.
What I find noticable is the with the optimality condition, I really have almost no grayness (I will have to quantify this!) what actually makes sense (and is what we want). In the literature and with [1] find more grayness at the silhouette (as with IPOPT). I'm currently reimplementing the filtering of the gradient and now I also get more "aliasing". What do you get?
[Update 10.2010]
I found that this pretty old post is sometimes still read. The key point is, that sensitivity filtering has been used. It is well known that this disturbs the original gradient and general purpose optimizers have problems with that - SNOPT normally fails. IPOPT seems to be more robust here while MMA and OC are rock solid against it. So to compare IPOPT you need rigorous regularization like density filtering, slope constraints or feature size control (e.g. MOLE) to mention a relevant regularization techniques mentioned in THE BOOK.
Thursday, 1 March 2007
Used Toolchain
Within this posting I will add more and more of the tools I use.
- As external optimizer I use IPOPT, to be replaced soon by a MMA implementation.
A. Wächter and L. T. Biegler, On the Implementation of a Primal-Dual Interior Point Filter Line Search Algorithm for Large-Scale Nonlinear Programming, Mathematical Programming 106(1), pp. 25-57, 2006
Subscribe to:
Posts (Atom)