- title
- Constraint System
- dated
- Fall 2023
At the beginning of phase 2, we weren’t sure what to do about constraints. Based on past exploration, constraints seemed to hold a lot of promise but were notoriously unreliable. A good constraint system could unify and power a number of key aspects of the dynamic medium we seek. It would allow for mechanical constructions, like a rope and pulley, that feel physically correct. It would allow for a computational model that bridged between algebraic formulae and tangible objects, like an architectural drawing that changed in response to edits in a parameter table. It’d allow for true bidirectional computation, instead of the limited / biased computation we had in Crosscut.
But constraint systems are hard to implement. They tend to suffer from the following issues:
- “Floaty-ness”, where manipulating one parameter causes other parameters to drift in ways that feel unnatural or unrealistic.
- “Blow-ups”, where manipulating part of the system causes other parts of the system to collapse inward or explode out to infinity (divergence toward or away from zero, rather than convergence on a reasonable value).
- Poor performance, either due to slow convergence or to excess recalculation when constraints are added or removed.
Early in this phase Alex was optimistic that with a bit of effort these issues could be overcome. After a few weeks he was able to show very promising results. This early success built on his earlier work on [propagation of knowns] to reduce the number of variables whose values had to be determined by the solver. Then, by leveraging equality constraints (and more generally, any linear relationship between variables — y=mx+b
where m
is not zero) he further reduced the number of dimensions that the solver had to worry about.
As an example, consider an angle constraint that “exports” two variables: one whose value is the angle in degrees, and another whose value is in radians. While either or both of those variables may be referenced by other constraints or scrubbed by the user, they collapse to a single variable as far as the solver is concerned — after all, once you know the value of one of them, you can compute the value of the other.
This technique is not just an optimization that enables the solver to converge to a solution more quickly; it often enables the solver to find a solution where it wouldn’t have been able to otherwise. It also improves the feel of our prototype by eliminating the “floaty-ness” that results from duplicated state. More importantly, this technique also enabled us to avoid blow-ups when duplicated or “uneven” constraints are added to a model.
To illustrate this point, suppose the user has added an angle constraint between points A
and B
, which introduced a variable θAB
. Now consider what happens when the user adds another angle constraint that involves the same points, but this time in the opposite direction. This is a situation that could lead to an unstable construction. Our new constraint system detects this, and instead of adding the second angle constraint, it introduces the variable θBA
and a linear relationship θBA = θAB + π
. So nothing changes for the solver: there is still only one variable (θAB
) to solve for, and only one constraint to satisfy. But to the user, it looks like there are two separate angle constraints. They can further constrain θBA
, or later remove the second angle constraint if they wish. (We are happy to report that all of the unstable constructions we documented in phase 1 are perfectly stable in our new constraint system.)
Another useful technique we developed for the constraint system is what we call “clustering”, i.e., the partitioning of the constraints and the variables on which they operate into clusters that can be solved independently. (As Avi pointed out, this is analogous to the notion of simulation islands in some physics engines.) By breaking up the solver’s workload into clusters, we reduced the number of dimensions it has to worry about, which has led to significant improvements in convergence. Clustering also opens up the possibility for us to parallelize the solver — this is something we can explore when we start to build larger projects/models that require better performance.
While these analyses and techniques are an important part of our constraint system, they are not tied to a particular choice of solver. This gives us the freedom to plug in better options as they become available. In addition to Alex’s relaxation-based solver, we tried several gradient descent-based solvers, which resulted in better convergence. The first was the uncmin from numericjs. Later, we found a version of uncmin that was modified by Guillermo Webster for his g9 project, which gave us even better results. We hope to eventually try @Avi Bryant 's new solver which uses auto-diff (instead of numerical differentiation) and WebAssembly for added performance and stability.
Further gains were made by reframing the way values were represented in the system. For example, instead of representing positional relationships as occurring in cartesian space (with x/y coordinates), relationships were expressed in polar space (with angle/distance coordinates).