Prof David Bindel

Please click the play button below.

Examples include:

- SPICE-level circuit simulation
- nodal voltages vs. voltage distributions

- Structural simulation
- beam end displacements vs. continuum field

- Chemical concentrations in stirred tank reactor
- concentrations in tank vs. spatially varying concentrations

- Typically ordinary differential equations (ODEs)
- Constraints: differential-algebraic equations (DAEs)

Often (not always) *sparse*.

Consider ODEs \(x' = f(x)\) (special case: \(f(x) = Ax\))

- Dependency graph: edge \((i,j)\) if \(f_j\) depends on \(x_i\)
- Sparsity means each \(f_j\) depends on only a few \(x_i\)
- Often arises from physical or logical locality
- Corresponds to \(A\) being sparse (mostly zeros)

Want to partition sparse graphs so that

- Subgraphs are same size (load balance)
- Cut size is minimal (minimize communication)

We’ll talk more about this later.

Consider \(x' = f(x)\) (special case: \(f(x) = Ax + b\)).

Might want: \(f(x_*) = 0\)

- Boils down to \(Ax = b\) (e.g. for Newton-like steps)
- Can solve directly or iteratively
- Sparsity matters a lot!

Consider \(x' = f(x)\) (special case: \(f(x) = Ax + b\)).

Might want: \(x(t)\) for many \(t\)

- Involves time stepping (explicit or implicit)
- Implicit methods involve linear/nonlinear solves
- Need to understand stiffness and stability issues

Consider \(x' = f(x)\) (special case: \(f(x) = Ax + b\)).

Might want: eigenvalues of \(A\) or \(f'(x_*)\)

- Example: forward Euler $$ x_{k+1} = x_k + (\Delta t) f(x_k) $$
- Next step depends only on earlier steps
- Simple algorithms
- May have stability/stiffness issues

- Example: backward Euler $$ x_{k+1} = x_k + (\Delta t) f(x_{k+1}) $$
- Next step depends on itself and on earlier steps
- Algorithms involve solves — complication, communication!
- Larger time steps, each step costs more

In all cases, lots of time in sparse matvec:

- Iterative linear solvers: repeated sparse matvec
- Iterative eigensolvers: repeated sparse matvec
- Explicit time marching: matvecs at each step
- Implicit time marching: iterative solves (involving matvecs)

We need to figure out how to make matvec fast!

- Sparse matrix \(\implies\) mostly zero entries
- Can also have “data sparseness” — representation with less than \(O(n^2)\) storage, even if most entries nonzero

- Could be implicit (e.g. directional differencing)
- Sometimes explicit representation is useful
- Easy to get lots of indirect indexing!
- Compressed sparse storage schemes help

This can be even more compact:

- Could organize by blocks (block CSR)
- Could compress column index data (16-bit vs 64-bit)
- Various other optimizations — see OSKI

- ODE and DAE models widely used in engineering
- Different analyses: static, dynamic, modal
- Sparse linear algebra is often key