Internals

Data structures

Pseudospectra.PortraitType
Portrait

structure representing a spectral portrait; includes a mesh for z=x+iy, values of the resolvent norm on the mesh, suitable contour levels, and some other metadata.

source

Arnoldi iteration

Pseudospectra.xeigsFunction
xeigs(A, B, channel=nothing; nev=6, ncv=max(20,2*nev+1), which=:LM,
      tol=0.0, maxiter=300, sigma=nothing, ritzvec=true, v0=zeros((0,)),
      wantH=true, options=nothing)

Compute (generalized) eigenpairs of A x=λ B x and projections.

Modified version of eigs() (from the Arpack package, q.v.) which optionally

  1. provides the projection matrices,
  2. provides intermediate states (typically for plotting).

For intermediates, caller must provide a Channel argument; in this case xeigs is implemented as a producer which fills channel. When finished, it puts (:finale, d,[v,],nconv,niter,nmult,resid[,H,V]) to channel.

The type of A must provide methods for mul!, issymmetric, eltype and size. Some variants are not implemented for the case where A is not a factorable matrix.

source
Pseudospectra.ArpackOptionsType
ArpackOptions{T}(; nev=6, ncv=0, which=:LM,
                                       tol=zero(T),maxiter=300,
                                       v0=Vector{T}(0), have_v0=false,
                                       sigma=nothing)

constructs an object to manage the Arnoldi scheme; see the documentation for eigs in the Arpack package for the meaning of fields.

source

Algorithm selection

IRAM means implicitly restarted Arnoldi method, using the ARPACK implementation. "Large" and "small" are determined by constants which might be made configurable someday.

Dense square matrix

  • If large and not opts[:direct], project via IRAM and go to rectangular branch.
  • If small, use SVD.
  • Otherwise, use inverse Lanczos.
    • If a Schur decomposition is available, use that to simplify the solvers.

Sparse square matrix

  • If small, and not opts[:keep_sparse], convert to dense then use above logic.
  • If opts[:direct], use inverse Lanczos. This involves a sparse factorization for every grid point, so is slow.
  • Otherwise project via IRAM and go to rectangular branch.

Dense rectangular matrix

  • If not Hessenberg, factor first.
    • If very tall, factor by QR, otherwise QZ (generalized Schur).
  • Use QR for resolvents, then inverse Lanczos.
    • Use a simplified QR for large Hessenberg.