# Chapter 3 - Linear Algebra

Linear Algebra is to Quantum Computing as *Boolean Algebra* is to *Classical Computing*. Although we have to learn a new tool, it makes calculations much easier.

# # Quantum States

## # Column Vectors

- We write $\ket 0$ and $\ket 1$ as column vectors: $$\ket 0 = \begin{pmatrix} 1 \\ 0 \end{pmatrix}, \qquad \ket 1 = \begin{pmatrix} 0 \\ 1 \end{pmatrix}.$$
- It is easier to write superpositions this way. A generic qubit would be: $$\begin{aligned}

\ket \psi &= \alpha \ket 0 + \beta \ket 1 \newline

&= \alpha \begin{pmatrix} 1 \\ 0 \end{pmatrix} + \beta \begin{pmatrix} 0 \\ 1 \end{pmatrix} \newline

&= \begin{pmatrix} \alpha \\ 0 \end{pmatrix} + \begin{pmatrix} 0 \\ \beta \end{pmatrix} \newline

&= \begin{pmatrix} \alpha \\ \beta \end{pmatrix}.

\end{aligned}$$

## # Row Vectors

- The
*transpose*of a Column vector is obtained by rewriting it as a*row vector*, and it is denoted by $ᵀ$. $$\begin{pmatrix} \alpha \\ \beta \end{pmatrix}^T = \begin{pmatrix} \alpha & \beta \end{pmatrix}.$$ - In quantum computing, we typically use the
*conjugate transpose*, which is obtained by taking the Complex Conjugate of each component of the transpose. It is denoted by $†$. $$\begin{pmatrix} \alpha \\ \beta \end{pmatrix}^\dagger = \begin{pmatrix} \alpha^* & \beta^* \end{pmatrix}.$$ - A bra $\bra \psi$ is the
*conjugate transpose*of a ket, and conversely, a ket is the conjugate transpose of a bra. $$\bra \psi = \ket{\psi}^\dagger, \qquad \ket \psi = \bra \psi^\dagger.$$

# # Inner Products

## # Inner Products Are Scalars

- The inner product of $\ket \psi = \begin{pmatrix} \alpha \\ \beta \end{pmatrix}$ and $\ket \phi = \begin{pmatrix} \gamma \\ \delta \end{pmatrix}$ is defined as $\braket{\psi | \phi}$, which is called a
*bra-ket*or*bracket*: $$\displaylines{

\braket{\phi | \psi} = \begin{pmatrix} \alpha^* & \beta^* \end{pmatrix}\begin{pmatrix} \gamma \\ \delta \end{pmatrix} \newline

\braket{\phi | \psi} = \alpha^*\gamma + \beta^*\delta.

}$$ which is a scalar value. That’s why an inner product is also known as scalar product. - The inner product of $\ket \psi$ and $\ket phi$ is just the Complex Conjugate of the inner product of $\ket \psi$ and $\ket \phi$: $$\braket{\psi | \phi} = \braket{\phi | \psi}^*.$$

## # Orthonormality

- The inner product of $\ket \psi$ with itself, denoted $\braket{\psi | \psi}$, is just the total probability, i.e. $|\alpha|^2 + |\beta|^2$, and if it is $1$, then the state $\ket \psi$ is
*normalized*. - Any two states on opposite sides of the Bloch sphere have zero inner product, and the states with zero inner product are called
*orthogonal*states. **Orthonormal**states are those states that are both*normalized*and*orthogonal*to each other. e.g. ${\ket 0, \ket 1}$ are orthonormal, so are ${\ket +, \ket -}$, and ${\ket i, \ket {-i}}$.

## # Projection, Measurement, and Change of Basis

- For an orthonormal basis ${\ket a, \ket b}$, the state of a qubit can be written as $$\ket \psi = \alpha \ket a + \beta \ket b,$$ where $\alpha = \braket{a | \psi}$ and $\beta = \braket{b | \psi}$. Here $\braket{a | \psi}$ is the amplitude of $\ket \psi$ in $\ket a$, i.e. the amount of $\ket \psi$ that is in $\ket a$, or the amount of overlap between $\ket \psi$ and $\ket a$, which in mathematical terms is called the
*projection*of $\ket \psi$ onto $\ket a$. - Inner products can be used to find the amplitudes and
*orthonormality*of a state in a certain basis, which provides a convenient way to change basis states, and perform calculations using different computer algebra systems.

# # Quantum Gates

## # Gates as Matrices

- Quantum Gate are matrices that keep the total probability equal to $1$.

## # Common One-Qubit Gates as Matrices

- The previously introduced common One-Qubit Quantum Gates can be represented as matrices:

## # Sequential Quantum Gates

- Using linear algebra, we can compute the effect of a sequence of quantum gates.
- For example: $HSTH\ket 0$ can be computed by simplifying/multiplying the matrices together: $$

HSTH\ket 0 =

\frac{1}{\sqrt 2} \begin{pmatrix}1 & 1 \\ 1 & -1 \end{pmatrix}

\begin{pmatrix}1 & 0 \\ 0 & i \end{pmatrix}

\begin{pmatrix}1 & 0 \\ 0 & e^{i\pi / 4} \end{pmatrix}

\frac{1}{\sqrt 2} \begin{pmatrix}1 & 1 \\ 1 & -1 \end{pmatrix}

\begin{pmatrix}1 \\ 0 \end{pmatrix}.

$$

## # Circuit Identities

- We can prove different circuit identities using Linear Algebra, e.g. $HXH = Z$ can be proven by multiplying the matrices together, either manually or, in most cases, using a computer program.

## # Unitarity

- Quantum Gate are unitary matrices, and unitary matrices are quantum gates.

## # Reversibility

- A quantum gate is always reversible, and its inverse is $U^†$.

# # Outer Products

## # Outer Products Are Matrices

- As opposed to
*inner products*$\braket{\psi | \phi}$ which are scalar, an*outer product*$\ket \psi \bra \phi$ is always a Matrix: $$\ket{\psi}\bra{\phi} = \begin{pmatrix}\alpha \\ \beta\end{pmatrix}\begin{pmatrix}\gamma^* & \delta^* \end{pmatrix}.$$ - We can add
*outer products*together to construct various Quantum Gate. - The
*outer product*of $\ket \phi$ and $\ket \psi$ is just the*conjugate transpose*of the outer product of $\ket \psi$ and $\ket \phi$: $$\ket \phi \bra \psi = \ket \psi \bra \phi ^ \dagger.$$

## # Completeness Relation

- A complete
*orthonormal*basis ${\ket a, \ket b}$ satisfies the completeness relation $$\ket a \bra a + \ket b \bra b = I.$$

# # Summary

- The mathematical language of Quantum Computing is Linear Algebra.
*Quantum states*are represented by column vectors called kets, and the*conjugate transpose*of a ket is a bra.- Multiplying a bra and a ket is an
*inner product*that yields the*projection*or amplitudes of the states onto each other. - States with zero
*inner product*are*orthogonal*, and a state whose inner product with itself is $1$ is*normalized*. - All quantum gates are unitary matrices.
- A quantum gate is always reversible.
- Multiplying a ket and a bra is an
*outer product*, which is a Matrix. - A complete
*orthonormal*basis satisfies the*completeness relation*, meaning the sum of the outer products of each basis vector with itself equals the Identity matrix.