Quantum neural network

“Neural Network are not black boxes. They are a big pile of linear algebra.” - Randall Munroe, xkcd

Machine learning has a wide range of models for tasks such as classification, regression, and clustering. Neural networks are one of the most successful models, having experienced a resurgence in use over the past decade due to improvements in computational power and advanced software libraries. The typical structure of a neural network consists of a series of interacting layers that perform transformations on data passing through the network. An archetypal neural network structure is the feedforward neural network, visualized by the following example:


../_images/neural_network.svg


Here, the neural network depth is determined by the number of layers, while the maximum width is given by the layer with the greatest number of neurons. The network begins with an input layer of real-valued neurons, which feed forward onto a series of one or more hidden layers. Following the notation of [1], if the \(n\) neurons at one layer are given by the vector \(\mathbf{x} \in \mathbb{R}^{n}\), the \(m\) neurons of the next layer take the values

\[\mathcal{L}(\mathbf{x}) = \varphi (W \mathbf{x} + \mathbf{b}),\]

where

  • \(W \in \mathbb{R}^{m \times n}\) is a matrix,

  • \(b \in \mathbb{R}^{m}\) is a vector, and

  • \(\varphi\) is a nonlinear function (also known as the activation function).

The matrix multiplication \(W \mathbf{x}\) is a linear transformation on \(\mathbf{x}\), while \(W \mathbf{x} + \mathbf{b}\) represents an affine transformation. In principle, any nonlinear function can be chosen for \(\varphi\), but often the choice is fixed from a standard set of activations that include the rectified linear unit (ReLU) and the sigmoid function acting on each neuron. Finally, the output layer enacts an affine transformation on the last hidden layer, but the activation function may be linear (including the identity), or a different nonlinear function such as softmax (for classification).

Layers in the feedforward neural network above are called fully connected as every neuron in a given hidden layer or output layer can be connected to all neurons in the previous layer through the matrix \(W\). Over time, specialized versions of layers have been developed to focus on different problems. For example, convolutional layers have a restricted form of connectivity and are suited to machine learning with images. We focus here on fully connected layers as the most general type.

Training of neural networks uses variations of the gradient descent algorithm on a cost function characterizing the similarity between outputs of the neural network and training data. The gradient of the cost function can be calculated using automatic differentiation, with knowledge of the feedforward network structure.

Quantum neural networks aim to encode neural networks into a quantum system, with the intention of benefiting from quantum information processing. There have been numerous attempts to define a quantum neural network, each with varying advantages and disadvantages. The quantum neural network detailed below, following the work of [1], has a CV architecture and is realized using standard CV gates from Strawberry Fields. One advantage of this CV architecture is that it naturally accommodates for the continuous nature of neural networks. Additionally, the CV model is able to easily apply non-linear transformations using the phase space picture - a task which qubit-based models struggle with, often relying on measurement postselection which has a probability of failure.

Implementation

A CV quantum neural network layer can be defined as

\[\mathcal{L} := \Phi \circ \mathcal{D} \circ \mathcal{U}_{2} \circ \mathcal{S} \circ \mathcal{U}_{1},\]

where

  • \(\mathcal{U}_{k}=U_{k}(\boldsymbol{\theta}_{k},\boldsymbol{\phi}_{k})\) is an \(N\) mode interferometer,

  • \(\mathcal{D}=\otimes_{i=1}^{N}D(\alpha_{i})\) is a single mode displacement gate (Dgate) with complex displacement \(\alpha_{i} \in \mathbb{C}\),

  • \(\mathcal{S}=\otimes_{i=1}^{N}S(r_{i})\) is a single mode squeezing gate (Sgate) acting on each mode with squeezing parameter \(r_{i} \in \mathbb{R}\), and

  • \(\Phi=\otimes_{i=1}^{N}\Phi(\lambda_{i})\) is a non-Gaussian gate on each mode with parameter \(\lambda_{i} \in \mathbb{R}\).

Note

Any non-Gaussian gate such as the cubic phase gate (Vgate) represents a valid choice, but we recommend the Kerr gate (Kgate) for simulations in Strawberry Fields. The Kerr gate is more accurate numerically because it is diagonal in the Fock basis.

The layer is shown below as a circuit:


../_images/layer.svg


These layers can then be composed to form a quantum neural network. The width of the network can also be varied between layers [1].

Reproducing classical neural networks

Let’s see how the quantum layer can embed the transformation \(\mathcal{L}(\mathbf{x}) = \varphi (W \mathbf{x} + \mathbf{b})\) of a classical neural network layer. Suppose \(N\)-dimensional data is encoded in position eigenstates so that

\[\mathbf{x} \Leftrightarrow \ket{\mathbf{x}} := \ket{x_{1}} \otimes \ldots \otimes \ket{x_{N}}.\]

We want to perform the transformation

\[\ket{\mathbf{x}} \Rightarrow \ket{\varphi (W \mathbf{x} + \mathbf{b})}.\]

It turns out that the quantum circuit above can do precisely this! Consider first the affine transformation \(W \mathbf{x} + \mathbf{b}\). Leveraging the singular value decomposition, we can always write \(W = O_{2} \Sigma O_{1}\) with \(O_{k}\) orthogonal matrices and \(\Sigma\) a positive diagonal matrix. These orthogonal transformations can be carried out using interferometers without access to phase, i.e., with \(\boldsymbol{\phi}_{k} = 0\):

\[U_{k}(\boldsymbol{\theta}_{k},\mathbf{0})\ket{\mathbf{x}} = \ket{O_{k} \mathbf{x}}.\]

On the other hand, the diagonal matrix \(\Sigma = {\rm diag}\left(\{c_{i}\}_{i=1}^{N}\right)\) can be achieved through squeezing:

\[\otimes_{i=1}^{N}S(r_{i})\ket{\mathbf{x}} \propto \ket{\Sigma \mathbf{x}},\]

with \(r_{i} = \log (c_{i})\). Finally, the addition of a bias vector \(\mathbf{b}\) is done using position displacement gates:

\[\otimes_{i=1}^{N}D(\alpha_{i})\ket{\mathbf{x}} = \ket{\mathbf{x} + \mathbf{b}},\]

with \(\mathbf{b} = \{\alpha_{i}\}_{i=1}^{N}\) and \(\alpha_{i} \in \mathbb{R}\). Putting this all together, we see that the operation \(\mathcal{D} \circ \mathcal{U}_{2} \circ \mathcal{S} \circ \mathcal{U}_{1}\) with phaseless interferometers and position displacement performs the transformation \(\ket{\mathbf{x}} \Rightarrow \ket{W \mathbf{x} + \mathbf{b}}\) on position eigenstates.

Warning

The TensorFlow backend is the natural simulator for quantum neural networks in Strawberry Fields, but this backend cannot naturally accommodate position eigenstates, which require infinite squeezing. For simulation of position eigenstates in this backend, the best approach is to use a displaced squeezed state (prepare_displaced_squeezed_state) with high squeezing value r. However, to avoid significant numerical error, it is important to make sure that all initial states have negligible amplitude for Fock states \(\ket{n}\) with \(n\geq \texttt{cutoff_dim}\), where \(\texttt{cutoff_dim}\) is the cutoff dimension.

Finally, the nonlinear function \(\varphi\) can be achieved through a restricted type of non-Gaussian gates \(\otimes_{i=1}^{N}\Phi(\lambda_{i})\) acting on each mode (see [1] for more details), resulting in the transformation

\[\otimes_{i=1}^{N}\Phi(\lambda_{i})\ket{\mathbf{x}} = \ket{\varphi(\mathbf{x})}.\]

The operation \(\mathcal{L} = \Phi \circ \mathcal{D} \circ \mathcal{U}_{2} \circ \mathcal{S} \circ \mathcal{U}_{1}\) with phaseless interferometers, position displacements, and restricted non-Gaussian gates can hence be seen as enacting a classical neural network layer \(\ket{\mathbf{x}} \Rightarrow \ket{\phi(W \mathbf{x} + \mathbf{b})}\) on position eigenstates.

Extending to quantum neural networks

In fact, CV quantum neural network layers can be made more expressive than their classical counterparts. We can do this by lifting the above restrictions on \(\mathcal{L}\), i.e.:

  • Using arbitrary interferometers \(U_{k}(\boldsymbol{\theta}_{k},\boldsymbol{\phi}_{k})\) with access to phase and general displacement gates (i.e., not necessarily position displacement). This allows \(\mathcal{D} \circ \mathcal{U}_{2} \circ \mathcal{S} \circ \mathcal{U}_{1}\) to represent a general Gaussian operation.

  • Using arbitrary non-Gaussian gates \(\Phi(\lambda_{i})\), such as the Kerr gate.

  • Encoding data outside of the position eigenbasis, for example using instead the Fock basis.

In fact, gates in a single layer form a universal gate set, making the CV quantum neural network a model for universal quantum computing, i.e., a sufficient number of layers can carry out any quantum algorithm implementable on a CV quantum computer.

CV quantum neural networks can be trained both through classical simulation and directly on quantum hardware. Strawberry Fields relies on classical simulation to evaluate cost functions of the CV quantum neural network and the resultant gradients with respect to parameters of each layer. However, this becomes an intractable task with increasing network depth and width. Ultimately, direct evaluation on hardware will likely be necessary to large scale networks; an approach for hardware-based training is mapped out in [2]. The PennyLane library provides tools for training hybrid quantum-classical machine learning models, using both simulators and real-world quantum hardware.

Example CV quantum neural network layers are shown, for one to four modes, below:


../_images/layer_1mode.svg

One mode layer


../_images/layer_2mode.svg

Two mode layer


../_images/layer_3mode.svg

Three mode layer


../_images/layer_4mode.svg

Four mode layer


Here, the multimode linear interferometers \(U_{1}\) and \(U_{2}\) have been decomposed into two-mode phaseless beamsplitters (BSgate) and single-mode phase shifters (Rgate) using the Clements decomposition [3]. The Kerr gate is used as the non-Gaussian gate.

Code

First, we import Strawberry Fields, TensorFlow, and NumPy:

import numpy as np
import tensorflow as tf
import strawberryfields as sf
from strawberryfields import ops

Before we begin defining our optimization problem, let’s first create some convenient utility functions.

Utility functions

The first step to writing a CV quantum neural network layer in Strawberry Fields is to define a function for the two interferometers:

def interferometer(params, q):
    """Parameterised interferometer acting on ``N`` modes.

    Args:
        params (list[float]): list of length ``max(1, N-1) + (N-1)*N`` parameters.

            * The first ``N(N-1)/2`` parameters correspond to the beamsplitter angles
            * The second ``N(N-1)/2`` parameters correspond to the beamsplitter phases
            * The final ``N-1`` parameters correspond to local rotation on the first N-1 modes

        q (list[RegRef]): list of Strawberry Fields quantum registers the interferometer
            is to be applied to
    """
    N = len(q)
    theta = params[:N*(N-1)//2]
    phi = params[N*(N-1)//2:N*(N-1)]
    rphi = params[-N+1:]

    if N == 1:
        # the interferometer is a single rotation
        ops.Rgate(rphi[0]) | q[0]
        return

    n = 0  # keep track of free parameters

    # Apply the rectangular beamsplitter array
    # The array depth is N
    for l in range(N):
        for k, (q1, q2) in enumerate(zip(q[:-1], q[1:])):
            # skip even or odd pairs depending on layer
            if (l + k) % 2 != 1:
                ops.BSgate(theta[n], phi[n]) | (q1, q2)
                n += 1

    # apply the final local phase shifts to all modes except the last one
    for i in range(max(1, N - 1)):
        ops.Rgate(rphi[i]) | q[i]

Warning

The Interferometer class in Strawberry Fields does not reproduce the functionality above. Instead, Interferometer applies a given input unitary matrix according to the Clements decomposition.

Using the above interferometer function, an \(N\) mode CV quantum neural network layer is given by the function:

def layer(params, q):
    """CV quantum neural network layer acting on ``N`` modes.

    Args:
        params (list[float]): list of length ``2*(max(1, N-1) + N**2 + n)`` containing
            the number of parameters for the layer
        q (list[RegRef]): list of Strawberry Fields quantum registers the layer
            is to be applied to
    """
    N = len(q)
    M = int(N * (N - 1)) + max(1, N - 1)

    int1 = params[:M]
    s = params[M:M+N]
    int2 = params[M+N:2*M+N]
    dr = params[2*M+N:2*M+2*N]
    dp = params[2*M+2*N:2*M+3*N]
    k = params[2*M+3*N:2*M+4*N]

    # begin layer
    interferometer(int1, q)

    for i in range(N):
        ops.Sgate(s[i]) | q[i]

    interferometer(int2, q)

    for i in range(N):
        ops.Dgate(dr[i], dp[i]) | q[i]
        ops.Kgate(k[i]) | q[i]

Finally, we define one more utility function to help us initialize the TensorFlow weights for our quantum neural network layers:

def init_weights(modes, layers, active_sd=0.0001, passive_sd=0.1):
    """Initialize a 2D TensorFlow Variable containing normally-distributed
    random weights for an ``N`` mode quantum neural network with ``L`` layers.

    Args:
        modes (int): the number of modes in the quantum neural network
        layers (int): the number of layers in the quantum neural network
        active_sd (float): the standard deviation used when initializing
            the normally-distributed weights for the active parameters
            (displacement, squeezing, and Kerr magnitude)
        passive_sd (float): the standard deviation used when initializing
            the normally-distributed weights for the passive parameters
            (beamsplitter angles and all gate phases)

    Returns:
        tf.Variable[tf.float32]: A TensorFlow Variable of shape
        ``[layers, 2*(max(1, modes-1) + modes**2 + modes)]``, where the Lth
        row represents the layer parameters for the Lth layer.
    """
    # Number of interferometer parameters:
    M = int(modes * (modes - 1)) + max(1, modes - 1)

    # Create the TensorFlow variables
    int1_weights = tf.random.normal(shape=[layers, M], stddev=passive_sd)
    s_weights = tf.random.normal(shape=[layers, modes], stddev=active_sd)
    int2_weights = tf.random.normal(shape=[layers, M], stddev=passive_sd)
    dr_weights = tf.random.normal(shape=[layers, modes], stddev=active_sd)
    dp_weights = tf.random.normal(shape=[layers, modes], stddev=passive_sd)
    k_weights = tf.random.normal(shape=[layers, modes], stddev=active_sd)

    weights = tf.concat(
        [int1_weights, s_weights, int2_weights, dr_weights, dp_weights, k_weights], axis=1
    )

    weights = tf.Variable(weights)

    return weights

Optimization

Now that we have our utility functions, lets begin defining our optimization problem In this particular example, let’s create a 1 mode CVQNN with 8 layers and a Fock-basis cutoff dimension of 6. We will train this QNN to output a desired target state; a single photon state.

# set the random seed
tf.random.set_seed(137)
np.random.seed(137)


# define width and depth of CV quantum neural network
modes = 1
layers = 8
cutoff_dim = 6


# defining desired state (single photon state)
target_state = np.zeros(cutoff_dim)
target_state[1] = 1
target_state = tf.constant(target_state, dtype=tf.complex64)

Now, let’s initialize an engine with the TensorFlow "tf" backend, and begin constructing out QNN program.

# initialize engine and program
eng = sf.Engine(backend="tf", backend_options={"cutoff_dim": cutoff_dim})
qnn = sf.Program(modes)

# initialize QNN weights
weights = init_weights(modes, layers) # our TensorFlow weights
num_params = np.prod(weights.shape)   # total number of parameters in our model

To construct the program, we must create and use Strawberry Fields symbolic gate arguments. These will be mapped to the TensorFlow variables on engine execution.

# Create array of Strawberry Fields symbolic gate arguments, matching
# the size of the weights Variable.
sf_params = np.arange(num_params).reshape(weights.shape).astype(np.str)
sf_params = np.array([qnn.params(*i) for i in sf_params])


# Construct the symbolic Strawberry Fields program by
# looping and applying layers to the program.
with qnn.context as q:
    for k in range(layers):
        layer(sf_params[k], q)

where sf_params is a real array of size [layers, 2*(max(1, modes-1) + modes**2 + modes)] containing the symbolic gate arguments for the quantum neural network.

Now that our QNN program is defined, we can create our cost function. Our cost function simply executes the QNN on our engine using the values of the input weights.

Since we want to maximize the fidelity \(f(w) = \langle \psi(w) | \psi_t\rangle\) between our QNN output state \(|\psi(w)\rangle\) and our target state \(\psi_t\rangle\), we compute the inner product between the two statevectors, as well as the norm \(\left\lVert \psi(w) - \psi_t\right\rVert\).

Finally, we also return the trace of the output QNN state. This should always have a value close to 1. If it deviates significantly from 1, this is an indication that we need to increase our Fock-basis cutoff.

def cost(weights):
    # Create a dictionary mapping from the names of the Strawberry Fields
    # symbolic gate parameters to the TensorFlow weight values.
    mapping = {p.name: w for p, w in zip(sf_params.flatten(), tf.reshape(weights, [-1]))}

    # run the engine
    state = eng.run(qnn, args=mapping).state
    ket = state.ket()

    difference = tf.reduce_sum(tf.abs(ket - target_state))
    fidelity = tf.abs(tf.reduce_sum(tf.math.conj(ket) * target_state)) ** 2
    return difference, fidelity, ket, tf.math.real(state.trace())

We are now ready to minimize our cost function using TensorFlow:

# set up the optimizer
opt = tf.keras.optimizers.Adam()
cost_before, fidelity_before, _, _ = cost(weights)

# Perform the optimization
for i in range(1000):
    # reset the engine if it has already been executed
    if eng.run_progs:
        eng.reset()

    with tf.GradientTape() as tape:
        loss, fid, ket, trace = cost(weights)

    # one repetition of the optimization
    gradients = tape.gradient(loss, weights)
    opt.apply_gradients(zip([gradients], [weights]))

    # Prints progress at every rep
    if i % 1 == 0:
        print("Rep: {} Cost: {:.4f} Fidelity: {:.4f} Trace: {:.4f}".format(i, loss, fid, trace))


print("\nFidelity before optimization: ", fidelity_before.numpy())
print("Fidelity after optimization: ", fid.numpy())
print("\nTarget state: ", target_state.numpy())
print("Output state: ", np.round(ket.numpy(), decimals=3))

Out:

Rep: 0 Cost: 2.0001 Fidelity: 0.0000 Trace: 1.0000
Rep: 1 Cost: 1.9978 Fidelity: 0.0001 Trace: 1.0000
Rep: 2 Cost: 1.9897 Fidelity: 0.0002 Trace: 1.0000
Rep: 3 Cost: 1.9794 Fidelity: 0.0006 Trace: 1.0000
Rep: 4 Cost: 1.9681 Fidelity: 0.0010 Trace: 1.0000
Rep: 5 Cost: 1.9632 Fidelity: 0.0016 Trace: 1.0000
Rep: 6 Cost: 1.9563 Fidelity: 0.0023 Trace: 1.0000
Rep: 7 Cost: 1.9476 Fidelity: 0.0031 Trace: 1.0000
Rep: 8 Cost: 1.9377 Fidelity: 0.0041 Trace: 1.0000
Rep: 9 Cost: 1.9268 Fidelity: 0.0052 Trace: 1.0000
Rep: 10 Cost: 1.9196 Fidelity: 0.0064 Trace: 1.0000
Rep: 11 Cost: 1.9130 Fidelity: 0.0077 Trace: 1.0000
Rep: 12 Cost: 1.9055 Fidelity: 0.0091 Trace: 1.0000
Rep: 13 Cost: 1.8971 Fidelity: 0.0107 Trace: 1.0000
Rep: 14 Cost: 1.8880 Fidelity: 0.0124 Trace: 1.0000
Rep: 15 Cost: 1.8789 Fidelity: 0.0142 Trace: 1.0000
Rep: 16 Cost: 1.8695 Fidelity: 0.0162 Trace: 1.0000
Rep: 17 Cost: 1.8601 Fidelity: 0.0183 Trace: 1.0000
Rep: 18 Cost: 1.8505 Fidelity: 0.0205 Trace: 1.0000
Rep: 19 Cost: 1.8410 Fidelity: 0.0229 Trace: 1.0000
Rep: 20 Cost: 1.8327 Fidelity: 0.0254 Trace: 1.0000
Rep: 21 Cost: 1.8241 Fidelity: 0.0280 Trace: 1.0000
Rep: 22 Cost: 1.8145 Fidelity: 0.0308 Trace: 1.0000
Rep: 23 Cost: 1.8060 Fidelity: 0.0337 Trace: 1.0000
Rep: 24 Cost: 1.7979 Fidelity: 0.0367 Trace: 1.0000
Rep: 25 Cost: 1.7897 Fidelity: 0.0398 Trace: 1.0000
Rep: 26 Cost: 1.7815 Fidelity: 0.0431 Trace: 1.0000
Rep: 27 Cost: 1.7732 Fidelity: 0.0464 Trace: 1.0000
Rep: 28 Cost: 1.7649 Fidelity: 0.0498 Trace: 1.0000
Rep: 29 Cost: 1.7566 Fidelity: 0.0533 Trace: 1.0000
Rep: 30 Cost: 1.7484 Fidelity: 0.0569 Trace: 1.0000
Rep: 31 Cost: 1.7403 Fidelity: 0.0606 Trace: 1.0000
Rep: 32 Cost: 1.7322 Fidelity: 0.0644 Trace: 1.0000
Rep: 33 Cost: 1.7242 Fidelity: 0.0683 Trace: 1.0000
Rep: 34 Cost: 1.7164 Fidelity: 0.0723 Trace: 1.0000
Rep: 35 Cost: 1.7087 Fidelity: 0.0763 Trace: 1.0000
Rep: 36 Cost: 1.7012 Fidelity: 0.0804 Trace: 1.0000
Rep: 37 Cost: 1.6938 Fidelity: 0.0846 Trace: 1.0000
Rep: 38 Cost: 1.6866 Fidelity: 0.0888 Trace: 1.0000
Rep: 39 Cost: 1.6795 Fidelity: 0.0931 Trace: 1.0000
Rep: 40 Cost: 1.6726 Fidelity: 0.0975 Trace: 1.0000
Rep: 41 Cost: 1.6659 Fidelity: 0.1019 Trace: 1.0000
Rep: 42 Cost: 1.6593 Fidelity: 0.1063 Trace: 1.0000
Rep: 43 Cost: 1.6529 Fidelity: 0.1108 Trace: 1.0000
Rep: 44 Cost: 1.6467 Fidelity: 0.1154 Trace: 1.0000
Rep: 45 Cost: 1.6405 Fidelity: 0.1199 Trace: 1.0000
Rep: 46 Cost: 1.6346 Fidelity: 0.1245 Trace: 1.0000
Rep: 47 Cost: 1.6287 Fidelity: 0.1291 Trace: 1.0000
Rep: 48 Cost: 1.6230 Fidelity: 0.1337 Trace: 1.0000
Rep: 49 Cost: 1.6173 Fidelity: 0.1384 Trace: 1.0000
Rep: 50 Cost: 1.6117 Fidelity: 0.1430 Trace: 1.0000
Rep: 51 Cost: 1.6062 Fidelity: 0.1476 Trace: 1.0000
Rep: 52 Cost: 1.6007 Fidelity: 0.1523 Trace: 1.0000
Rep: 53 Cost: 1.5952 Fidelity: 0.1569 Trace: 1.0000
Rep: 54 Cost: 1.5897 Fidelity: 0.1616 Trace: 1.0000
Rep: 55 Cost: 1.5842 Fidelity: 0.1662 Trace: 1.0000
Rep: 56 Cost: 1.5786 Fidelity: 0.1708 Trace: 1.0000
Rep: 57 Cost: 1.5731 Fidelity: 0.1754 Trace: 1.0000
Rep: 58 Cost: 1.5674 Fidelity: 0.1800 Trace: 1.0000
Rep: 59 Cost: 1.5617 Fidelity: 0.1846 Trace: 1.0000
Rep: 60 Cost: 1.5560 Fidelity: 0.1892 Trace: 1.0000
Rep: 61 Cost: 1.5502 Fidelity: 0.1938 Trace: 1.0000
Rep: 62 Cost: 1.5445 Fidelity: 0.1984 Trace: 1.0000
Rep: 63 Cost: 1.5389 Fidelity: 0.2030 Trace: 1.0000
Rep: 64 Cost: 1.5333 Fidelity: 0.2076 Trace: 1.0000
Rep: 65 Cost: 1.5276 Fidelity: 0.2122 Trace: 1.0000
Rep: 66 Cost: 1.5219 Fidelity: 0.2168 Trace: 1.0000
Rep: 67 Cost: 1.5161 Fidelity: 0.2215 Trace: 1.0000
Rep: 68 Cost: 1.5101 Fidelity: 0.2261 Trace: 1.0000
Rep: 69 Cost: 1.5040 Fidelity: 0.2307 Trace: 1.0000
Rep: 70 Cost: 1.4977 Fidelity: 0.2354 Trace: 1.0000
Rep: 71 Cost: 1.4912 Fidelity: 0.2400 Trace: 1.0000
Rep: 72 Cost: 1.4845 Fidelity: 0.2446 Trace: 1.0000
Rep: 73 Cost: 1.4775 Fidelity: 0.2492 Trace: 1.0000
Rep: 74 Cost: 1.4703 Fidelity: 0.2538 Trace: 1.0000
Rep: 75 Cost: 1.4629 Fidelity: 0.2583 Trace: 1.0000
Rep: 76 Cost: 1.4553 Fidelity: 0.2630 Trace: 1.0000
Rep: 77 Cost: 1.4474 Fidelity: 0.2676 Trace: 1.0000
Rep: 78 Cost: 1.4392 Fidelity: 0.2724 Trace: 1.0000
Rep: 79 Cost: 1.4308 Fidelity: 0.2772 Trace: 1.0000
Rep: 80 Cost: 1.4222 Fidelity: 0.2822 Trace: 1.0000
Rep: 81 Cost: 1.4132 Fidelity: 0.2873 Trace: 1.0000
Rep: 82 Cost: 1.4040 Fidelity: 0.2926 Trace: 1.0000
Rep: 83 Cost: 1.3945 Fidelity: 0.2980 Trace: 1.0000
Rep: 84 Cost: 1.3848 Fidelity: 0.3036 Trace: 1.0000
Rep: 85 Cost: 1.3748 Fidelity: 0.3094 Trace: 1.0000
Rep: 86 Cost: 1.3646 Fidelity: 0.3153 Trace: 1.0000
Rep: 87 Cost: 1.3543 Fidelity: 0.3214 Trace: 1.0000
Rep: 88 Cost: 1.3438 Fidelity: 0.3276 Trace: 1.0000
Rep: 89 Cost: 1.3334 Fidelity: 0.3340 Trace: 1.0000
Rep: 90 Cost: 1.3231 Fidelity: 0.3406 Trace: 1.0000
Rep: 91 Cost: 1.3129 Fidelity: 0.3473 Trace: 1.0000
Rep: 92 Cost: 1.3028 Fidelity: 0.3543 Trace: 1.0000
Rep: 93 Cost: 1.2925 Fidelity: 0.3614 Trace: 1.0000
Rep: 94 Cost: 1.2821 Fidelity: 0.3686 Trace: 1.0000
Rep: 95 Cost: 1.2715 Fidelity: 0.3759 Trace: 1.0000
Rep: 96 Cost: 1.2606 Fidelity: 0.3832 Trace: 1.0000
Rep: 97 Cost: 1.2493 Fidelity: 0.3905 Trace: 1.0000
Rep: 98 Cost: 1.2376 Fidelity: 0.3978 Trace: 1.0000
Rep: 99 Cost: 1.2257 Fidelity: 0.4051 Trace: 1.0000
Rep: 100 Cost: 1.2152 Fidelity: 0.4123 Trace: 1.0000
Rep: 101 Cost: 1.2057 Fidelity: 0.4197 Trace: 1.0000
Rep: 102 Cost: 1.1951 Fidelity: 0.4272 Trace: 1.0000
Rep: 103 Cost: 1.1841 Fidelity: 0.4345 Trace: 1.0000
Rep: 104 Cost: 1.1739 Fidelity: 0.4417 Trace: 1.0000
Rep: 105 Cost: 1.1641 Fidelity: 0.4487 Trace: 1.0000
Rep: 106 Cost: 1.1538 Fidelity: 0.4554 Trace: 1.0000
Rep: 107 Cost: 1.1427 Fidelity: 0.4620 Trace: 1.0000
Rep: 108 Cost: 1.1325 Fidelity: 0.4685 Trace: 1.0000
Rep: 109 Cost: 1.1229 Fidelity: 0.4749 Trace: 1.0000
Rep: 110 Cost: 1.1116 Fidelity: 0.4812 Trace: 1.0000
Rep: 111 Cost: 1.1032 Fidelity: 0.4875 Trace: 0.9999
Rep: 112 Cost: 1.0936 Fidelity: 0.4937 Trace: 0.9999
Rep: 113 Cost: 1.0821 Fidelity: 0.4998 Trace: 0.9999
Rep: 114 Cost: 1.0717 Fidelity: 0.5058 Trace: 0.9999
Rep: 115 Cost: 1.0628 Fidelity: 0.5117 Trace: 0.9999
Rep: 116 Cost: 1.0528 Fidelity: 0.5175 Trace: 0.9999
Rep: 117 Cost: 1.0420 Fidelity: 0.5233 Trace: 0.9999
Rep: 118 Cost: 1.0329 Fidelity: 0.5289 Trace: 0.9999
Rep: 119 Cost: 1.0234 Fidelity: 0.5345 Trace: 0.9999
Rep: 120 Cost: 1.0138 Fidelity: 0.5402 Trace: 0.9999
Rep: 121 Cost: 1.0055 Fidelity: 0.5458 Trace: 0.9999
Rep: 122 Cost: 0.9962 Fidelity: 0.5514 Trace: 0.9999
Rep: 123 Cost: 0.9864 Fidelity: 0.5570 Trace: 0.9998
Rep: 124 Cost: 0.9781 Fidelity: 0.5626 Trace: 0.9998
Rep: 125 Cost: 0.9695 Fidelity: 0.5682 Trace: 0.9998
Rep: 126 Cost: 0.9607 Fidelity: 0.5736 Trace: 0.9998
Rep: 127 Cost: 0.9518 Fidelity: 0.5790 Trace: 0.9998
Rep: 128 Cost: 0.9445 Fidelity: 0.5844 Trace: 0.9998
Rep: 129 Cost: 0.9367 Fidelity: 0.5898 Trace: 0.9998
Rep: 130 Cost: 0.9276 Fidelity: 0.5952 Trace: 0.9997
Rep: 131 Cost: 0.9177 Fidelity: 0.6005 Trace: 0.9997
Rep: 132 Cost: 0.9121 Fidelity: 0.6058 Trace: 0.9997
Rep: 133 Cost: 0.9034 Fidelity: 0.6111 Trace: 0.9997
Rep: 134 Cost: 0.8945 Fidelity: 0.6163 Trace: 0.9996
Rep: 135 Cost: 0.8869 Fidelity: 0.6214 Trace: 0.9996
Rep: 136 Cost: 0.8786 Fidelity: 0.6265 Trace: 0.9996
Rep: 137 Cost: 0.8691 Fidelity: 0.6314 Trace: 0.9996
Rep: 138 Cost: 0.8620 Fidelity: 0.6364 Trace: 0.9995
Rep: 139 Cost: 0.8544 Fidelity: 0.6413 Trace: 0.9995
Rep: 140 Cost: 0.8445 Fidelity: 0.6463 Trace: 0.9995
Rep: 141 Cost: 0.8373 Fidelity: 0.6513 Trace: 0.9995
Rep: 142 Cost: 0.8294 Fidelity: 0.6563 Trace: 0.9994
Rep: 143 Cost: 0.8215 Fidelity: 0.6611 Trace: 0.9994
Rep: 144 Cost: 0.8137 Fidelity: 0.6658 Trace: 0.9994
Rep: 145 Cost: 0.8044 Fidelity: 0.6705 Trace: 0.9993
Rep: 146 Cost: 0.7972 Fidelity: 0.6752 Trace: 0.9993
Rep: 147 Cost: 0.7894 Fidelity: 0.6799 Trace: 0.9993
Rep: 148 Cost: 0.7830 Fidelity: 0.6845 Trace: 0.9992
Rep: 149 Cost: 0.7750 Fidelity: 0.6892 Trace: 0.9992
Rep: 150 Cost: 0.7677 Fidelity: 0.6937 Trace: 0.9991
Rep: 151 Cost: 0.7593 Fidelity: 0.6982 Trace: 0.9991
Rep: 152 Cost: 0.7535 Fidelity: 0.7027 Trace: 0.9990
Rep: 153 Cost: 0.7454 Fidelity: 0.7071 Trace: 0.9990
Rep: 154 Cost: 0.7383 Fidelity: 0.7116 Trace: 0.9989
Rep: 155 Cost: 0.7314 Fidelity: 0.7160 Trace: 0.9989
Rep: 156 Cost: 0.7223 Fidelity: 0.7203 Trace: 0.9988
Rep: 157 Cost: 0.7171 Fidelity: 0.7243 Trace: 0.9988
Rep: 158 Cost: 0.7094 Fidelity: 0.7285 Trace: 0.9987
Rep: 159 Cost: 0.7018 Fidelity: 0.7327 Trace: 0.9987
Rep: 160 Cost: 0.6950 Fidelity: 0.7368 Trace: 0.9986
Rep: 161 Cost: 0.6866 Fidelity: 0.7407 Trace: 0.9985
Rep: 162 Cost: 0.6807 Fidelity: 0.7444 Trace: 0.9984
Rep: 163 Cost: 0.6724 Fidelity: 0.7483 Trace: 0.9984
Rep: 164 Cost: 0.6673 Fidelity: 0.7522 Trace: 0.9983
Rep: 165 Cost: 0.6610 Fidelity: 0.7559 Trace: 0.9983
Rep: 166 Cost: 0.6518 Fidelity: 0.7595 Trace: 0.9982
Rep: 167 Cost: 0.6470 Fidelity: 0.7628 Trace: 0.9981
Rep: 168 Cost: 0.6411 Fidelity: 0.7662 Trace: 0.9980
Rep: 169 Cost: 0.6319 Fidelity: 0.7698 Trace: 0.9979
Rep: 170 Cost: 0.6261 Fidelity: 0.7734 Trace: 0.9979
Rep: 171 Cost: 0.6204 Fidelity: 0.7768 Trace: 0.9978
Rep: 172 Cost: 0.6116 Fidelity: 0.7800 Trace: 0.9977
Rep: 173 Cost: 0.6058 Fidelity: 0.7831 Trace: 0.9976
Rep: 174 Cost: 0.6000 Fidelity: 0.7863 Trace: 0.9975
Rep: 175 Cost: 0.5909 Fidelity: 0.7896 Trace: 0.9974
Rep: 176 Cost: 0.5876 Fidelity: 0.7931 Trace: 0.9973
Rep: 177 Cost: 0.5827 Fidelity: 0.7963 Trace: 0.9973
Rep: 178 Cost: 0.5742 Fidelity: 0.7993 Trace: 0.9971
Rep: 179 Cost: 0.5658 Fidelity: 0.8020 Trace: 0.9970
Rep: 180 Cost: 0.5609 Fidelity: 0.8049 Trace: 0.9969
Rep: 181 Cost: 0.5528 Fidelity: 0.8079 Trace: 0.9968
Rep: 182 Cost: 0.5473 Fidelity: 0.8107 Trace: 0.9966
Rep: 183 Cost: 0.5431 Fidelity: 0.8131 Trace: 0.9965
Rep: 184 Cost: 0.5366 Fidelity: 0.8158 Trace: 0.9964
Rep: 185 Cost: 0.5337 Fidelity: 0.8185 Trace: 0.9963
Rep: 186 Cost: 0.5279 Fidelity: 0.8209 Trace: 0.9962
Rep: 187 Cost: 0.5229 Fidelity: 0.8228 Trace: 0.9960
Rep: 188 Cost: 0.5185 Fidelity: 0.8250 Trace: 0.9958
Rep: 189 Cost: 0.5119 Fidelity: 0.8275 Trace: 0.9957
Rep: 190 Cost: 0.5079 Fidelity: 0.8302 Trace: 0.9956
Rep: 191 Cost: 0.5024 Fidelity: 0.8324 Trace: 0.9954
Rep: 192 Cost: 0.4967 Fidelity: 0.8341 Trace: 0.9952
Rep: 193 Cost: 0.4917 Fidelity: 0.8359 Trace: 0.9950
Rep: 194 Cost: 0.4874 Fidelity: 0.8380 Trace: 0.9949
Rep: 195 Cost: 0.4831 Fidelity: 0.8398 Trace: 0.9947
Rep: 196 Cost: 0.4783 Fidelity: 0.8412 Trace: 0.9945
Rep: 197 Cost: 0.4728 Fidelity: 0.8430 Trace: 0.9944
Rep: 198 Cost: 0.4693 Fidelity: 0.8450 Trace: 0.9942
Rep: 199 Cost: 0.4654 Fidelity: 0.8479 Trace: 0.9941
Rep: 200 Cost: 0.4600 Fidelity: 0.8501 Trace: 0.9939
Rep: 201 Cost: 0.4577 Fidelity: 0.8523 Trace: 0.9938
Rep: 202 Cost: 0.4533 Fidelity: 0.8544 Trace: 0.9936
Rep: 203 Cost: 0.4496 Fidelity: 0.8566 Trace: 0.9934
Rep: 204 Cost: 0.4448 Fidelity: 0.8591 Trace: 0.9933
Rep: 205 Cost: 0.4416 Fidelity: 0.8611 Trace: 0.9932
Rep: 206 Cost: 0.4379 Fidelity: 0.8626 Trace: 0.9929
Rep: 207 Cost: 0.4338 Fidelity: 0.8643 Trace: 0.9928
Rep: 208 Cost: 0.4310 Fidelity: 0.8665 Trace: 0.9927
Rep: 209 Cost: 0.4279 Fidelity: 0.8682 Trace: 0.9925
Rep: 210 Cost: 0.4229 Fidelity: 0.8693 Trace: 0.9923
Rep: 211 Cost: 0.4208 Fidelity: 0.8706 Trace: 0.9920
Rep: 212 Cost: 0.4167 Fidelity: 0.8725 Trace: 0.9919
Rep: 213 Cost: 0.4128 Fidelity: 0.8744 Trace: 0.9918
Rep: 214 Cost: 0.4101 Fidelity: 0.8755 Trace: 0.9916
Rep: 215 Cost: 0.4053 Fidelity: 0.8768 Trace: 0.9914
Rep: 216 Cost: 0.4031 Fidelity: 0.8781 Trace: 0.9912
Rep: 217 Cost: 0.3993 Fidelity: 0.8793 Trace: 0.9910
Rep: 218 Cost: 0.3954 Fidelity: 0.8808 Trace: 0.9908
Rep: 219 Cost: 0.3937 Fidelity: 0.8816 Trace: 0.9906
Rep: 220 Cost: 0.3901 Fidelity: 0.8830 Trace: 0.9904
Rep: 221 Cost: 0.3892 Fidelity: 0.8850 Trace: 0.9903
Rep: 222 Cost: 0.3865 Fidelity: 0.8862 Trace: 0.9901
Rep: 223 Cost: 0.3822 Fidelity: 0.8868 Trace: 0.9899
Rep: 224 Cost: 0.3817 Fidelity: 0.8878 Trace: 0.9897
Rep: 225 Cost: 0.3769 Fidelity: 0.8901 Trace: 0.9896
Rep: 226 Cost: 0.3734 Fidelity: 0.8921 Trace: 0.9894
Rep: 227 Cost: 0.3732 Fidelity: 0.8932 Trace: 0.9891
Rep: 228 Cost: 0.3680 Fidelity: 0.8949 Trace: 0.9890
Rep: 229 Cost: 0.3663 Fidelity: 0.8968 Trace: 0.9889
Rep: 230 Cost: 0.3632 Fidelity: 0.8978 Trace: 0.9887
Rep: 231 Cost: 0.3604 Fidelity: 0.8983 Trace: 0.9884
Rep: 232 Cost: 0.3579 Fidelity: 0.8995 Trace: 0.9883
Rep: 233 Cost: 0.3552 Fidelity: 0.9013 Trace: 0.9882
Rep: 234 Cost: 0.3519 Fidelity: 0.9024 Trace: 0.9880
Rep: 235 Cost: 0.3502 Fidelity: 0.9027 Trace: 0.9877
Rep: 236 Cost: 0.3471 Fidelity: 0.9039 Trace: 0.9875
Rep: 237 Cost: 0.3455 Fidelity: 0.9057 Trace: 0.9875
Rep: 238 Cost: 0.3437 Fidelity: 0.9067 Trace: 0.9874
Rep: 239 Cost: 0.3388 Fidelity: 0.9070 Trace: 0.9871
Rep: 240 Cost: 0.3381 Fidelity: 0.9069 Trace: 0.9868
Rep: 241 Cost: 0.3354 Fidelity: 0.9077 Trace: 0.9866
Rep: 242 Cost: 0.3313 Fidelity: 0.9093 Trace: 0.9866
Rep: 243 Cost: 0.3291 Fidelity: 0.9101 Trace: 0.9864
Rep: 244 Cost: 0.3254 Fidelity: 0.9103 Trace: 0.9861
Rep: 245 Cost: 0.3226 Fidelity: 0.9112 Trace: 0.9860
Rep: 246 Cost: 0.3204 Fidelity: 0.9117 Trace: 0.9858
Rep: 247 Cost: 0.3185 Fidelity: 0.9124 Trace: 0.9856
Rep: 248 Cost: 0.3148 Fidelity: 0.9130 Trace: 0.9854
Rep: 249 Cost: 0.3136 Fidelity: 0.9137 Trace: 0.9852
Rep: 250 Cost: 0.3129 Fidelity: 0.9148 Trace: 0.9851
Rep: 251 Cost: 0.3094 Fidelity: 0.9158 Trace: 0.9849
Rep: 252 Cost: 0.3063 Fidelity: 0.9165 Trace: 0.9847
Rep: 253 Cost: 0.3044 Fidelity: 0.9176 Trace: 0.9846
Rep: 254 Cost: 0.3031 Fidelity: 0.9188 Trace: 0.9845
Rep: 255 Cost: 0.3013 Fidelity: 0.9198 Trace: 0.9843
Rep: 256 Cost: 0.2967 Fidelity: 0.9205 Trace: 0.9841
Rep: 257 Cost: 0.2975 Fidelity: 0.9211 Trace: 0.9839
Rep: 258 Cost: 0.2952 Fidelity: 0.9220 Trace: 0.9837
Rep: 259 Cost: 0.2907 Fidelity: 0.9231 Trace: 0.9836
Rep: 260 Cost: 0.2905 Fidelity: 0.9243 Trace: 0.9836
Rep: 261 Cost: 0.2877 Fidelity: 0.9250 Trace: 0.9834
Rep: 262 Cost: 0.2826 Fidelity: 0.9253 Trace: 0.9832
Rep: 263 Cost: 0.2838 Fidelity: 0.9261 Trace: 0.9830
Rep: 264 Cost: 0.2798 Fidelity: 0.9269 Trace: 0.9829
Rep: 265 Cost: 0.2759 Fidelity: 0.9277 Trace: 0.9827
Rep: 266 Cost: 0.2748 Fidelity: 0.9288 Trace: 0.9826
Rep: 267 Cost: 0.2715 Fidelity: 0.9293 Trace: 0.9824
Rep: 268 Cost: 0.2694 Fidelity: 0.9297 Trace: 0.9821
Rep: 269 Cost: 0.2656 Fidelity: 0.9306 Trace: 0.9821
Rep: 270 Cost: 0.2652 Fidelity: 0.9314 Trace: 0.9820
Rep: 271 Cost: 0.2628 Fidelity: 0.9321 Trace: 0.9818
Rep: 272 Cost: 0.2580 Fidelity: 0.9326 Trace: 0.9816
Rep: 273 Cost: 0.2593 Fidelity: 0.9331 Trace: 0.9814
Rep: 274 Cost: 0.2579 Fidelity: 0.9339 Trace: 0.9812
Rep: 275 Cost: 0.2548 Fidelity: 0.9350 Trace: 0.9812
Rep: 276 Cost: 0.2514 Fidelity: 0.9359 Trace: 0.9811
Rep: 277 Cost: 0.2468 Fidelity: 0.9364 Trace: 0.9809
Rep: 278 Cost: 0.2466 Fidelity: 0.9372 Trace: 0.9808
Rep: 279 Cost: 0.2446 Fidelity: 0.9383 Trace: 0.9807
Rep: 280 Cost: 0.2413 Fidelity: 0.9390 Trace: 0.9805
Rep: 281 Cost: 0.2377 Fidelity: 0.9400 Trace: 0.9804
Rep: 282 Cost: 0.2349 Fidelity: 0.9412 Trace: 0.9803
Rep: 283 Cost: 0.2333 Fidelity: 0.9419 Trace: 0.9801
Rep: 284 Cost: 0.2303 Fidelity: 0.9427 Trace: 0.9800
Rep: 285 Cost: 0.2272 Fidelity: 0.9436 Trace: 0.9799
Rep: 286 Cost: 0.2241 Fidelity: 0.9442 Trace: 0.9798
Rep: 287 Cost: 0.2219 Fidelity: 0.9452 Trace: 0.9798
Rep: 288 Cost: 0.2189 Fidelity: 0.9458 Trace: 0.9796
Rep: 289 Cost: 0.2180 Fidelity: 0.9458 Trace: 0.9793
Rep: 290 Cost: 0.2145 Fidelity: 0.9465 Trace: 0.9792
Rep: 291 Cost: 0.2099 Fidelity: 0.9476 Trace: 0.9792
Rep: 292 Cost: 0.2075 Fidelity: 0.9482 Trace: 0.9791
Rep: 293 Cost: 0.2049 Fidelity: 0.9483 Trace: 0.9788
Rep: 294 Cost: 0.2007 Fidelity: 0.9489 Trace: 0.9787
Rep: 295 Cost: 0.1992 Fidelity: 0.9500 Trace: 0.9787
Rep: 296 Cost: 0.1970 Fidelity: 0.9506 Trace: 0.9786
Rep: 297 Cost: 0.1953 Fidelity: 0.9510 Trace: 0.9783
Rep: 298 Cost: 0.1885 Fidelity: 0.9517 Trace: 0.9782
Rep: 299 Cost: 0.1882 Fidelity: 0.9521 Trace: 0.9779
Rep: 300 Cost: 0.1851 Fidelity: 0.9532 Trace: 0.9779
Rep: 301 Cost: 0.1803 Fidelity: 0.9545 Trace: 0.9780
Rep: 302 Cost: 0.1787 Fidelity: 0.9553 Trace: 0.9779
Rep: 303 Cost: 0.1751 Fidelity: 0.9559 Trace: 0.9778
Rep: 304 Cost: 0.1724 Fidelity: 0.9565 Trace: 0.9777
Rep: 305 Cost: 0.1683 Fidelity: 0.9575 Trace: 0.9777
Rep: 306 Cost: 0.1648 Fidelity: 0.9580 Trace: 0.9774
Rep: 307 Cost: 0.1643 Fidelity: 0.9581 Trace: 0.9771
Rep: 308 Cost: 0.1598 Fidelity: 0.9591 Trace: 0.9771
Rep: 309 Cost: 0.1561 Fidelity: 0.9601 Trace: 0.9772
Rep: 310 Cost: 0.1522 Fidelity: 0.9608 Trace: 0.9771
Rep: 311 Cost: 0.1495 Fidelity: 0.9611 Trace: 0.9769
Rep: 312 Cost: 0.1474 Fidelity: 0.9614 Trace: 0.9767
Rep: 313 Cost: 0.1415 Fidelity: 0.9625 Trace: 0.9768
Rep: 314 Cost: 0.1387 Fidelity: 0.9632 Trace: 0.9767
Rep: 315 Cost: 0.1365 Fidelity: 0.9636 Trace: 0.9764
Rep: 316 Cost: 0.1318 Fidelity: 0.9644 Trace: 0.9764
Rep: 317 Cost: 0.1275 Fidelity: 0.9651 Trace: 0.9764
Rep: 318 Cost: 0.1254 Fidelity: 0.9654 Trace: 0.9762
Rep: 319 Cost: 0.1225 Fidelity: 0.9660 Trace: 0.9761
Rep: 320 Cost: 0.1178 Fidelity: 0.9667 Trace: 0.9761
Rep: 321 Cost: 0.1140 Fidelity: 0.9673 Trace: 0.9761
Rep: 322 Cost: 0.1107 Fidelity: 0.9674 Trace: 0.9758
Rep: 323 Cost: 0.1071 Fidelity: 0.9679 Trace: 0.9757
Rep: 324 Cost: 0.1034 Fidelity: 0.9687 Trace: 0.9758
Rep: 325 Cost: 0.0998 Fidelity: 0.9690 Trace: 0.9757
Rep: 326 Cost: 0.0976 Fidelity: 0.9692 Trace: 0.9754
Rep: 327 Cost: 0.0922 Fidelity: 0.9698 Trace: 0.9753
Rep: 328 Cost: 0.0887 Fidelity: 0.9704 Trace: 0.9753
Rep: 329 Cost: 0.0858 Fidelity: 0.9707 Trace: 0.9751
Rep: 330 Cost: 0.0816 Fidelity: 0.9711 Trace: 0.9751
Rep: 331 Cost: 0.0777 Fidelity: 0.9715 Trace: 0.9751
Rep: 332 Cost: 0.0750 Fidelity: 0.9716 Trace: 0.9749
Rep: 333 Cost: 0.0716 Fidelity: 0.9718 Trace: 0.9747
Rep: 334 Cost: 0.0674 Fidelity: 0.9722 Trace: 0.9747
Rep: 335 Cost: 0.0660 Fidelity: 0.9725 Trace: 0.9746
Rep: 336 Cost: 0.0623 Fidelity: 0.9726 Trace: 0.9744
Rep: 337 Cost: 0.0572 Fidelity: 0.9724 Trace: 0.9741
Rep: 338 Cost: 0.0537 Fidelity: 0.9725 Trace: 0.9739
Rep: 339 Cost: 0.0510 Fidelity: 0.9729 Trace: 0.9740
Rep: 340 Cost: 0.0476 Fidelity: 0.9729 Trace: 0.9738
Rep: 341 Cost: 0.0433 Fidelity: 0.9728 Trace: 0.9734
Rep: 342 Cost: 0.0398 Fidelity: 0.9727 Trace: 0.9732
Rep: 343 Cost: 0.0355 Fidelity: 0.9728 Trace: 0.9732
Rep: 344 Cost: 0.0325 Fidelity: 0.9729 Trace: 0.9731
Rep: 345 Cost: 0.0296 Fidelity: 0.9727 Trace: 0.9728
Rep: 346 Cost: 0.0253 Fidelity: 0.9725 Trace: 0.9725
Rep: 347 Cost: 0.0221 Fidelity: 0.9722 Trace: 0.9722
Rep: 348 Cost: 0.0194 Fidelity: 0.9719 Trace: 0.9719
Rep: 349 Cost: 0.0225 Fidelity: 0.9719 Trace: 0.9719
Rep: 350 Cost: 0.0222 Fidelity: 0.9716 Trace: 0.9716
Rep: 351 Cost: 0.0264 Fidelity: 0.9713 Trace: 0.9713
Rep: 352 Cost: 0.0275 Fidelity: 0.9713 Trace: 0.9713
Rep: 353 Cost: 0.0274 Fidelity: 0.9714 Trace: 0.9715
Rep: 354 Cost: 0.0283 Fidelity: 0.9715 Trace: 0.9715
Rep: 355 Cost: 0.0270 Fidelity: 0.9714 Trace: 0.9715
Rep: 356 Cost: 0.0278 Fidelity: 0.9714 Trace: 0.9714
Rep: 357 Cost: 0.0247 Fidelity: 0.9715 Trace: 0.9715
Rep: 358 Cost: 0.0236 Fidelity: 0.9716 Trace: 0.9716
Rep: 359 Cost: 0.0213 Fidelity: 0.9717 Trace: 0.9718
Rep: 360 Cost: 0.0196 Fidelity: 0.9720 Trace: 0.9720
Rep: 361 Cost: 0.0184 Fidelity: 0.9720 Trace: 0.9720
Rep: 362 Cost: 0.0197 Fidelity: 0.9720 Trace: 0.9720
Rep: 363 Cost: 0.0202 Fidelity: 0.9725 Trace: 0.9725
Rep: 364 Cost: 0.0206 Fidelity: 0.9725 Trace: 0.9726
Rep: 365 Cost: 0.0249 Fidelity: 0.9723 Trace: 0.9723
Rep: 366 Cost: 0.0233 Fidelity: 0.9724 Trace: 0.9724
Rep: 367 Cost: 0.0248 Fidelity: 0.9726 Trace: 0.9727
Rep: 368 Cost: 0.0235 Fidelity: 0.9727 Trace: 0.9727
Rep: 369 Cost: 0.0232 Fidelity: 0.9725 Trace: 0.9725
Rep: 370 Cost: 0.0221 Fidelity: 0.9725 Trace: 0.9725
Rep: 371 Cost: 0.0218 Fidelity: 0.9726 Trace: 0.9727
Rep: 372 Cost: 0.0198 Fidelity: 0.9726 Trace: 0.9726
Rep: 373 Cost: 0.0211 Fidelity: 0.9723 Trace: 0.9723
Rep: 374 Cost: 0.0169 Fidelity: 0.9724 Trace: 0.9724
Rep: 375 Cost: 0.0214 Fidelity: 0.9722 Trace: 0.9722
Rep: 376 Cost: 0.0217 Fidelity: 0.9720 Trace: 0.9720
Rep: 377 Cost: 0.0193 Fidelity: 0.9722 Trace: 0.9722
Rep: 378 Cost: 0.0200 Fidelity: 0.9723 Trace: 0.9724
Rep: 379 Cost: 0.0198 Fidelity: 0.9723 Trace: 0.9723
Rep: 380 Cost: 0.0191 Fidelity: 0.9723 Trace: 0.9723
Rep: 381 Cost: 0.0188 Fidelity: 0.9724 Trace: 0.9724
Rep: 382 Cost: 0.0180 Fidelity: 0.9725 Trace: 0.9725
Rep: 383 Cost: 0.0169 Fidelity: 0.9726 Trace: 0.9727
Rep: 384 Cost: 0.0187 Fidelity: 0.9727 Trace: 0.9727
Rep: 385 Cost: 0.0190 Fidelity: 0.9728 Trace: 0.9728
Rep: 386 Cost: 0.0192 Fidelity: 0.9726 Trace: 0.9726
Rep: 387 Cost: 0.0187 Fidelity: 0.9727 Trace: 0.9727
Rep: 388 Cost: 0.0172 Fidelity: 0.9728 Trace: 0.9728
Rep: 389 Cost: 0.0187 Fidelity: 0.9726 Trace: 0.9726
Rep: 390 Cost: 0.0193 Fidelity: 0.9728 Trace: 0.9728
Rep: 391 Cost: 0.0172 Fidelity: 0.9726 Trace: 0.9726
Rep: 392 Cost: 0.0186 Fidelity: 0.9728 Trace: 0.9728
Rep: 393 Cost: 0.0174 Fidelity: 0.9727 Trace: 0.9727
Rep: 394 Cost: 0.0185 Fidelity: 0.9728 Trace: 0.9728
Rep: 395 Cost: 0.0202 Fidelity: 0.9727 Trace: 0.9727
Rep: 396 Cost: 0.0188 Fidelity: 0.9730 Trace: 0.9730
Rep: 397 Cost: 0.0171 Fidelity: 0.9729 Trace: 0.9729
Rep: 398 Cost: 0.0203 Fidelity: 0.9727 Trace: 0.9727
Rep: 399 Cost: 0.0167 Fidelity: 0.9730 Trace: 0.9730
Rep: 400 Cost: 0.0170 Fidelity: 0.9729 Trace: 0.9729
Rep: 401 Cost: 0.0240 Fidelity: 0.9726 Trace: 0.9726
Rep: 402 Cost: 0.0203 Fidelity: 0.9727 Trace: 0.9727
Rep: 403 Cost: 0.0246 Fidelity: 0.9730 Trace: 0.9730
Rep: 404 Cost: 0.0234 Fidelity: 0.9730 Trace: 0.9730
Rep: 405 Cost: 0.0198 Fidelity: 0.9727 Trace: 0.9727
Rep: 406 Cost: 0.0236 Fidelity: 0.9728 Trace: 0.9728
Rep: 407 Cost: 0.0205 Fidelity: 0.9731 Trace: 0.9731
Rep: 408 Cost: 0.0202 Fidelity: 0.9731 Trace: 0.9731
Rep: 409 Cost: 0.0222 Fidelity: 0.9727 Trace: 0.9727
Rep: 410 Cost: 0.0220 Fidelity: 0.9728 Trace: 0.9728
Rep: 411 Cost: 0.0190 Fidelity: 0.9732 Trace: 0.9733
Rep: 412 Cost: 0.0187 Fidelity: 0.9733 Trace: 0.9733
Rep: 413 Cost: 0.0202 Fidelity: 0.9729 Trace: 0.9729
Rep: 414 Cost: 0.0193 Fidelity: 0.9730 Trace: 0.9730
Rep: 415 Cost: 0.0201 Fidelity: 0.9733 Trace: 0.9734
Rep: 416 Cost: 0.0197 Fidelity: 0.9733 Trace: 0.9733
Rep: 417 Cost: 0.0188 Fidelity: 0.9730 Trace: 0.9730
Rep: 418 Cost: 0.0219 Fidelity: 0.9729 Trace: 0.9730
Rep: 419 Cost: 0.0230 Fidelity: 0.9732 Trace: 0.9732
Rep: 420 Cost: 0.0163 Fidelity: 0.9733 Trace: 0.9733
Rep: 421 Cost: 0.0276 Fidelity: 0.9731 Trace: 0.9732
Rep: 422 Cost: 0.0267 Fidelity: 0.9733 Trace: 0.9733
Rep: 423 Cost: 0.0221 Fidelity: 0.9736 Trace: 0.9736
Rep: 424 Cost: 0.0269 Fidelity: 0.9737 Trace: 0.9737
Rep: 425 Cost: 0.0231 Fidelity: 0.9735 Trace: 0.9735
Rep: 426 Cost: 0.0217 Fidelity: 0.9732 Trace: 0.9732
Rep: 427 Cost: 0.0229 Fidelity: 0.9732 Trace: 0.9732
Rep: 428 Cost: 0.0208 Fidelity: 0.9736 Trace: 0.9736
Rep: 429 Cost: 0.0217 Fidelity: 0.9736 Trace: 0.9736
Rep: 430 Cost: 0.0177 Fidelity: 0.9734 Trace: 0.9734
Rep: 431 Cost: 0.0219 Fidelity: 0.9733 Trace: 0.9733
Rep: 432 Cost: 0.0189 Fidelity: 0.9736 Trace: 0.9736
Rep: 433 Cost: 0.0196 Fidelity: 0.9737 Trace: 0.9737
Rep: 434 Cost: 0.0168 Fidelity: 0.9735 Trace: 0.9735
Rep: 435 Cost: 0.0219 Fidelity: 0.9732 Trace: 0.9732
Rep: 436 Cost: 0.0196 Fidelity: 0.9733 Trace: 0.9733
Rep: 437 Cost: 0.0198 Fidelity: 0.9737 Trace: 0.9737
Rep: 438 Cost: 0.0198 Fidelity: 0.9737 Trace: 0.9737
Rep: 439 Cost: 0.0203 Fidelity: 0.9734 Trace: 0.9734
Rep: 440 Cost: 0.0209 Fidelity: 0.9733 Trace: 0.9733
Rep: 441 Cost: 0.0177 Fidelity: 0.9736 Trace: 0.9736
Rep: 442 Cost: 0.0198 Fidelity: 0.9738 Trace: 0.9738
Rep: 443 Cost: 0.0180 Fidelity: 0.9737 Trace: 0.9737
Rep: 444 Cost: 0.0196 Fidelity: 0.9735 Trace: 0.9735
Rep: 445 Cost: 0.0211 Fidelity: 0.9736 Trace: 0.9736
Rep: 446 Cost: 0.0191 Fidelity: 0.9737 Trace: 0.9737
Rep: 447 Cost: 0.0187 Fidelity: 0.9737 Trace: 0.9737
Rep: 448 Cost: 0.0197 Fidelity: 0.9739 Trace: 0.9739
Rep: 449 Cost: 0.0177 Fidelity: 0.9739 Trace: 0.9739
Rep: 450 Cost: 0.0202 Fidelity: 0.9736 Trace: 0.9736
Rep: 451 Cost: 0.0198 Fidelity: 0.9737 Trace: 0.9737
Rep: 452 Cost: 0.0178 Fidelity: 0.9739 Trace: 0.9739
Rep: 453 Cost: 0.0191 Fidelity: 0.9740 Trace: 0.9740
Rep: 454 Cost: 0.0163 Fidelity: 0.9737 Trace: 0.9737
Rep: 455 Cost: 0.0194 Fidelity: 0.9736 Trace: 0.9736
Rep: 456 Cost: 0.0170 Fidelity: 0.9737 Trace: 0.9737
Rep: 457 Cost: 0.0200 Fidelity: 0.9741 Trace: 0.9741
Rep: 458 Cost: 0.0193 Fidelity: 0.9741 Trace: 0.9741
Rep: 459 Cost: 0.0168 Fidelity: 0.9739 Trace: 0.9739
Rep: 460 Cost: 0.0171 Fidelity: 0.9739 Trace: 0.9739
Rep: 461 Cost: 0.0179 Fidelity: 0.9740 Trace: 0.9740
Rep: 462 Cost: 0.0168 Fidelity: 0.9739 Trace: 0.9739
Rep: 463 Cost: 0.0157 Fidelity: 0.9739 Trace: 0.9739
Rep: 464 Cost: 0.0161 Fidelity: 0.9739 Trace: 0.9739
Rep: 465 Cost: 0.0162 Fidelity: 0.9738 Trace: 0.9738
Rep: 466 Cost: 0.0174 Fidelity: 0.9738 Trace: 0.9738
Rep: 467 Cost: 0.0148 Fidelity: 0.9740 Trace: 0.9740
Rep: 468 Cost: 0.0173 Fidelity: 0.9741 Trace: 0.9741
Rep: 469 Cost: 0.0171 Fidelity: 0.9741 Trace: 0.9741
Rep: 470 Cost: 0.0156 Fidelity: 0.9741 Trace: 0.9741
Rep: 471 Cost: 0.0206 Fidelity: 0.9742 Trace: 0.9742
Rep: 472 Cost: 0.0180 Fidelity: 0.9741 Trace: 0.9741
Rep: 473 Cost: 0.0201 Fidelity: 0.9739 Trace: 0.9740
Rep: 474 Cost: 0.0169 Fidelity: 0.9740 Trace: 0.9740
Rep: 475 Cost: 0.0220 Fidelity: 0.9743 Trace: 0.9744
Rep: 476 Cost: 0.0217 Fidelity: 0.9743 Trace: 0.9744
Rep: 477 Cost: 0.0168 Fidelity: 0.9741 Trace: 0.9741
Rep: 478 Cost: 0.0220 Fidelity: 0.9740 Trace: 0.9740
Rep: 479 Cost: 0.0171 Fidelity: 0.9742 Trace: 0.9742
Rep: 480 Cost: 0.0203 Fidelity: 0.9744 Trace: 0.9744
Rep: 481 Cost: 0.0185 Fidelity: 0.9744 Trace: 0.9744
Rep: 482 Cost: 0.0192 Fidelity: 0.9743 Trace: 0.9743
Rep: 483 Cost: 0.0199 Fidelity: 0.9744 Trace: 0.9744
Rep: 484 Cost: 0.0167 Fidelity: 0.9744 Trace: 0.9744
Rep: 485 Cost: 0.0186 Fidelity: 0.9742 Trace: 0.9742
Rep: 486 Cost: 0.0187 Fidelity: 0.9742 Trace: 0.9742
Rep: 487 Cost: 0.0168 Fidelity: 0.9744 Trace: 0.9744
Rep: 488 Cost: 0.0196 Fidelity: 0.9743 Trace: 0.9743
Rep: 489 Cost: 0.0177 Fidelity: 0.9744 Trace: 0.9744
Rep: 490 Cost: 0.0196 Fidelity: 0.9743 Trace: 0.9743
Rep: 491 Cost: 0.0192 Fidelity: 0.9743 Trace: 0.9743
Rep: 492 Cost: 0.0170 Fidelity: 0.9743 Trace: 0.9743
Rep: 493 Cost: 0.0199 Fidelity: 0.9744 Trace: 0.9744
Rep: 494 Cost: 0.0178 Fidelity: 0.9745 Trace: 0.9745
Rep: 495 Cost: 0.0165 Fidelity: 0.9744 Trace: 0.9744
Rep: 496 Cost: 0.0174 Fidelity: 0.9743 Trace: 0.9744
Rep: 497 Cost: 0.0154 Fidelity: 0.9745 Trace: 0.9745
Rep: 498 Cost: 0.0155 Fidelity: 0.9744 Trace: 0.9744
Rep: 499 Cost: 0.0159 Fidelity: 0.9744 Trace: 0.9744
Rep: 500 Cost: 0.0165 Fidelity: 0.9743 Trace: 0.9743
Rep: 501 Cost: 0.0166 Fidelity: 0.9746 Trace: 0.9746
Rep: 502 Cost: 0.0154 Fidelity: 0.9745 Trace: 0.9745
Rep: 503 Cost: 0.0180 Fidelity: 0.9743 Trace: 0.9743
Rep: 504 Cost: 0.0163 Fidelity: 0.9745 Trace: 0.9745
Rep: 505 Cost: 0.0173 Fidelity: 0.9747 Trace: 0.9747
Rep: 506 Cost: 0.0158 Fidelity: 0.9747 Trace: 0.9747
Rep: 507 Cost: 0.0175 Fidelity: 0.9745 Trace: 0.9745
Rep: 508 Cost: 0.0162 Fidelity: 0.9746 Trace: 0.9746
Rep: 509 Cost: 0.0157 Fidelity: 0.9746 Trace: 0.9746
Rep: 510 Cost: 0.0166 Fidelity: 0.9746 Trace: 0.9746
Rep: 511 Cost: 0.0153 Fidelity: 0.9746 Trace: 0.9746
Rep: 512 Cost: 0.0153 Fidelity: 0.9746 Trace: 0.9746
Rep: 513 Cost: 0.0165 Fidelity: 0.9748 Trace: 0.9748
Rep: 514 Cost: 0.0155 Fidelity: 0.9747 Trace: 0.9747
Rep: 515 Cost: 0.0167 Fidelity: 0.9746 Trace: 0.9747
Rep: 516 Cost: 0.0155 Fidelity: 0.9747 Trace: 0.9747
Rep: 517 Cost: 0.0173 Fidelity: 0.9747 Trace: 0.9747
Rep: 518 Cost: 0.0167 Fidelity: 0.9747 Trace: 0.9747
Rep: 519 Cost: 0.0166 Fidelity: 0.9746 Trace: 0.9746
Rep: 520 Cost: 0.0154 Fidelity: 0.9747 Trace: 0.9747
Rep: 521 Cost: 0.0171 Fidelity: 0.9749 Trace: 0.9750
Rep: 522 Cost: 0.0161 Fidelity: 0.9749 Trace: 0.9749
Rep: 523 Cost: 0.0175 Fidelity: 0.9747 Trace: 0.9747
Rep: 524 Cost: 0.0159 Fidelity: 0.9748 Trace: 0.9748
Rep: 525 Cost: 0.0164 Fidelity: 0.9750 Trace: 0.9750
Rep: 526 Cost: 0.0155 Fidelity: 0.9749 Trace: 0.9749
Rep: 527 Cost: 0.0163 Fidelity: 0.9748 Trace: 0.9748
Rep: 528 Cost: 0.0161 Fidelity: 0.9749 Trace: 0.9749
Rep: 529 Cost: 0.0166 Fidelity: 0.9749 Trace: 0.9749
Rep: 530 Cost: 0.0159 Fidelity: 0.9748 Trace: 0.9748
Rep: 531 Cost: 0.0159 Fidelity: 0.9749 Trace: 0.9749
Rep: 532 Cost: 0.0151 Fidelity: 0.9749 Trace: 0.9749
Rep: 533 Cost: 0.0147 Fidelity: 0.9750 Trace: 0.9750
Rep: 534 Cost: 0.0152 Fidelity: 0.9749 Trace: 0.9749
Rep: 535 Cost: 0.0164 Fidelity: 0.9749 Trace: 0.9749
Rep: 536 Cost: 0.0171 Fidelity: 0.9751 Trace: 0.9751
Rep: 537 Cost: 0.0177 Fidelity: 0.9752 Trace: 0.9752
Rep: 538 Cost: 0.0172 Fidelity: 0.9751 Trace: 0.9751
Rep: 539 Cost: 0.0156 Fidelity: 0.9751 Trace: 0.9751
Rep: 540 Cost: 0.0181 Fidelity: 0.9750 Trace: 0.9750
Rep: 541 Cost: 0.0167 Fidelity: 0.9749 Trace: 0.9749
Rep: 542 Cost: 0.0166 Fidelity: 0.9751 Trace: 0.9751
Rep: 543 Cost: 0.0171 Fidelity: 0.9752 Trace: 0.9752
Rep: 544 Cost: 0.0162 Fidelity: 0.9750 Trace: 0.9750
Rep: 545 Cost: 0.0189 Fidelity: 0.9748 Trace: 0.9749
Rep: 546 Cost: 0.0158 Fidelity: 0.9750 Trace: 0.9750
Rep: 547 Cost: 0.0187 Fidelity: 0.9753 Trace: 0.9753
Rep: 548 Cost: 0.0172 Fidelity: 0.9753 Trace: 0.9753
Rep: 549 Cost: 0.0191 Fidelity: 0.9750 Trace: 0.9750
Rep: 550 Cost: 0.0188 Fidelity: 0.9750 Trace: 0.9750
Rep: 551 Cost: 0.0174 Fidelity: 0.9752 Trace: 0.9753
Rep: 552 Cost: 0.0170 Fidelity: 0.9753 Trace: 0.9753
Rep: 553 Cost: 0.0173 Fidelity: 0.9752 Trace: 0.9752
Rep: 554 Cost: 0.0160 Fidelity: 0.9751 Trace: 0.9751
Rep: 555 Cost: 0.0199 Fidelity: 0.9753 Trace: 0.9753
Rep: 556 Cost: 0.0176 Fidelity: 0.9754 Trace: 0.9754
Rep: 557 Cost: 0.0181 Fidelity: 0.9752 Trace: 0.9752
Rep: 558 Cost: 0.0174 Fidelity: 0.9752 Trace: 0.9753
Rep: 559 Cost: 0.0188 Fidelity: 0.9754 Trace: 0.9754
Rep: 560 Cost: 0.0211 Fidelity: 0.9754 Trace: 0.9754
Rep: 561 Cost: 0.0164 Fidelity: 0.9752 Trace: 0.9752
Rep: 562 Cost: 0.0194 Fidelity: 0.9753 Trace: 0.9753
Rep: 563 Cost: 0.0190 Fidelity: 0.9755 Trace: 0.9755
Rep: 564 Cost: 0.0174 Fidelity: 0.9755 Trace: 0.9756
Rep: 565 Cost: 0.0184 Fidelity: 0.9754 Trace: 0.9754
Rep: 566 Cost: 0.0197 Fidelity: 0.9753 Trace: 0.9753
Rep: 567 Cost: 0.0162 Fidelity: 0.9754 Trace: 0.9754
Rep: 568 Cost: 0.0203 Fidelity: 0.9757 Trace: 0.9757
Rep: 569 Cost: 0.0215 Fidelity: 0.9757 Trace: 0.9757
Rep: 570 Cost: 0.0148 Fidelity: 0.9755 Trace: 0.9755
Rep: 571 Cost: 0.0243 Fidelity: 0.9750 Trace: 0.9750
Rep: 572 Cost: 0.0278 Fidelity: 0.9749 Trace: 0.9749
Rep: 573 Cost: 0.0226 Fidelity: 0.9751 Trace: 0.9751
Rep: 574 Cost: 0.0184 Fidelity: 0.9755 Trace: 0.9756
Rep: 575 Cost: 0.0206 Fidelity: 0.9757 Trace: 0.9757
Rep: 576 Cost: 0.0203 Fidelity: 0.9756 Trace: 0.9756
Rep: 577 Cost: 0.0186 Fidelity: 0.9754 Trace: 0.9754
Rep: 578 Cost: 0.0175 Fidelity: 0.9754 Trace: 0.9754
Rep: 579 Cost: 0.0205 Fidelity: 0.9756 Trace: 0.9756
Rep: 580 Cost: 0.0160 Fidelity: 0.9756 Trace: 0.9756
Rep: 581 Cost: 0.0213 Fidelity: 0.9755 Trace: 0.9755
Rep: 582 Cost: 0.0222 Fidelity: 0.9755 Trace: 0.9756
Rep: 583 Cost: 0.0160 Fidelity: 0.9758 Trace: 0.9758
Rep: 584 Cost: 0.0208 Fidelity: 0.9757 Trace: 0.9757
Rep: 585 Cost: 0.0199 Fidelity: 0.9756 Trace: 0.9756
Rep: 586 Cost: 0.0182 Fidelity: 0.9754 Trace: 0.9755
Rep: 587 Cost: 0.0175 Fidelity: 0.9756 Trace: 0.9756
Rep: 588 Cost: 0.0184 Fidelity: 0.9758 Trace: 0.9758
Rep: 589 Cost: 0.0171 Fidelity: 0.9758 Trace: 0.9758
Rep: 590 Cost: 0.0180 Fidelity: 0.9755 Trace: 0.9755
Rep: 591 Cost: 0.0187 Fidelity: 0.9755 Trace: 0.9755
Rep: 592 Cost: 0.0158 Fidelity: 0.9758 Trace: 0.9758
Rep: 593 Cost: 0.0170 Fidelity: 0.9759 Trace: 0.9759
Rep: 594 Cost: 0.0151 Fidelity: 0.9757 Trace: 0.9757
Rep: 595 Cost: 0.0153 Fidelity: 0.9757 Trace: 0.9757
Rep: 596 Cost: 0.0152 Fidelity: 0.9759 Trace: 0.9759
Rep: 597 Cost: 0.0151 Fidelity: 0.9759 Trace: 0.9759
Rep: 598 Cost: 0.0147 Fidelity: 0.9757 Trace: 0.9757
Rep: 599 Cost: 0.0143 Fidelity: 0.9757 Trace: 0.9757
Rep: 600 Cost: 0.0157 Fidelity: 0.9759 Trace: 0.9759
Rep: 601 Cost: 0.0155 Fidelity: 0.9759 Trace: 0.9759
Rep: 602 Cost: 0.0160 Fidelity: 0.9757 Trace: 0.9757
Rep: 603 Cost: 0.0144 Fidelity: 0.9758 Trace: 0.9758
Rep: 604 Cost: 0.0172 Fidelity: 0.9758 Trace: 0.9758
Rep: 605 Cost: 0.0154 Fidelity: 0.9758 Trace: 0.9758
Rep: 606 Cost: 0.0156 Fidelity: 0.9758 Trace: 0.9758
Rep: 607 Cost: 0.0164 Fidelity: 0.9758 Trace: 0.9758
Rep: 608 Cost: 0.0155 Fidelity: 0.9759 Trace: 0.9759
Rep: 609 Cost: 0.0165 Fidelity: 0.9760 Trace: 0.9760
Rep: 610 Cost: 0.0145 Fidelity: 0.9759 Trace: 0.9759
Rep: 611 Cost: 0.0175 Fidelity: 0.9758 Trace: 0.9758
Rep: 612 Cost: 0.0148 Fidelity: 0.9759 Trace: 0.9759
Rep: 613 Cost: 0.0198 Fidelity: 0.9762 Trace: 0.9762
Rep: 614 Cost: 0.0184 Fidelity: 0.9762 Trace: 0.9762
Rep: 615 Cost: 0.0165 Fidelity: 0.9759 Trace: 0.9759
Rep: 616 Cost: 0.0171 Fidelity: 0.9759 Trace: 0.9759
Rep: 617 Cost: 0.0149 Fidelity: 0.9761 Trace: 0.9761
Rep: 618 Cost: 0.0140 Fidelity: 0.9761 Trace: 0.9761
Rep: 619 Cost: 0.0166 Fidelity: 0.9759 Trace: 0.9759
Rep: 620 Cost: 0.0143 Fidelity: 0.9760 Trace: 0.9760
Rep: 621 Cost: 0.0153 Fidelity: 0.9760 Trace: 0.9760
Rep: 622 Cost: 0.0142 Fidelity: 0.9760 Trace: 0.9760
Rep: 623 Cost: 0.0136 Fidelity: 0.9760 Trace: 0.9760
Rep: 624 Cost: 0.0160 Fidelity: 0.9761 Trace: 0.9761
Rep: 625 Cost: 0.0160 Fidelity: 0.9761 Trace: 0.9761
Rep: 626 Cost: 0.0160 Fidelity: 0.9762 Trace: 0.9762
Rep: 627 Cost: 0.0154 Fidelity: 0.9760 Trace: 0.9760
Rep: 628 Cost: 0.0146 Fidelity: 0.9761 Trace: 0.9761
Rep: 629 Cost: 0.0161 Fidelity: 0.9763 Trace: 0.9763
Rep: 630 Cost: 0.0142 Fidelity: 0.9762 Trace: 0.9762
Rep: 631 Cost: 0.0169 Fidelity: 0.9760 Trace: 0.9761
Rep: 632 Cost: 0.0159 Fidelity: 0.9761 Trace: 0.9761
Rep: 633 Cost: 0.0169 Fidelity: 0.9763 Trace: 0.9763
Rep: 634 Cost: 0.0165 Fidelity: 0.9763 Trace: 0.9763
Rep: 635 Cost: 0.0148 Fidelity: 0.9761 Trace: 0.9761
Rep: 636 Cost: 0.0147 Fidelity: 0.9761 Trace: 0.9761
Rep: 637 Cost: 0.0143 Fidelity: 0.9762 Trace: 0.9762
Rep: 638 Cost: 0.0143 Fidelity: 0.9761 Trace: 0.9761
Rep: 639 Cost: 0.0157 Fidelity: 0.9761 Trace: 0.9761
Rep: 640 Cost: 0.0148 Fidelity: 0.9762 Trace: 0.9762
Rep: 641 Cost: 0.0163 Fidelity: 0.9763 Trace: 0.9763
Rep: 642 Cost: 0.0157 Fidelity: 0.9763 Trace: 0.9763
Rep: 643 Cost: 0.0153 Fidelity: 0.9763 Trace: 0.9763
Rep: 644 Cost: 0.0158 Fidelity: 0.9762 Trace: 0.9762
Rep: 645 Cost: 0.0133 Fidelity: 0.9763 Trace: 0.9764
Rep: 646 Cost: 0.0180 Fidelity: 0.9765 Trace: 0.9765
Rep: 647 Cost: 0.0157 Fidelity: 0.9765 Trace: 0.9765
Rep: 648 Cost: 0.0174 Fidelity: 0.9763 Trace: 0.9764
Rep: 649 Cost: 0.0163 Fidelity: 0.9764 Trace: 0.9764
Rep: 650 Cost: 0.0171 Fidelity: 0.9765 Trace: 0.9765
Rep: 651 Cost: 0.0161 Fidelity: 0.9765 Trace: 0.9765
Rep: 652 Cost: 0.0177 Fidelity: 0.9763 Trace: 0.9763
Rep: 653 Cost: 0.0172 Fidelity: 0.9763 Trace: 0.9763
Rep: 654 Cost: 0.0162 Fidelity: 0.9765 Trace: 0.9765
Rep: 655 Cost: 0.0157 Fidelity: 0.9764 Trace: 0.9764
Rep: 656 Cost: 0.0176 Fidelity: 0.9763 Trace: 0.9763
Rep: 657 Cost: 0.0172 Fidelity: 0.9763 Trace: 0.9763
Rep: 658 Cost: 0.0157 Fidelity: 0.9765 Trace: 0.9765
Rep: 659 Cost: 0.0148 Fidelity: 0.9765 Trace: 0.9765
Rep: 660 Cost: 0.0185 Fidelity: 0.9764 Trace: 0.9764
Rep: 661 Cost: 0.0176 Fidelity: 0.9764 Trace: 0.9764
Rep: 662 Cost: 0.0164 Fidelity: 0.9767 Trace: 0.9767
Rep: 663 Cost: 0.0170 Fidelity: 0.9767 Trace: 0.9767
Rep: 664 Cost: 0.0154 Fidelity: 0.9766 Trace: 0.9766
Rep: 665 Cost: 0.0171 Fidelity: 0.9765 Trace: 0.9765
Rep: 666 Cost: 0.0144 Fidelity: 0.9766 Trace: 0.9766
Rep: 667 Cost: 0.0147 Fidelity: 0.9765 Trace: 0.9765
Rep: 668 Cost: 0.0157 Fidelity: 0.9764 Trace: 0.9764
Rep: 669 Cost: 0.0143 Fidelity: 0.9765 Trace: 0.9765
Rep: 670 Cost: 0.0168 Fidelity: 0.9766 Trace: 0.9766
Rep: 671 Cost: 0.0153 Fidelity: 0.9765 Trace: 0.9765
Rep: 672 Cost: 0.0173 Fidelity: 0.9764 Trace: 0.9764
Rep: 673 Cost: 0.0155 Fidelity: 0.9765 Trace: 0.9765
Rep: 674 Cost: 0.0165 Fidelity: 0.9766 Trace: 0.9766
Rep: 675 Cost: 0.0147 Fidelity: 0.9766 Trace: 0.9766
Rep: 676 Cost: 0.0184 Fidelity: 0.9765 Trace: 0.9765
Rep: 677 Cost: 0.0172 Fidelity: 0.9766 Trace: 0.9766
Rep: 678 Cost: 0.0170 Fidelity: 0.9768 Trace: 0.9768
Rep: 679 Cost: 0.0174 Fidelity: 0.9768 Trace: 0.9768
Rep: 680 Cost: 0.0152 Fidelity: 0.9767 Trace: 0.9767
Rep: 681 Cost: 0.0150 Fidelity: 0.9767 Trace: 0.9767
Rep: 682 Cost: 0.0173 Fidelity: 0.9769 Trace: 0.9769
Rep: 683 Cost: 0.0167 Fidelity: 0.9769 Trace: 0.9769
Rep: 684 Cost: 0.0156 Fidelity: 0.9767 Trace: 0.9767
Rep: 685 Cost: 0.0157 Fidelity: 0.9767 Trace: 0.9767
Rep: 686 Cost: 0.0159 Fidelity: 0.9769 Trace: 0.9769
Rep: 687 Cost: 0.0152 Fidelity: 0.9769 Trace: 0.9769
Rep: 688 Cost: 0.0167 Fidelity: 0.9766 Trace: 0.9767
Rep: 689 Cost: 0.0153 Fidelity: 0.9767 Trace: 0.9767
Rep: 690 Cost: 0.0177 Fidelity: 0.9769 Trace: 0.9769
Rep: 691 Cost: 0.0175 Fidelity: 0.9770 Trace: 0.9770
Rep: 692 Cost: 0.0152 Fidelity: 0.9768 Trace: 0.9768
Rep: 693 Cost: 0.0160 Fidelity: 0.9767 Trace: 0.9767
Rep: 694 Cost: 0.0157 Fidelity: 0.9769 Trace: 0.9769
Rep: 695 Cost: 0.0149 Fidelity: 0.9770 Trace: 0.9770
Rep: 696 Cost: 0.0163 Fidelity: 0.9768 Trace: 0.9768
Rep: 697 Cost: 0.0142 Fidelity: 0.9769 Trace: 0.9769
Rep: 698 Cost: 0.0189 Fidelity: 0.9770 Trace: 0.9770
Rep: 699 Cost: 0.0185 Fidelity: 0.9770 Trace: 0.9770
Rep: 700 Cost: 0.0143 Fidelity: 0.9769 Trace: 0.9769
Rep: 701 Cost: 0.0146 Fidelity: 0.9769 Trace: 0.9769
Rep: 702 Cost: 0.0166 Fidelity: 0.9770 Trace: 0.9770
Rep: 703 Cost: 0.0152 Fidelity: 0.9770 Trace: 0.9770
Rep: 704 Cost: 0.0175 Fidelity: 0.9768 Trace: 0.9769
Rep: 705 Cost: 0.0168 Fidelity: 0.9769 Trace: 0.9769
Rep: 706 Cost: 0.0160 Fidelity: 0.9771 Trace: 0.9771
Rep: 707 Cost: 0.0163 Fidelity: 0.9771 Trace: 0.9771
Rep: 708 Cost: 0.0150 Fidelity: 0.9769 Trace: 0.9769
Rep: 709 Cost: 0.0140 Fidelity: 0.9770 Trace: 0.9770
Rep: 710 Cost: 0.0160 Fidelity: 0.9771 Trace: 0.9771
Rep: 711 Cost: 0.0141 Fidelity: 0.9771 Trace: 0.9771
Rep: 712 Cost: 0.0186 Fidelity: 0.9769 Trace: 0.9769
Rep: 713 Cost: 0.0177 Fidelity: 0.9770 Trace: 0.9770
Rep: 714 Cost: 0.0152 Fidelity: 0.9772 Trace: 0.9772
Rep: 715 Cost: 0.0159 Fidelity: 0.9771 Trace: 0.9772
Rep: 716 Cost: 0.0152 Fidelity: 0.9770 Trace: 0.9770
Rep: 717 Cost: 0.0145 Fidelity: 0.9771 Trace: 0.9771
Rep: 718 Cost: 0.0164 Fidelity: 0.9772 Trace: 0.9772
Rep: 719 Cost: 0.0144 Fidelity: 0.9772 Trace: 0.9772
Rep: 720 Cost: 0.0186 Fidelity: 0.9770 Trace: 0.9770
Rep: 721 Cost: 0.0187 Fidelity: 0.9770 Trace: 0.9770
Rep: 722 Cost: 0.0135 Fidelity: 0.9772 Trace: 0.9772
Rep: 723 Cost: 0.0158 Fidelity: 0.9772 Trace: 0.9772
Rep: 724 Cost: 0.0138 Fidelity: 0.9771 Trace: 0.9771
Rep: 725 Cost: 0.0150 Fidelity: 0.9772 Trace: 0.9772
Rep: 726 Cost: 0.0139 Fidelity: 0.9772 Trace: 0.9772
Rep: 727 Cost: 0.0162 Fidelity: 0.9770 Trace: 0.9770
Rep: 728 Cost: 0.0158 Fidelity: 0.9771 Trace: 0.9771
Rep: 729 Cost: 0.0144 Fidelity: 0.9773 Trace: 0.9773
Rep: 730 Cost: 0.0153 Fidelity: 0.9773 Trace: 0.9773
Rep: 731 Cost: 0.0143 Fidelity: 0.9771 Trace: 0.9772
Rep: 732 Cost: 0.0159 Fidelity: 0.9772 Trace: 0.9772
Rep: 733 Cost: 0.0150 Fidelity: 0.9772 Trace: 0.9772
Rep: 734 Cost: 0.0164 Fidelity: 0.9771 Trace: 0.9772
Rep: 735 Cost: 0.0170 Fidelity: 0.9771 Trace: 0.9771
Rep: 736 Cost: 0.0139 Fidelity: 0.9773 Trace: 0.9773
Rep: 737 Cost: 0.0141 Fidelity: 0.9774 Trace: 0.9774
Rep: 738 Cost: 0.0124 Fidelity: 0.9773 Trace: 0.9773
Rep: 739 Cost: 0.0139 Fidelity: 0.9772 Trace: 0.9772
Rep: 740 Cost: 0.0138 Fidelity: 0.9772 Trace: 0.9772
Rep: 741 Cost: 0.0137 Fidelity: 0.9773 Trace: 0.9773
Rep: 742 Cost: 0.0132 Fidelity: 0.9774 Trace: 0.9774
Rep: 743 Cost: 0.0148 Fidelity: 0.9772 Trace: 0.9772
Rep: 744 Cost: 0.0143 Fidelity: 0.9773 Trace: 0.9773
Rep: 745 Cost: 0.0145 Fidelity: 0.9775 Trace: 0.9775
Rep: 746 Cost: 0.0148 Fidelity: 0.9774 Trace: 0.9774
Rep: 747 Cost: 0.0164 Fidelity: 0.9773 Trace: 0.9773
Rep: 748 Cost: 0.0133 Fidelity: 0.9774 Trace: 0.9774
Rep: 749 Cost: 0.0153 Fidelity: 0.9775 Trace: 0.9775
Rep: 750 Cost: 0.0144 Fidelity: 0.9774 Trace: 0.9774
Rep: 751 Cost: 0.0148 Fidelity: 0.9773 Trace: 0.9773
Rep: 752 Cost: 0.0146 Fidelity: 0.9774 Trace: 0.9774
Rep: 753 Cost: 0.0137 Fidelity: 0.9775 Trace: 0.9775
Rep: 754 Cost: 0.0152 Fidelity: 0.9773 Trace: 0.9773
Rep: 755 Cost: 0.0152 Fidelity: 0.9773 Trace: 0.9773
Rep: 756 Cost: 0.0157 Fidelity: 0.9775 Trace: 0.9775
Rep: 757 Cost: 0.0133 Fidelity: 0.9776 Trace: 0.9776
Rep: 758 Cost: 0.0157 Fidelity: 0.9774 Trace: 0.9774
Rep: 759 Cost: 0.0139 Fidelity: 0.9775 Trace: 0.9775
Rep: 760 Cost: 0.0137 Fidelity: 0.9775 Trace: 0.9775
Rep: 761 Cost: 0.0156 Fidelity: 0.9775 Trace: 0.9775
Rep: 762 Cost: 0.0147 Fidelity: 0.9775 Trace: 0.9775
Rep: 763 Cost: 0.0128 Fidelity: 0.9776 Trace: 0.9776
Rep: 764 Cost: 0.0138 Fidelity: 0.9776 Trace: 0.9776
Rep: 765 Cost: 0.0130 Fidelity: 0.9776 Trace: 0.9776
Rep: 766 Cost: 0.0136 Fidelity: 0.9776 Trace: 0.9776
Rep: 767 Cost: 0.0147 Fidelity: 0.9775 Trace: 0.9775
Rep: 768 Cost: 0.0127 Fidelity: 0.9775 Trace: 0.9775
Rep: 769 Cost: 0.0140 Fidelity: 0.9776 Trace: 0.9776
Rep: 770 Cost: 0.0140 Fidelity: 0.9775 Trace: 0.9775
Rep: 771 Cost: 0.0135 Fidelity: 0.9776 Trace: 0.9776
Rep: 772 Cost: 0.0158 Fidelity: 0.9777 Trace: 0.9777
Rep: 773 Cost: 0.0135 Fidelity: 0.9777 Trace: 0.9777
Rep: 774 Cost: 0.0162 Fidelity: 0.9775 Trace: 0.9775
Rep: 775 Cost: 0.0142 Fidelity: 0.9776 Trace: 0.9776
Rep: 776 Cost: 0.0176 Fidelity: 0.9778 Trace: 0.9778
Rep: 777 Cost: 0.0179 Fidelity: 0.9779 Trace: 0.9779
Rep: 778 Cost: 0.0143 Fidelity: 0.9777 Trace: 0.9777
Rep: 779 Cost: 0.0164 Fidelity: 0.9775 Trace: 0.9775
Rep: 780 Cost: 0.0160 Fidelity: 0.9776 Trace: 0.9776
Rep: 781 Cost: 0.0154 Fidelity: 0.9778 Trace: 0.9778
Rep: 782 Cost: 0.0154 Fidelity: 0.9778 Trace: 0.9778
Rep: 783 Cost: 0.0173 Fidelity: 0.9777 Trace: 0.9777
Rep: 784 Cost: 0.0139 Fidelity: 0.9776 Trace: 0.9776
Rep: 785 Cost: 0.0176 Fidelity: 0.9778 Trace: 0.9778
Rep: 786 Cost: 0.0163 Fidelity: 0.9778 Trace: 0.9778
Rep: 787 Cost: 0.0158 Fidelity: 0.9777 Trace: 0.9777
Rep: 788 Cost: 0.0162 Fidelity: 0.9776 Trace: 0.9776
Rep: 789 Cost: 0.0142 Fidelity: 0.9778 Trace: 0.9778
Rep: 790 Cost: 0.0135 Fidelity: 0.9778 Trace: 0.9778
Rep: 791 Cost: 0.0170 Fidelity: 0.9777 Trace: 0.9777
Rep: 792 Cost: 0.0161 Fidelity: 0.9777 Trace: 0.9777
Rep: 793 Cost: 0.0151 Fidelity: 0.9779 Trace: 0.9779
Rep: 794 Cost: 0.0147 Fidelity: 0.9779 Trace: 0.9779
Rep: 795 Cost: 0.0161 Fidelity: 0.9778 Trace: 0.9778
Rep: 796 Cost: 0.0156 Fidelity: 0.9778 Trace: 0.9778
Rep: 797 Cost: 0.0153 Fidelity: 0.9779 Trace: 0.9780
Rep: 798 Cost: 0.0151 Fidelity: 0.9779 Trace: 0.9780
Rep: 799 Cost: 0.0153 Fidelity: 0.9778 Trace: 0.9778
Rep: 800 Cost: 0.0147 Fidelity: 0.9779 Trace: 0.9779
Rep: 801 Cost: 0.0158 Fidelity: 0.9780 Trace: 0.9780
Rep: 802 Cost: 0.0151 Fidelity: 0.9780 Trace: 0.9780
Rep: 803 Cost: 0.0156 Fidelity: 0.9779 Trace: 0.9779
Rep: 804 Cost: 0.0152 Fidelity: 0.9779 Trace: 0.9779
Rep: 805 Cost: 0.0152 Fidelity: 0.9781 Trace: 0.9781
Rep: 806 Cost: 0.0149 Fidelity: 0.9781 Trace: 0.9781
Rep: 807 Cost: 0.0152 Fidelity: 0.9780 Trace: 0.9780
Rep: 808 Cost: 0.0163 Fidelity: 0.9779 Trace: 0.9780
Rep: 809 Cost: 0.0140 Fidelity: 0.9781 Trace: 0.9781
Rep: 810 Cost: 0.0155 Fidelity: 0.9780 Trace: 0.9780
Rep: 811 Cost: 0.0134 Fidelity: 0.9779 Trace: 0.9779
Rep: 812 Cost: 0.0141 Fidelity: 0.9780 Trace: 0.9780
Rep: 813 Cost: 0.0140 Fidelity: 0.9781 Trace: 0.9781
Rep: 814 Cost: 0.0137 Fidelity: 0.9779 Trace: 0.9779
Rep: 815 Cost: 0.0132 Fidelity: 0.9779 Trace: 0.9779
Rep: 816 Cost: 0.0153 Fidelity: 0.9781 Trace: 0.9781
Rep: 817 Cost: 0.0149 Fidelity: 0.9781 Trace: 0.9782
Rep: 818 Cost: 0.0138 Fidelity: 0.9780 Trace: 0.9780
Rep: 819 Cost: 0.0154 Fidelity: 0.9780 Trace: 0.9780
Rep: 820 Cost: 0.0133 Fidelity: 0.9781 Trace: 0.9781
Rep: 821 Cost: 0.0164 Fidelity: 0.9780 Trace: 0.9780
Rep: 822 Cost: 0.0151 Fidelity: 0.9781 Trace: 0.9781
Rep: 823 Cost: 0.0153 Fidelity: 0.9782 Trace: 0.9782
Rep: 824 Cost: 0.0154 Fidelity: 0.9781 Trace: 0.9781
Rep: 825 Cost: 0.0144 Fidelity: 0.9780 Trace: 0.9780
Rep: 826 Cost: 0.0141 Fidelity: 0.9781 Trace: 0.9781
Rep: 827 Cost: 0.0152 Fidelity: 0.9782 Trace: 0.9782
Rep: 828 Cost: 0.0134 Fidelity: 0.9781 Trace: 0.9781
Rep: 829 Cost: 0.0144 Fidelity: 0.9781 Trace: 0.9781
Rep: 830 Cost: 0.0150 Fidelity: 0.9783 Trace: 0.9783
Rep: 831 Cost: 0.0156 Fidelity: 0.9783 Trace: 0.9783
Rep: 832 Cost: 0.0137 Fidelity: 0.9782 Trace: 0.9782
Rep: 833 Cost: 0.0172 Fidelity: 0.9781 Trace: 0.9781
Rep: 834 Cost: 0.0166 Fidelity: 0.9781 Trace: 0.9781
Rep: 835 Cost: 0.0141 Fidelity: 0.9782 Trace: 0.9782
Rep: 836 Cost: 0.0142 Fidelity: 0.9782 Trace: 0.9782
Rep: 837 Cost: 0.0148 Fidelity: 0.9782 Trace: 0.9782
Rep: 838 Cost: 0.0144 Fidelity: 0.9783 Trace: 0.9783
Rep: 839 Cost: 0.0136 Fidelity: 0.9783 Trace: 0.9783
Rep: 840 Cost: 0.0139 Fidelity: 0.9782 Trace: 0.9782
Rep: 841 Cost: 0.0126 Fidelity: 0.9783 Trace: 0.9783
Rep: 842 Cost: 0.0150 Fidelity: 0.9784 Trace: 0.9784
Rep: 843 Cost: 0.0132 Fidelity: 0.9783 Trace: 0.9783
Rep: 844 Cost: 0.0161 Fidelity: 0.9781 Trace: 0.9782
Rep: 845 Cost: 0.0140 Fidelity: 0.9782 Trace: 0.9782
Rep: 846 Cost: 0.0150 Fidelity: 0.9783 Trace: 0.9783
Rep: 847 Cost: 0.0133 Fidelity: 0.9783 Trace: 0.9783
Rep: 848 Cost: 0.0167 Fidelity: 0.9783 Trace: 0.9783
Rep: 849 Cost: 0.0161 Fidelity: 0.9783 Trace: 0.9783
Rep: 850 Cost: 0.0152 Fidelity: 0.9785 Trace: 0.9785
Rep: 851 Cost: 0.0148 Fidelity: 0.9784 Trace: 0.9784
Rep: 852 Cost: 0.0157 Fidelity: 0.9782 Trace: 0.9782
Rep: 853 Cost: 0.0154 Fidelity: 0.9783 Trace: 0.9783
Rep: 854 Cost: 0.0145 Fidelity: 0.9785 Trace: 0.9785
Rep: 855 Cost: 0.0142 Fidelity: 0.9785 Trace: 0.9785
Rep: 856 Cost: 0.0155 Fidelity: 0.9783 Trace: 0.9783
Rep: 857 Cost: 0.0156 Fidelity: 0.9783 Trace: 0.9783
Rep: 858 Cost: 0.0133 Fidelity: 0.9785 Trace: 0.9785
Rep: 859 Cost: 0.0149 Fidelity: 0.9785 Trace: 0.9785
Rep: 860 Cost: 0.0133 Fidelity: 0.9784 Trace: 0.9784
Rep: 861 Cost: 0.0137 Fidelity: 0.9784 Trace: 0.9784
Rep: 862 Cost: 0.0154 Fidelity: 0.9785 Trace: 0.9785
Rep: 863 Cost: 0.0143 Fidelity: 0.9785 Trace: 0.9785
Rep: 864 Cost: 0.0148 Fidelity: 0.9784 Trace: 0.9784
Rep: 865 Cost: 0.0162 Fidelity: 0.9783 Trace: 0.9783
Rep: 866 Cost: 0.0153 Fidelity: 0.9784 Trace: 0.9784
Rep: 867 Cost: 0.0168 Fidelity: 0.9786 Trace: 0.9786
Rep: 868 Cost: 0.0162 Fidelity: 0.9787 Trace: 0.9787
Rep: 869 Cost: 0.0164 Fidelity: 0.9786 Trace: 0.9786
Rep: 870 Cost: 0.0161 Fidelity: 0.9784 Trace: 0.9784
Rep: 871 Cost: 0.0145 Fidelity: 0.9784 Trace: 0.9784
Rep: 872 Cost: 0.0159 Fidelity: 0.9786 Trace: 0.9786
Rep: 873 Cost: 0.0148 Fidelity: 0.9785 Trace: 0.9785
Rep: 874 Cost: 0.0142 Fidelity: 0.9785 Trace: 0.9785
Rep: 875 Cost: 0.0141 Fidelity: 0.9785 Trace: 0.9785
Rep: 876 Cost: 0.0147 Fidelity: 0.9786 Trace: 0.9786
Rep: 877 Cost: 0.0136 Fidelity: 0.9786 Trace: 0.9786
Rep: 878 Cost: 0.0147 Fidelity: 0.9785 Trace: 0.9785
Rep: 879 Cost: 0.0150 Fidelity: 0.9784 Trace: 0.9784
Rep: 880 Cost: 0.0146 Fidelity: 0.9786 Trace: 0.9786
Rep: 881 Cost: 0.0123 Fidelity: 0.9786 Trace: 0.9786
Rep: 882 Cost: 0.0172 Fidelity: 0.9785 Trace: 0.9785
Rep: 883 Cost: 0.0162 Fidelity: 0.9786 Trace: 0.9786
Rep: 884 Cost: 0.0162 Fidelity: 0.9788 Trace: 0.9788
Rep: 885 Cost: 0.0178 Fidelity: 0.9788 Trace: 0.9788
Rep: 886 Cost: 0.0130 Fidelity: 0.9787 Trace: 0.9787
Rep: 887 Cost: 0.0204 Fidelity: 0.9784 Trace: 0.9785
Rep: 888 Cost: 0.0217 Fidelity: 0.9784 Trace: 0.9784
Rep: 889 Cost: 0.0159 Fidelity: 0.9786 Trace: 0.9786
Rep: 890 Cost: 0.0185 Fidelity: 0.9788 Trace: 0.9788
Rep: 891 Cost: 0.0196 Fidelity: 0.9789 Trace: 0.9789
Rep: 892 Cost: 0.0143 Fidelity: 0.9787 Trace: 0.9787
Rep: 893 Cost: 0.0182 Fidelity: 0.9786 Trace: 0.9786
Rep: 894 Cost: 0.0173 Fidelity: 0.9786 Trace: 0.9786
Rep: 895 Cost: 0.0155 Fidelity: 0.9787 Trace: 0.9787
Rep: 896 Cost: 0.0188 Fidelity: 0.9788 Trace: 0.9788
Rep: 897 Cost: 0.0153 Fidelity: 0.9788 Trace: 0.9788
Rep: 898 Cost: 0.0168 Fidelity: 0.9786 Trace: 0.9786
Rep: 899 Cost: 0.0185 Fidelity: 0.9786 Trace: 0.9787
Rep: 900 Cost: 0.0138 Fidelity: 0.9788 Trace: 0.9788
Rep: 901 Cost: 0.0168 Fidelity: 0.9789 Trace: 0.9789
Rep: 902 Cost: 0.0153 Fidelity: 0.9788 Trace: 0.9788
Rep: 903 Cost: 0.0166 Fidelity: 0.9786 Trace: 0.9786
Rep: 904 Cost: 0.0180 Fidelity: 0.9785 Trace: 0.9786
Rep: 905 Cost: 0.0126 Fidelity: 0.9787 Trace: 0.9787
Rep: 906 Cost: 0.0183 Fidelity: 0.9789 Trace: 0.9789
Rep: 907 Cost: 0.0182 Fidelity: 0.9789 Trace: 0.9790
Rep: 908 Cost: 0.0148 Fidelity: 0.9788 Trace: 0.9788
Rep: 909 Cost: 0.0174 Fidelity: 0.9787 Trace: 0.9787
Rep: 910 Cost: 0.0147 Fidelity: 0.9788 Trace: 0.9788
Rep: 911 Cost: 0.0153 Fidelity: 0.9789 Trace: 0.9789
Rep: 912 Cost: 0.0151 Fidelity: 0.9789 Trace: 0.9789
Rep: 913 Cost: 0.0135 Fidelity: 0.9788 Trace: 0.9788
Rep: 914 Cost: 0.0176 Fidelity: 0.9787 Trace: 0.9788
Rep: 915 Cost: 0.0133 Fidelity: 0.9788 Trace: 0.9788
Rep: 916 Cost: 0.0191 Fidelity: 0.9790 Trace: 0.9790
Rep: 917 Cost: 0.0212 Fidelity: 0.9790 Trace: 0.9791
Rep: 918 Cost: 0.0155 Fidelity: 0.9789 Trace: 0.9789
Rep: 919 Cost: 0.0176 Fidelity: 0.9787 Trace: 0.9787
Rep: 920 Cost: 0.0190 Fidelity: 0.9787 Trace: 0.9787
Rep: 921 Cost: 0.0127 Fidelity: 0.9788 Trace: 0.9788
Rep: 922 Cost: 0.0176 Fidelity: 0.9791 Trace: 0.9791
Rep: 923 Cost: 0.0173 Fidelity: 0.9791 Trace: 0.9792
Rep: 924 Cost: 0.0144 Fidelity: 0.9790 Trace: 0.9790
Rep: 925 Cost: 0.0182 Fidelity: 0.9788 Trace: 0.9788
Rep: 926 Cost: 0.0169 Fidelity: 0.9788 Trace: 0.9788
Rep: 927 Cost: 0.0154 Fidelity: 0.9790 Trace: 0.9790
Rep: 928 Cost: 0.0159 Fidelity: 0.9791 Trace: 0.9791
Rep: 929 Cost: 0.0139 Fidelity: 0.9790 Trace: 0.9790
Rep: 930 Cost: 0.0141 Fidelity: 0.9790 Trace: 0.9790
Rep: 931 Cost: 0.0149 Fidelity: 0.9791 Trace: 0.9791
Rep: 932 Cost: 0.0142 Fidelity: 0.9791 Trace: 0.9791
Rep: 933 Cost: 0.0133 Fidelity: 0.9790 Trace: 0.9790
Rep: 934 Cost: 0.0135 Fidelity: 0.9789 Trace: 0.9789
Rep: 935 Cost: 0.0149 Fidelity: 0.9789 Trace: 0.9789
Rep: 936 Cost: 0.0149 Fidelity: 0.9791 Trace: 0.9791
Rep: 937 Cost: 0.0144 Fidelity: 0.9792 Trace: 0.9792
Rep: 938 Cost: 0.0149 Fidelity: 0.9791 Trace: 0.9791
Rep: 939 Cost: 0.0139 Fidelity: 0.9790 Trace: 0.9790
Rep: 940 Cost: 0.0128 Fidelity: 0.9791 Trace: 0.9791
Rep: 941 Cost: 0.0127 Fidelity: 0.9792 Trace: 0.9792
Rep: 942 Cost: 0.0149 Fidelity: 0.9792 Trace: 0.9792
Rep: 943 Cost: 0.0168 Fidelity: 0.9791 Trace: 0.9791
Rep: 944 Cost: 0.0121 Fidelity: 0.9791 Trace: 0.9791
Rep: 945 Cost: 0.0184 Fidelity: 0.9792 Trace: 0.9792
Rep: 946 Cost: 0.0181 Fidelity: 0.9792 Trace: 0.9792
Rep: 947 Cost: 0.0127 Fidelity: 0.9790 Trace: 0.9790
Rep: 948 Cost: 0.0189 Fidelity: 0.9790 Trace: 0.9790
Rep: 949 Cost: 0.0177 Fidelity: 0.9791 Trace: 0.9791
Rep: 950 Cost: 0.0146 Fidelity: 0.9793 Trace: 0.9793
Rep: 951 Cost: 0.0166 Fidelity: 0.9793 Trace: 0.9793
Rep: 952 Cost: 0.0130 Fidelity: 0.9792 Trace: 0.9792
Rep: 953 Cost: 0.0183 Fidelity: 0.9790 Trace: 0.9791
Rep: 954 Cost: 0.0196 Fidelity: 0.9791 Trace: 0.9791
Rep: 955 Cost: 0.0158 Fidelity: 0.9792 Trace: 0.9792
Rep: 956 Cost: 0.0171 Fidelity: 0.9794 Trace: 0.9794
Rep: 957 Cost: 0.0173 Fidelity: 0.9794 Trace: 0.9794
Rep: 958 Cost: 0.0155 Fidelity: 0.9794 Trace: 0.9794
Rep: 959 Cost: 0.0172 Fidelity: 0.9793 Trace: 0.9793
Rep: 960 Cost: 0.0148 Fidelity: 0.9793 Trace: 0.9793
Rep: 961 Cost: 0.0165 Fidelity: 0.9794 Trace: 0.9794
Rep: 962 Cost: 0.0177 Fidelity: 0.9794 Trace: 0.9794
Rep: 963 Cost: 0.0141 Fidelity: 0.9792 Trace: 0.9792
Rep: 964 Cost: 0.0156 Fidelity: 0.9792 Trace: 0.9792
Rep: 965 Cost: 0.0141 Fidelity: 0.9793 Trace: 0.9793
Rep: 966 Cost: 0.0136 Fidelity: 0.9794 Trace: 0.9794
Rep: 967 Cost: 0.0138 Fidelity: 0.9793 Trace: 0.9793
Rep: 968 Cost: 0.0133 Fidelity: 0.9792 Trace: 0.9792
Rep: 969 Cost: 0.0138 Fidelity: 0.9793 Trace: 0.9793
Rep: 970 Cost: 0.0132 Fidelity: 0.9794 Trace: 0.9794
Rep: 971 Cost: 0.0142 Fidelity: 0.9793 Trace: 0.9793
Rep: 972 Cost: 0.0156 Fidelity: 0.9792 Trace: 0.9792
Rep: 973 Cost: 0.0126 Fidelity: 0.9793 Trace: 0.9793
Rep: 974 Cost: 0.0171 Fidelity: 0.9795 Trace: 0.9796
Rep: 975 Cost: 0.0179 Fidelity: 0.9796 Trace: 0.9796
Rep: 976 Cost: 0.0134 Fidelity: 0.9794 Trace: 0.9794
Rep: 977 Cost: 0.0165 Fidelity: 0.9792 Trace: 0.9792
Rep: 978 Cost: 0.0165 Fidelity: 0.9792 Trace: 0.9792
Rep: 979 Cost: 0.0132 Fidelity: 0.9793 Trace: 0.9793
Rep: 980 Cost: 0.0168 Fidelity: 0.9796 Trace: 0.9796
Rep: 981 Cost: 0.0161 Fidelity: 0.9796 Trace: 0.9796
Rep: 982 Cost: 0.0138 Fidelity: 0.9794 Trace: 0.9794
Rep: 983 Cost: 0.0176 Fidelity: 0.9792 Trace: 0.9792
Rep: 984 Cost: 0.0160 Fidelity: 0.9792 Trace: 0.9793
Rep: 985 Cost: 0.0147 Fidelity: 0.9795 Trace: 0.9795
Rep: 986 Cost: 0.0160 Fidelity: 0.9796 Trace: 0.9796
Rep: 987 Cost: 0.0132 Fidelity: 0.9795 Trace: 0.9795
Rep: 988 Cost: 0.0140 Fidelity: 0.9795 Trace: 0.9795
Rep: 989 Cost: 0.0151 Fidelity: 0.9795 Trace: 0.9795
Rep: 990 Cost: 0.0122 Fidelity: 0.9796 Trace: 0.9796
Rep: 991 Cost: 0.0151 Fidelity: 0.9795 Trace: 0.9795
Rep: 992 Cost: 0.0145 Fidelity: 0.9795 Trace: 0.9795
Rep: 993 Cost: 0.0138 Fidelity: 0.9796 Trace: 0.9796
Rep: 994 Cost: 0.0150 Fidelity: 0.9795 Trace: 0.9795
Rep: 995 Cost: 0.0147 Fidelity: 0.9795 Trace: 0.9795
Rep: 996 Cost: 0.0128 Fidelity: 0.9796 Trace: 0.9796
Rep: 997 Cost: 0.0139 Fidelity: 0.9796 Trace: 0.9796
Rep: 998 Cost: 0.0129 Fidelity: 0.9795 Trace: 0.9795
Rep: 999 Cost: 0.0125 Fidelity: 0.9795 Trace: 0.9795

Fidelity before optimization:  3.8496677e-09
Fidelity after optimization:  0.9794778

Target state:  [0.+0.j 1.+0.j 0.+0.j 0.+0.j 0.+0.j 0.+0.j]
Output state:  [ 0.   +0.001j  0.99 -0.j    -0.001+0.j    -0.   +0.j    -0.   +0.j
  0.   +0.j   ]

For more applications of CV quantum neural networks, see the state learning and gate synthesis demonstrations.

References

1(1,2,3,4)

Nathan Killoran, Thomas R Bromley, Juan Miguel Arrazola, Maria Schuld, Nicolás Quesada, and Seth Lloyd. Continuous-variable quantum neural networks. arXiv preprint arXiv:1806.06871, 2018.

2

Maria Schuld, Ville Bergholm, Christian Gogolin, Josh Izaac, and Nathan Killoran. Evaluating analytic gradients on quantum hardware. Physical Review A, 99(3):032331, 2019.

3

William R Clements, Peter C Humphreys, Benjamin J Metcalf, W Steven Kolthammer, and Ian A Walsmley. Optimal design for universal multiport interferometers. Optica, 3(12):1460–1465, 2016. doi:10.1364/OPTICA.3.001460.

Total running time of the script: ( 3 minutes 57.263 seconds)

Gallery generated by Sphinx-Gallery