Quantum neural network

“Neural Network are not black boxes. They are a big pile of linear algebra.” - Randall Munroe, xkcd

Machine learning has a wide range of models for tasks such as classification, regression, and clustering. Neural networks are one of the most successful models, having experienced a resurgence in use over the past decade due to improvements in computational power and advanced software libraries. The typical structure of a neural network consists of a series of interacting layers that perform transformations on data passing through the network. An archetypal neural network structure is the feedforward neural network, visualized by the following example:


../_images/neural_network.svg


Here, the neural network depth is determined by the number of layers, while the maximum width is given by the layer with the greatest number of neurons. The network begins with an input layer of real-valued neurons, which feed forward onto a series of one or more hidden layers. Following the notation of [1], if the \(n\) neurons at one layer are given by the vector \(\mathbf{x} \in \mathbb{R}^{n}\), the \(m\) neurons of the next layer take the values

\[\mathcal{L}(\mathbf{x}) = \varphi (W \mathbf{x} + \mathbf{b}),\]

where

  • \(W \in \mathbb{R}^{m \times n}\) is a matrix,

  • \(b \in \mathbb{R}^{m}\) is a vector, and

  • \(\varphi\) is a nonlinear function (also known as the activation function).

The matrix multiplication \(W \mathbf{x}\) is a linear transformation on \(\mathbf{x}\), while \(W \mathbf{x} + \mathbf{b}\) represents an affine transformation. In principle, any nonlinear function can be chosen for \(\varphi\), but often the choice is fixed from a standard set of activations that include the rectified linear unit (ReLU) and the sigmoid function acting on each neuron. Finally, the output layer enacts an affine transformation on the last hidden layer, but the activation function may be linear (including the identity), or a different nonlinear function such as softmax (for classification).

Layers in the feedforward neural network above are called fully connected as every neuron in a given hidden layer or output layer can be connected to all neurons in the previous layer through the matrix \(W\). Over time, specialized versions of layers have been developed to focus on different problems. For example, convolutional layers have a restricted form of connectivity and are suited to machine learning with images. We focus here on fully connected layers as the most general type.

Training of neural networks uses variations of the gradient descent algorithm on a cost function characterizing the similarity between outputs of the neural network and training data. The gradient of the cost function can be calculated using automatic differentiation, with knowledge of the feedforward network structure.

Quantum neural networks aim to encode neural networks into a quantum system, with the intention of benefiting from quantum information processing. There have been numerous attempts to define a quantum neural network, each with varying advantages and disadvantages. The quantum neural network detailed below, following the work of [1], has a CV architecture and is realized using standard CV gates from Strawberry Fields. One advantage of this CV architecture is that it naturally accommodates for the continuous nature of neural networks. Additionally, the CV model is able to easily apply non-linear transformations using the phase space picture - a task which qubit-based models struggle with, often relying on measurement postselection which has a probability of failure.

Implementation

A CV quantum neural network layer can be defined as

\[\mathcal{L} := \Phi \circ \mathcal{D} \circ \mathcal{U}_{2} \circ \mathcal{S} \circ \mathcal{U}_{1},\]

where

  • \(\mathcal{U}_{k}=U_{k}(\boldsymbol{\theta}_{k},\boldsymbol{\phi}_{k})\) is an \(N\) mode interferometer,

  • \(\mathcal{D}=\otimes_{i=1}^{N}D(\alpha_{i})\) is a single mode displacement gate (Dgate) with complex displacement \(\alpha_{i} \in \mathbb{C}\),

  • \(\mathcal{S}=\otimes_{i=1}^{N}S(r_{i})\) is a single mode squeezing gate (Sgate) acting on each mode with squeezing parameter \(r_{i} \in \mathbb{R}\), and

  • \(\Phi=\otimes_{i=1}^{N}\Phi(\lambda_{i})\) is a non-Gaussian gate on each mode with parameter \(\lambda_{i} \in \mathbb{R}\).

Note

Any non-Gaussian gate such as the cubic phase gate (Vgate) represents a valid choice, but we recommend the Kerr gate (Kgate) for simulations in Strawberry Fields. The Kerr gate is more accurate numerically because it is diagonal in the Fock basis.

The layer is shown below as a circuit:


../_images/layer.svg


These layers can then be composed to form a quantum neural network. The width of the network can also be varied between layers [1].

Reproducing classical neural networks

Let’s see how the quantum layer can embed the transformation \(\mathcal{L}(\mathbf{x}) = \varphi (W \mathbf{x} + \mathbf{b})\) of a classical neural network layer. Suppose \(N\)-dimensional data is encoded in position eigenstates so that

\[\mathbf{x} \Leftrightarrow \ket{\mathbf{x}} := \ket{x_{1}} \otimes \ldots \otimes \ket{x_{N}}.\]

We want to perform the transformation

\[\ket{\mathbf{x}} \Rightarrow \ket{\varphi (W \mathbf{x} + \mathbf{b})}.\]

It turns out that the quantum circuit above can do precisely this! Consider first the affine transformation \(W \mathbf{x} + \mathbf{b}\). Leveraging the singular value decomposition, we can always write \(W = O_{2} \Sigma O_{1}\) with \(O_{k}\) orthogonal matrices and \(\Sigma\) a positive diagonal matrix. These orthogonal transformations can be carried out using interferometers without access to phase, i.e., with \(\boldsymbol{\phi}_{k} = 0\):

\[U_{k}(\boldsymbol{\theta}_{k},\mathbf{0})\ket{\mathbf{x}} = \ket{O_{k} \mathbf{x}}.\]

On the other hand, the diagonal matrix \(\Sigma = {\rm diag}\left(\{c_{i}\}_{i=1}^{N}\right)\) can be achieved through squeezing:

\[\otimes_{i=1}^{N}S(r_{i})\ket{\mathbf{x}} \propto \ket{\Sigma \mathbf{x}},\]

with \(r_{i} = \log (c_{i})\). Finally, the addition of a bias vector \(\mathbf{b}\) is done using position displacement gates:

\[\otimes_{i=1}^{N}D(\alpha_{i})\ket{\mathbf{x}} = \ket{\mathbf{x} + \mathbf{b}},\]

with \(\mathbf{b} = \{\alpha_{i}\}_{i=1}^{N}\) and \(\alpha_{i} \in \mathbb{R}\). Putting this all together, we see that the operation \(\mathcal{D} \circ \mathcal{U}_{2} \circ \mathcal{S} \circ \mathcal{U}_{1}\) with phaseless interferometers and position displacement performs the transformation \(\ket{\mathbf{x}} \Rightarrow \ket{W \mathbf{x} + \mathbf{b}}\) on position eigenstates.

Warning

The TensorFlow backend is the natural simulator for quantum neural networks in Strawberry Fields, but this backend cannot naturally accommodate position eigenstates, which require infinite squeezing. For simulation of position eigenstates in this backend, the best approach is to use a displaced squeezed state (prepare_displaced_squeezed_state) with high squeezing value r. However, to avoid significant numerical error, it is important to make sure that all initial states have negligible amplitude for Fock states \(\ket{n}\) with \(n\geq \texttt{cutoff_dim}\), where \(\texttt{cutoff_dim}\) is the cutoff dimension.

Finally, the nonlinear function \(\varphi\) can be achieved through a restricted type of non-Gaussian gates \(\otimes_{i=1}^{N}\Phi(\lambda_{i})\) acting on each mode (see [1] for more details), resulting in the transformation

\[\otimes_{i=1}^{N}\Phi(\lambda_{i})\ket{\mathbf{x}} = \ket{\varphi(\mathbf{x})}.\]

The operation \(\mathcal{L} = \Phi \circ \mathcal{D} \circ \mathcal{U}_{2} \circ \mathcal{S} \circ \mathcal{U}_{1}\) with phaseless interferometers, position displacements, and restricted non-Gaussian gates can hence be seen as enacting a classical neural network layer \(\ket{\mathbf{x}} \Rightarrow \ket{\phi(W \mathbf{x} + \mathbf{b})}\) on position eigenstates.

Extending to quantum neural networks

In fact, CV quantum neural network layers can be made more expressive than their classical counterparts. We can do this by lifting the above restrictions on \(\mathcal{L}\), i.e.:

  • Using arbitrary interferometers \(U_{k}(\boldsymbol{\theta}_{k},\boldsymbol{\phi}_{k})\) with access to phase and general displacement gates (i.e., not necessarily position displacement). This allows \(\mathcal{D} \circ \mathcal{U}_{2} \circ \mathcal{S} \circ \mathcal{U}_{1}\) to represent a general Gaussian operation.

  • Using arbitrary non-Gaussian gates \(\Phi(\lambda_{i})\), such as the Kerr gate.

  • Encoding data outside of the position eigenbasis, for example using instead the Fock basis.

In fact, gates in a single layer form a universal gate set, making the CV quantum neural network a model for universal quantum computing, i.e., a sufficient number of layers can carry out any quantum algorithm implementable on a CV quantum computer.

CV quantum neural networks can be trained both through classical simulation and directly on quantum hardware. Strawberry Fields relies on classical simulation to evaluate cost functions of the CV quantum neural network and the resultant gradients with respect to parameters of each layer. However, this becomes an intractable task with increasing network depth and width. Ultimately, direct evaluation on hardware will likely be necessary to large scale networks; an approach for hardware-based training is mapped out in [2]. The PennyLane library provides tools for training hybrid quantum-classical machine learning models, using both simulators and real-world quantum hardware.

Example CV quantum neural network layers are shown, for one to four modes, below:


../_images/layer_1mode.svg

One mode layer


../_images/layer_2mode.svg

Two mode layer


../_images/layer_3mode.svg

Three mode layer


../_images/layer_4mode.svg

Four mode layer


Here, the multimode linear interferometers \(U_{1}\) and \(U_{2}\) have been decomposed into two-mode phaseless beamsplitters (BSgate) and single-mode phase shifters (Rgate) using the Clements decomposition [3]. The Kerr gate is used as the non-Gaussian gate.

Code

First, we import Strawberry Fields, TensorFlow, and NumPy:

import numpy as np
import tensorflow as tf
import strawberryfields as sf
from strawberryfields import ops

Before we begin defining our optimization problem, let’s first create some convenient utility functions.

Utility functions

The first step to writing a CV quantum neural network layer in Strawberry Fields is to define a function for the two interferometers:

def interferometer(params, q):
    """Parameterised interferometer acting on ``N`` modes.

    Args:
        params (list[float]): list of length ``max(1, N-1) + (N-1)*N`` parameters.

            * The first ``N(N-1)/2`` parameters correspond to the beamsplitter angles
            * The second ``N(N-1)/2`` parameters correspond to the beamsplitter phases
            * The final ``N-1`` parameters correspond to local rotation on the first N-1 modes

        q (list[RegRef]): list of Strawberry Fields quantum registers the interferometer
            is to be applied to
    """
    N = len(q)
    theta = params[:N*(N-1)//2]
    phi = params[N*(N-1)//2:N*(N-1)]
    rphi = params[-N+1:]

    if N == 1:
        # the interferometer is a single rotation
        ops.Rgate(rphi[0]) | q[0]
        return

    n = 0  # keep track of free parameters

    # Apply the rectangular beamsplitter array
    # The array depth is N
    for l in range(N):
        for k, (q1, q2) in enumerate(zip(q[:-1], q[1:])):
            # skip even or odd pairs depending on layer
            if (l + k) % 2 != 1:
                ops.BSgate(theta[n], phi[n]) | (q1, q2)
                n += 1

    # apply the final local phase shifts to all modes except the last one
    for i in range(max(1, N - 1)):
        ops.Rgate(rphi[i]) | q[i]

Warning

The Interferometer class in Strawberry Fields does not reproduce the functionality above. Instead, Interferometer applies a given input unitary matrix according to the Clements decomposition.

Using the above interferometer function, an \(N\) mode CV quantum neural network layer is given by the function:

def layer(params, q):
    """CV quantum neural network layer acting on ``N`` modes.

    Args:
        params (list[float]): list of length ``2*(max(1, N-1) + N**2 + n)`` containing
            the number of parameters for the layer
        q (list[RegRef]): list of Strawberry Fields quantum registers the layer
            is to be applied to
    """
    N = len(q)
    M = int(N * (N - 1)) + max(1, N - 1)

    int1 = params[:M]
    s = params[M:M+N]
    int2 = params[M+N:2*M+N]
    dr = params[2*M+N:2*M+2*N]
    dp = params[2*M+2*N:2*M+3*N]
    k = params[2*M+3*N:2*M+4*N]

    # begin layer
    interferometer(int1, q)

    for i in range(N):
        ops.Sgate(s[i]) | q[i]

    interferometer(int2, q)

    for i in range(N):
        ops.Dgate(dr[i], dp[i]) | q[i]
        ops.Kgate(k[i]) | q[i]

Finally, we define one more utility function to help us initialize the TensorFlow weights for our quantum neural network layers:

def init_weights(modes, layers, active_sd=0.0001, passive_sd=0.1):
    """Initialize a 2D TensorFlow Variable containing normally-distributed
    random weights for an ``N`` mode quantum neural network with ``L`` layers.

    Args:
        modes (int): the number of modes in the quantum neural network
        layers (int): the number of layers in the quantum neural network
        active_sd (float): the standard deviation used when initializing
            the normally-distributed weights for the active parameters
            (displacement, squeezing, and Kerr magnitude)
        passive_sd (float): the standard deviation used when initializing
            the normally-distributed weights for the passive parameters
            (beamsplitter angles and all gate phases)

    Returns:
        tf.Variable[tf.float32]: A TensorFlow Variable of shape
        ``[layers, 2*(max(1, modes-1) + modes**2 + modes)]``, where the Lth
        row represents the layer parameters for the Lth layer.
    """
    # Number of interferometer parameters:
    M = int(modes * (modes - 1)) + max(1, modes - 1)

    # Create the TensorFlow variables
    int1_weights = tf.random.normal(shape=[layers, M], stddev=passive_sd)
    s_weights = tf.random.normal(shape=[layers, modes], stddev=active_sd)
    int2_weights = tf.random.normal(shape=[layers, M], stddev=passive_sd)
    dr_weights = tf.random.normal(shape=[layers, modes], stddev=active_sd)
    dp_weights = tf.random.normal(shape=[layers, modes], stddev=passive_sd)
    k_weights = tf.random.normal(shape=[layers, modes], stddev=active_sd)

    weights = tf.concat(
        [int1_weights, s_weights, int2_weights, dr_weights, dp_weights, k_weights], axis=1
    )

    weights = tf.Variable(weights)

    return weights

Optimization

Now that we have our utility functions, lets begin defining our optimization problem In this particular example, let’s create a 1 mode CVQNN with 8 layers and a Fock-basis cutoff dimension of 6. We will train this QNN to output a desired target state; a single photon state.

# set the random seed
tf.random.set_seed(137)
np.random.seed(137)


# define width and depth of CV quantum neural network
modes = 1
layers = 8
cutoff_dim = 6


# defining desired state (single photon state)
target_state = np.zeros(cutoff_dim)
target_state[1] = 1
target_state = tf.constant(target_state, dtype=tf.complex64)

Now, let’s initialize an engine with the TensorFlow "tf" backend, and begin constructing out QNN program.

# initialize engine and program
eng = sf.Engine(backend="tf", backend_options={"cutoff_dim": cutoff_dim})
qnn = sf.Program(modes)

# initialize QNN weights
weights = init_weights(modes, layers) # our TensorFlow weights
num_params = np.prod(weights.shape)   # total number of parameters in our model

To construct the program, we must create and use Strawberry Fields symbolic gate arguments. These will be mapped to the TensorFlow variables on engine execution.

# Create array of Strawberry Fields symbolic gate arguments, matching
# the size of the weights Variable.
sf_params = np.arange(num_params).reshape(weights.shape).astype(np.str)
sf_params = np.array([qnn.params(*i) for i in sf_params])


# Construct the symbolic Strawberry Fields program by
# looping and applying layers to the program.
with qnn.context as q:
    for k in range(layers):
        layer(sf_params[k], q)

where sf_params is a real array of size [layers, 2*(max(1, modes-1) + modes**2 + modes)] containing the symbolic gate arguments for the quantum neural network.

Now that our QNN program is defined, we can create our cost function. Our cost function simply executes the QNN on our engine using the values of the input weights.

Since we want to maximize the fidelity \(f(w) = \langle \psi(w) | \psi_t\rangle\) between our QNN output state \(|\psi(w)\rangle\) and our target state \(\psi_t\rangle\), we compute the inner product between the two statevectors, as well as the norm \(\left\lVert \psi(w) - \psi_t\right\rVert\).

Finally, we also return the trace of the output QNN state. This should always have a value close to 1. If it deviates significantly from 1, this is an indication that we need to increase our Fock-basis cutoff.

def cost(weights):
    # Create a dictionary mapping from the names of the Strawberry Fields
    # symbolic gate parameters to the TensorFlow weight values.
    mapping = {p.name: w for p, w in zip(sf_params.flatten(), tf.reshape(weights, [-1]))}

    # run the engine
    state = eng.run(qnn, args=mapping).state
    ket = state.ket()

    difference = tf.reduce_sum(tf.abs(ket - target_state))
    fidelity = tf.abs(tf.reduce_sum(tf.math.conj(ket) * target_state)) ** 2
    return difference, fidelity, ket, tf.math.real(state.trace())

We are now ready to minimize our cost function using TensorFlow:

# set up the optimizer
opt = tf.keras.optimizers.Adam()
cost_before, fidelity_before, _, _ = cost(weights)

# Perform the optimization
for i in range(1000):
    # reset the engine if it has already been executed
    if eng.run_progs:
        eng.reset()

    with tf.GradientTape() as tape:
        loss, fid, ket, trace = cost(weights)

    # one repetition of the optimization
    gradients = tape.gradient(loss, weights)
    opt.apply_gradients(zip([gradients], [weights]))

    # Prints progress at every rep
    if i % 1 == 0:
        print("Rep: {} Cost: {:.4f} Fidelity: {:.4f} Trace: {:.4f}".format(i, loss, fid, trace))


print("\nFidelity before optimization: ", fidelity_before.numpy())
print("Fidelity after optimization: ", fid.numpy())
print("\nTarget state: ", target_state.numpy())
print("Output state: ", np.round(ket.numpy(), decimals=3))

Out:

Rep: 0 Cost: 2.0001 Fidelity: 0.0000 Trace: 1.0000
Rep: 1 Cost: 1.9978 Fidelity: 0.0001 Trace: 1.0000
Rep: 2 Cost: 1.9897 Fidelity: 0.0002 Trace: 1.0000
Rep: 3 Cost: 1.9794 Fidelity: 0.0006 Trace: 1.0000
Rep: 4 Cost: 1.9681 Fidelity: 0.0010 Trace: 1.0000
Rep: 5 Cost: 1.9632 Fidelity: 0.0016 Trace: 1.0000
Rep: 6 Cost: 1.9563 Fidelity: 0.0023 Trace: 1.0000
Rep: 7 Cost: 1.9476 Fidelity: 0.0031 Trace: 1.0000
Rep: 8 Cost: 1.9377 Fidelity: 0.0041 Trace: 1.0000
Rep: 9 Cost: 1.9268 Fidelity: 0.0052 Trace: 1.0000
Rep: 10 Cost: 1.9196 Fidelity: 0.0064 Trace: 1.0000
Rep: 11 Cost: 1.9130 Fidelity: 0.0077 Trace: 1.0000
Rep: 12 Cost: 1.9055 Fidelity: 0.0091 Trace: 1.0000
Rep: 13 Cost: 1.8971 Fidelity: 0.0107 Trace: 1.0000
Rep: 14 Cost: 1.8880 Fidelity: 0.0124 Trace: 1.0000
Rep: 15 Cost: 1.8789 Fidelity: 0.0142 Trace: 1.0000
Rep: 16 Cost: 1.8695 Fidelity: 0.0162 Trace: 1.0000
Rep: 17 Cost: 1.8601 Fidelity: 0.0183 Trace: 1.0000
Rep: 18 Cost: 1.8505 Fidelity: 0.0205 Trace: 1.0000
Rep: 19 Cost: 1.8410 Fidelity: 0.0229 Trace: 1.0000
Rep: 20 Cost: 1.8327 Fidelity: 0.0254 Trace: 1.0000
Rep: 21 Cost: 1.8241 Fidelity: 0.0280 Trace: 1.0000
Rep: 22 Cost: 1.8145 Fidelity: 0.0308 Trace: 1.0000
Rep: 23 Cost: 1.8060 Fidelity: 0.0337 Trace: 1.0000
Rep: 24 Cost: 1.7979 Fidelity: 0.0367 Trace: 1.0000
Rep: 25 Cost: 1.7897 Fidelity: 0.0398 Trace: 1.0000
Rep: 26 Cost: 1.7815 Fidelity: 0.0431 Trace: 1.0000
Rep: 27 Cost: 1.7732 Fidelity: 0.0464 Trace: 1.0000
Rep: 28 Cost: 1.7649 Fidelity: 0.0498 Trace: 1.0000
Rep: 29 Cost: 1.7566 Fidelity: 0.0533 Trace: 1.0000
Rep: 30 Cost: 1.7484 Fidelity: 0.0569 Trace: 1.0000
Rep: 31 Cost: 1.7403 Fidelity: 0.0606 Trace: 1.0000
Rep: 32 Cost: 1.7322 Fidelity: 0.0644 Trace: 1.0000
Rep: 33 Cost: 1.7242 Fidelity: 0.0683 Trace: 1.0000
Rep: 34 Cost: 1.7164 Fidelity: 0.0723 Trace: 1.0000
Rep: 35 Cost: 1.7087 Fidelity: 0.0763 Trace: 1.0000
Rep: 36 Cost: 1.7012 Fidelity: 0.0804 Trace: 1.0000
Rep: 37 Cost: 1.6938 Fidelity: 0.0846 Trace: 1.0000
Rep: 38 Cost: 1.6866 Fidelity: 0.0888 Trace: 1.0000
Rep: 39 Cost: 1.6795 Fidelity: 0.0931 Trace: 1.0000
Rep: 40 Cost: 1.6726 Fidelity: 0.0975 Trace: 1.0000
Rep: 41 Cost: 1.6659 Fidelity: 0.1019 Trace: 1.0000
Rep: 42 Cost: 1.6593 Fidelity: 0.1063 Trace: 1.0000
Rep: 43 Cost: 1.6529 Fidelity: 0.1108 Trace: 1.0000
Rep: 44 Cost: 1.6467 Fidelity: 0.1154 Trace: 1.0000
Rep: 45 Cost: 1.6405 Fidelity: 0.1199 Trace: 1.0000
Rep: 46 Cost: 1.6346 Fidelity: 0.1245 Trace: 1.0000
Rep: 47 Cost: 1.6287 Fidelity: 0.1291 Trace: 1.0000
Rep: 48 Cost: 1.6230 Fidelity: 0.1337 Trace: 1.0000
Rep: 49 Cost: 1.6173 Fidelity: 0.1384 Trace: 1.0000
Rep: 50 Cost: 1.6117 Fidelity: 0.1430 Trace: 1.0000
Rep: 51 Cost: 1.6062 Fidelity: 0.1476 Trace: 1.0000
Rep: 52 Cost: 1.6007 Fidelity: 0.1523 Trace: 1.0000
Rep: 53 Cost: 1.5952 Fidelity: 0.1569 Trace: 1.0000
Rep: 54 Cost: 1.5897 Fidelity: 0.1616 Trace: 1.0000
Rep: 55 Cost: 1.5842 Fidelity: 0.1662 Trace: 1.0000
Rep: 56 Cost: 1.5786 Fidelity: 0.1708 Trace: 1.0000
Rep: 57 Cost: 1.5731 Fidelity: 0.1754 Trace: 1.0000
Rep: 58 Cost: 1.5674 Fidelity: 0.1800 Trace: 1.0000
Rep: 59 Cost: 1.5617 Fidelity: 0.1846 Trace: 1.0000
Rep: 60 Cost: 1.5560 Fidelity: 0.1892 Trace: 1.0000
Rep: 61 Cost: 1.5502 Fidelity: 0.1938 Trace: 1.0000
Rep: 62 Cost: 1.5445 Fidelity: 0.1984 Trace: 1.0000
Rep: 63 Cost: 1.5389 Fidelity: 0.2030 Trace: 1.0000
Rep: 64 Cost: 1.5333 Fidelity: 0.2076 Trace: 1.0000
Rep: 65 Cost: 1.5276 Fidelity: 0.2122 Trace: 1.0000
Rep: 66 Cost: 1.5219 Fidelity: 0.2168 Trace: 1.0000
Rep: 67 Cost: 1.5161 Fidelity: 0.2215 Trace: 1.0000
Rep: 68 Cost: 1.5101 Fidelity: 0.2261 Trace: 1.0000
Rep: 69 Cost: 1.5040 Fidelity: 0.2307 Trace: 1.0000
Rep: 70 Cost: 1.4977 Fidelity: 0.2354 Trace: 1.0000
Rep: 71 Cost: 1.4912 Fidelity: 0.2400 Trace: 1.0000
Rep: 72 Cost: 1.4845 Fidelity: 0.2446 Trace: 1.0000
Rep: 73 Cost: 1.4775 Fidelity: 0.2492 Trace: 1.0000
Rep: 74 Cost: 1.4703 Fidelity: 0.2538 Trace: 1.0000
Rep: 75 Cost: 1.4629 Fidelity: 0.2583 Trace: 1.0000
Rep: 76 Cost: 1.4553 Fidelity: 0.2630 Trace: 1.0000
Rep: 77 Cost: 1.4474 Fidelity: 0.2676 Trace: 1.0000
Rep: 78 Cost: 1.4392 Fidelity: 0.2724 Trace: 1.0000
Rep: 79 Cost: 1.4308 Fidelity: 0.2772 Trace: 1.0000
Rep: 80 Cost: 1.4222 Fidelity: 0.2822 Trace: 1.0000
Rep: 81 Cost: 1.4132 Fidelity: 0.2873 Trace: 1.0000
Rep: 82 Cost: 1.4040 Fidelity: 0.2926 Trace: 1.0000
Rep: 83 Cost: 1.3945 Fidelity: 0.2980 Trace: 1.0000
Rep: 84 Cost: 1.3848 Fidelity: 0.3036 Trace: 1.0000
Rep: 85 Cost: 1.3748 Fidelity: 0.3094 Trace: 1.0000
Rep: 86 Cost: 1.3646 Fidelity: 0.3153 Trace: 1.0000
Rep: 87 Cost: 1.3543 Fidelity: 0.3214 Trace: 1.0000
Rep: 88 Cost: 1.3438 Fidelity: 0.3276 Trace: 1.0000
Rep: 89 Cost: 1.3334 Fidelity: 0.3340 Trace: 1.0000
Rep: 90 Cost: 1.3231 Fidelity: 0.3406 Trace: 1.0000
Rep: 91 Cost: 1.3129 Fidelity: 0.3473 Trace: 1.0000
Rep: 92 Cost: 1.3028 Fidelity: 0.3543 Trace: 1.0000
Rep: 93 Cost: 1.2925 Fidelity: 0.3614 Trace: 1.0000
Rep: 94 Cost: 1.2821 Fidelity: 0.3686 Trace: 1.0000
Rep: 95 Cost: 1.2715 Fidelity: 0.3759 Trace: 1.0000
Rep: 96 Cost: 1.2606 Fidelity: 0.3832 Trace: 1.0000
Rep: 97 Cost: 1.2493 Fidelity: 0.3905 Trace: 1.0000
Rep: 98 Cost: 1.2376 Fidelity: 0.3978 Trace: 1.0000
Rep: 99 Cost: 1.2257 Fidelity: 0.4051 Trace: 1.0000
Rep: 100 Cost: 1.2152 Fidelity: 0.4123 Trace: 1.0000
Rep: 101 Cost: 1.2057 Fidelity: 0.4197 Trace: 1.0000
Rep: 102 Cost: 1.1951 Fidelity: 0.4272 Trace: 1.0000
Rep: 103 Cost: 1.1841 Fidelity: 0.4345 Trace: 1.0000
Rep: 104 Cost: 1.1739 Fidelity: 0.4417 Trace: 1.0000
Rep: 105 Cost: 1.1641 Fidelity: 0.4487 Trace: 1.0000
Rep: 106 Cost: 1.1538 Fidelity: 0.4554 Trace: 1.0000
Rep: 107 Cost: 1.1427 Fidelity: 0.4620 Trace: 1.0000
Rep: 108 Cost: 1.1325 Fidelity: 0.4685 Trace: 1.0000
Rep: 109 Cost: 1.1229 Fidelity: 0.4749 Trace: 1.0000
Rep: 110 Cost: 1.1116 Fidelity: 0.4812 Trace: 1.0000
Rep: 111 Cost: 1.1032 Fidelity: 0.4875 Trace: 0.9999
Rep: 112 Cost: 1.0936 Fidelity: 0.4937 Trace: 0.9999
Rep: 113 Cost: 1.0821 Fidelity: 0.4998 Trace: 0.9999
Rep: 114 Cost: 1.0717 Fidelity: 0.5058 Trace: 0.9999
Rep: 115 Cost: 1.0628 Fidelity: 0.5117 Trace: 0.9999
Rep: 116 Cost: 1.0528 Fidelity: 0.5175 Trace: 0.9999
Rep: 117 Cost: 1.0420 Fidelity: 0.5233 Trace: 0.9999
Rep: 118 Cost: 1.0329 Fidelity: 0.5289 Trace: 0.9999
Rep: 119 Cost: 1.0234 Fidelity: 0.5345 Trace: 0.9999
Rep: 120 Cost: 1.0138 Fidelity: 0.5402 Trace: 0.9999
Rep: 121 Cost: 1.0055 Fidelity: 0.5458 Trace: 0.9999
Rep: 122 Cost: 0.9962 Fidelity: 0.5514 Trace: 0.9999
Rep: 123 Cost: 0.9864 Fidelity: 0.5570 Trace: 0.9998
Rep: 124 Cost: 0.9781 Fidelity: 0.5626 Trace: 0.9998
Rep: 125 Cost: 0.9695 Fidelity: 0.5682 Trace: 0.9998
Rep: 126 Cost: 0.9607 Fidelity: 0.5736 Trace: 0.9998
Rep: 127 Cost: 0.9518 Fidelity: 0.5790 Trace: 0.9998
Rep: 128 Cost: 0.9445 Fidelity: 0.5844 Trace: 0.9998
Rep: 129 Cost: 0.9367 Fidelity: 0.5898 Trace: 0.9998
Rep: 130 Cost: 0.9276 Fidelity: 0.5952 Trace: 0.9997
Rep: 131 Cost: 0.9177 Fidelity: 0.6005 Trace: 0.9997
Rep: 132 Cost: 0.9120 Fidelity: 0.6058 Trace: 0.9997
Rep: 133 Cost: 0.9034 Fidelity: 0.6111 Trace: 0.9997
Rep: 134 Cost: 0.8945 Fidelity: 0.6163 Trace: 0.9996
Rep: 135 Cost: 0.8868 Fidelity: 0.6214 Trace: 0.9996
Rep: 136 Cost: 0.8785 Fidelity: 0.6265 Trace: 0.9996
Rep: 137 Cost: 0.8690 Fidelity: 0.6314 Trace: 0.9996
Rep: 138 Cost: 0.8621 Fidelity: 0.6364 Trace: 0.9995
Rep: 139 Cost: 0.8545 Fidelity: 0.6413 Trace: 0.9995
Rep: 140 Cost: 0.8445 Fidelity: 0.6463 Trace: 0.9995
Rep: 141 Cost: 0.8374 Fidelity: 0.6513 Trace: 0.9995
Rep: 142 Cost: 0.8296 Fidelity: 0.6563 Trace: 0.9994
Rep: 143 Cost: 0.8215 Fidelity: 0.6611 Trace: 0.9994
Rep: 144 Cost: 0.8139 Fidelity: 0.6658 Trace: 0.9994
Rep: 145 Cost: 0.8045 Fidelity: 0.6705 Trace: 0.9993
Rep: 146 Cost: 0.7999 Fidelity: 0.6752 Trace: 0.9993
Rep: 147 Cost: 0.7935 Fidelity: 0.6799 Trace: 0.9993
Rep: 148 Cost: 0.7846 Fidelity: 0.6845 Trace: 0.9992
Rep: 149 Cost: 0.7760 Fidelity: 0.6891 Trace: 0.9992
Rep: 150 Cost: 0.7691 Fidelity: 0.6937 Trace: 0.9991
Rep: 151 Cost: 0.7605 Fidelity: 0.6984 Trace: 0.9991
Rep: 152 Cost: 0.7540 Fidelity: 0.7029 Trace: 0.9990
Rep: 153 Cost: 0.7468 Fidelity: 0.7074 Trace: 0.9990
Rep: 154 Cost: 0.7374 Fidelity: 0.7117 Trace: 0.9989
Rep: 155 Cost: 0.7331 Fidelity: 0.7159 Trace: 0.9989
Rep: 156 Cost: 0.7267 Fidelity: 0.7200 Trace: 0.9988
Rep: 157 Cost: 0.7173 Fidelity: 0.7243 Trace: 0.9988
Rep: 158 Cost: 0.7104 Fidelity: 0.7285 Trace: 0.9987
Rep: 159 Cost: 0.7039 Fidelity: 0.7326 Trace: 0.9987
Rep: 160 Cost: 0.6953 Fidelity: 0.7366 Trace: 0.9986
Rep: 161 Cost: 0.6890 Fidelity: 0.7403 Trace: 0.9985
Rep: 162 Cost: 0.6827 Fidelity: 0.7441 Trace: 0.9984
Rep: 163 Cost: 0.6732 Fidelity: 0.7480 Trace: 0.9984
Rep: 164 Cost: 0.6688 Fidelity: 0.7519 Trace: 0.9983
Rep: 165 Cost: 0.6632 Fidelity: 0.7557 Trace: 0.9983
Rep: 166 Cost: 0.6546 Fidelity: 0.7593 Trace: 0.9982
Rep: 167 Cost: 0.6458 Fidelity: 0.7627 Trace: 0.9981
Rep: 168 Cost: 0.6402 Fidelity: 0.7661 Trace: 0.9980
Rep: 169 Cost: 0.6324 Fidelity: 0.7697 Trace: 0.9979
Rep: 170 Cost: 0.6264 Fidelity: 0.7733 Trace: 0.9979
Rep: 171 Cost: 0.6196 Fidelity: 0.7767 Trace: 0.9978
Rep: 172 Cost: 0.6113 Fidelity: 0.7799 Trace: 0.9977
Rep: 173 Cost: 0.6057 Fidelity: 0.7830 Trace: 0.9976
Rep: 174 Cost: 0.5977 Fidelity: 0.7863 Trace: 0.9975
Rep: 175 Cost: 0.5927 Fidelity: 0.7896 Trace: 0.9974
Rep: 176 Cost: 0.5852 Fidelity: 0.7926 Trace: 0.9973
Rep: 177 Cost: 0.5806 Fidelity: 0.7954 Trace: 0.9972
Rep: 178 Cost: 0.5741 Fidelity: 0.7984 Trace: 0.9971
Rep: 179 Cost: 0.5667 Fidelity: 0.8016 Trace: 0.9970
Rep: 180 Cost: 0.5610 Fidelity: 0.8046 Trace: 0.9969
Rep: 181 Cost: 0.5544 Fidelity: 0.8073 Trace: 0.9968
Rep: 182 Cost: 0.5488 Fidelity: 0.8100 Trace: 0.9966
Rep: 183 Cost: 0.5430 Fidelity: 0.8129 Trace: 0.9965
Rep: 184 Cost: 0.5380 Fidelity: 0.8157 Trace: 0.9964
Rep: 185 Cost: 0.5319 Fidelity: 0.8182 Trace: 0.9963
Rep: 186 Cost: 0.5278 Fidelity: 0.8208 Trace: 0.9961
Rep: 187 Cost: 0.5222 Fidelity: 0.8233 Trace: 0.9960
Rep: 188 Cost: 0.5171 Fidelity: 0.8255 Trace: 0.9958
Rep: 189 Cost: 0.5125 Fidelity: 0.8276 Trace: 0.9957
Rep: 190 Cost: 0.5071 Fidelity: 0.8299 Trace: 0.9955
Rep: 191 Cost: 0.5012 Fidelity: 0.8320 Trace: 0.9954
Rep: 192 Cost: 0.4982 Fidelity: 0.8341 Trace: 0.9952
Rep: 193 Cost: 0.4925 Fidelity: 0.8359 Trace: 0.9951
Rep: 194 Cost: 0.4873 Fidelity: 0.8375 Trace: 0.9949
Rep: 195 Cost: 0.4833 Fidelity: 0.8393 Trace: 0.9947
Rep: 196 Cost: 0.4782 Fidelity: 0.8413 Trace: 0.9946
Rep: 197 Cost: 0.4743 Fidelity: 0.8433 Trace: 0.9944
Rep: 198 Cost: 0.4697 Fidelity: 0.8449 Trace: 0.9943
Rep: 199 Cost: 0.4643 Fidelity: 0.8474 Trace: 0.9941
Rep: 200 Cost: 0.4621 Fidelity: 0.8495 Trace: 0.9939
Rep: 201 Cost: 0.4585 Fidelity: 0.8522 Trace: 0.9938
Rep: 202 Cost: 0.4534 Fidelity: 0.8550 Trace: 0.9937
Rep: 203 Cost: 0.4496 Fidelity: 0.8572 Trace: 0.9935
Rep: 204 Cost: 0.4456 Fidelity: 0.8595 Trace: 0.9933
Rep: 205 Cost: 0.4418 Fidelity: 0.8612 Trace: 0.9931
Rep: 206 Cost: 0.4391 Fidelity: 0.8629 Trace: 0.9930
Rep: 207 Cost: 0.4339 Fidelity: 0.8649 Trace: 0.9928
Rep: 208 Cost: 0.4310 Fidelity: 0.8667 Trace: 0.9927
Rep: 209 Cost: 0.4274 Fidelity: 0.8685 Trace: 0.9925
Rep: 210 Cost: 0.4229 Fidelity: 0.8699 Trace: 0.9923
Rep: 211 Cost: 0.4201 Fidelity: 0.8712 Trace: 0.9921
Rep: 212 Cost: 0.4161 Fidelity: 0.8731 Trace: 0.9920
Rep: 213 Cost: 0.4129 Fidelity: 0.8745 Trace: 0.9918
Rep: 214 Cost: 0.4096 Fidelity: 0.8755 Trace: 0.9915
Rep: 215 Cost: 0.4067 Fidelity: 0.8772 Trace: 0.9914
Rep: 216 Cost: 0.4022 Fidelity: 0.8786 Trace: 0.9912
Rep: 217 Cost: 0.4024 Fidelity: 0.8793 Trace: 0.9910
Rep: 218 Cost: 0.3963 Fidelity: 0.8809 Trace: 0.9908
Rep: 219 Cost: 0.3970 Fidelity: 0.8827 Trace: 0.9907
Rep: 220 Cost: 0.3930 Fidelity: 0.8840 Trace: 0.9905
Rep: 221 Cost: 0.3882 Fidelity: 0.8848 Trace: 0.9902
Rep: 222 Cost: 0.3851 Fidelity: 0.8863 Trace: 0.9901
Rep: 223 Cost: 0.3837 Fidelity: 0.8885 Trace: 0.9900
Rep: 224 Cost: 0.3817 Fidelity: 0.8897 Trace: 0.9898
Rep: 225 Cost: 0.3754 Fidelity: 0.8903 Trace: 0.9895
Rep: 226 Cost: 0.3776 Fidelity: 0.8908 Trace: 0.9892
Rep: 227 Cost: 0.3729 Fidelity: 0.8924 Trace: 0.9891
Rep: 228 Cost: 0.3682 Fidelity: 0.8945 Trace: 0.9891
Rep: 229 Cost: 0.3672 Fidelity: 0.8959 Trace: 0.9889
Rep: 230 Cost: 0.3618 Fidelity: 0.8967 Trace: 0.9887
Rep: 231 Cost: 0.3608 Fidelity: 0.8976 Trace: 0.9884
Rep: 232 Cost: 0.3573 Fidelity: 0.8992 Trace: 0.9883
Rep: 233 Cost: 0.3553 Fidelity: 0.9010 Trace: 0.9882
Rep: 234 Cost: 0.3532 Fidelity: 0.9019 Trace: 0.9880
Rep: 235 Cost: 0.3491 Fidelity: 0.9023 Trace: 0.9877
Rep: 236 Cost: 0.3471 Fidelity: 0.9028 Trace: 0.9875
Rep: 237 Cost: 0.3437 Fidelity: 0.9042 Trace: 0.9874
Rep: 238 Cost: 0.3419 Fidelity: 0.9057 Trace: 0.9873
Rep: 239 Cost: 0.3389 Fidelity: 0.9063 Trace: 0.9871
Rep: 240 Cost: 0.3360 Fidelity: 0.9063 Trace: 0.9868
Rep: 241 Cost: 0.3337 Fidelity: 0.9072 Trace: 0.9866
Rep: 242 Cost: 0.3311 Fidelity: 0.9089 Trace: 0.9866
Rep: 243 Cost: 0.3292 Fidelity: 0.9098 Trace: 0.9864
Rep: 244 Cost: 0.3247 Fidelity: 0.9099 Trace: 0.9861
Rep: 245 Cost: 0.3264 Fidelity: 0.9101 Trace: 0.9858
Rep: 246 Cost: 0.3199 Fidelity: 0.9116 Trace: 0.9857
Rep: 247 Cost: 0.3208 Fidelity: 0.9135 Trace: 0.9857
Rep: 248 Cost: 0.3189 Fidelity: 0.9147 Trace: 0.9856
Rep: 249 Cost: 0.3134 Fidelity: 0.9152 Trace: 0.9853
Rep: 250 Cost: 0.3147 Fidelity: 0.9151 Trace: 0.9849
Rep: 251 Cost: 0.3108 Fidelity: 0.9162 Trace: 0.9848
Rep: 252 Cost: 0.3063 Fidelity: 0.9180 Trace: 0.9848
Rep: 253 Cost: 0.3057 Fidelity: 0.9191 Trace: 0.9847
Rep: 254 Cost: 0.3008 Fidelity: 0.9196 Trace: 0.9845
Rep: 255 Cost: 0.3006 Fidelity: 0.9198 Trace: 0.9841
Rep: 256 Cost: 0.2975 Fidelity: 0.9209 Trace: 0.9840
Rep: 257 Cost: 0.2950 Fidelity: 0.9226 Trace: 0.9840
Rep: 258 Cost: 0.2933 Fidelity: 0.9236 Trace: 0.9839
Rep: 259 Cost: 0.2886 Fidelity: 0.9238 Trace: 0.9836
Rep: 260 Cost: 0.2888 Fidelity: 0.9236 Trace: 0.9833
Rep: 261 Cost: 0.2861 Fidelity: 0.9243 Trace: 0.9832
Rep: 262 Cost: 0.2823 Fidelity: 0.9259 Trace: 0.9832
Rep: 263 Cost: 0.2807 Fidelity: 0.9269 Trace: 0.9831
Rep: 264 Cost: 0.2766 Fidelity: 0.9271 Trace: 0.9828
Rep: 265 Cost: 0.2759 Fidelity: 0.9270 Trace: 0.9825
Rep: 266 Cost: 0.2714 Fidelity: 0.9283 Trace: 0.9825
Rep: 267 Cost: 0.2712 Fidelity: 0.9297 Trace: 0.9824
Rep: 268 Cost: 0.2692 Fidelity: 0.9305 Trace: 0.9823
Rep: 269 Cost: 0.2647 Fidelity: 0.9308 Trace: 0.9820
Rep: 270 Cost: 0.2648 Fidelity: 0.9312 Trace: 0.9817
Rep: 271 Cost: 0.2611 Fidelity: 0.9323 Trace: 0.9816
Rep: 272 Cost: 0.2585 Fidelity: 0.9335 Trace: 0.9816
Rep: 273 Cost: 0.2568 Fidelity: 0.9344 Trace: 0.9815
Rep: 274 Cost: 0.2519 Fidelity: 0.9351 Trace: 0.9813
Rep: 275 Cost: 0.2542 Fidelity: 0.9353 Trace: 0.9811
Rep: 276 Cost: 0.2498 Fidelity: 0.9364 Trace: 0.9810
Rep: 277 Cost: 0.2463 Fidelity: 0.9379 Trace: 0.9810
Rep: 278 Cost: 0.2458 Fidelity: 0.9388 Trace: 0.9809
Rep: 279 Cost: 0.2412 Fidelity: 0.9394 Trace: 0.9807
Rep: 280 Cost: 0.2393 Fidelity: 0.9396 Trace: 0.9804
Rep: 281 Cost: 0.2390 Fidelity: 0.9402 Trace: 0.9802
Rep: 282 Cost: 0.2341 Fidelity: 0.9413 Trace: 0.9802
Rep: 283 Cost: 0.2297 Fidelity: 0.9423 Trace: 0.9802
Rep: 284 Cost: 0.2294 Fidelity: 0.9428 Trace: 0.9800
Rep: 285 Cost: 0.2250 Fidelity: 0.9433 Trace: 0.9798
Rep: 286 Cost: 0.2230 Fidelity: 0.9440 Trace: 0.9797
Rep: 287 Cost: 0.2209 Fidelity: 0.9446 Trace: 0.9796
Rep: 288 Cost: 0.2162 Fidelity: 0.9455 Trace: 0.9795
Rep: 289 Cost: 0.2146 Fidelity: 0.9460 Trace: 0.9793
Rep: 290 Cost: 0.2103 Fidelity: 0.9467 Trace: 0.9791
Rep: 291 Cost: 0.2092 Fidelity: 0.9475 Trace: 0.9791
Rep: 292 Cost: 0.2041 Fidelity: 0.9482 Trace: 0.9789
Rep: 293 Cost: 0.2046 Fidelity: 0.9485 Trace: 0.9787
Rep: 294 Cost: 0.1996 Fidelity: 0.9494 Trace: 0.9787
Rep: 295 Cost: 0.1979 Fidelity: 0.9508 Trace: 0.9787
Rep: 296 Cost: 0.1945 Fidelity: 0.9516 Trace: 0.9786
Rep: 297 Cost: 0.1907 Fidelity: 0.9519 Trace: 0.9783
Rep: 298 Cost: 0.1875 Fidelity: 0.9527 Trace: 0.9782
Rep: 299 Cost: 0.1860 Fidelity: 0.9538 Trace: 0.9782
Rep: 300 Cost: 0.1829 Fidelity: 0.9545 Trace: 0.9781
Rep: 301 Cost: 0.1778 Fidelity: 0.9547 Trace: 0.9779
Rep: 302 Cost: 0.1755 Fidelity: 0.9553 Trace: 0.9777
Rep: 303 Cost: 0.1719 Fidelity: 0.9562 Trace: 0.9777
Rep: 304 Cost: 0.1684 Fidelity: 0.9570 Trace: 0.9777
Rep: 305 Cost: 0.1652 Fidelity: 0.9573 Trace: 0.9774
Rep: 306 Cost: 0.1620 Fidelity: 0.9583 Trace: 0.9775
Rep: 307 Cost: 0.1581 Fidelity: 0.9589 Trace: 0.9773
Rep: 308 Cost: 0.1561 Fidelity: 0.9594 Trace: 0.9771
Rep: 309 Cost: 0.1519 Fidelity: 0.9603 Trace: 0.9770
Rep: 310 Cost: 0.1513 Fidelity: 0.9614 Trace: 0.9771
Rep: 311 Cost: 0.1470 Fidelity: 0.9620 Trace: 0.9770
Rep: 312 Cost: 0.1429 Fidelity: 0.9622 Trace: 0.9768
Rep: 313 Cost: 0.1389 Fidelity: 0.9629 Trace: 0.9767
Rep: 314 Cost: 0.1351 Fidelity: 0.9636 Trace: 0.9766
Rep: 315 Cost: 0.1326 Fidelity: 0.9641 Trace: 0.9765
Rep: 316 Cost: 0.1286 Fidelity: 0.9646 Trace: 0.9764
Rep: 317 Cost: 0.1268 Fidelity: 0.9655 Trace: 0.9764
Rep: 318 Cost: 0.1218 Fidelity: 0.9661 Trace: 0.9764
Rep: 319 Cost: 0.1198 Fidelity: 0.9663 Trace: 0.9761
Rep: 320 Cost: 0.1146 Fidelity: 0.9670 Trace: 0.9760
Rep: 321 Cost: 0.1111 Fidelity: 0.9678 Trace: 0.9761
Rep: 322 Cost: 0.1073 Fidelity: 0.9682 Trace: 0.9759
Rep: 323 Cost: 0.1044 Fidelity: 0.9684 Trace: 0.9757
Rep: 324 Cost: 0.1003 Fidelity: 0.9689 Trace: 0.9756
Rep: 325 Cost: 0.0971 Fidelity: 0.9695 Trace: 0.9757
Rep: 326 Cost: 0.0928 Fidelity: 0.9699 Trace: 0.9756
Rep: 327 Cost: 0.0906 Fidelity: 0.9700 Trace: 0.9753
Rep: 328 Cost: 0.0852 Fidelity: 0.9706 Trace: 0.9752
Rep: 329 Cost: 0.0829 Fidelity: 0.9710 Trace: 0.9752
Rep: 330 Cost: 0.0782 Fidelity: 0.9712 Trace: 0.9750
Rep: 331 Cost: 0.0750 Fidelity: 0.9715 Trace: 0.9749
Rep: 332 Cost: 0.0719 Fidelity: 0.9719 Trace: 0.9749
Rep: 333 Cost: 0.0674 Fidelity: 0.9721 Trace: 0.9747
Rep: 334 Cost: 0.0663 Fidelity: 0.9721 Trace: 0.9745
Rep: 335 Cost: 0.0605 Fidelity: 0.9724 Trace: 0.9744
Rep: 336 Cost: 0.0593 Fidelity: 0.9727 Trace: 0.9743
Rep: 337 Cost: 0.0535 Fidelity: 0.9728 Trace: 0.9742
Rep: 338 Cost: 0.0522 Fidelity: 0.9728 Trace: 0.9739
Rep: 339 Cost: 0.0480 Fidelity: 0.9728 Trace: 0.9737
Rep: 340 Cost: 0.0443 Fidelity: 0.9729 Trace: 0.9735
Rep: 341 Cost: 0.0398 Fidelity: 0.9730 Trace: 0.9735
Rep: 342 Cost: 0.0380 Fidelity: 0.9730 Trace: 0.9734
Rep: 343 Cost: 0.0327 Fidelity: 0.9728 Trace: 0.9730
Rep: 344 Cost: 0.0307 Fidelity: 0.9724 Trace: 0.9726
Rep: 345 Cost: 0.0267 Fidelity: 0.9723 Trace: 0.9724
Rep: 346 Cost: 0.0229 Fidelity: 0.9723 Trace: 0.9723
Rep: 347 Cost: 0.0203 Fidelity: 0.9721 Trace: 0.9721
Rep: 348 Cost: 0.0248 Fidelity: 0.9715 Trace: 0.9715
Rep: 349 Cost: 0.0236 Fidelity: 0.9713 Trace: 0.9714
Rep: 350 Cost: 0.0254 Fidelity: 0.9715 Trace: 0.9715
Rep: 351 Cost: 0.0261 Fidelity: 0.9714 Trace: 0.9715
Rep: 352 Cost: 0.0290 Fidelity: 0.9713 Trace: 0.9714
Rep: 353 Cost: 0.0268 Fidelity: 0.9712 Trace: 0.9713
Rep: 354 Cost: 0.0284 Fidelity: 0.9712 Trace: 0.9713
Rep: 355 Cost: 0.0280 Fidelity: 0.9712 Trace: 0.9713
Rep: 356 Cost: 0.0251 Fidelity: 0.9713 Trace: 0.9713
Rep: 357 Cost: 0.0242 Fidelity: 0.9715 Trace: 0.9715
Rep: 358 Cost: 0.0246 Fidelity: 0.9718 Trace: 0.9718
Rep: 359 Cost: 0.0216 Fidelity: 0.9719 Trace: 0.9720
Rep: 360 Cost: 0.0222 Fidelity: 0.9718 Trace: 0.9718
Rep: 361 Cost: 0.0177 Fidelity: 0.9721 Trace: 0.9721
Rep: 362 Cost: 0.0191 Fidelity: 0.9723 Trace: 0.9723
Rep: 363 Cost: 0.0204 Fidelity: 0.9722 Trace: 0.9722
Rep: 364 Cost: 0.0201 Fidelity: 0.9723 Trace: 0.9723
Rep: 365 Cost: 0.0214 Fidelity: 0.9725 Trace: 0.9725
Rep: 366 Cost: 0.0214 Fidelity: 0.9726 Trace: 0.9726
Rep: 367 Cost: 0.0251 Fidelity: 0.9724 Trace: 0.9725
Rep: 368 Cost: 0.0217 Fidelity: 0.9725 Trace: 0.9725
Rep: 369 Cost: 0.0229 Fidelity: 0.9726 Trace: 0.9726
Rep: 370 Cost: 0.0202 Fidelity: 0.9724 Trace: 0.9724
Rep: 371 Cost: 0.0248 Fidelity: 0.9720 Trace: 0.9721
Rep: 372 Cost: 0.0217 Fidelity: 0.9721 Trace: 0.9722
Rep: 373 Cost: 0.0225 Fidelity: 0.9725 Trace: 0.9726
Rep: 374 Cost: 0.0216 Fidelity: 0.9725 Trace: 0.9725
Rep: 375 Cost: 0.0231 Fidelity: 0.9720 Trace: 0.9721
Rep: 376 Cost: 0.0217 Fidelity: 0.9720 Trace: 0.9720
Rep: 377 Cost: 0.0234 Fidelity: 0.9724 Trace: 0.9725
Rep: 378 Cost: 0.0239 Fidelity: 0.9725 Trace: 0.9725
Rep: 379 Cost: 0.0217 Fidelity: 0.9723 Trace: 0.9723
Rep: 380 Cost: 0.0214 Fidelity: 0.9722 Trace: 0.9723
Rep: 381 Cost: 0.0210 Fidelity: 0.9726 Trace: 0.9727
Rep: 382 Cost: 0.0220 Fidelity: 0.9728 Trace: 0.9728
Rep: 383 Cost: 0.0192 Fidelity: 0.9727 Trace: 0.9727
Rep: 384 Cost: 0.0200 Fidelity: 0.9724 Trace: 0.9725
Rep: 385 Cost: 0.0192 Fidelity: 0.9726 Trace: 0.9726
Rep: 386 Cost: 0.0197 Fidelity: 0.9725 Trace: 0.9725
Rep: 387 Cost: 0.0165 Fidelity: 0.9727 Trace: 0.9727
Rep: 388 Cost: 0.0206 Fidelity: 0.9728 Trace: 0.9728
Rep: 389 Cost: 0.0196 Fidelity: 0.9728 Trace: 0.9728
Rep: 390 Cost: 0.0189 Fidelity: 0.9726 Trace: 0.9726
Rep: 391 Cost: 0.0199 Fidelity: 0.9728 Trace: 0.9728
Rep: 392 Cost: 0.0207 Fidelity: 0.9728 Trace: 0.9728
Rep: 393 Cost: 0.0188 Fidelity: 0.9726 Trace: 0.9726
Rep: 394 Cost: 0.0179 Fidelity: 0.9727 Trace: 0.9727
Rep: 395 Cost: 0.0220 Fidelity: 0.9731 Trace: 0.9731
Rep: 396 Cost: 0.0215 Fidelity: 0.9731 Trace: 0.9731
Rep: 397 Cost: 0.0170 Fidelity: 0.9728 Trace: 0.9728
Rep: 398 Cost: 0.0178 Fidelity: 0.9727 Trace: 0.9727
Rep: 399 Cost: 0.0215 Fidelity: 0.9729 Trace: 0.9729
Rep: 400 Cost: 0.0176 Fidelity: 0.9730 Trace: 0.9730
Rep: 401 Cost: 0.0251 Fidelity: 0.9726 Trace: 0.9727
Rep: 402 Cost: 0.0239 Fidelity: 0.9727 Trace: 0.9727
Rep: 403 Cost: 0.0198 Fidelity: 0.9731 Trace: 0.9731
Rep: 404 Cost: 0.0227 Fidelity: 0.9731 Trace: 0.9731
Rep: 405 Cost: 0.0173 Fidelity: 0.9729 Trace: 0.9729
Rep: 406 Cost: 0.0182 Fidelity: 0.9729 Trace: 0.9730
Rep: 407 Cost: 0.0171 Fidelity: 0.9731 Trace: 0.9731
Rep: 408 Cost: 0.0189 Fidelity: 0.9729 Trace: 0.9729
Rep: 409 Cost: 0.0179 Fidelity: 0.9730 Trace: 0.9730
Rep: 410 Cost: 0.0198 Fidelity: 0.9733 Trace: 0.9733
Rep: 411 Cost: 0.0184 Fidelity: 0.9732 Trace: 0.9732
Rep: 412 Cost: 0.0210 Fidelity: 0.9729 Trace: 0.9729
Rep: 413 Cost: 0.0193 Fidelity: 0.9729 Trace: 0.9729
Rep: 414 Cost: 0.0216 Fidelity: 0.9733 Trace: 0.9733
Rep: 415 Cost: 0.0209 Fidelity: 0.9734 Trace: 0.9734
Rep: 416 Cost: 0.0200 Fidelity: 0.9731 Trace: 0.9731
Rep: 417 Cost: 0.0199 Fidelity: 0.9730 Trace: 0.9730
Rep: 418 Cost: 0.0191 Fidelity: 0.9732 Trace: 0.9732
Rep: 419 Cost: 0.0174 Fidelity: 0.9733 Trace: 0.9733
Rep: 420 Cost: 0.0226 Fidelity: 0.9730 Trace: 0.9730
Rep: 421 Cost: 0.0209 Fidelity: 0.9731 Trace: 0.9731
Rep: 422 Cost: 0.0202 Fidelity: 0.9734 Trace: 0.9734
Rep: 423 Cost: 0.0213 Fidelity: 0.9734 Trace: 0.9734
Rep: 424 Cost: 0.0165 Fidelity: 0.9732 Trace: 0.9732
Rep: 425 Cost: 0.0163 Fidelity: 0.9734 Trace: 0.9734
Rep: 426 Cost: 0.0164 Fidelity: 0.9733 Trace: 0.9733
Rep: 427 Cost: 0.0161 Fidelity: 0.9733 Trace: 0.9733
Rep: 428 Cost: 0.0152 Fidelity: 0.9734 Trace: 0.9734
Rep: 429 Cost: 0.0182 Fidelity: 0.9733 Trace: 0.9733
Rep: 430 Cost: 0.0160 Fidelity: 0.9733 Trace: 0.9733
Rep: 431 Cost: 0.0192 Fidelity: 0.9734 Trace: 0.9734
Rep: 432 Cost: 0.0188 Fidelity: 0.9734 Trace: 0.9734
Rep: 433 Cost: 0.0168 Fidelity: 0.9733 Trace: 0.9733
Rep: 434 Cost: 0.0175 Fidelity: 0.9734 Trace: 0.9735
Rep: 435 Cost: 0.0168 Fidelity: 0.9735 Trace: 0.9735
Rep: 436 Cost: 0.0163 Fidelity: 0.9734 Trace: 0.9734
Rep: 437 Cost: 0.0177 Fidelity: 0.9734 Trace: 0.9734
Rep: 438 Cost: 0.0175 Fidelity: 0.9735 Trace: 0.9735
Rep: 439 Cost: 0.0196 Fidelity: 0.9736 Trace: 0.9736
Rep: 440 Cost: 0.0195 Fidelity: 0.9736 Trace: 0.9736
Rep: 441 Cost: 0.0177 Fidelity: 0.9737 Trace: 0.9737
Rep: 442 Cost: 0.0203 Fidelity: 0.9737 Trace: 0.9737
Rep: 443 Cost: 0.0184 Fidelity: 0.9735 Trace: 0.9735
Rep: 444 Cost: 0.0187 Fidelity: 0.9735 Trace: 0.9735
Rep: 445 Cost: 0.0153 Fidelity: 0.9737 Trace: 0.9737
Rep: 446 Cost: 0.0201 Fidelity: 0.9737 Trace: 0.9738
Rep: 447 Cost: 0.0188 Fidelity: 0.9737 Trace: 0.9737
Rep: 448 Cost: 0.0178 Fidelity: 0.9735 Trace: 0.9735
Rep: 449 Cost: 0.0177 Fidelity: 0.9735 Trace: 0.9735
Rep: 450 Cost: 0.0155 Fidelity: 0.9735 Trace: 0.9735
Rep: 451 Cost: 0.0187 Fidelity: 0.9738 Trace: 0.9738
Rep: 452 Cost: 0.0194 Fidelity: 0.9739 Trace: 0.9739
Rep: 453 Cost: 0.0160 Fidelity: 0.9737 Trace: 0.9737
Rep: 454 Cost: 0.0174 Fidelity: 0.9736 Trace: 0.9736
Rep: 455 Cost: 0.0186 Fidelity: 0.9737 Trace: 0.9737
Rep: 456 Cost: 0.0164 Fidelity: 0.9737 Trace: 0.9737
Rep: 457 Cost: 0.0171 Fidelity: 0.9739 Trace: 0.9739
Rep: 458 Cost: 0.0186 Fidelity: 0.9740 Trace: 0.9740
Rep: 459 Cost: 0.0161 Fidelity: 0.9739 Trace: 0.9739
Rep: 460 Cost: 0.0191 Fidelity: 0.9736 Trace: 0.9736
Rep: 461 Cost: 0.0196 Fidelity: 0.9737 Trace: 0.9737
Rep: 462 Cost: 0.0173 Fidelity: 0.9739 Trace: 0.9739
Rep: 463 Cost: 0.0160 Fidelity: 0.9740 Trace: 0.9740
Rep: 464 Cost: 0.0190 Fidelity: 0.9738 Trace: 0.9738
Rep: 465 Cost: 0.0178 Fidelity: 0.9739 Trace: 0.9739
Rep: 466 Cost: 0.0177 Fidelity: 0.9740 Trace: 0.9740
Rep: 467 Cost: 0.0201 Fidelity: 0.9739 Trace: 0.9739
Rep: 468 Cost: 0.0187 Fidelity: 0.9738 Trace: 0.9738
Rep: 469 Cost: 0.0171 Fidelity: 0.9738 Trace: 0.9738
Rep: 470 Cost: 0.0202 Fidelity: 0.9741 Trace: 0.9742
Rep: 471 Cost: 0.0209 Fidelity: 0.9743 Trace: 0.9743
Rep: 472 Cost: 0.0180 Fidelity: 0.9741 Trace: 0.9741
Rep: 473 Cost: 0.0208 Fidelity: 0.9738 Trace: 0.9738
Rep: 474 Cost: 0.0188 Fidelity: 0.9739 Trace: 0.9739
Rep: 475 Cost: 0.0212 Fidelity: 0.9742 Trace: 0.9742
Rep: 476 Cost: 0.0195 Fidelity: 0.9743 Trace: 0.9743
Rep: 477 Cost: 0.0183 Fidelity: 0.9741 Trace: 0.9741
Rep: 478 Cost: 0.0176 Fidelity: 0.9741 Trace: 0.9741
Rep: 479 Cost: 0.0189 Fidelity: 0.9742 Trace: 0.9742
Rep: 480 Cost: 0.0167 Fidelity: 0.9742 Trace: 0.9742
Rep: 481 Cost: 0.0211 Fidelity: 0.9740 Trace: 0.9740
Rep: 482 Cost: 0.0202 Fidelity: 0.9740 Trace: 0.9740
Rep: 483 Cost: 0.0186 Fidelity: 0.9742 Trace: 0.9742
Rep: 484 Cost: 0.0193 Fidelity: 0.9742 Trace: 0.9743
Rep: 485 Cost: 0.0169 Fidelity: 0.9741 Trace: 0.9741
Rep: 486 Cost: 0.0175 Fidelity: 0.9742 Trace: 0.9742
Rep: 487 Cost: 0.0188 Fidelity: 0.9744 Trace: 0.9744
Rep: 488 Cost: 0.0166 Fidelity: 0.9744 Trace: 0.9744
Rep: 489 Cost: 0.0206 Fidelity: 0.9741 Trace: 0.9741
Rep: 490 Cost: 0.0197 Fidelity: 0.9741 Trace: 0.9741
Rep: 491 Cost: 0.0190 Fidelity: 0.9744 Trace: 0.9744
Rep: 492 Cost: 0.0199 Fidelity: 0.9745 Trace: 0.9745
Rep: 493 Cost: 0.0162 Fidelity: 0.9744 Trace: 0.9744
Rep: 494 Cost: 0.0197 Fidelity: 0.9741 Trace: 0.9742
Rep: 495 Cost: 0.0179 Fidelity: 0.9742 Trace: 0.9742
Rep: 496 Cost: 0.0192 Fidelity: 0.9745 Trace: 0.9745
Rep: 497 Cost: 0.0189 Fidelity: 0.9745 Trace: 0.9745
Rep: 498 Cost: 0.0203 Fidelity: 0.9743 Trace: 0.9743
Rep: 499 Cost: 0.0189 Fidelity: 0.9742 Trace: 0.9742
Rep: 500 Cost: 0.0182 Fidelity: 0.9744 Trace: 0.9744
Rep: 501 Cost: 0.0168 Fidelity: 0.9745 Trace: 0.9745
Rep: 502 Cost: 0.0196 Fidelity: 0.9743 Trace: 0.9744
Rep: 503 Cost: 0.0189 Fidelity: 0.9744 Trace: 0.9744
Rep: 504 Cost: 0.0179 Fidelity: 0.9745 Trace: 0.9746
Rep: 505 Cost: 0.0195 Fidelity: 0.9746 Trace: 0.9746
Rep: 506 Cost: 0.0153 Fidelity: 0.9745 Trace: 0.9745
Rep: 507 Cost: 0.0165 Fidelity: 0.9744 Trace: 0.9744
Rep: 508 Cost: 0.0167 Fidelity: 0.9746 Trace: 0.9746
Rep: 509 Cost: 0.0145 Fidelity: 0.9746 Trace: 0.9746
Rep: 510 Cost: 0.0192 Fidelity: 0.9744 Trace: 0.9744
Rep: 511 Cost: 0.0171 Fidelity: 0.9744 Trace: 0.9744
Rep: 512 Cost: 0.0190 Fidelity: 0.9748 Trace: 0.9748
Rep: 513 Cost: 0.0194 Fidelity: 0.9748 Trace: 0.9748
Rep: 514 Cost: 0.0165 Fidelity: 0.9746 Trace: 0.9746
Rep: 515 Cost: 0.0182 Fidelity: 0.9745 Trace: 0.9745
Rep: 516 Cost: 0.0176 Fidelity: 0.9746 Trace: 0.9746
Rep: 517 Cost: 0.0162 Fidelity: 0.9748 Trace: 0.9748
Rep: 518 Cost: 0.0175 Fidelity: 0.9746 Trace: 0.9746
Rep: 519 Cost: 0.0152 Fidelity: 0.9746 Trace: 0.9746
Rep: 520 Cost: 0.0166 Fidelity: 0.9748 Trace: 0.9748
Rep: 521 Cost: 0.0156 Fidelity: 0.9747 Trace: 0.9747
Rep: 522 Cost: 0.0174 Fidelity: 0.9746 Trace: 0.9746
Rep: 523 Cost: 0.0160 Fidelity: 0.9747 Trace: 0.9747
Rep: 524 Cost: 0.0160 Fidelity: 0.9747 Trace: 0.9747
Rep: 525 Cost: 0.0152 Fidelity: 0.9747 Trace: 0.9747
Rep: 526 Cost: 0.0171 Fidelity: 0.9748 Trace: 0.9748
Rep: 527 Cost: 0.0161 Fidelity: 0.9748 Trace: 0.9749
Rep: 528 Cost: 0.0157 Fidelity: 0.9748 Trace: 0.9748
Rep: 529 Cost: 0.0158 Fidelity: 0.9749 Trace: 0.9749
Rep: 530 Cost: 0.0153 Fidelity: 0.9748 Trace: 0.9748
Rep: 531 Cost: 0.0150 Fidelity: 0.9748 Trace: 0.9748
Rep: 532 Cost: 0.0144 Fidelity: 0.9749 Trace: 0.9749
Rep: 533 Cost: 0.0143 Fidelity: 0.9749 Trace: 0.9749
Rep: 534 Cost: 0.0158 Fidelity: 0.9748 Trace: 0.9748
Rep: 535 Cost: 0.0160 Fidelity: 0.9748 Trace: 0.9748
Rep: 536 Cost: 0.0148 Fidelity: 0.9749 Trace: 0.9749
Rep: 537 Cost: 0.0166 Fidelity: 0.9747 Trace: 0.9747
Rep: 538 Cost: 0.0153 Fidelity: 0.9748 Trace: 0.9748
Rep: 539 Cost: 0.0156 Fidelity: 0.9750 Trace: 0.9750
Rep: 540 Cost: 0.0172 Fidelity: 0.9749 Trace: 0.9749
Rep: 541 Cost: 0.0158 Fidelity: 0.9750 Trace: 0.9750
Rep: 542 Cost: 0.0169 Fidelity: 0.9751 Trace: 0.9751
Rep: 543 Cost: 0.0178 Fidelity: 0.9750 Trace: 0.9750
Rep: 544 Cost: 0.0173 Fidelity: 0.9750 Trace: 0.9750
Rep: 545 Cost: 0.0150 Fidelity: 0.9751 Trace: 0.9751
Rep: 546 Cost: 0.0183 Fidelity: 0.9750 Trace: 0.9751
Rep: 547 Cost: 0.0188 Fidelity: 0.9750 Trace: 0.9751
Rep: 548 Cost: 0.0163 Fidelity: 0.9751 Trace: 0.9751
Rep: 549 Cost: 0.0155 Fidelity: 0.9750 Trace: 0.9750
Rep: 550 Cost: 0.0185 Fidelity: 0.9749 Trace: 0.9749
Rep: 551 Cost: 0.0181 Fidelity: 0.9750 Trace: 0.9750
Rep: 552 Cost: 0.0154 Fidelity: 0.9752 Trace: 0.9752
Rep: 553 Cost: 0.0174 Fidelity: 0.9752 Trace: 0.9752
Rep: 554 Cost: 0.0172 Fidelity: 0.9752 Trace: 0.9752
Rep: 555 Cost: 0.0159 Fidelity: 0.9752 Trace: 0.9752
Rep: 556 Cost: 0.0164 Fidelity: 0.9752 Trace: 0.9752
Rep: 557 Cost: 0.0176 Fidelity: 0.9751 Trace: 0.9751
Rep: 558 Cost: 0.0164 Fidelity: 0.9751 Trace: 0.9751
Rep: 559 Cost: 0.0156 Fidelity: 0.9752 Trace: 0.9753
Rep: 560 Cost: 0.0173 Fidelity: 0.9752 Trace: 0.9752
Rep: 561 Cost: 0.0149 Fidelity: 0.9751 Trace: 0.9751
Rep: 562 Cost: 0.0164 Fidelity: 0.9753 Trace: 0.9753
Rep: 563 Cost: 0.0159 Fidelity: 0.9753 Trace: 0.9753
Rep: 564 Cost: 0.0173 Fidelity: 0.9751 Trace: 0.9751
Rep: 565 Cost: 0.0167 Fidelity: 0.9752 Trace: 0.9752
Rep: 566 Cost: 0.0175 Fidelity: 0.9755 Trace: 0.9755
Rep: 567 Cost: 0.0170 Fidelity: 0.9755 Trace: 0.9755
Rep: 568 Cost: 0.0160 Fidelity: 0.9752 Trace: 0.9752
Rep: 569 Cost: 0.0153 Fidelity: 0.9753 Trace: 0.9753
Rep: 570 Cost: 0.0157 Fidelity: 0.9754 Trace: 0.9754
Rep: 571 Cost: 0.0160 Fidelity: 0.9753 Trace: 0.9753
Rep: 572 Cost: 0.0142 Fidelity: 0.9753 Trace: 0.9753
Rep: 573 Cost: 0.0138 Fidelity: 0.9753 Trace: 0.9753
Rep: 574 Cost: 0.0154 Fidelity: 0.9754 Trace: 0.9754
Rep: 575 Cost: 0.0141 Fidelity: 0.9754 Trace: 0.9754
Rep: 576 Cost: 0.0147 Fidelity: 0.9754 Trace: 0.9754
Rep: 577 Cost: 0.0150 Fidelity: 0.9754 Trace: 0.9754
Rep: 578 Cost: 0.0153 Fidelity: 0.9755 Trace: 0.9755
Rep: 579 Cost: 0.0147 Fidelity: 0.9754 Trace: 0.9754
Rep: 580 Cost: 0.0144 Fidelity: 0.9755 Trace: 0.9755
Rep: 581 Cost: 0.0165 Fidelity: 0.9755 Trace: 0.9755
Rep: 582 Cost: 0.0163 Fidelity: 0.9754 Trace: 0.9754
Rep: 583 Cost: 0.0170 Fidelity: 0.9754 Trace: 0.9754
Rep: 584 Cost: 0.0166 Fidelity: 0.9755 Trace: 0.9755
Rep: 585 Cost: 0.0159 Fidelity: 0.9756 Trace: 0.9756
Rep: 586 Cost: 0.0173 Fidelity: 0.9755 Trace: 0.9755
Rep: 587 Cost: 0.0144 Fidelity: 0.9755 Trace: 0.9755
Rep: 588 Cost: 0.0181 Fidelity: 0.9754 Trace: 0.9754
Rep: 589 Cost: 0.0169 Fidelity: 0.9754 Trace: 0.9754
Rep: 590 Cost: 0.0159 Fidelity: 0.9756 Trace: 0.9756
Rep: 591 Cost: 0.0162 Fidelity: 0.9757 Trace: 0.9757
Rep: 592 Cost: 0.0162 Fidelity: 0.9756 Trace: 0.9756
Rep: 593 Cost: 0.0159 Fidelity: 0.9756 Trace: 0.9756
Rep: 594 Cost: 0.0154 Fidelity: 0.9757 Trace: 0.9757
Rep: 595 Cost: 0.0157 Fidelity: 0.9757 Trace: 0.9757
Rep: 596 Cost: 0.0143 Fidelity: 0.9757 Trace: 0.9757
Rep: 597 Cost: 0.0150 Fidelity: 0.9757 Trace: 0.9757
Rep: 598 Cost: 0.0149 Fidelity: 0.9758 Trace: 0.9758
Rep: 599 Cost: 0.0146 Fidelity: 0.9756 Trace: 0.9756
Rep: 600 Cost: 0.0144 Fidelity: 0.9756 Trace: 0.9756
Rep: 601 Cost: 0.0161 Fidelity: 0.9758 Trace: 0.9758
Rep: 602 Cost: 0.0168 Fidelity: 0.9757 Trace: 0.9757
Rep: 603 Cost: 0.0153 Fidelity: 0.9756 Trace: 0.9756
Rep: 604 Cost: 0.0180 Fidelity: 0.9757 Trace: 0.9757
Rep: 605 Cost: 0.0166 Fidelity: 0.9757 Trace: 0.9757
Rep: 606 Cost: 0.0150 Fidelity: 0.9758 Trace: 0.9758
Rep: 607 Cost: 0.0160 Fidelity: 0.9759 Trace: 0.9759
Rep: 608 Cost: 0.0165 Fidelity: 0.9759 Trace: 0.9759
Rep: 609 Cost: 0.0153 Fidelity: 0.9758 Trace: 0.9758
Rep: 610 Cost: 0.0156 Fidelity: 0.9758 Trace: 0.9758
Rep: 611 Cost: 0.0166 Fidelity: 0.9759 Trace: 0.9759
Rep: 612 Cost: 0.0137 Fidelity: 0.9759 Trace: 0.9759
Rep: 613 Cost: 0.0171 Fidelity: 0.9757 Trace: 0.9758
Rep: 614 Cost: 0.0157 Fidelity: 0.9758 Trace: 0.9758
Rep: 615 Cost: 0.0166 Fidelity: 0.9760 Trace: 0.9760
Rep: 616 Cost: 0.0159 Fidelity: 0.9759 Trace: 0.9759
Rep: 617 Cost: 0.0159 Fidelity: 0.9757 Trace: 0.9758
Rep: 618 Cost: 0.0146 Fidelity: 0.9759 Trace: 0.9759
Rep: 619 Cost: 0.0159 Fidelity: 0.9760 Trace: 0.9760
Rep: 620 Cost: 0.0154 Fidelity: 0.9759 Trace: 0.9759
Rep: 621 Cost: 0.0152 Fidelity: 0.9758 Trace: 0.9758
Rep: 622 Cost: 0.0151 Fidelity: 0.9759 Trace: 0.9759
Rep: 623 Cost: 0.0163 Fidelity: 0.9759 Trace: 0.9759
Rep: 624 Cost: 0.0147 Fidelity: 0.9760 Trace: 0.9760
Rep: 625 Cost: 0.0162 Fidelity: 0.9759 Trace: 0.9759
Rep: 626 Cost: 0.0161 Fidelity: 0.9759 Trace: 0.9759
Rep: 627 Cost: 0.0143 Fidelity: 0.9760 Trace: 0.9760
Rep: 628 Cost: 0.0142 Fidelity: 0.9760 Trace: 0.9760
Rep: 629 Cost: 0.0162 Fidelity: 0.9759 Trace: 0.9759
Rep: 630 Cost: 0.0151 Fidelity: 0.9760 Trace: 0.9760
Rep: 631 Cost: 0.0156 Fidelity: 0.9761 Trace: 0.9761
Rep: 632 Cost: 0.0155 Fidelity: 0.9761 Trace: 0.9761
Rep: 633 Cost: 0.0151 Fidelity: 0.9761 Trace: 0.9761
Rep: 634 Cost: 0.0147 Fidelity: 0.9760 Trace: 0.9760
Rep: 635 Cost: 0.0155 Fidelity: 0.9762 Trace: 0.9762
Rep: 636 Cost: 0.0151 Fidelity: 0.9762 Trace: 0.9762
Rep: 637 Cost: 0.0161 Fidelity: 0.9762 Trace: 0.9762
Rep: 638 Cost: 0.0154 Fidelity: 0.9761 Trace: 0.9761
Rep: 639 Cost: 0.0158 Fidelity: 0.9761 Trace: 0.9762
Rep: 640 Cost: 0.0167 Fidelity: 0.9763 Trace: 0.9763
Rep: 641 Cost: 0.0148 Fidelity: 0.9762 Trace: 0.9762
Rep: 642 Cost: 0.0187 Fidelity: 0.9760 Trace: 0.9760
Rep: 643 Cost: 0.0182 Fidelity: 0.9760 Trace: 0.9760
Rep: 644 Cost: 0.0151 Fidelity: 0.9763 Trace: 0.9763
Rep: 645 Cost: 0.0159 Fidelity: 0.9763 Trace: 0.9763
Rep: 646 Cost: 0.0150 Fidelity: 0.9761 Trace: 0.9761
Rep: 647 Cost: 0.0154 Fidelity: 0.9762 Trace: 0.9762
Rep: 648 Cost: 0.0147 Fidelity: 0.9762 Trace: 0.9762
Rep: 649 Cost: 0.0175 Fidelity: 0.9761 Trace: 0.9761
Rep: 650 Cost: 0.0157 Fidelity: 0.9762 Trace: 0.9762
Rep: 651 Cost: 0.0181 Fidelity: 0.9764 Trace: 0.9764
Rep: 652 Cost: 0.0177 Fidelity: 0.9763 Trace: 0.9764
Rep: 653 Cost: 0.0159 Fidelity: 0.9763 Trace: 0.9763
Rep: 654 Cost: 0.0160 Fidelity: 0.9763 Trace: 0.9763
Rep: 655 Cost: 0.0148 Fidelity: 0.9764 Trace: 0.9764
Rep: 656 Cost: 0.0159 Fidelity: 0.9762 Trace: 0.9762
Rep: 657 Cost: 0.0149 Fidelity: 0.9762 Trace: 0.9762
Rep: 658 Cost: 0.0164 Fidelity: 0.9765 Trace: 0.9765
Rep: 659 Cost: 0.0165 Fidelity: 0.9765 Trace: 0.9765
Rep: 660 Cost: 0.0145 Fidelity: 0.9763 Trace: 0.9763
Rep: 661 Cost: 0.0166 Fidelity: 0.9764 Trace: 0.9764
Rep: 662 Cost: 0.0142 Fidelity: 0.9764 Trace: 0.9764
Rep: 663 Cost: 0.0191 Fidelity: 0.9763 Trace: 0.9763
Rep: 664 Cost: 0.0188 Fidelity: 0.9763 Trace: 0.9764
Rep: 665 Cost: 0.0154 Fidelity: 0.9765 Trace: 0.9765
Rep: 666 Cost: 0.0167 Fidelity: 0.9764 Trace: 0.9764
Rep: 667 Cost: 0.0180 Fidelity: 0.9762 Trace: 0.9762
Rep: 668 Cost: 0.0158 Fidelity: 0.9763 Trace: 0.9763
Rep: 669 Cost: 0.0182 Fidelity: 0.9766 Trace: 0.9766
Rep: 670 Cost: 0.0187 Fidelity: 0.9766 Trace: 0.9766
Rep: 671 Cost: 0.0150 Fidelity: 0.9764 Trace: 0.9764
Rep: 672 Cost: 0.0175 Fidelity: 0.9763 Trace: 0.9763
Rep: 673 Cost: 0.0154 Fidelity: 0.9765 Trace: 0.9765
Rep: 674 Cost: 0.0166 Fidelity: 0.9767 Trace: 0.9767
Rep: 675 Cost: 0.0156 Fidelity: 0.9766 Trace: 0.9766
Rep: 676 Cost: 0.0171 Fidelity: 0.9764 Trace: 0.9764
Rep: 677 Cost: 0.0168 Fidelity: 0.9764 Trace: 0.9764
Rep: 678 Cost: 0.0167 Fidelity: 0.9766 Trace: 0.9766
Rep: 679 Cost: 0.0162 Fidelity: 0.9767 Trace: 0.9767
Rep: 680 Cost: 0.0161 Fidelity: 0.9766 Trace: 0.9766
Rep: 681 Cost: 0.0137 Fidelity: 0.9766 Trace: 0.9766
Rep: 682 Cost: 0.0166 Fidelity: 0.9767 Trace: 0.9767
Rep: 683 Cost: 0.0158 Fidelity: 0.9766 Trace: 0.9766
Rep: 684 Cost: 0.0149 Fidelity: 0.9765 Trace: 0.9765
Rep: 685 Cost: 0.0147 Fidelity: 0.9766 Trace: 0.9766
Rep: 686 Cost: 0.0156 Fidelity: 0.9767 Trace: 0.9767
Rep: 687 Cost: 0.0145 Fidelity: 0.9766 Trace: 0.9766
Rep: 688 Cost: 0.0145 Fidelity: 0.9766 Trace: 0.9766
Rep: 689 Cost: 0.0138 Fidelity: 0.9767 Trace: 0.9767
Rep: 690 Cost: 0.0151 Fidelity: 0.9768 Trace: 0.9768
Rep: 691 Cost: 0.0150 Fidelity: 0.9768 Trace: 0.9768
Rep: 692 Cost: 0.0146 Fidelity: 0.9767 Trace: 0.9767
Rep: 693 Cost: 0.0154 Fidelity: 0.9768 Trace: 0.9768
Rep: 694 Cost: 0.0146 Fidelity: 0.9767 Trace: 0.9767
Rep: 695 Cost: 0.0153 Fidelity: 0.9767 Trace: 0.9767
Rep: 696 Cost: 0.0166 Fidelity: 0.9768 Trace: 0.9768
Rep: 697 Cost: 0.0164 Fidelity: 0.9768 Trace: 0.9769
Rep: 698 Cost: 0.0142 Fidelity: 0.9767 Trace: 0.9767
Rep: 699 Cost: 0.0165 Fidelity: 0.9768 Trace: 0.9768
Rep: 700 Cost: 0.0158 Fidelity: 0.9769 Trace: 0.9769
Rep: 701 Cost: 0.0148 Fidelity: 0.9769 Trace: 0.9769
Rep: 702 Cost: 0.0163 Fidelity: 0.9768 Trace: 0.9768
Rep: 703 Cost: 0.0150 Fidelity: 0.9769 Trace: 0.9769
Rep: 704 Cost: 0.0149 Fidelity: 0.9770 Trace: 0.9770
Rep: 705 Cost: 0.0140 Fidelity: 0.9769 Trace: 0.9769
Rep: 706 Cost: 0.0150 Fidelity: 0.9767 Trace: 0.9767
Rep: 707 Cost: 0.0139 Fidelity: 0.9768 Trace: 0.9768
Rep: 708 Cost: 0.0177 Fidelity: 0.9770 Trace: 0.9771
Rep: 709 Cost: 0.0173 Fidelity: 0.9771 Trace: 0.9771
Rep: 710 Cost: 0.0159 Fidelity: 0.9768 Trace: 0.9769
Rep: 711 Cost: 0.0172 Fidelity: 0.9767 Trace: 0.9767
Rep: 712 Cost: 0.0154 Fidelity: 0.9768 Trace: 0.9768
Rep: 713 Cost: 0.0156 Fidelity: 0.9770 Trace: 0.9770
Rep: 714 Cost: 0.0161 Fidelity: 0.9769 Trace: 0.9769
Rep: 715 Cost: 0.0149 Fidelity: 0.9768 Trace: 0.9768
Rep: 716 Cost: 0.0171 Fidelity: 0.9769 Trace: 0.9769
Rep: 717 Cost: 0.0153 Fidelity: 0.9769 Trace: 0.9769
Rep: 718 Cost: 0.0173 Fidelity: 0.9768 Trace: 0.9768
Rep: 719 Cost: 0.0173 Fidelity: 0.9769 Trace: 0.9769
Rep: 720 Cost: 0.0164 Fidelity: 0.9771 Trace: 0.9771
Rep: 721 Cost: 0.0154 Fidelity: 0.9770 Trace: 0.9770
Rep: 722 Cost: 0.0175 Fidelity: 0.9769 Trace: 0.9769
Rep: 723 Cost: 0.0168 Fidelity: 0.9769 Trace: 0.9769
Rep: 724 Cost: 0.0162 Fidelity: 0.9772 Trace: 0.9772
Rep: 725 Cost: 0.0169 Fidelity: 0.9772 Trace: 0.9772
Rep: 726 Cost: 0.0153 Fidelity: 0.9770 Trace: 0.9770
Rep: 727 Cost: 0.0155 Fidelity: 0.9770 Trace: 0.9770
Rep: 728 Cost: 0.0149 Fidelity: 0.9772 Trace: 0.9772
Rep: 729 Cost: 0.0170 Fidelity: 0.9773 Trace: 0.9773
Rep: 730 Cost: 0.0163 Fidelity: 0.9772 Trace: 0.9772
Rep: 731 Cost: 0.0152 Fidelity: 0.9770 Trace: 0.9770
Rep: 732 Cost: 0.0173 Fidelity: 0.9770 Trace: 0.9770
Rep: 733 Cost: 0.0178 Fidelity: 0.9771 Trace: 0.9771
Rep: 734 Cost: 0.0151 Fidelity: 0.9770 Trace: 0.9770
Rep: 735 Cost: 0.0180 Fidelity: 0.9771 Trace: 0.9771
Rep: 736 Cost: 0.0172 Fidelity: 0.9772 Trace: 0.9772
Rep: 737 Cost: 0.0172 Fidelity: 0.9773 Trace: 0.9773
Rep: 738 Cost: 0.0166 Fidelity: 0.9772 Trace: 0.9772
Rep: 739 Cost: 0.0143 Fidelity: 0.9771 Trace: 0.9771
Rep: 740 Cost: 0.0164 Fidelity: 0.9772 Trace: 0.9772
Rep: 741 Cost: 0.0143 Fidelity: 0.9773 Trace: 0.9773
Rep: 742 Cost: 0.0147 Fidelity: 0.9772 Trace: 0.9772
Rep: 743 Cost: 0.0145 Fidelity: 0.9772 Trace: 0.9772
Rep: 744 Cost: 0.0145 Fidelity: 0.9773 Trace: 0.9773
Rep: 745 Cost: 0.0141 Fidelity: 0.9773 Trace: 0.9773
Rep: 746 Cost: 0.0138 Fidelity: 0.9772 Trace: 0.9772
Rep: 747 Cost: 0.0139 Fidelity: 0.9774 Trace: 0.9774
Rep: 748 Cost: 0.0162 Fidelity: 0.9774 Trace: 0.9774
Rep: 749 Cost: 0.0178 Fidelity: 0.9773 Trace: 0.9773
Rep: 750 Cost: 0.0131 Fidelity: 0.9774 Trace: 0.9774
Rep: 751 Cost: 0.0183 Fidelity: 0.9773 Trace: 0.9773
Rep: 752 Cost: 0.0182 Fidelity: 0.9772 Trace: 0.9772
Rep: 753 Cost: 0.0156 Fidelity: 0.9772 Trace: 0.9772
Rep: 754 Cost: 0.0161 Fidelity: 0.9774 Trace: 0.9774
Rep: 755 Cost: 0.0165 Fidelity: 0.9775 Trace: 0.9775
Rep: 756 Cost: 0.0154 Fidelity: 0.9774 Trace: 0.9774
Rep: 757 Cost: 0.0156 Fidelity: 0.9773 Trace: 0.9773
Rep: 758 Cost: 0.0147 Fidelity: 0.9773 Trace: 0.9773
Rep: 759 Cost: 0.0160 Fidelity: 0.9774 Trace: 0.9774
Rep: 760 Cost: 0.0157 Fidelity: 0.9775 Trace: 0.9775
Rep: 761 Cost: 0.0135 Fidelity: 0.9774 Trace: 0.9774
Rep: 762 Cost: 0.0139 Fidelity: 0.9774 Trace: 0.9774
Rep: 763 Cost: 0.0143 Fidelity: 0.9775 Trace: 0.9775
Rep: 764 Cost: 0.0136 Fidelity: 0.9774 Trace: 0.9774
Rep: 765 Cost: 0.0149 Fidelity: 0.9774 Trace: 0.9774
Rep: 766 Cost: 0.0153 Fidelity: 0.9775 Trace: 0.9775
Rep: 767 Cost: 0.0141 Fidelity: 0.9776 Trace: 0.9776
Rep: 768 Cost: 0.0143 Fidelity: 0.9775 Trace: 0.9775
Rep: 769 Cost: 0.0137 Fidelity: 0.9775 Trace: 0.9775
Rep: 770 Cost: 0.0168 Fidelity: 0.9776 Trace: 0.9776
Rep: 771 Cost: 0.0130 Fidelity: 0.9775 Trace: 0.9775
Rep: 772 Cost: 0.0188 Fidelity: 0.9775 Trace: 0.9775
Rep: 773 Cost: 0.0187 Fidelity: 0.9775 Trace: 0.9775
Rep: 774 Cost: 0.0154 Fidelity: 0.9776 Trace: 0.9776
Rep: 775 Cost: 0.0158 Fidelity: 0.9775 Trace: 0.9775
Rep: 776 Cost: 0.0170 Fidelity: 0.9774 Trace: 0.9774
Rep: 777 Cost: 0.0146 Fidelity: 0.9775 Trace: 0.9775
Rep: 778 Cost: 0.0172 Fidelity: 0.9777 Trace: 0.9778
Rep: 779 Cost: 0.0174 Fidelity: 0.9778 Trace: 0.9778
Rep: 780 Cost: 0.0128 Fidelity: 0.9776 Trace: 0.9776
Rep: 781 Cost: 0.0145 Fidelity: 0.9775 Trace: 0.9775
Rep: 782 Cost: 0.0135 Fidelity: 0.9777 Trace: 0.9777
Rep: 783 Cost: 0.0139 Fidelity: 0.9776 Trace: 0.9776
Rep: 784 Cost: 0.0131 Fidelity: 0.9777 Trace: 0.9777
Rep: 785 Cost: 0.0139 Fidelity: 0.9776 Trace: 0.9776
Rep: 786 Cost: 0.0130 Fidelity: 0.9777 Trace: 0.9777
Rep: 787 Cost: 0.0143 Fidelity: 0.9777 Trace: 0.9777
Rep: 788 Cost: 0.0131 Fidelity: 0.9777 Trace: 0.9777
Rep: 789 Cost: 0.0139 Fidelity: 0.9777 Trace: 0.9777
Rep: 790 Cost: 0.0165 Fidelity: 0.9776 Trace: 0.9776
Rep: 791 Cost: 0.0147 Fidelity: 0.9777 Trace: 0.9777
Rep: 792 Cost: 0.0153 Fidelity: 0.9779 Trace: 0.9779
Rep: 793 Cost: 0.0155 Fidelity: 0.9779 Trace: 0.9779
Rep: 794 Cost: 0.0149 Fidelity: 0.9777 Trace: 0.9777
Rep: 795 Cost: 0.0134 Fidelity: 0.9777 Trace: 0.9777
Rep: 796 Cost: 0.0162 Fidelity: 0.9779 Trace: 0.9779
Rep: 797 Cost: 0.0147 Fidelity: 0.9778 Trace: 0.9778
Rep: 798 Cost: 0.0153 Fidelity: 0.9776 Trace: 0.9776
Rep: 799 Cost: 0.0139 Fidelity: 0.9777 Trace: 0.9777
Rep: 800 Cost: 0.0173 Fidelity: 0.9779 Trace: 0.9779
Rep: 801 Cost: 0.0160 Fidelity: 0.9779 Trace: 0.9779
Rep: 802 Cost: 0.0160 Fidelity: 0.9777 Trace: 0.9777
Rep: 803 Cost: 0.0162 Fidelity: 0.9776 Trace: 0.9777
Rep: 804 Cost: 0.0154 Fidelity: 0.9778 Trace: 0.9778
Rep: 805 Cost: 0.0139 Fidelity: 0.9779 Trace: 0.9779
Rep: 806 Cost: 0.0168 Fidelity: 0.9778 Trace: 0.9778
Rep: 807 Cost: 0.0155 Fidelity: 0.9778 Trace: 0.9778
Rep: 808 Cost: 0.0159 Fidelity: 0.9780 Trace: 0.9780
Rep: 809 Cost: 0.0157 Fidelity: 0.9780 Trace: 0.9780
Rep: 810 Cost: 0.0148 Fidelity: 0.9779 Trace: 0.9779
Rep: 811 Cost: 0.0148 Fidelity: 0.9779 Trace: 0.9779
Rep: 812 Cost: 0.0161 Fidelity: 0.9781 Trace: 0.9781
Rep: 813 Cost: 0.0156 Fidelity: 0.9780 Trace: 0.9780
Rep: 814 Cost: 0.0158 Fidelity: 0.9778 Trace: 0.9778
Rep: 815 Cost: 0.0143 Fidelity: 0.9779 Trace: 0.9779
Rep: 816 Cost: 0.0174 Fidelity: 0.9781 Trace: 0.9781
Rep: 817 Cost: 0.0175 Fidelity: 0.9781 Trace: 0.9781
Rep: 818 Cost: 0.0131 Fidelity: 0.9779 Trace: 0.9779
Rep: 819 Cost: 0.0131 Fidelity: 0.9779 Trace: 0.9779
Rep: 820 Cost: 0.0163 Fidelity: 0.9781 Trace: 0.9781
Rep: 821 Cost: 0.0151 Fidelity: 0.9781 Trace: 0.9781
Rep: 822 Cost: 0.0159 Fidelity: 0.9779 Trace: 0.9779
Rep: 823 Cost: 0.0148 Fidelity: 0.9780 Trace: 0.9780
Rep: 824 Cost: 0.0160 Fidelity: 0.9782 Trace: 0.9782
Rep: 825 Cost: 0.0156 Fidelity: 0.9782 Trace: 0.9782
Rep: 826 Cost: 0.0145 Fidelity: 0.9780 Trace: 0.9781
Rep: 827 Cost: 0.0131 Fidelity: 0.9780 Trace: 0.9780
Rep: 828 Cost: 0.0162 Fidelity: 0.9781 Trace: 0.9782
Rep: 829 Cost: 0.0143 Fidelity: 0.9781 Trace: 0.9781
Rep: 830 Cost: 0.0167 Fidelity: 0.9779 Trace: 0.9779
Rep: 831 Cost: 0.0154 Fidelity: 0.9779 Trace: 0.9780
Rep: 832 Cost: 0.0164 Fidelity: 0.9781 Trace: 0.9782
Rep: 833 Cost: 0.0166 Fidelity: 0.9782 Trace: 0.9782
Rep: 834 Cost: 0.0143 Fidelity: 0.9780 Trace: 0.9780
Rep: 835 Cost: 0.0156 Fidelity: 0.9780 Trace: 0.9780
Rep: 836 Cost: 0.0135 Fidelity: 0.9781 Trace: 0.9781
Rep: 837 Cost: 0.0129 Fidelity: 0.9782 Trace: 0.9782
Rep: 838 Cost: 0.0152 Fidelity: 0.9780 Trace: 0.9780
Rep: 839 Cost: 0.0133 Fidelity: 0.9781 Trace: 0.9781
Rep: 840 Cost: 0.0153 Fidelity: 0.9783 Trace: 0.9783
Rep: 841 Cost: 0.0138 Fidelity: 0.9783 Trace: 0.9783
Rep: 842 Cost: 0.0163 Fidelity: 0.9780 Trace: 0.9781
Rep: 843 Cost: 0.0155 Fidelity: 0.9781 Trace: 0.9781
Rep: 844 Cost: 0.0151 Fidelity: 0.9783 Trace: 0.9783
Rep: 845 Cost: 0.0143 Fidelity: 0.9783 Trace: 0.9783
Rep: 846 Cost: 0.0154 Fidelity: 0.9781 Trace: 0.9781
Rep: 847 Cost: 0.0140 Fidelity: 0.9781 Trace: 0.9781
Rep: 848 Cost: 0.0157 Fidelity: 0.9783 Trace: 0.9783
Rep: 849 Cost: 0.0137 Fidelity: 0.9783 Trace: 0.9783
Rep: 850 Cost: 0.0171 Fidelity: 0.9781 Trace: 0.9781
Rep: 851 Cost: 0.0157 Fidelity: 0.9781 Trace: 0.9781
Rep: 852 Cost: 0.0162 Fidelity: 0.9784 Trace: 0.9784
Rep: 853 Cost: 0.0164 Fidelity: 0.9784 Trace: 0.9784
Rep: 854 Cost: 0.0140 Fidelity: 0.9783 Trace: 0.9783
Rep: 855 Cost: 0.0140 Fidelity: 0.9783 Trace: 0.9783
Rep: 856 Cost: 0.0153 Fidelity: 0.9784 Trace: 0.9784
Rep: 857 Cost: 0.0138 Fidelity: 0.9784 Trace: 0.9784
Rep: 858 Cost: 0.0172 Fidelity: 0.9783 Trace: 0.9783
Rep: 859 Cost: 0.0168 Fidelity: 0.9783 Trace: 0.9783
Rep: 860 Cost: 0.0140 Fidelity: 0.9785 Trace: 0.9785
Rep: 861 Cost: 0.0155 Fidelity: 0.9784 Trace: 0.9784
Rep: 862 Cost: 0.0135 Fidelity: 0.9784 Trace: 0.9784
Rep: 863 Cost: 0.0138 Fidelity: 0.9784 Trace: 0.9784
Rep: 864 Cost: 0.0131 Fidelity: 0.9784 Trace: 0.9784
Rep: 865 Cost: 0.0147 Fidelity: 0.9783 Trace: 0.9783
Rep: 866 Cost: 0.0135 Fidelity: 0.9783 Trace: 0.9783
Rep: 867 Cost: 0.0155 Fidelity: 0.9785 Trace: 0.9785
Rep: 868 Cost: 0.0159 Fidelity: 0.9785 Trace: 0.9785
Rep: 869 Cost: 0.0130 Fidelity: 0.9784 Trace: 0.9784
Rep: 870 Cost: 0.0158 Fidelity: 0.9783 Trace: 0.9783
Rep: 871 Cost: 0.0161 Fidelity: 0.9784 Trace: 0.9784
Rep: 872 Cost: 0.0130 Fidelity: 0.9784 Trace: 0.9784
Rep: 873 Cost: 0.0163 Fidelity: 0.9785 Trace: 0.9785
Rep: 874 Cost: 0.0158 Fidelity: 0.9786 Trace: 0.9786
Rep: 875 Cost: 0.0155 Fidelity: 0.9785 Trace: 0.9785
Rep: 876 Cost: 0.0162 Fidelity: 0.9784 Trace: 0.9784
Rep: 877 Cost: 0.0158 Fidelity: 0.9783 Trace: 0.9783
Rep: 878 Cost: 0.0163 Fidelity: 0.9784 Trace: 0.9784
Rep: 879 Cost: 0.0157 Fidelity: 0.9786 Trace: 0.9786
Rep: 880 Cost: 0.0140 Fidelity: 0.9786 Trace: 0.9786
Rep: 881 Cost: 0.0167 Fidelity: 0.9785 Trace: 0.9786
Rep: 882 Cost: 0.0171 Fidelity: 0.9784 Trace: 0.9785
Rep: 883 Cost: 0.0144 Fidelity: 0.9785 Trace: 0.9785
Rep: 884 Cost: 0.0170 Fidelity: 0.9788 Trace: 0.9788
Rep: 885 Cost: 0.0184 Fidelity: 0.9788 Trace: 0.9788
Rep: 886 Cost: 0.0136 Fidelity: 0.9787 Trace: 0.9787
Rep: 887 Cost: 0.0188 Fidelity: 0.9784 Trace: 0.9784
Rep: 888 Cost: 0.0210 Fidelity: 0.9783 Trace: 0.9783
Rep: 889 Cost: 0.0171 Fidelity: 0.9785 Trace: 0.9785
Rep: 890 Cost: 0.0161 Fidelity: 0.9787 Trace: 0.9787
Rep: 891 Cost: 0.0174 Fidelity: 0.9788 Trace: 0.9788
Rep: 892 Cost: 0.0166 Fidelity: 0.9787 Trace: 0.9787
Rep: 893 Cost: 0.0145 Fidelity: 0.9785 Trace: 0.9786
Rep: 894 Cost: 0.0163 Fidelity: 0.9786 Trace: 0.9786
Rep: 895 Cost: 0.0168 Fidelity: 0.9786 Trace: 0.9787
Rep: 896 Cost: 0.0130 Fidelity: 0.9786 Trace: 0.9786
Rep: 897 Cost: 0.0150 Fidelity: 0.9787 Trace: 0.9787
Rep: 898 Cost: 0.0144 Fidelity: 0.9788 Trace: 0.9788
Rep: 899 Cost: 0.0145 Fidelity: 0.9787 Trace: 0.9787
Rep: 900 Cost: 0.0152 Fidelity: 0.9786 Trace: 0.9786
Rep: 901 Cost: 0.0138 Fidelity: 0.9786 Trace: 0.9786
Rep: 902 Cost: 0.0149 Fidelity: 0.9787 Trace: 0.9787
Rep: 903 Cost: 0.0139 Fidelity: 0.9788 Trace: 0.9788
Rep: 904 Cost: 0.0146 Fidelity: 0.9787 Trace: 0.9787
Rep: 905 Cost: 0.0148 Fidelity: 0.9786 Trace: 0.9786
Rep: 906 Cost: 0.0124 Fidelity: 0.9787 Trace: 0.9787
Rep: 907 Cost: 0.0143 Fidelity: 0.9788 Trace: 0.9788
Rep: 908 Cost: 0.0123 Fidelity: 0.9788 Trace: 0.9788
Rep: 909 Cost: 0.0146 Fidelity: 0.9786 Trace: 0.9786
Rep: 910 Cost: 0.0131 Fidelity: 0.9787 Trace: 0.9787
Rep: 911 Cost: 0.0153 Fidelity: 0.9789 Trace: 0.9789
Rep: 912 Cost: 0.0146 Fidelity: 0.9789 Trace: 0.9789
Rep: 913 Cost: 0.0145 Fidelity: 0.9788 Trace: 0.9788
Rep: 914 Cost: 0.0145 Fidelity: 0.9788 Trace: 0.9788
Rep: 915 Cost: 0.0139 Fidelity: 0.9789 Trace: 0.9789
Rep: 916 Cost: 0.0147 Fidelity: 0.9788 Trace: 0.9788
Rep: 917 Cost: 0.0142 Fidelity: 0.9788 Trace: 0.9788
Rep: 918 Cost: 0.0140 Fidelity: 0.9789 Trace: 0.9789
Rep: 919 Cost: 0.0136 Fidelity: 0.9789 Trace: 0.9789
Rep: 920 Cost: 0.0145 Fidelity: 0.9788 Trace: 0.9788
Rep: 921 Cost: 0.0139 Fidelity: 0.9789 Trace: 0.9789
Rep: 922 Cost: 0.0142 Fidelity: 0.9790 Trace: 0.9790
Rep: 923 Cost: 0.0137 Fidelity: 0.9788 Trace: 0.9788
Rep: 924 Cost: 0.0137 Fidelity: 0.9789 Trace: 0.9789
Rep: 925 Cost: 0.0144 Fidelity: 0.9790 Trace: 0.9790
Rep: 926 Cost: 0.0132 Fidelity: 0.9790 Trace: 0.9790
Rep: 927 Cost: 0.0162 Fidelity: 0.9788 Trace: 0.9788
Rep: 928 Cost: 0.0160 Fidelity: 0.9788 Trace: 0.9788
Rep: 929 Cost: 0.0138 Fidelity: 0.9790 Trace: 0.9790
Rep: 930 Cost: 0.0152 Fidelity: 0.9791 Trace: 0.9791
Rep: 931 Cost: 0.0140 Fidelity: 0.9790 Trace: 0.9790
Rep: 932 Cost: 0.0135 Fidelity: 0.9789 Trace: 0.9789
Rep: 933 Cost: 0.0147 Fidelity: 0.9790 Trace: 0.9790
Rep: 934 Cost: 0.0125 Fidelity: 0.9790 Trace: 0.9790
Rep: 935 Cost: 0.0172 Fidelity: 0.9789 Trace: 0.9789
Rep: 936 Cost: 0.0161 Fidelity: 0.9790 Trace: 0.9790
Rep: 937 Cost: 0.0140 Fidelity: 0.9791 Trace: 0.9791
Rep: 938 Cost: 0.0152 Fidelity: 0.9791 Trace: 0.9791
Rep: 939 Cost: 0.0136 Fidelity: 0.9790 Trace: 0.9790
Rep: 940 Cost: 0.0137 Fidelity: 0.9791 Trace: 0.9791
Rep: 941 Cost: 0.0126 Fidelity: 0.9791 Trace: 0.9791
Rep: 942 Cost: 0.0142 Fidelity: 0.9790 Trace: 0.9790
Rep: 943 Cost: 0.0128 Fidelity: 0.9790 Trace: 0.9790
Rep: 944 Cost: 0.0148 Fidelity: 0.9791 Trace: 0.9791
Rep: 945 Cost: 0.0152 Fidelity: 0.9791 Trace: 0.9792
Rep: 946 Cost: 0.0139 Fidelity: 0.9790 Trace: 0.9790
Rep: 947 Cost: 0.0135 Fidelity: 0.9790 Trace: 0.9790
Rep: 948 Cost: 0.0140 Fidelity: 0.9791 Trace: 0.9791
Rep: 949 Cost: 0.0146 Fidelity: 0.9790 Trace: 0.9791
Rep: 950 Cost: 0.0138 Fidelity: 0.9790 Trace: 0.9790
Rep: 951 Cost: 0.0143 Fidelity: 0.9791 Trace: 0.9792
Rep: 952 Cost: 0.0129 Fidelity: 0.9792 Trace: 0.9792
Rep: 953 Cost: 0.0158 Fidelity: 0.9792 Trace: 0.9792
Rep: 954 Cost: 0.0156 Fidelity: 0.9792 Trace: 0.9792
Rep: 955 Cost: 0.0133 Fidelity: 0.9793 Trace: 0.9793
Rep: 956 Cost: 0.0170 Fidelity: 0.9792 Trace: 0.9792
Rep: 957 Cost: 0.0174 Fidelity: 0.9791 Trace: 0.9791
Rep: 958 Cost: 0.0147 Fidelity: 0.9791 Trace: 0.9791
Rep: 959 Cost: 0.0147 Fidelity: 0.9793 Trace: 0.9793
Rep: 960 Cost: 0.0155 Fidelity: 0.9793 Trace: 0.9793
Rep: 961 Cost: 0.0122 Fidelity: 0.9792 Trace: 0.9792
Rep: 962 Cost: 0.0162 Fidelity: 0.9791 Trace: 0.9791
Rep: 963 Cost: 0.0168 Fidelity: 0.9791 Trace: 0.9791
Rep: 964 Cost: 0.0130 Fidelity: 0.9791 Trace: 0.9791
Rep: 965 Cost: 0.0165 Fidelity: 0.9793 Trace: 0.9793
Rep: 966 Cost: 0.0180 Fidelity: 0.9793 Trace: 0.9793
Rep: 967 Cost: 0.0144 Fidelity: 0.9793 Trace: 0.9793
Rep: 968 Cost: 0.0155 Fidelity: 0.9791 Trace: 0.9791
Rep: 969 Cost: 0.0178 Fidelity: 0.9791 Trace: 0.9791
Rep: 970 Cost: 0.0153 Fidelity: 0.9792 Trace: 0.9792
Rep: 971 Cost: 0.0139 Fidelity: 0.9793 Trace: 0.9793
Rep: 972 Cost: 0.0174 Fidelity: 0.9793 Trace: 0.9793
Rep: 973 Cost: 0.0165 Fidelity: 0.9793 Trace: 0.9793
Rep: 974 Cost: 0.0138 Fidelity: 0.9792 Trace: 0.9792
Rep: 975 Cost: 0.0156 Fidelity: 0.9793 Trace: 0.9793
Rep: 976 Cost: 0.0151 Fidelity: 0.9793 Trace: 0.9793
Rep: 977 Cost: 0.0146 Fidelity: 0.9794 Trace: 0.9794
Rep: 978 Cost: 0.0153 Fidelity: 0.9794 Trace: 0.9794
Rep: 979 Cost: 0.0145 Fidelity: 0.9794 Trace: 0.9794
Rep: 980 Cost: 0.0145 Fidelity: 0.9793 Trace: 0.9793
Rep: 981 Cost: 0.0147 Fidelity: 0.9793 Trace: 0.9793
Rep: 982 Cost: 0.0123 Fidelity: 0.9794 Trace: 0.9794
Rep: 983 Cost: 0.0150 Fidelity: 0.9795 Trace: 0.9795
Rep: 984 Cost: 0.0136 Fidelity: 0.9795 Trace: 0.9795
Rep: 985 Cost: 0.0144 Fidelity: 0.9792 Trace: 0.9793
Rep: 986 Cost: 0.0150 Fidelity: 0.9792 Trace: 0.9792
Rep: 987 Cost: 0.0121 Fidelity: 0.9794 Trace: 0.9794
Rep: 988 Cost: 0.0139 Fidelity: 0.9794 Trace: 0.9795
Rep: 989 Cost: 0.0131 Fidelity: 0.9794 Trace: 0.9794
Rep: 990 Cost: 0.0129 Fidelity: 0.9794 Trace: 0.9794
Rep: 991 Cost: 0.0121 Fidelity: 0.9794 Trace: 0.9794
Rep: 992 Cost: 0.0125 Fidelity: 0.9795 Trace: 0.9795
Rep: 993 Cost: 0.0128 Fidelity: 0.9795 Trace: 0.9795
Rep: 994 Cost: 0.0122 Fidelity: 0.9795 Trace: 0.9795
Rep: 995 Cost: 0.0152 Fidelity: 0.9794 Trace: 0.9794
Rep: 996 Cost: 0.0136 Fidelity: 0.9795 Trace: 0.9795
Rep: 997 Cost: 0.0143 Fidelity: 0.9796 Trace: 0.9796
Rep: 998 Cost: 0.0137 Fidelity: 0.9796 Trace: 0.9796
Rep: 999 Cost: 0.0146 Fidelity: 0.9795 Trace: 0.9795

Fidelity before optimization:  3.8496597e-09
Fidelity after optimization:  0.9795164

Target state:  [0.+0.j 1.+0.j 0.+0.j 0.+0.j 0.+0.j 0.+0.j]
Output state:  [ 0.   +0.001j  0.99 -0.001j -0.   +0.002j  0.001-0.001j -0.   -0.001j
  0.   +0.j   ]

For more applications of CV quantum neural networks, see the state learning and gate synthesis demonstrations.

References

1(1,2,3,4)

Nathan Killoran, Thomas R Bromley, Juan Miguel Arrazola, Maria Schuld, Nicolás Quesada, and Seth Lloyd. Continuous-variable quantum neural networks. arXiv preprint arXiv:1806.06871, 2018.

2

Maria Schuld, Ville Bergholm, Christian Gogolin, Josh Izaac, and Nathan Killoran. Evaluating analytic gradients on quantum hardware. Physical Review A, 99(3):032331, 2019.

3

William R Clements, Peter C Humphreys, Benjamin J Metcalf, W Steven Kolthammer, and Ian A Walsmley. Optimal design for universal multiport interferometers. Optica, 3(12):1460–1465, 2016. doi:10.1364/OPTICA.3.001460.

Total running time of the script: ( 1 minutes 56.395 seconds)

Gallery generated by Sphinx-Gallery