Note
Click here to download the full example code
Quantum neural network¶
“Neural Network are not black boxes. They are a big pile of linear algebra.” - Randall Munroe, xkcd
Machine learning has a wide range of models for tasks such as classification, regression, and clustering. Neural networks are one of the most successful models, having experienced a resurgence in use over the past decade due to improvements in computational power and advanced software libraries. The typical structure of a neural network consists of a series of interacting layers that perform transformations on data passing through the network. An archetypal neural network structure is the feedforward neural network, visualized by the following example:
Here, the neural network depth is determined by the number of layers, while the maximum width is given by the layer with the greatest number of neurons. The network begins with an input layer of real-valued neurons, which feed forward onto a series of one or more hidden layers. Following the notation of [1], if the \(n\) neurons at one layer are given by the vector \(\mathbf{x} \in \mathbb{R}^{n}\), the \(m\) neurons of the next layer take the values
where
\(W \in \mathbb{R}^{m \times n}\) is a matrix,
\(b \in \mathbb{R}^{m}\) is a vector, and
\(\varphi\) is a nonlinear function (also known as the activation function).
The matrix multiplication \(W \mathbf{x}\) is a linear transformation on \(\mathbf{x}\), while \(W \mathbf{x} + \mathbf{b}\) represents an affine transformation. In principle, any nonlinear function can be chosen for \(\varphi\), but often the choice is fixed from a standard set of activations that include the rectified linear unit (ReLU) and the sigmoid function acting on each neuron. Finally, the output layer enacts an affine transformation on the last hidden layer, but the activation function may be linear (including the identity), or a different nonlinear function such as softmax (for classification).
Layers in the feedforward neural network above are called fully connected as every neuron in a given hidden layer or output layer can be connected to all neurons in the previous layer through the matrix \(W\). Over time, specialized versions of layers have been developed to focus on different problems. For example, convolutional layers have a restricted form of connectivity and are suited to machine learning with images. We focus here on fully connected layers as the most general type.
Training of neural networks uses variations of the gradient descent algorithm on a cost function characterizing the similarity between outputs of the neural network and training data. The gradient of the cost function can be calculated using automatic differentiation, with knowledge of the feedforward network structure.
Quantum neural networks aim to encode neural networks into a quantum system, with the intention of benefiting from quantum information processing. There have been numerous attempts to define a quantum neural network, each with varying advantages and disadvantages. The quantum neural network detailed below, following the work of [1], has a CV architecture and is realized using standard CV gates from Strawberry Fields. One advantage of this CV architecture is that it naturally accommodates for the continuous nature of neural networks. Additionally, the CV model is able to easily apply non-linear transformations using the phase space picture - a task which qubit-based models struggle with, often relying on measurement postselection which has a probability of failure.
Implementation¶
A CV quantum neural network layer can be defined as
where
\(\mathcal{U}_{k}=U_{k}(\boldsymbol{\theta}_{k},\boldsymbol{\phi}_{k})\) is an \(N\) mode interferometer,
\(\mathcal{D}=\otimes_{i=1}^{N}D(\alpha_{i})\) is a single mode displacement gate (
Dgate
) with complex displacement \(\alpha_{i} \in \mathbb{C}\),\(\mathcal{S}=\otimes_{i=1}^{N}S(r_{i})\) is a single mode squeezing gate (
Sgate
) acting on each mode with squeezing parameter \(r_{i} \in \mathbb{R}\), and\(\Phi=\otimes_{i=1}^{N}\Phi(\lambda_{i})\) is a non-Gaussian gate on each mode with parameter \(\lambda_{i} \in \mathbb{R}\).
Note
Any non-Gaussian gate such as the cubic phase gate (Vgate
)
represents a valid choice, but we recommend the Kerr gate (Kgate
)
for simulations in Strawberry Fields. The Kerr gate is more accurate numerically because it is
diagonal in the Fock basis.
The layer is shown below as a circuit:
These layers can then be composed to form a quantum neural network. The width of the network can also be varied between layers [1].
Reproducing classical neural networks¶
Let’s see how the quantum layer can embed the transformation \(\mathcal{L}(\mathbf{x}) = \varphi (W \mathbf{x} + \mathbf{b})\) of a classical neural network layer. Suppose \(N\)-dimensional data is encoded in position eigenstates so that
We want to perform the transformation
It turns out that the quantum circuit above can do precisely this! Consider first the affine transformation \(W \mathbf{x} + \mathbf{b}\). Leveraging the singular value decomposition, we can always write \(W = O_{2} \Sigma O_{1}\) with \(O_{k}\) orthogonal matrices and \(\Sigma\) a positive diagonal matrix. These orthogonal transformations can be carried out using interferometers without access to phase, i.e., with \(\boldsymbol{\phi}_{k} = 0\):
On the other hand, the diagonal matrix \(\Sigma = {\rm diag}\left(\{c_{i}\}_{i=1}^{N}\right)\) can be achieved through squeezing:
with \(r_{i} = \log (c_{i})\). Finally, the addition of a bias vector \(\mathbf{b}\) is done using position displacement gates:
with \(\mathbf{b} = \{\alpha_{i}\}_{i=1}^{N}\) and \(\alpha_{i} \in \mathbb{R}\). Putting this all together, we see that the operation \(\mathcal{D} \circ \mathcal{U}_{2} \circ \mathcal{S} \circ \mathcal{U}_{1}\) with phaseless interferometers and position displacement performs the transformation \(\ket{\mathbf{x}} \Rightarrow \ket{W \mathbf{x} + \mathbf{b}}\) on position eigenstates.
Warning
The TensorFlow backend is the natural simulator for quantum neural networks in Strawberry
Fields, but this backend cannot naturally accommodate position eigenstates, which require
infinite squeezing. For simulation of position eigenstates in this backend, the best approach is
to use a displaced squeezed state (prepare_displaced_squeezed_state
) with high
squeezing value r. However, to avoid significant numerical error, it is important to make sure
that all initial states have negligible amplitude for Fock states \(\ket{n}\) with
\(n\geq \texttt{cutoff_dim}\), where \(\texttt{cutoff_dim}\) is the cutoff dimension.
Finally, the nonlinear function \(\varphi\) can be achieved through a restricted type of non-Gaussian gates \(\otimes_{i=1}^{N}\Phi(\lambda_{i})\) acting on each mode (see [1] for more details), resulting in the transformation
The operation \(\mathcal{L} = \Phi \circ \mathcal{D} \circ \mathcal{U}_{2} \circ \mathcal{S} \circ \mathcal{U}_{1}\) with phaseless interferometers, position displacements, and restricted non-Gaussian gates can hence be seen as enacting a classical neural network layer \(\ket{\mathbf{x}} \Rightarrow \ket{\phi(W \mathbf{x} + \mathbf{b})}\) on position eigenstates.
Extending to quantum neural networks¶
In fact, CV quantum neural network layers can be made more expressive than their classical counterparts. We can do this by lifting the above restrictions on \(\mathcal{L}\), i.e.:
Using arbitrary interferometers \(U_{k}(\boldsymbol{\theta}_{k},\boldsymbol{\phi}_{k})\) with access to phase and general displacement gates (i.e., not necessarily position displacement). This allows \(\mathcal{D} \circ \mathcal{U}_{2} \circ \mathcal{S} \circ \mathcal{U}_{1}\) to represent a general Gaussian operation.
Using arbitrary non-Gaussian gates \(\Phi(\lambda_{i})\), such as the Kerr gate.
Encoding data outside of the position eigenbasis, for example using instead the Fock basis.
In fact, gates in a single layer form a universal gate set, making the CV quantum neural network a model for universal quantum computing, i.e., a sufficient number of layers can carry out any quantum algorithm implementable on a CV quantum computer.
CV quantum neural networks can be trained both through classical simulation and directly on quantum hardware. Strawberry Fields relies on classical simulation to evaluate cost functions of the CV quantum neural network and the resultant gradients with respect to parameters of each layer. However, this becomes an intractable task with increasing network depth and width. Ultimately, direct evaluation on hardware will likely be necessary to large scale networks; an approach for hardware-based training is mapped out in [2]. The PennyLane library provides tools for training hybrid quantum-classical machine learning models, using both simulators and real-world quantum hardware.
Example CV quantum neural network layers are shown, for one to four modes, below:
Here, the multimode linear interferometers \(U_{1}\) and \(U_{2}\) have been decomposed into
two-mode phaseless beamsplitters (BSgate
) and single-mode phase shifters
(Rgate
) using the Clements decomposition [3]. The Kerr gate is used as
the non-Gaussian gate.
Code¶
First, we import Strawberry Fields, TensorFlow, and NumPy:
import numpy as np
import tensorflow as tf
import strawberryfields as sf
from strawberryfields import ops
Before we begin defining our optimization problem, let’s first create some convenient utility functions.
Utility functions¶
The first step to writing a CV quantum neural network layer in Strawberry Fields is to define a function for the two interferometers:
def interferometer(params, q):
"""Parameterised interferometer acting on ``N`` modes.
Args:
params (list[float]): list of length ``max(1, N-1) + (N-1)*N`` parameters.
* The first ``N(N-1)/2`` parameters correspond to the beamsplitter angles
* The second ``N(N-1)/2`` parameters correspond to the beamsplitter phases
* The final ``N-1`` parameters correspond to local rotation on the first N-1 modes
q (list[RegRef]): list of Strawberry Fields quantum registers the interferometer
is to be applied to
"""
N = len(q)
theta = params[:N*(N-1)//2]
phi = params[N*(N-1)//2:N*(N-1)]
rphi = params[-N+1:]
if N == 1:
# the interferometer is a single rotation
ops.Rgate(rphi[0]) | q[0]
return
n = 0 # keep track of free parameters
# Apply the rectangular beamsplitter array
# The array depth is N
for l in range(N):
for k, (q1, q2) in enumerate(zip(q[:-1], q[1:])):
# skip even or odd pairs depending on layer
if (l + k) % 2 != 1:
ops.BSgate(theta[n], phi[n]) | (q1, q2)
n += 1
# apply the final local phase shifts to all modes except the last one
for i in range(max(1, N - 1)):
ops.Rgate(rphi[i]) | q[i]
Warning
The Interferometer
class in Strawberry Fields does not reproduce
the functionality above. Instead, Interferometer
applies a given
input unitary matrix according to the Clements decomposition.
Using the above interferometer
function, an \(N\) mode CV quantum neural network layer is
given by the function:
def layer(params, q):
"""CV quantum neural network layer acting on ``N`` modes.
Args:
params (list[float]): list of length ``2*(max(1, N-1) + N**2 + n)`` containing
the number of parameters for the layer
q (list[RegRef]): list of Strawberry Fields quantum registers the layer
is to be applied to
"""
N = len(q)
M = int(N * (N - 1)) + max(1, N - 1)
int1 = params[:M]
s = params[M:M+N]
int2 = params[M+N:2*M+N]
dr = params[2*M+N:2*M+2*N]
dp = params[2*M+2*N:2*M+3*N]
k = params[2*M+3*N:2*M+4*N]
# begin layer
interferometer(int1, q)
for i in range(N):
ops.Sgate(s[i]) | q[i]
interferometer(int2, q)
for i in range(N):
ops.Dgate(dr[i], dp[i]) | q[i]
ops.Kgate(k[i]) | q[i]
Finally, we define one more utility function to help us initialize the TensorFlow weights for our quantum neural network layers:
def init_weights(modes, layers, active_sd=0.0001, passive_sd=0.1):
"""Initialize a 2D TensorFlow Variable containing normally-distributed
random weights for an ``N`` mode quantum neural network with ``L`` layers.
Args:
modes (int): the number of modes in the quantum neural network
layers (int): the number of layers in the quantum neural network
active_sd (float): the standard deviation used when initializing
the normally-distributed weights for the active parameters
(displacement, squeezing, and Kerr magnitude)
passive_sd (float): the standard deviation used when initializing
the normally-distributed weights for the passive parameters
(beamsplitter angles and all gate phases)
Returns:
tf.Variable[tf.float32]: A TensorFlow Variable of shape
``[layers, 2*(max(1, modes-1) + modes**2 + modes)]``, where the Lth
row represents the layer parameters for the Lth layer.
"""
# Number of interferometer parameters:
M = int(modes * (modes - 1)) + max(1, modes - 1)
# Create the TensorFlow variables
int1_weights = tf.random.normal(shape=[layers, M], stddev=passive_sd)
s_weights = tf.random.normal(shape=[layers, modes], stddev=active_sd)
int2_weights = tf.random.normal(shape=[layers, M], stddev=passive_sd)
dr_weights = tf.random.normal(shape=[layers, modes], stddev=active_sd)
dp_weights = tf.random.normal(shape=[layers, modes], stddev=passive_sd)
k_weights = tf.random.normal(shape=[layers, modes], stddev=active_sd)
weights = tf.concat(
[int1_weights, s_weights, int2_weights, dr_weights, dp_weights, k_weights], axis=1
)
weights = tf.Variable(weights)
return weights
Optimization¶
Now that we have our utility functions, lets begin defining our optimization problem In this particular example, let’s create a 1 mode CVQNN with 8 layers and a Fock-basis cutoff dimension of 6. We will train this QNN to output a desired target state; a single photon state.
# set the random seed
tf.random.set_seed(137)
np.random.seed(137)
# define width and depth of CV quantum neural network
modes = 1
layers = 8
cutoff_dim = 6
# defining desired state (single photon state)
target_state = np.zeros(cutoff_dim)
target_state[1] = 1
target_state = tf.constant(target_state, dtype=tf.complex64)
Now, let’s initialize an engine with the TensorFlow "tf"
backend,
and begin constructing out QNN program.
# initialize engine and program
eng = sf.Engine(backend="tf", backend_options={"cutoff_dim": cutoff_dim})
qnn = sf.Program(modes)
# initialize QNN weights
weights = init_weights(modes, layers) # our TensorFlow weights
num_params = np.prod(weights.shape) # total number of parameters in our model
To construct the program, we must create and use Strawberry Fields symbolic gate arguments. These will be mapped to the TensorFlow variables on engine execution.
# Create array of Strawberry Fields symbolic gate arguments, matching
# the size of the weights Variable.
sf_params = np.arange(num_params).reshape(weights.shape).astype(np.str)
sf_params = np.array([qnn.params(*i) for i in sf_params])
# Construct the symbolic Strawberry Fields program by
# looping and applying layers to the program.
with qnn.context as q:
for k in range(layers):
layer(sf_params[k], q)
where sf_params
is a real array of size [layers, 2*(max(1, modes-1) + modes**2 + modes)]
containing the symbolic gate arguments for the quantum neural network.
Now that our QNN program is defined, we can create our cost function. Our cost function simply executes the QNN on our engine using the values of the input weights.
Since we want to maximize the fidelity \(f(w) = \langle \psi(w) | \psi_t\rangle\) between our QNN output state \(|\psi(w)\rangle\) and our target state \(\psi_t\rangle\), we compute the inner product between the two statevectors, as well as the norm \(\left\lVert \psi(w) - \psi_t\right\rVert\).
Finally, we also return the trace of the output QNN state. This should always have a value close to 1. If it deviates significantly from 1, this is an indication that we need to increase our Fock-basis cutoff.
def cost(weights):
# Create a dictionary mapping from the names of the Strawberry Fields
# symbolic gate parameters to the TensorFlow weight values.
mapping = {p.name: w for p, w in zip(sf_params.flatten(), tf.reshape(weights, [-1]))}
# run the engine
state = eng.run(qnn, args=mapping).state
ket = state.ket()
difference = tf.reduce_sum(tf.abs(ket - target_state))
fidelity = tf.abs(tf.reduce_sum(tf.math.conj(ket) * target_state)) ** 2
return difference, fidelity, ket, tf.math.real(state.trace())
We are now ready to minimize our cost function using TensorFlow:
# set up the optimizer
opt = tf.keras.optimizers.Adam()
cost_before, fidelity_before, _, _ = cost(weights)
# Perform the optimization
for i in range(1000):
# reset the engine if it has already been executed
if eng.run_progs:
eng.reset()
with tf.GradientTape() as tape:
loss, fid, ket, trace = cost(weights)
# one repetition of the optimization
gradients = tape.gradient(loss, weights)
opt.apply_gradients(zip([gradients], [weights]))
# Prints progress at every rep
if i % 1 == 0:
print("Rep: {} Cost: {:.4f} Fidelity: {:.4f} Trace: {:.4f}".format(i, loss, fid, trace))
print("\nFidelity before optimization: ", fidelity_before.numpy())
print("Fidelity after optimization: ", fid.numpy())
print("\nTarget state: ", target_state.numpy())
print("Output state: ", np.round(ket.numpy(), decimals=3))
Out:
Rep: 0 Cost: 2.0001 Fidelity: 0.0000 Trace: 1.0000
Rep: 1 Cost: 1.9978 Fidelity: 0.0001 Trace: 1.0000
Rep: 2 Cost: 1.9897 Fidelity: 0.0002 Trace: 1.0000
Rep: 3 Cost: 1.9794 Fidelity: 0.0006 Trace: 1.0000
Rep: 4 Cost: 1.9681 Fidelity: 0.0010 Trace: 1.0000
Rep: 5 Cost: 1.9632 Fidelity: 0.0016 Trace: 1.0000
Rep: 6 Cost: 1.9563 Fidelity: 0.0023 Trace: 1.0000
Rep: 7 Cost: 1.9476 Fidelity: 0.0031 Trace: 1.0000
Rep: 8 Cost: 1.9377 Fidelity: 0.0041 Trace: 1.0000
Rep: 9 Cost: 1.9268 Fidelity: 0.0052 Trace: 1.0000
Rep: 10 Cost: 1.9196 Fidelity: 0.0064 Trace: 1.0000
Rep: 11 Cost: 1.9130 Fidelity: 0.0077 Trace: 1.0000
Rep: 12 Cost: 1.9055 Fidelity: 0.0091 Trace: 1.0000
Rep: 13 Cost: 1.8971 Fidelity: 0.0107 Trace: 1.0000
Rep: 14 Cost: 1.8880 Fidelity: 0.0124 Trace: 1.0000
Rep: 15 Cost: 1.8789 Fidelity: 0.0142 Trace: 1.0000
Rep: 16 Cost: 1.8695 Fidelity: 0.0162 Trace: 1.0000
Rep: 17 Cost: 1.8601 Fidelity: 0.0183 Trace: 1.0000
Rep: 18 Cost: 1.8505 Fidelity: 0.0205 Trace: 1.0000
Rep: 19 Cost: 1.8410 Fidelity: 0.0229 Trace: 1.0000
Rep: 20 Cost: 1.8327 Fidelity: 0.0254 Trace: 1.0000
Rep: 21 Cost: 1.8241 Fidelity: 0.0280 Trace: 1.0000
Rep: 22 Cost: 1.8145 Fidelity: 0.0308 Trace: 1.0000
Rep: 23 Cost: 1.8060 Fidelity: 0.0337 Trace: 1.0000
Rep: 24 Cost: 1.7979 Fidelity: 0.0367 Trace: 1.0000
Rep: 25 Cost: 1.7897 Fidelity: 0.0398 Trace: 1.0000
Rep: 26 Cost: 1.7815 Fidelity: 0.0431 Trace: 1.0000
Rep: 27 Cost: 1.7732 Fidelity: 0.0464 Trace: 1.0000
Rep: 28 Cost: 1.7649 Fidelity: 0.0498 Trace: 1.0000
Rep: 29 Cost: 1.7566 Fidelity: 0.0533 Trace: 1.0000
Rep: 30 Cost: 1.7484 Fidelity: 0.0569 Trace: 1.0000
Rep: 31 Cost: 1.7403 Fidelity: 0.0606 Trace: 1.0000
Rep: 32 Cost: 1.7322 Fidelity: 0.0644 Trace: 1.0000
Rep: 33 Cost: 1.7242 Fidelity: 0.0683 Trace: 1.0000
Rep: 34 Cost: 1.7164 Fidelity: 0.0723 Trace: 1.0000
Rep: 35 Cost: 1.7087 Fidelity: 0.0763 Trace: 1.0000
Rep: 36 Cost: 1.7012 Fidelity: 0.0804 Trace: 1.0000
Rep: 37 Cost: 1.6938 Fidelity: 0.0846 Trace: 1.0000
Rep: 38 Cost: 1.6866 Fidelity: 0.0888 Trace: 1.0000
Rep: 39 Cost: 1.6795 Fidelity: 0.0931 Trace: 1.0000
Rep: 40 Cost: 1.6726 Fidelity: 0.0975 Trace: 1.0000
Rep: 41 Cost: 1.6659 Fidelity: 0.1019 Trace: 1.0000
Rep: 42 Cost: 1.6593 Fidelity: 0.1063 Trace: 1.0000
Rep: 43 Cost: 1.6529 Fidelity: 0.1108 Trace: 1.0000
Rep: 44 Cost: 1.6467 Fidelity: 0.1154 Trace: 1.0000
Rep: 45 Cost: 1.6405 Fidelity: 0.1199 Trace: 1.0000
Rep: 46 Cost: 1.6346 Fidelity: 0.1245 Trace: 1.0000
Rep: 47 Cost: 1.6287 Fidelity: 0.1291 Trace: 1.0000
Rep: 48 Cost: 1.6230 Fidelity: 0.1337 Trace: 1.0000
Rep: 49 Cost: 1.6173 Fidelity: 0.1384 Trace: 1.0000
Rep: 50 Cost: 1.6117 Fidelity: 0.1430 Trace: 1.0000
Rep: 51 Cost: 1.6062 Fidelity: 0.1476 Trace: 1.0000
Rep: 52 Cost: 1.6007 Fidelity: 0.1523 Trace: 1.0000
Rep: 53 Cost: 1.5952 Fidelity: 0.1569 Trace: 1.0000
Rep: 54 Cost: 1.5897 Fidelity: 0.1616 Trace: 1.0000
Rep: 55 Cost: 1.5842 Fidelity: 0.1662 Trace: 1.0000
Rep: 56 Cost: 1.5786 Fidelity: 0.1708 Trace: 1.0000
Rep: 57 Cost: 1.5731 Fidelity: 0.1754 Trace: 1.0000
Rep: 58 Cost: 1.5674 Fidelity: 0.1800 Trace: 1.0000
Rep: 59 Cost: 1.5617 Fidelity: 0.1846 Trace: 1.0000
Rep: 60 Cost: 1.5560 Fidelity: 0.1892 Trace: 1.0000
Rep: 61 Cost: 1.5502 Fidelity: 0.1938 Trace: 1.0000
Rep: 62 Cost: 1.5445 Fidelity: 0.1984 Trace: 1.0000
Rep: 63 Cost: 1.5389 Fidelity: 0.2030 Trace: 1.0000
Rep: 64 Cost: 1.5333 Fidelity: 0.2076 Trace: 1.0000
Rep: 65 Cost: 1.5276 Fidelity: 0.2122 Trace: 1.0000
Rep: 66 Cost: 1.5219 Fidelity: 0.2168 Trace: 1.0000
Rep: 67 Cost: 1.5161 Fidelity: 0.2215 Trace: 1.0000
Rep: 68 Cost: 1.5101 Fidelity: 0.2261 Trace: 1.0000
Rep: 69 Cost: 1.5040 Fidelity: 0.2307 Trace: 1.0000
Rep: 70 Cost: 1.4977 Fidelity: 0.2354 Trace: 1.0000
Rep: 71 Cost: 1.4912 Fidelity: 0.2400 Trace: 1.0000
Rep: 72 Cost: 1.4845 Fidelity: 0.2446 Trace: 1.0000
Rep: 73 Cost: 1.4775 Fidelity: 0.2492 Trace: 1.0000
Rep: 74 Cost: 1.4703 Fidelity: 0.2538 Trace: 1.0000
Rep: 75 Cost: 1.4629 Fidelity: 0.2583 Trace: 1.0000
Rep: 76 Cost: 1.4553 Fidelity: 0.2630 Trace: 1.0000
Rep: 77 Cost: 1.4474 Fidelity: 0.2676 Trace: 1.0000
Rep: 78 Cost: 1.4392 Fidelity: 0.2724 Trace: 1.0000
Rep: 79 Cost: 1.4308 Fidelity: 0.2772 Trace: 1.0000
Rep: 80 Cost: 1.4222 Fidelity: 0.2822 Trace: 1.0000
Rep: 81 Cost: 1.4132 Fidelity: 0.2873 Trace: 1.0000
Rep: 82 Cost: 1.4040 Fidelity: 0.2926 Trace: 1.0000
Rep: 83 Cost: 1.3945 Fidelity: 0.2980 Trace: 1.0000
Rep: 84 Cost: 1.3848 Fidelity: 0.3036 Trace: 1.0000
Rep: 85 Cost: 1.3748 Fidelity: 0.3094 Trace: 1.0000
Rep: 86 Cost: 1.3646 Fidelity: 0.3153 Trace: 1.0000
Rep: 87 Cost: 1.3543 Fidelity: 0.3214 Trace: 1.0000
Rep: 88 Cost: 1.3438 Fidelity: 0.3276 Trace: 1.0000
Rep: 89 Cost: 1.3334 Fidelity: 0.3340 Trace: 1.0000
Rep: 90 Cost: 1.3231 Fidelity: 0.3406 Trace: 1.0000
Rep: 91 Cost: 1.3129 Fidelity: 0.3473 Trace: 1.0000
Rep: 92 Cost: 1.3028 Fidelity: 0.3543 Trace: 1.0000
Rep: 93 Cost: 1.2925 Fidelity: 0.3614 Trace: 1.0000
Rep: 94 Cost: 1.2821 Fidelity: 0.3686 Trace: 1.0000
Rep: 95 Cost: 1.2715 Fidelity: 0.3759 Trace: 1.0000
Rep: 96 Cost: 1.2606 Fidelity: 0.3832 Trace: 1.0000
Rep: 97 Cost: 1.2493 Fidelity: 0.3905 Trace: 1.0000
Rep: 98 Cost: 1.2376 Fidelity: 0.3978 Trace: 1.0000
Rep: 99 Cost: 1.2257 Fidelity: 0.4051 Trace: 1.0000
Rep: 100 Cost: 1.2152 Fidelity: 0.4123 Trace: 1.0000
Rep: 101 Cost: 1.2057 Fidelity: 0.4197 Trace: 1.0000
Rep: 102 Cost: 1.1951 Fidelity: 0.4272 Trace: 1.0000
Rep: 103 Cost: 1.1841 Fidelity: 0.4345 Trace: 1.0000
Rep: 104 Cost: 1.1739 Fidelity: 0.4417 Trace: 1.0000
Rep: 105 Cost: 1.1641 Fidelity: 0.4487 Trace: 1.0000
Rep: 106 Cost: 1.1538 Fidelity: 0.4554 Trace: 1.0000
Rep: 107 Cost: 1.1427 Fidelity: 0.4620 Trace: 1.0000
Rep: 108 Cost: 1.1325 Fidelity: 0.4685 Trace: 1.0000
Rep: 109 Cost: 1.1229 Fidelity: 0.4749 Trace: 1.0000
Rep: 110 Cost: 1.1116 Fidelity: 0.4812 Trace: 1.0000
Rep: 111 Cost: 1.1032 Fidelity: 0.4875 Trace: 0.9999
Rep: 112 Cost: 1.0936 Fidelity: 0.4937 Trace: 0.9999
Rep: 113 Cost: 1.0821 Fidelity: 0.4998 Trace: 0.9999
Rep: 114 Cost: 1.0717 Fidelity: 0.5058 Trace: 0.9999
Rep: 115 Cost: 1.0628 Fidelity: 0.5117 Trace: 0.9999
Rep: 116 Cost: 1.0528 Fidelity: 0.5175 Trace: 0.9999
Rep: 117 Cost: 1.0420 Fidelity: 0.5233 Trace: 0.9999
Rep: 118 Cost: 1.0329 Fidelity: 0.5289 Trace: 0.9999
Rep: 119 Cost: 1.0234 Fidelity: 0.5345 Trace: 0.9999
Rep: 120 Cost: 1.0138 Fidelity: 0.5402 Trace: 0.9999
Rep: 121 Cost: 1.0055 Fidelity: 0.5458 Trace: 0.9999
Rep: 122 Cost: 0.9962 Fidelity: 0.5514 Trace: 0.9999
Rep: 123 Cost: 0.9864 Fidelity: 0.5570 Trace: 0.9998
Rep: 124 Cost: 0.9781 Fidelity: 0.5626 Trace: 0.9998
Rep: 125 Cost: 0.9695 Fidelity: 0.5682 Trace: 0.9998
Rep: 126 Cost: 0.9607 Fidelity: 0.5736 Trace: 0.9998
Rep: 127 Cost: 0.9518 Fidelity: 0.5790 Trace: 0.9998
Rep: 128 Cost: 0.9445 Fidelity: 0.5844 Trace: 0.9998
Rep: 129 Cost: 0.9367 Fidelity: 0.5898 Trace: 0.9998
Rep: 130 Cost: 0.9276 Fidelity: 0.5952 Trace: 0.9997
Rep: 131 Cost: 0.9177 Fidelity: 0.6005 Trace: 0.9997
Rep: 132 Cost: 0.9120 Fidelity: 0.6058 Trace: 0.9997
Rep: 133 Cost: 0.9034 Fidelity: 0.6111 Trace: 0.9997
Rep: 134 Cost: 0.8945 Fidelity: 0.6163 Trace: 0.9996
Rep: 135 Cost: 0.8868 Fidelity: 0.6214 Trace: 0.9996
Rep: 136 Cost: 0.8785 Fidelity: 0.6265 Trace: 0.9996
Rep: 137 Cost: 0.8690 Fidelity: 0.6314 Trace: 0.9996
Rep: 138 Cost: 0.8621 Fidelity: 0.6364 Trace: 0.9995
Rep: 139 Cost: 0.8545 Fidelity: 0.6413 Trace: 0.9995
Rep: 140 Cost: 0.8445 Fidelity: 0.6463 Trace: 0.9995
Rep: 141 Cost: 0.8375 Fidelity: 0.6513 Trace: 0.9995
Rep: 142 Cost: 0.8297 Fidelity: 0.6563 Trace: 0.9994
Rep: 143 Cost: 0.8215 Fidelity: 0.6611 Trace: 0.9994
Rep: 144 Cost: 0.8139 Fidelity: 0.6658 Trace: 0.9994
Rep: 145 Cost: 0.8047 Fidelity: 0.6705 Trace: 0.9993
Rep: 146 Cost: 0.7996 Fidelity: 0.6752 Trace: 0.9993
Rep: 147 Cost: 0.7931 Fidelity: 0.6799 Trace: 0.9993
Rep: 148 Cost: 0.7842 Fidelity: 0.6845 Trace: 0.9992
Rep: 149 Cost: 0.7761 Fidelity: 0.6891 Trace: 0.9992
Rep: 150 Cost: 0.7690 Fidelity: 0.6937 Trace: 0.9991
Rep: 151 Cost: 0.7603 Fidelity: 0.6984 Trace: 0.9991
Rep: 152 Cost: 0.7539 Fidelity: 0.7029 Trace: 0.9990
Rep: 153 Cost: 0.7462 Fidelity: 0.7074 Trace: 0.9990
Rep: 154 Cost: 0.7371 Fidelity: 0.7117 Trace: 0.9989
Rep: 155 Cost: 0.7304 Fidelity: 0.7160 Trace: 0.9989
Rep: 156 Cost: 0.7227 Fidelity: 0.7203 Trace: 0.9988
Rep: 157 Cost: 0.7161 Fidelity: 0.7245 Trace: 0.9988
Rep: 158 Cost: 0.7089 Fidelity: 0.7287 Trace: 0.9987
Rep: 159 Cost: 0.7011 Fidelity: 0.7327 Trace: 0.9986
Rep: 160 Cost: 0.6943 Fidelity: 0.7366 Trace: 0.9986
Rep: 161 Cost: 0.6873 Fidelity: 0.7406 Trace: 0.9985
Rep: 162 Cost: 0.6797 Fidelity: 0.7445 Trace: 0.9985
Rep: 163 Cost: 0.6736 Fidelity: 0.7482 Trace: 0.9984
Rep: 164 Cost: 0.6665 Fidelity: 0.7519 Trace: 0.9983
Rep: 165 Cost: 0.6591 Fidelity: 0.7557 Trace: 0.9983
Rep: 166 Cost: 0.6517 Fidelity: 0.7592 Trace: 0.9982
Rep: 167 Cost: 0.6459 Fidelity: 0.7626 Trace: 0.9981
Rep: 168 Cost: 0.6380 Fidelity: 0.7662 Trace: 0.9980
Rep: 169 Cost: 0.6330 Fidelity: 0.7698 Trace: 0.9980
Rep: 170 Cost: 0.6263 Fidelity: 0.7732 Trace: 0.9979
Rep: 171 Cost: 0.6186 Fidelity: 0.7764 Trace: 0.9978
Rep: 172 Cost: 0.6124 Fidelity: 0.7797 Trace: 0.9977
Rep: 173 Cost: 0.6049 Fidelity: 0.7832 Trace: 0.9976
Rep: 174 Cost: 0.5992 Fidelity: 0.7866 Trace: 0.9975
Rep: 175 Cost: 0.5915 Fidelity: 0.7898 Trace: 0.9974
Rep: 176 Cost: 0.5872 Fidelity: 0.7926 Trace: 0.9973
Rep: 177 Cost: 0.5812 Fidelity: 0.7956 Trace: 0.9972
Rep: 178 Cost: 0.5726 Fidelity: 0.7989 Trace: 0.9971
Rep: 179 Cost: 0.5680 Fidelity: 0.8020 Trace: 0.9970
Rep: 180 Cost: 0.5620 Fidelity: 0.8049 Trace: 0.9969
Rep: 181 Cost: 0.5553 Fidelity: 0.8076 Trace: 0.9968
Rep: 182 Cost: 0.5496 Fidelity: 0.8102 Trace: 0.9966
Rep: 183 Cost: 0.5431 Fidelity: 0.8130 Trace: 0.9965
Rep: 184 Cost: 0.5386 Fidelity: 0.8158 Trace: 0.9964
Rep: 185 Cost: 0.5333 Fidelity: 0.8185 Trace: 0.9963
Rep: 186 Cost: 0.5267 Fidelity: 0.8209 Trace: 0.9961
Rep: 187 Cost: 0.5227 Fidelity: 0.8234 Trace: 0.9960
Rep: 188 Cost: 0.5171 Fidelity: 0.8259 Trace: 0.9959
Rep: 189 Cost: 0.5118 Fidelity: 0.8279 Trace: 0.9957
Rep: 190 Cost: 0.5071 Fidelity: 0.8300 Trace: 0.9955
Rep: 191 Cost: 0.5018 Fidelity: 0.8322 Trace: 0.9954
Rep: 192 Cost: 0.4961 Fidelity: 0.8343 Trace: 0.9952
Rep: 193 Cost: 0.4931 Fidelity: 0.8362 Trace: 0.9951
Rep: 194 Cost: 0.4875 Fidelity: 0.8379 Trace: 0.9949
Rep: 195 Cost: 0.4830 Fidelity: 0.8395 Trace: 0.9947
Rep: 196 Cost: 0.4786 Fidelity: 0.8413 Trace: 0.9946
Rep: 197 Cost: 0.4739 Fidelity: 0.8431 Trace: 0.9944
Rep: 198 Cost: 0.4710 Fidelity: 0.8455 Trace: 0.9943
Rep: 199 Cost: 0.4683 Fidelity: 0.8471 Trace: 0.9942
Rep: 200 Cost: 0.4633 Fidelity: 0.8491 Trace: 0.9940
Rep: 201 Cost: 0.4559 Fidelity: 0.8517 Trace: 0.9938
Rep: 202 Cost: 0.4539 Fidelity: 0.8541 Trace: 0.9936
Rep: 203 Cost: 0.4502 Fidelity: 0.8568 Trace: 0.9935
Rep: 204 Cost: 0.4465 Fidelity: 0.8596 Trace: 0.9933
Rep: 205 Cost: 0.4420 Fidelity: 0.8618 Trace: 0.9932
Rep: 206 Cost: 0.4418 Fidelity: 0.8631 Trace: 0.9930
Rep: 207 Cost: 0.4372 Fidelity: 0.8649 Trace: 0.9928
Rep: 208 Cost: 0.4318 Fidelity: 0.8669 Trace: 0.9927
Rep: 209 Cost: 0.4315 Fidelity: 0.8691 Trace: 0.9926
Rep: 210 Cost: 0.4248 Fidelity: 0.8708 Trace: 0.9924
Rep: 211 Cost: 0.4234 Fidelity: 0.8717 Trace: 0.9921
Rep: 212 Cost: 0.4221 Fidelity: 0.8731 Trace: 0.9918
Rep: 213 Cost: 0.4142 Fidelity: 0.8753 Trace: 0.9917
Rep: 214 Cost: 0.4144 Fidelity: 0.8778 Trace: 0.9917
Rep: 215 Cost: 0.4138 Fidelity: 0.8795 Trace: 0.9916
Rep: 216 Cost: 0.4073 Fidelity: 0.8806 Trace: 0.9914
Rep: 217 Cost: 0.4005 Fidelity: 0.8809 Trace: 0.9910
Rep: 218 Cost: 0.4016 Fidelity: 0.8816 Trace: 0.9908
Rep: 219 Cost: 0.3955 Fidelity: 0.8832 Trace: 0.9907
Rep: 220 Cost: 0.3927 Fidelity: 0.8850 Trace: 0.9906
Rep: 221 Cost: 0.3923 Fidelity: 0.8863 Trace: 0.9905
Rep: 222 Cost: 0.3862 Fidelity: 0.8873 Trace: 0.9902
Rep: 223 Cost: 0.3826 Fidelity: 0.8883 Trace: 0.9899
Rep: 224 Cost: 0.3804 Fidelity: 0.8899 Trace: 0.9898
Rep: 225 Cost: 0.3774 Fidelity: 0.8918 Trace: 0.9897
Rep: 226 Cost: 0.3748 Fidelity: 0.8928 Trace: 0.9895
Rep: 227 Cost: 0.3712 Fidelity: 0.8933 Trace: 0.9892
Rep: 228 Cost: 0.3684 Fidelity: 0.8939 Trace: 0.9890
Rep: 229 Cost: 0.3641 Fidelity: 0.8953 Trace: 0.9888
Rep: 230 Cost: 0.3635 Fidelity: 0.8971 Trace: 0.9888
Rep: 231 Cost: 0.3605 Fidelity: 0.8979 Trace: 0.9886
Rep: 232 Cost: 0.3557 Fidelity: 0.8980 Trace: 0.9882
Rep: 233 Cost: 0.3534 Fidelity: 0.8988 Trace: 0.9880
Rep: 234 Cost: 0.3514 Fidelity: 0.9003 Trace: 0.9880
Rep: 235 Cost: 0.3480 Fidelity: 0.9012 Trace: 0.9878
Rep: 236 Cost: 0.3475 Fidelity: 0.9014 Trace: 0.9875
Rep: 237 Cost: 0.3428 Fidelity: 0.9032 Trace: 0.9874
Rep: 238 Cost: 0.3401 Fidelity: 0.9047 Trace: 0.9872
Rep: 239 Cost: 0.3373 Fidelity: 0.9055 Trace: 0.9870
Rep: 240 Cost: 0.3348 Fidelity: 0.9069 Trace: 0.9869
Rep: 241 Cost: 0.3321 Fidelity: 0.9077 Trace: 0.9867
Rep: 242 Cost: 0.3311 Fidelity: 0.9091 Trace: 0.9866
Rep: 243 Cost: 0.3282 Fidelity: 0.9100 Trace: 0.9864
Rep: 244 Cost: 0.3254 Fidelity: 0.9106 Trace: 0.9862
Rep: 245 Cost: 0.3222 Fidelity: 0.9116 Trace: 0.9860
Rep: 246 Cost: 0.3208 Fidelity: 0.9128 Trace: 0.9858
Rep: 247 Cost: 0.3177 Fidelity: 0.9135 Trace: 0.9856
Rep: 248 Cost: 0.3161 Fidelity: 0.9138 Trace: 0.9854
Rep: 249 Cost: 0.3135 Fidelity: 0.9148 Trace: 0.9852
Rep: 250 Cost: 0.3100 Fidelity: 0.9158 Trace: 0.9851
Rep: 251 Cost: 0.3077 Fidelity: 0.9163 Trace: 0.9849
Rep: 252 Cost: 0.3065 Fidelity: 0.9173 Trace: 0.9848
Rep: 253 Cost: 0.3026 Fidelity: 0.9180 Trace: 0.9846
Rep: 254 Cost: 0.3037 Fidelity: 0.9184 Trace: 0.9843
Rep: 255 Cost: 0.2985 Fidelity: 0.9197 Trace: 0.9843
Rep: 256 Cost: 0.2988 Fidelity: 0.9207 Trace: 0.9841
Rep: 257 Cost: 0.2954 Fidelity: 0.9217 Trace: 0.9840
Rep: 258 Cost: 0.2913 Fidelity: 0.9225 Trace: 0.9837
Rep: 259 Cost: 0.2896 Fidelity: 0.9235 Trace: 0.9836
Rep: 260 Cost: 0.2876 Fidelity: 0.9249 Trace: 0.9835
Rep: 261 Cost: 0.2860 Fidelity: 0.9257 Trace: 0.9833
Rep: 262 Cost: 0.2824 Fidelity: 0.9262 Trace: 0.9831
Rep: 263 Cost: 0.2792 Fidelity: 0.9268 Trace: 0.9830
Rep: 264 Cost: 0.2784 Fidelity: 0.9281 Trace: 0.9829
Rep: 265 Cost: 0.2757 Fidelity: 0.9288 Trace: 0.9828
Rep: 266 Cost: 0.2724 Fidelity: 0.9289 Trace: 0.9825
Rep: 267 Cost: 0.2708 Fidelity: 0.9292 Trace: 0.9823
Rep: 268 Cost: 0.2671 Fidelity: 0.9303 Trace: 0.9822
Rep: 269 Cost: 0.2649 Fidelity: 0.9313 Trace: 0.9821
Rep: 270 Cost: 0.2618 Fidelity: 0.9316 Trace: 0.9819
Rep: 271 Cost: 0.2610 Fidelity: 0.9320 Trace: 0.9817
Rep: 272 Cost: 0.2587 Fidelity: 0.9336 Trace: 0.9817
Rep: 273 Cost: 0.2568 Fidelity: 0.9344 Trace: 0.9816
Rep: 274 Cost: 0.2524 Fidelity: 0.9348 Trace: 0.9813
Rep: 275 Cost: 0.2535 Fidelity: 0.9348 Trace: 0.9809
Rep: 276 Cost: 0.2491 Fidelity: 0.9359 Trace: 0.9808
Rep: 277 Cost: 0.2451 Fidelity: 0.9376 Trace: 0.9809
Rep: 278 Cost: 0.2435 Fidelity: 0.9386 Trace: 0.9808
Rep: 279 Cost: 0.2392 Fidelity: 0.9391 Trace: 0.9806
Rep: 280 Cost: 0.2369 Fidelity: 0.9398 Trace: 0.9804
Rep: 281 Cost: 0.2356 Fidelity: 0.9412 Trace: 0.9804
Rep: 282 Cost: 0.2325 Fidelity: 0.9419 Trace: 0.9803
Rep: 283 Cost: 0.2289 Fidelity: 0.9419 Trace: 0.9800
Rep: 284 Cost: 0.2259 Fidelity: 0.9425 Trace: 0.9799
Rep: 285 Cost: 0.2235 Fidelity: 0.9436 Trace: 0.9799
Rep: 286 Cost: 0.2205 Fidelity: 0.9442 Trace: 0.9798
Rep: 287 Cost: 0.2181 Fidelity: 0.9444 Trace: 0.9795
Rep: 288 Cost: 0.2149 Fidelity: 0.9455 Trace: 0.9795
Rep: 289 Cost: 0.2130 Fidelity: 0.9463 Trace: 0.9793
Rep: 290 Cost: 0.2083 Fidelity: 0.9470 Trace: 0.9791
Rep: 291 Cost: 0.2091 Fidelity: 0.9481 Trace: 0.9790
Rep: 292 Cost: 0.2057 Fidelity: 0.9491 Trace: 0.9790
Rep: 293 Cost: 0.2011 Fidelity: 0.9499 Trace: 0.9788
Rep: 294 Cost: 0.1996 Fidelity: 0.9503 Trace: 0.9786
Rep: 295 Cost: 0.1949 Fidelity: 0.9512 Trace: 0.9785
Rep: 296 Cost: 0.1930 Fidelity: 0.9524 Trace: 0.9786
Rep: 297 Cost: 0.1895 Fidelity: 0.9531 Trace: 0.9785
Rep: 298 Cost: 0.1858 Fidelity: 0.9532 Trace: 0.9782
Rep: 299 Cost: 0.1834 Fidelity: 0.9535 Trace: 0.9779
Rep: 300 Cost: 0.1793 Fidelity: 0.9543 Trace: 0.9779
Rep: 301 Cost: 0.1781 Fidelity: 0.9556 Trace: 0.9781
Rep: 302 Cost: 0.1754 Fidelity: 0.9564 Trace: 0.9780
Rep: 303 Cost: 0.1711 Fidelity: 0.9567 Trace: 0.9778
Rep: 304 Cost: 0.1678 Fidelity: 0.9569 Trace: 0.9775
Rep: 305 Cost: 0.1626 Fidelity: 0.9582 Trace: 0.9775
Rep: 306 Cost: 0.1599 Fidelity: 0.9591 Trace: 0.9774
Rep: 307 Cost: 0.1565 Fidelity: 0.9597 Trace: 0.9772
Rep: 308 Cost: 0.1525 Fidelity: 0.9607 Trace: 0.9772
Rep: 309 Cost: 0.1498 Fidelity: 0.9612 Trace: 0.9771
Rep: 310 Cost: 0.1463 Fidelity: 0.9620 Trace: 0.9770
Rep: 311 Cost: 0.1427 Fidelity: 0.9627 Trace: 0.9770
Rep: 312 Cost: 0.1397 Fidelity: 0.9631 Trace: 0.9768
Rep: 313 Cost: 0.1365 Fidelity: 0.9635 Trace: 0.9766
Rep: 314 Cost: 0.1316 Fidelity: 0.9643 Trace: 0.9767
Rep: 315 Cost: 0.1288 Fidelity: 0.9648 Trace: 0.9766
Rep: 316 Cost: 0.1261 Fidelity: 0.9653 Trace: 0.9764
Rep: 317 Cost: 0.1209 Fidelity: 0.9659 Trace: 0.9763
Rep: 318 Cost: 0.1204 Fidelity: 0.9665 Trace: 0.9762
Rep: 319 Cost: 0.1153 Fidelity: 0.9671 Trace: 0.9761
Rep: 320 Cost: 0.1135 Fidelity: 0.9676 Trace: 0.9759
Rep: 321 Cost: 0.1078 Fidelity: 0.9682 Trace: 0.9759
Rep: 322 Cost: 0.1063 Fidelity: 0.9687 Trace: 0.9759
Rep: 323 Cost: 0.1008 Fidelity: 0.9693 Trace: 0.9759
Rep: 324 Cost: 0.1001 Fidelity: 0.9697 Trace: 0.9757
Rep: 325 Cost: 0.0977 Fidelity: 0.9701 Trace: 0.9756
Rep: 326 Cost: 0.0896 Fidelity: 0.9705 Trace: 0.9755
Rep: 327 Cost: 0.0916 Fidelity: 0.9708 Trace: 0.9754
Rep: 328 Cost: 0.0872 Fidelity: 0.9711 Trace: 0.9752
Rep: 329 Cost: 0.0789 Fidelity: 0.9713 Trace: 0.9750
Rep: 330 Cost: 0.0821 Fidelity: 0.9717 Trace: 0.9749
Rep: 331 Cost: 0.0772 Fidelity: 0.9721 Trace: 0.9749
Rep: 332 Cost: 0.0690 Fidelity: 0.9724 Trace: 0.9749
Rep: 333 Cost: 0.0704 Fidelity: 0.9724 Trace: 0.9746
Rep: 334 Cost: 0.0654 Fidelity: 0.9724 Trace: 0.9744
Rep: 335 Cost: 0.0597 Fidelity: 0.9726 Trace: 0.9742
Rep: 336 Cost: 0.0578 Fidelity: 0.9729 Trace: 0.9741
Rep: 337 Cost: 0.0534 Fidelity: 0.9731 Trace: 0.9741
Rep: 338 Cost: 0.0497 Fidelity: 0.9731 Trace: 0.9739
Rep: 339 Cost: 0.0454 Fidelity: 0.9730 Trace: 0.9736
Rep: 340 Cost: 0.0407 Fidelity: 0.9728 Trace: 0.9732
Rep: 341 Cost: 0.0377 Fidelity: 0.9726 Trace: 0.9730
Rep: 342 Cost: 0.0342 Fidelity: 0.9727 Trace: 0.9729
Rep: 343 Cost: 0.0300 Fidelity: 0.9728 Trace: 0.9729
Rep: 344 Cost: 0.0264 Fidelity: 0.9727 Trace: 0.9727
Rep: 345 Cost: 0.0205 Fidelity: 0.9722 Trace: 0.9723
Rep: 346 Cost: 0.0219 Fidelity: 0.9717 Trace: 0.9717
Rep: 347 Cost: 0.0203 Fidelity: 0.9716 Trace: 0.9716
Rep: 348 Cost: 0.0231 Fidelity: 0.9718 Trace: 0.9718
Rep: 349 Cost: 0.0269 Fidelity: 0.9716 Trace: 0.9716
Rep: 350 Cost: 0.0264 Fidelity: 0.9714 Trace: 0.9715
Rep: 351 Cost: 0.0288 Fidelity: 0.9713 Trace: 0.9714
Rep: 352 Cost: 0.0281 Fidelity: 0.9713 Trace: 0.9714
Rep: 353 Cost: 0.0286 Fidelity: 0.9713 Trace: 0.9714
Rep: 354 Cost: 0.0273 Fidelity: 0.9714 Trace: 0.9714
Rep: 355 Cost: 0.0270 Fidelity: 0.9715 Trace: 0.9716
Rep: 356 Cost: 0.0255 Fidelity: 0.9715 Trace: 0.9716
Rep: 357 Cost: 0.0248 Fidelity: 0.9716 Trace: 0.9716
Rep: 358 Cost: 0.0234 Fidelity: 0.9718 Trace: 0.9718
Rep: 359 Cost: 0.0203 Fidelity: 0.9720 Trace: 0.9720
Rep: 360 Cost: 0.0177 Fidelity: 0.9720 Trace: 0.9720
Rep: 361 Cost: 0.0173 Fidelity: 0.9722 Trace: 0.9722
Rep: 362 Cost: 0.0212 Fidelity: 0.9726 Trace: 0.9726
Rep: 363 Cost: 0.0208 Fidelity: 0.9726 Trace: 0.9726
Rep: 364 Cost: 0.0217 Fidelity: 0.9726 Trace: 0.9726
Rep: 365 Cost: 0.0210 Fidelity: 0.9727 Trace: 0.9727
Rep: 366 Cost: 0.0200 Fidelity: 0.9726 Trace: 0.9726
Rep: 367 Cost: 0.0213 Fidelity: 0.9725 Trace: 0.9725
Rep: 368 Cost: 0.0189 Fidelity: 0.9725 Trace: 0.9725
Rep: 369 Cost: 0.0201 Fidelity: 0.9724 Trace: 0.9724
Rep: 370 Cost: 0.0176 Fidelity: 0.9723 Trace: 0.9723
Rep: 371 Cost: 0.0165 Fidelity: 0.9723 Trace: 0.9723
Rep: 372 Cost: 0.0208 Fidelity: 0.9726 Trace: 0.9726
Rep: 373 Cost: 0.0215 Fidelity: 0.9724 Trace: 0.9724
Rep: 374 Cost: 0.0224 Fidelity: 0.9722 Trace: 0.9722
Rep: 375 Cost: 0.0206 Fidelity: 0.9723 Trace: 0.9724
Rep: 376 Cost: 0.0196 Fidelity: 0.9724 Trace: 0.9724
Rep: 377 Cost: 0.0219 Fidelity: 0.9723 Trace: 0.9724
Rep: 378 Cost: 0.0186 Fidelity: 0.9724 Trace: 0.9724
Rep: 379 Cost: 0.0242 Fidelity: 0.9726 Trace: 0.9727
Rep: 380 Cost: 0.0253 Fidelity: 0.9726 Trace: 0.9726
Rep: 381 Cost: 0.0227 Fidelity: 0.9725 Trace: 0.9725
Rep: 382 Cost: 0.0182 Fidelity: 0.9725 Trace: 0.9725
Rep: 383 Cost: 0.0198 Fidelity: 0.9726 Trace: 0.9726
Rep: 384 Cost: 0.0219 Fidelity: 0.9728 Trace: 0.9728
Rep: 385 Cost: 0.0189 Fidelity: 0.9727 Trace: 0.9727
Rep: 386 Cost: 0.0193 Fidelity: 0.9728 Trace: 0.9728
Rep: 387 Cost: 0.0192 Fidelity: 0.9728 Trace: 0.9728
Rep: 388 Cost: 0.0164 Fidelity: 0.9727 Trace: 0.9727
Rep: 389 Cost: 0.0188 Fidelity: 0.9727 Trace: 0.9727
Rep: 390 Cost: 0.0182 Fidelity: 0.9726 Trace: 0.9726
Rep: 391 Cost: 0.0191 Fidelity: 0.9727 Trace: 0.9727
Rep: 392 Cost: 0.0213 Fidelity: 0.9729 Trace: 0.9729
Rep: 393 Cost: 0.0196 Fidelity: 0.9728 Trace: 0.9728
Rep: 394 Cost: 0.0199 Fidelity: 0.9726 Trace: 0.9726
Rep: 395 Cost: 0.0187 Fidelity: 0.9728 Trace: 0.9728
Rep: 396 Cost: 0.0224 Fidelity: 0.9730 Trace: 0.9730
Rep: 397 Cost: 0.0189 Fidelity: 0.9728 Trace: 0.9728
Rep: 398 Cost: 0.0178 Fidelity: 0.9729 Trace: 0.9729
Rep: 399 Cost: 0.0204 Fidelity: 0.9732 Trace: 0.9732
Rep: 400 Cost: 0.0186 Fidelity: 0.9731 Trace: 0.9731
Rep: 401 Cost: 0.0218 Fidelity: 0.9727 Trace: 0.9727
Rep: 402 Cost: 0.0215 Fidelity: 0.9727 Trace: 0.9727
Rep: 403 Cost: 0.0220 Fidelity: 0.9731 Trace: 0.9731
Rep: 404 Cost: 0.0204 Fidelity: 0.9733 Trace: 0.9733
Rep: 405 Cost: 0.0187 Fidelity: 0.9731 Trace: 0.9731
Rep: 406 Cost: 0.0207 Fidelity: 0.9729 Trace: 0.9729
Rep: 407 Cost: 0.0211 Fidelity: 0.9731 Trace: 0.9731
Rep: 408 Cost: 0.0203 Fidelity: 0.9733 Trace: 0.9733
Rep: 409 Cost: 0.0231 Fidelity: 0.9732 Trace: 0.9732
Rep: 410 Cost: 0.0219 Fidelity: 0.9732 Trace: 0.9732
Rep: 411 Cost: 0.0183 Fidelity: 0.9733 Trace: 0.9734
Rep: 412 Cost: 0.0209 Fidelity: 0.9731 Trace: 0.9731
Rep: 413 Cost: 0.0233 Fidelity: 0.9729 Trace: 0.9729
Rep: 414 Cost: 0.0202 Fidelity: 0.9731 Trace: 0.9731
Rep: 415 Cost: 0.0216 Fidelity: 0.9734 Trace: 0.9734
Rep: 416 Cost: 0.0212 Fidelity: 0.9734 Trace: 0.9735
Rep: 417 Cost: 0.0177 Fidelity: 0.9731 Trace: 0.9732
Rep: 418 Cost: 0.0183 Fidelity: 0.9732 Trace: 0.9732
Rep: 419 Cost: 0.0193 Fidelity: 0.9733 Trace: 0.9734
Rep: 420 Cost: 0.0187 Fidelity: 0.9733 Trace: 0.9733
Rep: 421 Cost: 0.0169 Fidelity: 0.9733 Trace: 0.9733
Rep: 422 Cost: 0.0190 Fidelity: 0.9731 Trace: 0.9731
Rep: 423 Cost: 0.0183 Fidelity: 0.9732 Trace: 0.9732
Rep: 424 Cost: 0.0164 Fidelity: 0.9733 Trace: 0.9733
Rep: 425 Cost: 0.0157 Fidelity: 0.9733 Trace: 0.9733
Rep: 426 Cost: 0.0211 Fidelity: 0.9735 Trace: 0.9735
Rep: 427 Cost: 0.0180 Fidelity: 0.9735 Trace: 0.9735
Rep: 428 Cost: 0.0222 Fidelity: 0.9733 Trace: 0.9733
Rep: 429 Cost: 0.0200 Fidelity: 0.9734 Trace: 0.9735
Rep: 430 Cost: 0.0207 Fidelity: 0.9737 Trace: 0.9737
Rep: 431 Cost: 0.0209 Fidelity: 0.9737 Trace: 0.9737
Rep: 432 Cost: 0.0190 Fidelity: 0.9735 Trace: 0.9735
Rep: 433 Cost: 0.0192 Fidelity: 0.9736 Trace: 0.9736
Rep: 434 Cost: 0.0173 Fidelity: 0.9737 Trace: 0.9737
Rep: 435 Cost: 0.0182 Fidelity: 0.9735 Trace: 0.9735
Rep: 436 Cost: 0.0198 Fidelity: 0.9733 Trace: 0.9733
Rep: 437 Cost: 0.0210 Fidelity: 0.9734 Trace: 0.9734
Rep: 438 Cost: 0.0178 Fidelity: 0.9736 Trace: 0.9736
Rep: 439 Cost: 0.0218 Fidelity: 0.9737 Trace: 0.9737
Rep: 440 Cost: 0.0247 Fidelity: 0.9736 Trace: 0.9736
Rep: 441 Cost: 0.0226 Fidelity: 0.9734 Trace: 0.9734
Rep: 442 Cost: 0.0192 Fidelity: 0.9735 Trace: 0.9735
Rep: 443 Cost: 0.0213 Fidelity: 0.9738 Trace: 0.9738
Rep: 444 Cost: 0.0225 Fidelity: 0.9738 Trace: 0.9738
Rep: 445 Cost: 0.0178 Fidelity: 0.9736 Trace: 0.9736
Rep: 446 Cost: 0.0201 Fidelity: 0.9735 Trace: 0.9735
Rep: 447 Cost: 0.0184 Fidelity: 0.9738 Trace: 0.9738
Rep: 448 Cost: 0.0195 Fidelity: 0.9740 Trace: 0.9740
Rep: 449 Cost: 0.0198 Fidelity: 0.9740 Trace: 0.9740
Rep: 450 Cost: 0.0192 Fidelity: 0.9737 Trace: 0.9737
Rep: 451 Cost: 0.0202 Fidelity: 0.9735 Trace: 0.9735
Rep: 452 Cost: 0.0178 Fidelity: 0.9738 Trace: 0.9738
Rep: 453 Cost: 0.0198 Fidelity: 0.9740 Trace: 0.9740
Rep: 454 Cost: 0.0192 Fidelity: 0.9739 Trace: 0.9739
Rep: 455 Cost: 0.0190 Fidelity: 0.9737 Trace: 0.9737
Rep: 456 Cost: 0.0195 Fidelity: 0.9738 Trace: 0.9738
Rep: 457 Cost: 0.0202 Fidelity: 0.9739 Trace: 0.9739
Rep: 458 Cost: 0.0170 Fidelity: 0.9738 Trace: 0.9738
Rep: 459 Cost: 0.0182 Fidelity: 0.9737 Trace: 0.9737
Rep: 460 Cost: 0.0170 Fidelity: 0.9739 Trace: 0.9739
Rep: 461 Cost: 0.0161 Fidelity: 0.9739 Trace: 0.9739
Rep: 462 Cost: 0.0177 Fidelity: 0.9741 Trace: 0.9741
Rep: 463 Cost: 0.0168 Fidelity: 0.9740 Trace: 0.9740
Rep: 464 Cost: 0.0172 Fidelity: 0.9739 Trace: 0.9739
Rep: 465 Cost: 0.0172 Fidelity: 0.9740 Trace: 0.9740
Rep: 466 Cost: 0.0173 Fidelity: 0.9741 Trace: 0.9741
Rep: 467 Cost: 0.0174 Fidelity: 0.9742 Trace: 0.9742
Rep: 468 Cost: 0.0163 Fidelity: 0.9740 Trace: 0.9740
Rep: 469 Cost: 0.0170 Fidelity: 0.9740 Trace: 0.9740
Rep: 470 Cost: 0.0185 Fidelity: 0.9742 Trace: 0.9742
Rep: 471 Cost: 0.0172 Fidelity: 0.9742 Trace: 0.9743
Rep: 472 Cost: 0.0176 Fidelity: 0.9740 Trace: 0.9740
Rep: 473 Cost: 0.0161 Fidelity: 0.9741 Trace: 0.9741
Rep: 474 Cost: 0.0155 Fidelity: 0.9741 Trace: 0.9741
Rep: 475 Cost: 0.0160 Fidelity: 0.9740 Trace: 0.9740
Rep: 476 Cost: 0.0155 Fidelity: 0.9741 Trace: 0.9741
Rep: 477 Cost: 0.0162 Fidelity: 0.9742 Trace: 0.9742
Rep: 478 Cost: 0.0169 Fidelity: 0.9741 Trace: 0.9741
Rep: 479 Cost: 0.0161 Fidelity: 0.9742 Trace: 0.9742
Rep: 480 Cost: 0.0154 Fidelity: 0.9742 Trace: 0.9742
Rep: 481 Cost: 0.0163 Fidelity: 0.9744 Trace: 0.9744
Rep: 482 Cost: 0.0152 Fidelity: 0.9743 Trace: 0.9743
Rep: 483 Cost: 0.0163 Fidelity: 0.9741 Trace: 0.9741
Rep: 484 Cost: 0.0169 Fidelity: 0.9743 Trace: 0.9743
Rep: 485 Cost: 0.0161 Fidelity: 0.9743 Trace: 0.9743
Rep: 486 Cost: 0.0174 Fidelity: 0.9742 Trace: 0.9742
Rep: 487 Cost: 0.0166 Fidelity: 0.9744 Trace: 0.9744
Rep: 488 Cost: 0.0163 Fidelity: 0.9745 Trace: 0.9745
Rep: 489 Cost: 0.0176 Fidelity: 0.9743 Trace: 0.9743
Rep: 490 Cost: 0.0161 Fidelity: 0.9743 Trace: 0.9743
Rep: 491 Cost: 0.0190 Fidelity: 0.9746 Trace: 0.9746
Rep: 492 Cost: 0.0182 Fidelity: 0.9746 Trace: 0.9746
Rep: 493 Cost: 0.0182 Fidelity: 0.9744 Trace: 0.9744
Rep: 494 Cost: 0.0145 Fidelity: 0.9744 Trace: 0.9744
Rep: 495 Cost: 0.0200 Fidelity: 0.9746 Trace: 0.9746
Rep: 496 Cost: 0.0168 Fidelity: 0.9745 Trace: 0.9745
Rep: 497 Cost: 0.0210 Fidelity: 0.9743 Trace: 0.9743
Rep: 498 Cost: 0.0192 Fidelity: 0.9744 Trace: 0.9744
Rep: 499 Cost: 0.0181 Fidelity: 0.9746 Trace: 0.9746
Rep: 500 Cost: 0.0182 Fidelity: 0.9746 Trace: 0.9746
Rep: 501 Cost: 0.0182 Fidelity: 0.9744 Trace: 0.9745
Rep: 502 Cost: 0.0182 Fidelity: 0.9745 Trace: 0.9746
Rep: 503 Cost: 0.0173 Fidelity: 0.9747 Trace: 0.9747
Rep: 504 Cost: 0.0164 Fidelity: 0.9746 Trace: 0.9746
Rep: 505 Cost: 0.0192 Fidelity: 0.9744 Trace: 0.9744
Rep: 506 Cost: 0.0159 Fidelity: 0.9746 Trace: 0.9746
Rep: 507 Cost: 0.0218 Fidelity: 0.9748 Trace: 0.9748
Rep: 508 Cost: 0.0206 Fidelity: 0.9748 Trace: 0.9748
Rep: 509 Cost: 0.0173 Fidelity: 0.9745 Trace: 0.9745
Rep: 510 Cost: 0.0185 Fidelity: 0.9746 Trace: 0.9746
Rep: 511 Cost: 0.0178 Fidelity: 0.9748 Trace: 0.9748
Rep: 512 Cost: 0.0181 Fidelity: 0.9748 Trace: 0.9748
Rep: 513 Cost: 0.0169 Fidelity: 0.9746 Trace: 0.9746
Rep: 514 Cost: 0.0146 Fidelity: 0.9747 Trace: 0.9747
Rep: 515 Cost: 0.0228 Fidelity: 0.9750 Trace: 0.9750
Rep: 516 Cost: 0.0223 Fidelity: 0.9750 Trace: 0.9751
Rep: 517 Cost: 0.0157 Fidelity: 0.9748 Trace: 0.9748
Rep: 518 Cost: 0.0189 Fidelity: 0.9746 Trace: 0.9746
Rep: 519 Cost: 0.0160 Fidelity: 0.9747 Trace: 0.9747
Rep: 520 Cost: 0.0160 Fidelity: 0.9748 Trace: 0.9748
Rep: 521 Cost: 0.0171 Fidelity: 0.9748 Trace: 0.9748
Rep: 522 Cost: 0.0151 Fidelity: 0.9748 Trace: 0.9748
Rep: 523 Cost: 0.0145 Fidelity: 0.9748 Trace: 0.9748
Rep: 524 Cost: 0.0187 Fidelity: 0.9747 Trace: 0.9747
Rep: 525 Cost: 0.0171 Fidelity: 0.9748 Trace: 0.9748
Rep: 526 Cost: 0.0169 Fidelity: 0.9749 Trace: 0.9749
Rep: 527 Cost: 0.0158 Fidelity: 0.9749 Trace: 0.9749
Rep: 528 Cost: 0.0183 Fidelity: 0.9748 Trace: 0.9749
Rep: 529 Cost: 0.0173 Fidelity: 0.9749 Trace: 0.9750
Rep: 530 Cost: 0.0174 Fidelity: 0.9750 Trace: 0.9750
Rep: 531 Cost: 0.0174 Fidelity: 0.9748 Trace: 0.9748
Rep: 532 Cost: 0.0165 Fidelity: 0.9749 Trace: 0.9749
Rep: 533 Cost: 0.0180 Fidelity: 0.9752 Trace: 0.9752
Rep: 534 Cost: 0.0172 Fidelity: 0.9752 Trace: 0.9752
Rep: 535 Cost: 0.0172 Fidelity: 0.9749 Trace: 0.9749
Rep: 536 Cost: 0.0169 Fidelity: 0.9749 Trace: 0.9749
Rep: 537 Cost: 0.0172 Fidelity: 0.9751 Trace: 0.9752
Rep: 538 Cost: 0.0168 Fidelity: 0.9751 Trace: 0.9751
Rep: 539 Cost: 0.0172 Fidelity: 0.9749 Trace: 0.9749
Rep: 540 Cost: 0.0161 Fidelity: 0.9750 Trace: 0.9750
Rep: 541 Cost: 0.0179 Fidelity: 0.9752 Trace: 0.9752
Rep: 542 Cost: 0.0170 Fidelity: 0.9752 Trace: 0.9752
Rep: 543 Cost: 0.0176 Fidelity: 0.9750 Trace: 0.9750
Rep: 544 Cost: 0.0160 Fidelity: 0.9750 Trace: 0.9750
Rep: 545 Cost: 0.0178 Fidelity: 0.9752 Trace: 0.9752
Rep: 546 Cost: 0.0164 Fidelity: 0.9752 Trace: 0.9752
Rep: 547 Cost: 0.0177 Fidelity: 0.9750 Trace: 0.9750
Rep: 548 Cost: 0.0154 Fidelity: 0.9751 Trace: 0.9751
Rep: 549 Cost: 0.0184 Fidelity: 0.9753 Trace: 0.9753
Rep: 550 Cost: 0.0165 Fidelity: 0.9753 Trace: 0.9753
Rep: 551 Cost: 0.0186 Fidelity: 0.9750 Trace: 0.9751
Rep: 552 Cost: 0.0171 Fidelity: 0.9751 Trace: 0.9751
Rep: 553 Cost: 0.0179 Fidelity: 0.9753 Trace: 0.9753
Rep: 554 Cost: 0.0170 Fidelity: 0.9754 Trace: 0.9754
Rep: 555 Cost: 0.0185 Fidelity: 0.9752 Trace: 0.9752
Rep: 556 Cost: 0.0166 Fidelity: 0.9751 Trace: 0.9751
Rep: 557 Cost: 0.0193 Fidelity: 0.9754 Trace: 0.9754
Rep: 558 Cost: 0.0184 Fidelity: 0.9754 Trace: 0.9754
Rep: 559 Cost: 0.0171 Fidelity: 0.9753 Trace: 0.9753
Rep: 560 Cost: 0.0169 Fidelity: 0.9752 Trace: 0.9753
Rep: 561 Cost: 0.0176 Fidelity: 0.9754 Trace: 0.9754
Rep: 562 Cost: 0.0167 Fidelity: 0.9755 Trace: 0.9755
Rep: 563 Cost: 0.0180 Fidelity: 0.9753 Trace: 0.9753
Rep: 564 Cost: 0.0170 Fidelity: 0.9753 Trace: 0.9753
Rep: 565 Cost: 0.0180 Fidelity: 0.9755 Trace: 0.9755
Rep: 566 Cost: 0.0177 Fidelity: 0.9755 Trace: 0.9755
Rep: 567 Cost: 0.0167 Fidelity: 0.9753 Trace: 0.9753
Rep: 568 Cost: 0.0168 Fidelity: 0.9754 Trace: 0.9754
Rep: 569 Cost: 0.0174 Fidelity: 0.9756 Trace: 0.9756
Rep: 570 Cost: 0.0164 Fidelity: 0.9756 Trace: 0.9756
Rep: 571 Cost: 0.0173 Fidelity: 0.9754 Trace: 0.9754
Rep: 572 Cost: 0.0167 Fidelity: 0.9755 Trace: 0.9755
Rep: 573 Cost: 0.0148 Fidelity: 0.9755 Trace: 0.9756
Rep: 574 Cost: 0.0145 Fidelity: 0.9755 Trace: 0.9755
Rep: 575 Cost: 0.0160 Fidelity: 0.9755 Trace: 0.9755
Rep: 576 Cost: 0.0155 Fidelity: 0.9756 Trace: 0.9756
Rep: 577 Cost: 0.0165 Fidelity: 0.9755 Trace: 0.9755
Rep: 578 Cost: 0.0165 Fidelity: 0.9754 Trace: 0.9754
Rep: 579 Cost: 0.0152 Fidelity: 0.9755 Trace: 0.9755
Rep: 580 Cost: 0.0156 Fidelity: 0.9756 Trace: 0.9756
Rep: 581 Cost: 0.0149 Fidelity: 0.9755 Trace: 0.9755
Rep: 582 Cost: 0.0139 Fidelity: 0.9756 Trace: 0.9756
Rep: 583 Cost: 0.0161 Fidelity: 0.9757 Trace: 0.9757
Rep: 584 Cost: 0.0151 Fidelity: 0.9756 Trace: 0.9756
Rep: 585 Cost: 0.0181 Fidelity: 0.9757 Trace: 0.9757
Rep: 586 Cost: 0.0173 Fidelity: 0.9757 Trace: 0.9757
Rep: 587 Cost: 0.0166 Fidelity: 0.9756 Trace: 0.9756
Rep: 588 Cost: 0.0171 Fidelity: 0.9758 Trace: 0.9758
Rep: 589 Cost: 0.0182 Fidelity: 0.9759 Trace: 0.9759
Rep: 590 Cost: 0.0173 Fidelity: 0.9758 Trace: 0.9758
Rep: 591 Cost: 0.0170 Fidelity: 0.9757 Trace: 0.9757
Rep: 592 Cost: 0.0159 Fidelity: 0.9756 Trace: 0.9756
Rep: 593 Cost: 0.0182 Fidelity: 0.9758 Trace: 0.9758
Rep: 594 Cost: 0.0186 Fidelity: 0.9758 Trace: 0.9758
Rep: 595 Cost: 0.0159 Fidelity: 0.9756 Trace: 0.9756
Rep: 596 Cost: 0.0160 Fidelity: 0.9756 Trace: 0.9756
Rep: 597 Cost: 0.0163 Fidelity: 0.9758 Trace: 0.9758
Rep: 598 Cost: 0.0167 Fidelity: 0.9759 Trace: 0.9759
Rep: 599 Cost: 0.0146 Fidelity: 0.9757 Trace: 0.9757
Rep: 600 Cost: 0.0169 Fidelity: 0.9758 Trace: 0.9758
Rep: 601 Cost: 0.0156 Fidelity: 0.9757 Trace: 0.9757
Rep: 602 Cost: 0.0149 Fidelity: 0.9758 Trace: 0.9758
Rep: 603 Cost: 0.0151 Fidelity: 0.9759 Trace: 0.9759
Rep: 604 Cost: 0.0161 Fidelity: 0.9759 Trace: 0.9759
Rep: 605 Cost: 0.0137 Fidelity: 0.9759 Trace: 0.9759
Rep: 606 Cost: 0.0178 Fidelity: 0.9759 Trace: 0.9759
Rep: 607 Cost: 0.0182 Fidelity: 0.9759 Trace: 0.9760
Rep: 608 Cost: 0.0139 Fidelity: 0.9759 Trace: 0.9759
Rep: 609 Cost: 0.0183 Fidelity: 0.9758 Trace: 0.9759
Rep: 610 Cost: 0.0188 Fidelity: 0.9759 Trace: 0.9759
Rep: 611 Cost: 0.0156 Fidelity: 0.9760 Trace: 0.9760
Rep: 612 Cost: 0.0172 Fidelity: 0.9761 Trace: 0.9761
Rep: 613 Cost: 0.0186 Fidelity: 0.9759 Trace: 0.9759
Rep: 614 Cost: 0.0177 Fidelity: 0.9758 Trace: 0.9758
Rep: 615 Cost: 0.0161 Fidelity: 0.9759 Trace: 0.9759
Rep: 616 Cost: 0.0164 Fidelity: 0.9761 Trace: 0.9761
Rep: 617 Cost: 0.0184 Fidelity: 0.9762 Trace: 0.9762
Rep: 618 Cost: 0.0168 Fidelity: 0.9761 Trace: 0.9761
Rep: 619 Cost: 0.0150 Fidelity: 0.9759 Trace: 0.9759
Rep: 620 Cost: 0.0181 Fidelity: 0.9760 Trace: 0.9760
Rep: 621 Cost: 0.0164 Fidelity: 0.9760 Trace: 0.9760
Rep: 622 Cost: 0.0149 Fidelity: 0.9760 Trace: 0.9760
Rep: 623 Cost: 0.0156 Fidelity: 0.9761 Trace: 0.9761
Rep: 624 Cost: 0.0162 Fidelity: 0.9761 Trace: 0.9761
Rep: 625 Cost: 0.0136 Fidelity: 0.9760 Trace: 0.9760
Rep: 626 Cost: 0.0158 Fidelity: 0.9762 Trace: 0.9762
Rep: 627 Cost: 0.0153 Fidelity: 0.9761 Trace: 0.9761
Rep: 628 Cost: 0.0149 Fidelity: 0.9761 Trace: 0.9761
Rep: 629 Cost: 0.0150 Fidelity: 0.9762 Trace: 0.9762
Rep: 630 Cost: 0.0148 Fidelity: 0.9762 Trace: 0.9762
Rep: 631 Cost: 0.0142 Fidelity: 0.9761 Trace: 0.9761
Rep: 632 Cost: 0.0165 Fidelity: 0.9760 Trace: 0.9760
Rep: 633 Cost: 0.0147 Fidelity: 0.9761 Trace: 0.9761
Rep: 634 Cost: 0.0150 Fidelity: 0.9762 Trace: 0.9762
Rep: 635 Cost: 0.0148 Fidelity: 0.9761 Trace: 0.9761
Rep: 636 Cost: 0.0159 Fidelity: 0.9761 Trace: 0.9761
Rep: 637 Cost: 0.0150 Fidelity: 0.9762 Trace: 0.9762
Rep: 638 Cost: 0.0164 Fidelity: 0.9762 Trace: 0.9762
Rep: 639 Cost: 0.0166 Fidelity: 0.9763 Trace: 0.9763
Rep: 640 Cost: 0.0138 Fidelity: 0.9763 Trace: 0.9763
Rep: 641 Cost: 0.0174 Fidelity: 0.9761 Trace: 0.9761
Rep: 642 Cost: 0.0151 Fidelity: 0.9762 Trace: 0.9762
Rep: 643 Cost: 0.0179 Fidelity: 0.9765 Trace: 0.9765
Rep: 644 Cost: 0.0187 Fidelity: 0.9765 Trace: 0.9765
Rep: 645 Cost: 0.0147 Fidelity: 0.9763 Trace: 0.9763
Rep: 646 Cost: 0.0175 Fidelity: 0.9762 Trace: 0.9762
Rep: 647 Cost: 0.0183 Fidelity: 0.9762 Trace: 0.9762
Rep: 648 Cost: 0.0138 Fidelity: 0.9763 Trace: 0.9763
Rep: 649 Cost: 0.0192 Fidelity: 0.9763 Trace: 0.9764
Rep: 650 Cost: 0.0198 Fidelity: 0.9764 Trace: 0.9764
Rep: 651 Cost: 0.0152 Fidelity: 0.9764 Trace: 0.9765
Rep: 652 Cost: 0.0190 Fidelity: 0.9763 Trace: 0.9763
Rep: 653 Cost: 0.0214 Fidelity: 0.9762 Trace: 0.9762
Rep: 654 Cost: 0.0175 Fidelity: 0.9763 Trace: 0.9763
Rep: 655 Cost: 0.0161 Fidelity: 0.9765 Trace: 0.9765
Rep: 656 Cost: 0.0187 Fidelity: 0.9765 Trace: 0.9765
Rep: 657 Cost: 0.0154 Fidelity: 0.9764 Trace: 0.9764
Rep: 658 Cost: 0.0172 Fidelity: 0.9763 Trace: 0.9764
Rep: 659 Cost: 0.0189 Fidelity: 0.9763 Trace: 0.9764
Rep: 660 Cost: 0.0144 Fidelity: 0.9764 Trace: 0.9764
Rep: 661 Cost: 0.0187 Fidelity: 0.9765 Trace: 0.9766
Rep: 662 Cost: 0.0210 Fidelity: 0.9766 Trace: 0.9766
Rep: 663 Cost: 0.0174 Fidelity: 0.9766 Trace: 0.9766
Rep: 664 Cost: 0.0157 Fidelity: 0.9764 Trace: 0.9764
Rep: 665 Cost: 0.0179 Fidelity: 0.9764 Trace: 0.9764
Rep: 666 Cost: 0.0145 Fidelity: 0.9765 Trace: 0.9765
Rep: 667 Cost: 0.0175 Fidelity: 0.9766 Trace: 0.9766
Rep: 668 Cost: 0.0184 Fidelity: 0.9766 Trace: 0.9767
Rep: 669 Cost: 0.0151 Fidelity: 0.9766 Trace: 0.9766
Rep: 670 Cost: 0.0173 Fidelity: 0.9765 Trace: 0.9765
Rep: 671 Cost: 0.0180 Fidelity: 0.9765 Trace: 0.9765
Rep: 672 Cost: 0.0148 Fidelity: 0.9767 Trace: 0.9767
Rep: 673 Cost: 0.0178 Fidelity: 0.9768 Trace: 0.9768
Rep: 674 Cost: 0.0178 Fidelity: 0.9767 Trace: 0.9767
Rep: 675 Cost: 0.0160 Fidelity: 0.9766 Trace: 0.9766
Rep: 676 Cost: 0.0160 Fidelity: 0.9766 Trace: 0.9766
Rep: 677 Cost: 0.0156 Fidelity: 0.9767 Trace: 0.9767
Rep: 678 Cost: 0.0161 Fidelity: 0.9768 Trace: 0.9768
Rep: 679 Cost: 0.0164 Fidelity: 0.9767 Trace: 0.9767
Rep: 680 Cost: 0.0142 Fidelity: 0.9767 Trace: 0.9767
Rep: 681 Cost: 0.0170 Fidelity: 0.9766 Trace: 0.9766
Rep: 682 Cost: 0.0157 Fidelity: 0.9766 Trace: 0.9766
Rep: 683 Cost: 0.0156 Fidelity: 0.9768 Trace: 0.9768
Rep: 684 Cost: 0.0166 Fidelity: 0.9768 Trace: 0.9769
Rep: 685 Cost: 0.0138 Fidelity: 0.9767 Trace: 0.9767
Rep: 686 Cost: 0.0187 Fidelity: 0.9766 Trace: 0.9766
Rep: 687 Cost: 0.0200 Fidelity: 0.9766 Trace: 0.9766
Rep: 688 Cost: 0.0166 Fidelity: 0.9766 Trace: 0.9766
Rep: 689 Cost: 0.0173 Fidelity: 0.9767 Trace: 0.9767
Rep: 690 Cost: 0.0185 Fidelity: 0.9768 Trace: 0.9768
Rep: 691 Cost: 0.0163 Fidelity: 0.9769 Trace: 0.9769
Rep: 692 Cost: 0.0147 Fidelity: 0.9767 Trace: 0.9767
Rep: 693 Cost: 0.0166 Fidelity: 0.9766 Trace: 0.9766
Rep: 694 Cost: 0.0159 Fidelity: 0.9768 Trace: 0.9768
Rep: 695 Cost: 0.0165 Fidelity: 0.9770 Trace: 0.9770
Rep: 696 Cost: 0.0137 Fidelity: 0.9769 Trace: 0.9769
Rep: 697 Cost: 0.0183 Fidelity: 0.9767 Trace: 0.9767
Rep: 698 Cost: 0.0165 Fidelity: 0.9768 Trace: 0.9768
Rep: 699 Cost: 0.0177 Fidelity: 0.9770 Trace: 0.9770
Rep: 700 Cost: 0.0185 Fidelity: 0.9771 Trace: 0.9771
Rep: 701 Cost: 0.0156 Fidelity: 0.9770 Trace: 0.9770
Rep: 702 Cost: 0.0184 Fidelity: 0.9768 Trace: 0.9768
Rep: 703 Cost: 0.0164 Fidelity: 0.9768 Trace: 0.9768
Rep: 704 Cost: 0.0167 Fidelity: 0.9770 Trace: 0.9771
Rep: 705 Cost: 0.0152 Fidelity: 0.9771 Trace: 0.9771
Rep: 706 Cost: 0.0168 Fidelity: 0.9769 Trace: 0.9769
Rep: 707 Cost: 0.0152 Fidelity: 0.9769 Trace: 0.9769
Rep: 708 Cost: 0.0173 Fidelity: 0.9770 Trace: 0.9770
Rep: 709 Cost: 0.0148 Fidelity: 0.9770 Trace: 0.9770
Rep: 710 Cost: 0.0198 Fidelity: 0.9769 Trace: 0.9769
Rep: 711 Cost: 0.0199 Fidelity: 0.9769 Trace: 0.9769
Rep: 712 Cost: 0.0145 Fidelity: 0.9771 Trace: 0.9771
Rep: 713 Cost: 0.0172 Fidelity: 0.9771 Trace: 0.9771
Rep: 714 Cost: 0.0141 Fidelity: 0.9770 Trace: 0.9770
Rep: 715 Cost: 0.0157 Fidelity: 0.9770 Trace: 0.9770
Rep: 716 Cost: 0.0150 Fidelity: 0.9772 Trace: 0.9772
Rep: 717 Cost: 0.0146 Fidelity: 0.9771 Trace: 0.9771
Rep: 718 Cost: 0.0155 Fidelity: 0.9770 Trace: 0.9770
Rep: 719 Cost: 0.0133 Fidelity: 0.9771 Trace: 0.9771
Rep: 720 Cost: 0.0138 Fidelity: 0.9772 Trace: 0.9772
Rep: 721 Cost: 0.0148 Fidelity: 0.9770 Trace: 0.9770
Rep: 722 Cost: 0.0144 Fidelity: 0.9771 Trace: 0.9771
Rep: 723 Cost: 0.0146 Fidelity: 0.9772 Trace: 0.9772
Rep: 724 Cost: 0.0163 Fidelity: 0.9771 Trace: 0.9771
Rep: 725 Cost: 0.0138 Fidelity: 0.9771 Trace: 0.9771
Rep: 726 Cost: 0.0195 Fidelity: 0.9773 Trace: 0.9773
Rep: 727 Cost: 0.0179 Fidelity: 0.9773 Trace: 0.9773
Rep: 728 Cost: 0.0161 Fidelity: 0.9771 Trace: 0.9771
Rep: 729 Cost: 0.0171 Fidelity: 0.9771 Trace: 0.9772
Rep: 730 Cost: 0.0142 Fidelity: 0.9773 Trace: 0.9773
Rep: 731 Cost: 0.0157 Fidelity: 0.9773 Trace: 0.9773
Rep: 732 Cost: 0.0146 Fidelity: 0.9772 Trace: 0.9772
Rep: 733 Cost: 0.0144 Fidelity: 0.9773 Trace: 0.9773
Rep: 734 Cost: 0.0161 Fidelity: 0.9773 Trace: 0.9774
Rep: 735 Cost: 0.0133 Fidelity: 0.9772 Trace: 0.9772
Rep: 736 Cost: 0.0194 Fidelity: 0.9770 Trace: 0.9770
Rep: 737 Cost: 0.0179 Fidelity: 0.9771 Trace: 0.9771
Rep: 738 Cost: 0.0159 Fidelity: 0.9774 Trace: 0.9774
Rep: 739 Cost: 0.0188 Fidelity: 0.9774 Trace: 0.9775
Rep: 740 Cost: 0.0133 Fidelity: 0.9773 Trace: 0.9773
Rep: 741 Cost: 0.0209 Fidelity: 0.9771 Trace: 0.9771
Rep: 742 Cost: 0.0207 Fidelity: 0.9771 Trace: 0.9771
Rep: 743 Cost: 0.0132 Fidelity: 0.9774 Trace: 0.9774
Rep: 744 Cost: 0.0180 Fidelity: 0.9775 Trace: 0.9775
Rep: 745 Cost: 0.0141 Fidelity: 0.9774 Trace: 0.9775
Rep: 746 Cost: 0.0205 Fidelity: 0.9772 Trace: 0.9772
Rep: 747 Cost: 0.0218 Fidelity: 0.9771 Trace: 0.9772
Rep: 748 Cost: 0.0142 Fidelity: 0.9773 Trace: 0.9773
Rep: 749 Cost: 0.0204 Fidelity: 0.9776 Trace: 0.9777
Rep: 750 Cost: 0.0207 Fidelity: 0.9777 Trace: 0.9777
Rep: 751 Cost: 0.0152 Fidelity: 0.9776 Trace: 0.9776
Rep: 752 Cost: 0.0198 Fidelity: 0.9772 Trace: 0.9773
Rep: 753 Cost: 0.0197 Fidelity: 0.9772 Trace: 0.9772
Rep: 754 Cost: 0.0162 Fidelity: 0.9774 Trace: 0.9774
Rep: 755 Cost: 0.0187 Fidelity: 0.9776 Trace: 0.9776
Rep: 756 Cost: 0.0176 Fidelity: 0.9776 Trace: 0.9776
Rep: 757 Cost: 0.0174 Fidelity: 0.9775 Trace: 0.9775
Rep: 758 Cost: 0.0158 Fidelity: 0.9774 Trace: 0.9774
Rep: 759 Cost: 0.0176 Fidelity: 0.9775 Trace: 0.9775
Rep: 760 Cost: 0.0179 Fidelity: 0.9776 Trace: 0.9776
Rep: 761 Cost: 0.0137 Fidelity: 0.9775 Trace: 0.9775
Rep: 762 Cost: 0.0169 Fidelity: 0.9774 Trace: 0.9774
Rep: 763 Cost: 0.0146 Fidelity: 0.9774 Trace: 0.9774
Rep: 764 Cost: 0.0154 Fidelity: 0.9776 Trace: 0.9776
Rep: 765 Cost: 0.0130 Fidelity: 0.9776 Trace: 0.9776
Rep: 766 Cost: 0.0167 Fidelity: 0.9774 Trace: 0.9774
Rep: 767 Cost: 0.0152 Fidelity: 0.9774 Trace: 0.9775
Rep: 768 Cost: 0.0161 Fidelity: 0.9777 Trace: 0.9777
Rep: 769 Cost: 0.0155 Fidelity: 0.9777 Trace: 0.9777
Rep: 770 Cost: 0.0162 Fidelity: 0.9775 Trace: 0.9775
Rep: 771 Cost: 0.0153 Fidelity: 0.9775 Trace: 0.9775
Rep: 772 Cost: 0.0168 Fidelity: 0.9777 Trace: 0.9777
Rep: 773 Cost: 0.0148 Fidelity: 0.9777 Trace: 0.9777
Rep: 774 Cost: 0.0176 Fidelity: 0.9776 Trace: 0.9776
Rep: 775 Cost: 0.0175 Fidelity: 0.9776 Trace: 0.9776
Rep: 776 Cost: 0.0147 Fidelity: 0.9777 Trace: 0.9778
Rep: 777 Cost: 0.0176 Fidelity: 0.9778 Trace: 0.9778
Rep: 778 Cost: 0.0137 Fidelity: 0.9776 Trace: 0.9776
Rep: 779 Cost: 0.0158 Fidelity: 0.9776 Trace: 0.9776
Rep: 780 Cost: 0.0137 Fidelity: 0.9777 Trace: 0.9778
Rep: 781 Cost: 0.0136 Fidelity: 0.9777 Trace: 0.9777
Rep: 782 Cost: 0.0132 Fidelity: 0.9777 Trace: 0.9777
Rep: 783 Cost: 0.0144 Fidelity: 0.9778 Trace: 0.9778
Rep: 784 Cost: 0.0134 Fidelity: 0.9777 Trace: 0.9777
Rep: 785 Cost: 0.0156 Fidelity: 0.9776 Trace: 0.9776
Rep: 786 Cost: 0.0136 Fidelity: 0.9777 Trace: 0.9777
Rep: 787 Cost: 0.0165 Fidelity: 0.9779 Trace: 0.9779
Rep: 788 Cost: 0.0165 Fidelity: 0.9779 Trace: 0.9779
Rep: 789 Cost: 0.0144 Fidelity: 0.9778 Trace: 0.9778
Rep: 790 Cost: 0.0161 Fidelity: 0.9777 Trace: 0.9777
Rep: 791 Cost: 0.0161 Fidelity: 0.9778 Trace: 0.9778
Rep: 792 Cost: 0.0133 Fidelity: 0.9778 Trace: 0.9778
Rep: 793 Cost: 0.0162 Fidelity: 0.9778 Trace: 0.9778
Rep: 794 Cost: 0.0144 Fidelity: 0.9779 Trace: 0.9779
Rep: 795 Cost: 0.0155 Fidelity: 0.9778 Trace: 0.9778
Rep: 796 Cost: 0.0151 Fidelity: 0.9778 Trace: 0.9778
Rep: 797 Cost: 0.0141 Fidelity: 0.9778 Trace: 0.9778
Rep: 798 Cost: 0.0142 Fidelity: 0.9778 Trace: 0.9778
Rep: 799 Cost: 0.0157 Fidelity: 0.9779 Trace: 0.9779
Rep: 800 Cost: 0.0160 Fidelity: 0.9778 Trace: 0.9778
Rep: 801 Cost: 0.0159 Fidelity: 0.9777 Trace: 0.9777
Rep: 802 Cost: 0.0142 Fidelity: 0.9778 Trace: 0.9778
Rep: 803 Cost: 0.0167 Fidelity: 0.9780 Trace: 0.9780
Rep: 804 Cost: 0.0158 Fidelity: 0.9780 Trace: 0.9780
Rep: 805 Cost: 0.0153 Fidelity: 0.9778 Trace: 0.9778
Rep: 806 Cost: 0.0149 Fidelity: 0.9779 Trace: 0.9779
Rep: 807 Cost: 0.0154 Fidelity: 0.9781 Trace: 0.9781
Rep: 808 Cost: 0.0164 Fidelity: 0.9781 Trace: 0.9782
Rep: 809 Cost: 0.0123 Fidelity: 0.9780 Trace: 0.9780
Rep: 810 Cost: 0.0183 Fidelity: 0.9778 Trace: 0.9778
Rep: 811 Cost: 0.0176 Fidelity: 0.9778 Trace: 0.9778
Rep: 812 Cost: 0.0146 Fidelity: 0.9781 Trace: 0.9781
Rep: 813 Cost: 0.0184 Fidelity: 0.9782 Trace: 0.9782
Rep: 814 Cost: 0.0181 Fidelity: 0.9781 Trace: 0.9782
Rep: 815 Cost: 0.0173 Fidelity: 0.9780 Trace: 0.9780
Rep: 816 Cost: 0.0165 Fidelity: 0.9778 Trace: 0.9779
Rep: 817 Cost: 0.0172 Fidelity: 0.9779 Trace: 0.9780
Rep: 818 Cost: 0.0154 Fidelity: 0.9781 Trace: 0.9781
Rep: 819 Cost: 0.0153 Fidelity: 0.9781 Trace: 0.9781
Rep: 820 Cost: 0.0183 Fidelity: 0.9780 Trace: 0.9780
Rep: 821 Cost: 0.0141 Fidelity: 0.9781 Trace: 0.9781
Rep: 822 Cost: 0.0193 Fidelity: 0.9782 Trace: 0.9782
Rep: 823 Cost: 0.0201 Fidelity: 0.9782 Trace: 0.9782
Rep: 824 Cost: 0.0139 Fidelity: 0.9780 Trace: 0.9780
Rep: 825 Cost: 0.0194 Fidelity: 0.9780 Trace: 0.9781
Rep: 826 Cost: 0.0191 Fidelity: 0.9782 Trace: 0.9782
Rep: 827 Cost: 0.0175 Fidelity: 0.9784 Trace: 0.9784
Rep: 828 Cost: 0.0199 Fidelity: 0.9783 Trace: 0.9784
Rep: 829 Cost: 0.0197 Fidelity: 0.9782 Trace: 0.9782
Rep: 830 Cost: 0.0179 Fidelity: 0.9780 Trace: 0.9780
Rep: 831 Cost: 0.0175 Fidelity: 0.9780 Trace: 0.9780
Rep: 832 Cost: 0.0162 Fidelity: 0.9782 Trace: 0.9782
Rep: 833 Cost: 0.0181 Fidelity: 0.9783 Trace: 0.9783
Rep: 834 Cost: 0.0154 Fidelity: 0.9782 Trace: 0.9782
Rep: 835 Cost: 0.0167 Fidelity: 0.9780 Trace: 0.9780
Rep: 836 Cost: 0.0176 Fidelity: 0.9780 Trace: 0.9780
Rep: 837 Cost: 0.0137 Fidelity: 0.9783 Trace: 0.9783
Rep: 838 Cost: 0.0161 Fidelity: 0.9784 Trace: 0.9784
Rep: 839 Cost: 0.0148 Fidelity: 0.9783 Trace: 0.9783
Rep: 840 Cost: 0.0146 Fidelity: 0.9781 Trace: 0.9782
Rep: 841 Cost: 0.0141 Fidelity: 0.9782 Trace: 0.9782
Rep: 842 Cost: 0.0148 Fidelity: 0.9784 Trace: 0.9784
Rep: 843 Cost: 0.0146 Fidelity: 0.9784 Trace: 0.9784
Rep: 844 Cost: 0.0144 Fidelity: 0.9782 Trace: 0.9782
Rep: 845 Cost: 0.0150 Fidelity: 0.9782 Trace: 0.9782
Rep: 846 Cost: 0.0158 Fidelity: 0.9783 Trace: 0.9783
Rep: 847 Cost: 0.0132 Fidelity: 0.9783 Trace: 0.9783
Rep: 848 Cost: 0.0141 Fidelity: 0.9783 Trace: 0.9783
Rep: 849 Cost: 0.0163 Fidelity: 0.9785 Trace: 0.9785
Rep: 850 Cost: 0.0157 Fidelity: 0.9785 Trace: 0.9785
Rep: 851 Cost: 0.0137 Fidelity: 0.9784 Trace: 0.9784
Rep: 852 Cost: 0.0189 Fidelity: 0.9782 Trace: 0.9782
Rep: 853 Cost: 0.0196 Fidelity: 0.9782 Trace: 0.9782
Rep: 854 Cost: 0.0154 Fidelity: 0.9784 Trace: 0.9784
Rep: 855 Cost: 0.0160 Fidelity: 0.9785 Trace: 0.9785
Rep: 856 Cost: 0.0192 Fidelity: 0.9786 Trace: 0.9786
Rep: 857 Cost: 0.0169 Fidelity: 0.9785 Trace: 0.9785
Rep: 858 Cost: 0.0150 Fidelity: 0.9785 Trace: 0.9785
Rep: 859 Cost: 0.0145 Fidelity: 0.9785 Trace: 0.9785
Rep: 860 Cost: 0.0148 Fidelity: 0.9784 Trace: 0.9784
Rep: 861 Cost: 0.0155 Fidelity: 0.9784 Trace: 0.9784
Rep: 862 Cost: 0.0131 Fidelity: 0.9784 Trace: 0.9784
Rep: 863 Cost: 0.0149 Fidelity: 0.9785 Trace: 0.9785
Rep: 864 Cost: 0.0152 Fidelity: 0.9785 Trace: 0.9785
Rep: 865 Cost: 0.0123 Fidelity: 0.9784 Trace: 0.9784
Rep: 866 Cost: 0.0150 Fidelity: 0.9785 Trace: 0.9785
Rep: 867 Cost: 0.0135 Fidelity: 0.9784 Trace: 0.9784
Rep: 868 Cost: 0.0157 Fidelity: 0.9784 Trace: 0.9784
Rep: 869 Cost: 0.0134 Fidelity: 0.9785 Trace: 0.9785
Rep: 870 Cost: 0.0143 Fidelity: 0.9785 Trace: 0.9785
Rep: 871 Cost: 0.0129 Fidelity: 0.9785 Trace: 0.9785
Rep: 872 Cost: 0.0149 Fidelity: 0.9785 Trace: 0.9786
Rep: 873 Cost: 0.0149 Fidelity: 0.9786 Trace: 0.9786
Rep: 874 Cost: 0.0126 Fidelity: 0.9786 Trace: 0.9786
Rep: 875 Cost: 0.0150 Fidelity: 0.9785 Trace: 0.9785
Rep: 876 Cost: 0.0124 Fidelity: 0.9785 Trace: 0.9785
Rep: 877 Cost: 0.0174 Fidelity: 0.9787 Trace: 0.9788
Rep: 878 Cost: 0.0182 Fidelity: 0.9787 Trace: 0.9788
Rep: 879 Cost: 0.0149 Fidelity: 0.9786 Trace: 0.9786
Rep: 880 Cost: 0.0155 Fidelity: 0.9784 Trace: 0.9784
Rep: 881 Cost: 0.0162 Fidelity: 0.9785 Trace: 0.9785
Rep: 882 Cost: 0.0145 Fidelity: 0.9786 Trace: 0.9787
Rep: 883 Cost: 0.0147 Fidelity: 0.9787 Trace: 0.9787
Rep: 884 Cost: 0.0152 Fidelity: 0.9786 Trace: 0.9786
Rep: 885 Cost: 0.0138 Fidelity: 0.9785 Trace: 0.9785
Rep: 886 Cost: 0.0162 Fidelity: 0.9786 Trace: 0.9787
Rep: 887 Cost: 0.0141 Fidelity: 0.9787 Trace: 0.9787
Rep: 888 Cost: 0.0161 Fidelity: 0.9787 Trace: 0.9787
Rep: 889 Cost: 0.0162 Fidelity: 0.9787 Trace: 0.9788
Rep: 890 Cost: 0.0130 Fidelity: 0.9787 Trace: 0.9787
Rep: 891 Cost: 0.0172 Fidelity: 0.9786 Trace: 0.9786
Rep: 892 Cost: 0.0187 Fidelity: 0.9785 Trace: 0.9785
Rep: 893 Cost: 0.0156 Fidelity: 0.9786 Trace: 0.9786
Rep: 894 Cost: 0.0150 Fidelity: 0.9787 Trace: 0.9787
Rep: 895 Cost: 0.0168 Fidelity: 0.9788 Trace: 0.9788
Rep: 896 Cost: 0.0147 Fidelity: 0.9788 Trace: 0.9788
Rep: 897 Cost: 0.0147 Fidelity: 0.9787 Trace: 0.9787
Rep: 898 Cost: 0.0159 Fidelity: 0.9786 Trace: 0.9786
Rep: 899 Cost: 0.0147 Fidelity: 0.9786 Trace: 0.9786
Rep: 900 Cost: 0.0143 Fidelity: 0.9788 Trace: 0.9788
Rep: 901 Cost: 0.0143 Fidelity: 0.9789 Trace: 0.9789
Rep: 902 Cost: 0.0148 Fidelity: 0.9788 Trace: 0.9788
Rep: 903 Cost: 0.0149 Fidelity: 0.9787 Trace: 0.9788
Rep: 904 Cost: 0.0125 Fidelity: 0.9788 Trace: 0.9788
Rep: 905 Cost: 0.0138 Fidelity: 0.9788 Trace: 0.9788
Rep: 906 Cost: 0.0121 Fidelity: 0.9788 Trace: 0.9788
Rep: 907 Cost: 0.0146 Fidelity: 0.9788 Trace: 0.9788
Rep: 908 Cost: 0.0154 Fidelity: 0.9788 Trace: 0.9788
Rep: 909 Cost: 0.0135 Fidelity: 0.9788 Trace: 0.9788
Rep: 910 Cost: 0.0149 Fidelity: 0.9789 Trace: 0.9789
Rep: 911 Cost: 0.0150 Fidelity: 0.9788 Trace: 0.9788
Rep: 912 Cost: 0.0135 Fidelity: 0.9788 Trace: 0.9788
Rep: 913 Cost: 0.0142 Fidelity: 0.9788 Trace: 0.9788
Rep: 914 Cost: 0.0152 Fidelity: 0.9790 Trace: 0.9790
Rep: 915 Cost: 0.0151 Fidelity: 0.9791 Trace: 0.9791
Rep: 916 Cost: 0.0130 Fidelity: 0.9790 Trace: 0.9790
Rep: 917 Cost: 0.0170 Fidelity: 0.9787 Trace: 0.9787
Rep: 918 Cost: 0.0166 Fidelity: 0.9787 Trace: 0.9787
Rep: 919 Cost: 0.0135 Fidelity: 0.9789 Trace: 0.9789
Rep: 920 Cost: 0.0139 Fidelity: 0.9790 Trace: 0.9790
Rep: 921 Cost: 0.0149 Fidelity: 0.9788 Trace: 0.9789
Rep: 922 Cost: 0.0134 Fidelity: 0.9789 Trace: 0.9789
Rep: 923 Cost: 0.0155 Fidelity: 0.9790 Trace: 0.9790
Rep: 924 Cost: 0.0133 Fidelity: 0.9790 Trace: 0.9790
Rep: 925 Cost: 0.0177 Fidelity: 0.9788 Trace: 0.9788
Rep: 926 Cost: 0.0174 Fidelity: 0.9788 Trace: 0.9788
Rep: 927 Cost: 0.0127 Fidelity: 0.9790 Trace: 0.9790
Rep: 928 Cost: 0.0149 Fidelity: 0.9791 Trace: 0.9791
Rep: 929 Cost: 0.0133 Fidelity: 0.9790 Trace: 0.9790
Rep: 930 Cost: 0.0127 Fidelity: 0.9790 Trace: 0.9790
Rep: 931 Cost: 0.0151 Fidelity: 0.9791 Trace: 0.9791
Rep: 932 Cost: 0.0127 Fidelity: 0.9790 Trace: 0.9790
Rep: 933 Cost: 0.0180 Fidelity: 0.9788 Trace: 0.9789
Rep: 934 Cost: 0.0171 Fidelity: 0.9789 Trace: 0.9789
Rep: 935 Cost: 0.0138 Fidelity: 0.9791 Trace: 0.9791
Rep: 936 Cost: 0.0152 Fidelity: 0.9791 Trace: 0.9791
Rep: 937 Cost: 0.0127 Fidelity: 0.9790 Trace: 0.9790
Rep: 938 Cost: 0.0122 Fidelity: 0.9791 Trace: 0.9791
Rep: 939 Cost: 0.0151 Fidelity: 0.9791 Trace: 0.9791
Rep: 940 Cost: 0.0129 Fidelity: 0.9791 Trace: 0.9791
Rep: 941 Cost: 0.0175 Fidelity: 0.9790 Trace: 0.9790
Rep: 942 Cost: 0.0169 Fidelity: 0.9791 Trace: 0.9791
Rep: 943 Cost: 0.0141 Fidelity: 0.9792 Trace: 0.9793
Rep: 944 Cost: 0.0176 Fidelity: 0.9792 Trace: 0.9792
Rep: 945 Cost: 0.0147 Fidelity: 0.9791 Trace: 0.9791
Rep: 946 Cost: 0.0164 Fidelity: 0.9790 Trace: 0.9790
Rep: 947 Cost: 0.0160 Fidelity: 0.9791 Trace: 0.9791
Rep: 948 Cost: 0.0152 Fidelity: 0.9792 Trace: 0.9792
Rep: 949 Cost: 0.0155 Fidelity: 0.9792 Trace: 0.9792
Rep: 950 Cost: 0.0143 Fidelity: 0.9790 Trace: 0.9790
Rep: 951 Cost: 0.0150 Fidelity: 0.9791 Trace: 0.9791
Rep: 952 Cost: 0.0134 Fidelity: 0.9793 Trace: 0.9793
Rep: 953 Cost: 0.0137 Fidelity: 0.9793 Trace: 0.9793
Rep: 954 Cost: 0.0138 Fidelity: 0.9791 Trace: 0.9791
Rep: 955 Cost: 0.0126 Fidelity: 0.9792 Trace: 0.9792
Rep: 956 Cost: 0.0162 Fidelity: 0.9794 Trace: 0.9794
Rep: 957 Cost: 0.0159 Fidelity: 0.9794 Trace: 0.9794
Rep: 958 Cost: 0.0129 Fidelity: 0.9792 Trace: 0.9792
Rep: 959 Cost: 0.0134 Fidelity: 0.9792 Trace: 0.9792
Rep: 960 Cost: 0.0141 Fidelity: 0.9793 Trace: 0.9793
Rep: 961 Cost: 0.0132 Fidelity: 0.9793 Trace: 0.9793
Rep: 962 Cost: 0.0146 Fidelity: 0.9792 Trace: 0.9792
Rep: 963 Cost: 0.0126 Fidelity: 0.9792 Trace: 0.9792
Rep: 964 Cost: 0.0129 Fidelity: 0.9793 Trace: 0.9793
Rep: 965 Cost: 0.0134 Fidelity: 0.9792 Trace: 0.9792
Rep: 966 Cost: 0.0117 Fidelity: 0.9792 Trace: 0.9792
Rep: 967 Cost: 0.0148 Fidelity: 0.9794 Trace: 0.9794
Rep: 968 Cost: 0.0132 Fidelity: 0.9794 Trace: 0.9794
Rep: 969 Cost: 0.0153 Fidelity: 0.9793 Trace: 0.9793
Rep: 970 Cost: 0.0138 Fidelity: 0.9793 Trace: 0.9793
Rep: 971 Cost: 0.0158 Fidelity: 0.9795 Trace: 0.9795
Rep: 972 Cost: 0.0153 Fidelity: 0.9795 Trace: 0.9795
Rep: 973 Cost: 0.0137 Fidelity: 0.9794 Trace: 0.9794
Rep: 974 Cost: 0.0135 Fidelity: 0.9793 Trace: 0.9793
Rep: 975 Cost: 0.0149 Fidelity: 0.9794 Trace: 0.9795
Rep: 976 Cost: 0.0135 Fidelity: 0.9795 Trace: 0.9795
Rep: 977 Cost: 0.0158 Fidelity: 0.9793 Trace: 0.9793
Rep: 978 Cost: 0.0153 Fidelity: 0.9793 Trace: 0.9793
Rep: 979 Cost: 0.0141 Fidelity: 0.9795 Trace: 0.9795
Rep: 980 Cost: 0.0142 Fidelity: 0.9795 Trace: 0.9795
Rep: 981 Cost: 0.0140 Fidelity: 0.9794 Trace: 0.9794
Rep: 982 Cost: 0.0131 Fidelity: 0.9794 Trace: 0.9794
Rep: 983 Cost: 0.0157 Fidelity: 0.9796 Trace: 0.9796
Rep: 984 Cost: 0.0152 Fidelity: 0.9795 Trace: 0.9796
Rep: 985 Cost: 0.0136 Fidelity: 0.9794 Trace: 0.9794
Rep: 986 Cost: 0.0130 Fidelity: 0.9795 Trace: 0.9795
Rep: 987 Cost: 0.0159 Fidelity: 0.9796 Trace: 0.9796
Rep: 988 Cost: 0.0156 Fidelity: 0.9796 Trace: 0.9796
Rep: 989 Cost: 0.0129 Fidelity: 0.9795 Trace: 0.9795
Rep: 990 Cost: 0.0129 Fidelity: 0.9795 Trace: 0.9795
Rep: 991 Cost: 0.0136 Fidelity: 0.9796 Trace: 0.9796
Rep: 992 Cost: 0.0115 Fidelity: 0.9795 Trace: 0.9795
Rep: 993 Cost: 0.0183 Fidelity: 0.9794 Trace: 0.9794
Rep: 994 Cost: 0.0174 Fidelity: 0.9794 Trace: 0.9794
Rep: 995 Cost: 0.0126 Fidelity: 0.9796 Trace: 0.9796
Rep: 996 Cost: 0.0142 Fidelity: 0.9796 Trace: 0.9796
Rep: 997 Cost: 0.0131 Fidelity: 0.9795 Trace: 0.9795
Rep: 998 Cost: 0.0126 Fidelity: 0.9796 Trace: 0.9796
Rep: 999 Cost: 0.0149 Fidelity: 0.9797 Trace: 0.9797
Fidelity before optimization: 3.8496597e-09
Fidelity after optimization: 0.97966236
Target state: [0.+0.j 1.+0.j 0.+0.j 0.+0.j 0.+0.j 0.+0.j]
Output state: [-0. -0.002j 0.99 +0.003j 0. -0.001j -0.001-0.j -0. +0.001j
-0. +0.j ]
For more applications of CV quantum neural networks, see the state learning and gate synthesis demonstrations.
References¶
- 1(1,2,3,4)
Nathan Killoran, Thomas R Bromley, Juan Miguel Arrazola, Maria Schuld, Nicolás Quesada, and Seth Lloyd. Continuous-variable quantum neural networks. arXiv preprint arXiv:1806.06871, 2018.
- 2
Maria Schuld, Ville Bergholm, Christian Gogolin, Josh Izaac, and Nathan Killoran. Evaluating analytic gradients on quantum hardware. Physical Review A, 99(3):032331, 2019.
- 3
William R Clements, Peter C Humphreys, Benjamin J Metcalf, W Steven Kolthammer, and Ian A Walsmley. Optimal design for universal multiport interferometers. Optica, 3(12):1460–1465, 2016. doi:10.1364/OPTICA.3.001460.
Total running time of the script: ( 1 minutes 50.504 seconds)
Contents
Downloads
Related tutorials