## April 18, 2019 Quantum Machine Learning Part 5

*Quantum Neural Networks (QNNs)*

In the last two articles, we looked at quantum algorithms such as variational quantum eigen solver, quantum classifiers, variational quantum classifier and QGANs. In this article, we look at Quantum Neural Networks (QNNs).

The classical neural networks are related to the processing of neurons, the transformation performed by the neuron, the interconnection between neurons, the neural network dynamics, and the learning rules. The neural network learning rules govern the change of neural network connection strengths. The neural networks are differentiated by training in a supervised or unsupervised manner. Neural networks can use single systems to save different class data and to classify the stimulus in a distributed way. Hence neural networks can be very useful in creating classification systems.

Quantum computing is related to the theory of quantum mechanics.Quantum computing is an evolving field and the quantum neural networks can be used for solving computationally challenging problems. The wave function in quantum mechanics is equivalent to neural networks. The coherence in quantum states which is referred as superposition is neuron in classical neural networks. The measurement in quantum computing is the same as interconnections in neural networks. The entanglement is similar to gain function in a neural network.

Quantum neural networks are similar to brain function and are helpful in creating new information systems. The Quantum neural networks are being used for solving classically challenging problems which require exponential capacity and memory. In a nutshell, quantum neural networks are the next natural step in the evolution of neuro computing systems.

The code below shows a QNN implementation. In this implementation, we look at the photonic quantum neural net model. The QNN simulated is a continuous variable quantum neural network.

**Prerequisites:**

- You need to set up Python3.5 to run the code samples below. You can download from python org website link.
- You can follow the instructions to setup penny lane and strawberry fields.

**Problem**

QNN is based on a quantum circuit which has trainable continuous parameters. Quantum neural network is trained with sine function data to fit the model for a one-dimensional function. Quantum machine device is created. Penny Lane and strawberry fields open source framework is used for quantum machine simulation.

import pennylane as quantumMachineL from pennylane import numpy as nump from pennylane.optimize import AdamOptimizer # getting the quantum machine device try: device = quantumMachineL.device('strawberryfields.fock', wires=1, cutoff_dim=10) except: print("please install the strawberryfields")

Creation of Layers in neural network is created using the quantum machine device.

def GetLayer(varr): """ One layer of the quantum neural network """ quantumMachineL.Rotation(varr[0], wires=0) quantumMachineL.Squeezing(varr[1], 0., wires=0) quantumMachineL.Rotation(varr[2], wires=0) quantumMachineL.Displacement(varr[3], 0., wires=0) quantumMachineL.Kerr(varr[4], wires=0)

The quantum neural network is created by creating layers and using Quantum machine.

@quantumMachineL.qnode(device) def GetQuantumNeuralNet(vars, xcoor=None): """ Returns The quantum neural network """ quantumMachineL.Displacement(xcoor, 0., wires=0) for var in vars: GetLayer(var) return quantumMachineL.expval.X(0)

Square loss is calculated by using label and prediction values.

def GetSquareLoss(labelValues, predictionValues): """ Get the Square loss function """ lossValue = 0 for label, prediction in zip(labelValues, predictionValues): lossValue = lossValue + (label - prediction) ** 2 lossValue = lossValue / len(labelValues) return lossValue

The cost function is minimized by calculating the square loss from the feature and label values. Predictions are calculated for each feature value. Square loss is calculated for label values and associated predictions.

def GetCost(variables, featureValues, labelValues): """ Minimizing Cost function """ predictions = [GetQuantumNeuralNet(variables, xcoor=x) for x in featureValues] return GetSquareLoss(labelValues, predictions)

Sine data is loaded from the text file. x and y coordinates are read from the sinedata. The random function of nump is initialised. variable_intial has weights which are initialized with random values. The number of layers in Quantum neural network is set as 4. Adam optimizer is created.

# loading the data from sin_data.txt sinedata = nump.loadtxt("sin_data.txt") # reading x and y coordinates X_coord = sinedata[:, 0] Y_coord = sinedata[:, 1] # setting the seed for nump random nump.random.seed(0) #number of layers in QNN set as 4 number_of_layers = 4 variable_initial = 0.05 * nump.random.randn(number_of_layers, 5) # getting the adam optimizer optimizer = AdamOptimizer(0.01, beta1=0.9, beta2=0.999)

A variable value is set to variable_initial. Optimizer is executed with the steps ranging from 1 to 500 for training the model. For every iteration, the cost value is printed.

# variable set to variable_intial variable = variable_initial #iterating from 1 to 500 for iteration in range(500): var_cost = optimizer.step(lambda v: GetCost(v, X_coord, Y_coord), variable) # printing the iteration, cost value print("Iteration: {:5d} | Cost value: {:0.7f} ".format(iteration + 1, GetCost(var_cost, X_coord, Y_coord)))

**Instructions for Running the Code**

# installing penny lane pip install pennylane #installing pennylane-sf pip install pennylane-sf #running the qnn python code python qnn.py

**Output**

This article is part of the Quantum Machine Learning series. You can check the other articles:

**bhagvanarch**

Guest Blogger