Ch 1 INTRODUCTION
1.1 Motivation
1.2 Structure of the book
Ch 2 ADVANCES IN ML
2.1 Lifelong ML
2.1.1 Multi-Task Learning [MTL]
2.2 Lifelong Supervised Learning
2.3 Lifelong Neural Networks
2.4 Cumulative Learning
2.4.1 Open (World) Classification or Learning
2.4.2 Open World Learning for Unseen Class Detection
2.5 Efficient Lifelong Learning Algorithm: ELLA
2.6 Lifelong Sentiment Classification: LSC
2.7 Lifelong Unsupervised Learning
2.7.1 Lifelong Topic Modeling
2.7.1.1 Lifelong Topic Model for Small Data: AMC
2.7.2 Lifelong Information Extraction (LIE)
2.7.3 Lifelong Relaxation Labeling (Lifelong‐RL)
2.8 Lifelong Reinforcement Learning
2.8.1 LRL Through Multiple Environments
2.8.2 Hierarchical Bayesian LRL
2.8.3 Policy Gradient ELLA (PG‐ELLA)
REFERENCES
Ch 3 DEEP NN and FEDERATED LEARNING
3.1 Optimization Algorithm Approximation by DNN
3.1.1 Generic Optimization Problem
3.1.2 Algorithm Approximation Background
3.2 Spatial Scheduling by DNN
3.2.1 Wireless Link Scheduling
3.2.2 Scheduling by DNN
3.2.2.1 Learning Based on Geographic Location Information
3.2.2.2 DNN Structure
3.2.2.3 Training Process
3.2.2.4 Link Deactivation
3.3 Spatial Scheduling by DNN with Proportional Fairness
3.3.1. Proportional Fairness Scheduling
3.3.2. Learning to Maximize Weighted Sum Rate
3.3.3 Weighted Sum Rate Maximization via Binary Weights
3.3.4 Utility Analysis of Binary Reweighting Scheme
3.4 DNN in Vehicular Networks
3.4.1 System Model
3.4.1.1 Channel Model
3.4.1.2 Modeling AoI
3.4.1.3 Link Clustering
3.4.2 Network Optimization
3.4.2.1 AoI‐Aware RRM Objectives
3.4.2.2 Bellman’s Equation
3.4.2.3 DRL Algorithm
3.5 Federated Learning
3.5.1 Preliminaries
3.5.2 Algorithms Classification
3.5.3 FLS Architecture
3.6 FL and Block Chains
3.6.1 Basics of Block Chains
3.6.2 CASE STUDY: Blockchain‐based
Federated Learning (BFL) in Vehicular Network
3.6.2.1 System Model
3.6.2.2 BFL Block Arrival Process
3.6.2.3 System Optimization
REFERENCES
Ch 4 QC WITH CONTINUOUS VARIABLE
4.1 Preliminaries
4.1.1. Position and momentum space
4.1.2 Momentum Operator
4.1.3 Translation Operator in Quantum Mechanics
4.1.4 Wave Function
4.1.5 Hamiltonian Operator
4.1.6 Schrödinger equation
4.1.7 Relativistic wave equations
4.2 Gaussian Quantum Information
4.2.1 Elements of Gaussian Quantum Information Theory
4.2.2 Distinguishability of Gaussian States
4.2.2.1 Measures of distinguishability
4.2.2.2 Distinguishing optical coherent states
4.2.3 Examples of Gaussian Quantum Protocols
4.2.3.1 Quantum teleportation and variants
4.2.3.2 Quantum cloning
4.2.4 Bosonic Gaussian Channels
4.2.4.1 Preliminaries
4.2.4.2 One‐mode Gaussian channels
4.2.4.3 Classical capacity of Gaussian channels
4.2.4.4 Quantum capacity of Gaussian channels
4.2.4.5 Quantum dense coding and entanglement‐assisted classical capacity
4.2.4.6. Entanglement distribution and secret‐key capacities
4.2.4.7. Gaussian channel discrimination and applications
Appendix
REERENCES
Ch 5 ENTENGLEMENT
5.1 Quantum information with continuous variables
5.1.1 Continuous Variables in Quantum Optics
5.1.1.1 The quadratures of the quantized field
5.1.1.2 Phase‐space distribution
5.1.1.3 Gaussian states
5.1.1.4 Linear optics
5.1.1.5. Nonlinear optics
5.1.1.6. Polarization and spin
5.1.1.7 Phase reference
5.1.2 Continuous‐Variable Entanglement
5.1.2.1. Bipartite entanglement
5.1.2.2. Multipartite entanglement
5.1.2.3 Nonlocality
5.2 Remote Entanglement Distribution
Appendix A: Schmidt decomposition
Appendix B: Mermin‐Klyshko inequalities
Appendix C: The classification of tripartite three‐mode Gaussian states
Appendix C1: Points of Intersection
REFERENCES
Ch 6 ACHIEVABLE TRANSMISSION RATES
6.1 Bosonic Gaussian channels (BGCs)
6.1.1. Multi‐mode BGCs
6.1.1.1. Notation and preliminaries
6.1.1.2. BGCs
6.1.2. Unitary dilation theorem
6.1.2.1. General dilations
6.1.2.2. Reducing the number of environmental modes
6.1.2.3. Minimal noise channels
6.1.2.4. Additive classical noise channel
6.1.2.5. Canonical form for generic channels
6.1.3. Weak degradability
6.1.3.1. A criterion for weak degradability
6.1.4. Two‐mode BGCs
6.1.4.1. Weak‐degradability properties
6.2 Entanglement‐Assisted Classical Capacity
of Noisy Quantum Channels
6.3 Entanglement- assisted classical capacity
6.3.1 The BSST Theorem
6.3.2 Entanglement‐Assisted vs Unassisted Capacities
6.4 Entanglement‐assisted capacity of quantum channel with additive constraint
6.4.1. Classical Quantum Chanel with Additive Constraint
6.4.2. Quantum-Quantum Chanel
6.4.3. Entanglement‐Assisted Capacity
6.5 Entanglement in Quantum channels with cv
6.5.1 The entanglement‐assisted classical capacity
6.5.2 Entanglement‐assisted versus unassisted classical capacities
6.5.3 On continuity of the entanglement‐assisted capacity
6.5.4. Coherent information and a measure of private classical information
6.6 Fundamental Limits of Quantum Communications: Summary
6.6.1 Adaptive protocols and two‐way capacities
6.6.2 General bounds for two‐way capacities
6.6.3 Simulation of quantum channels
6.6.4 Teleportation covariance
6.6.5 Teleportation stretching of adaptive protocols.
6.6.6 REE as a single‐letter converse bound
6.6.7 Generalizations
6.6.8 Achievable rates in bosonic communications
6.6.9 Achievable rates in quantum optical communications
6.6.10 Achievable rates in quantum communications with Gaussian noise
6.6.11 Limits on achievable rates in qubit communications
6.7 Simplification of the main results
6.8 Summary of Analytical Tools
6.9 Performance Bounds
6.10 Simplified Models for Bosonic Gaussian Channels
6.11 Simplified Models for Discrete‐Variable Channels
6.2 Simplified Bounds for QKD Protocols Rates
6.3 Algorithms Upgrades
REFERENCES
Ch 7 QUANTUM NETWORK ROUTING
7.1 Routing over Virtual Quantum Network
7.1.1 Preliminaries
7.1.2 Ring Network
7.1.3 Sphere Network
7.1.3.1 Definition of the VQN architecture
7.1.3.2 Routing Algorithm
7.2 Minimum Cost Routing
7.2.1 Quantum Routing Parameters
7.2.2 Entanglement‐Gradient Routing
7.3 Entanglement Distribution
7.3.1 Preliminaries
7.3.2 The optimal RED protocol
7.3.3 Stationary Protocol
7.4 Quantum graph
7.5 Multipoint entanglement distribution (multi-partite entanglement)
REFERENCES
Ch 8 Dynamic QUANTUM NETWORK TOPOLOGY DESIGN
8.1 Quantum graph states
8.1.1. Interaction pattern.
8.1.2. Stabilizer formalism.
8.1.3 Local Clifford group and LC equivalence.
8.1.4 Weyl operators and Heisenberg group
8.2. Evaluation of the Degree of Entanglement for Graph States
8.2.1 Schmidt measure
8.2.2. Generalization of the evaluation rules
8.3 Quantum State Graph Reconfiguration
8.3.1 Vertex‐deletions and local complementations
8.3.2 Circle graphs
8.3.3 Examples of vertices
8.3.4 Graph Reconfiguration Algorithms
REFERENCES
Ch 9 ELEMENTS of QUANTUM CODING THEORY
9.1 Quantum coding theorems
9.1.1 Preliminaries
9.1.2 Quantum coding theorem
9.1.3 Reliability function
9.1.4 Reliability Function for Different Quantum Channel Examples
9.2 Error Correction Limits for Quantum Metrology
9.2.1. Quantum Metrology in Presence of Impairments
9.2.2 Error Correction Enhanced Quantum Metrology
9.2.2.1. Noiseless Ancilla and Perfect Error Correction
9.2.2.2. Noisy Ancilla and Perfect Error Correction
9.2.2.3. Noiseless Ancilla and Imperfect Error Correction
9.2.2.4. Limitations of Current Quantum Technologies
9.2.3. Other Error Correction Strategies
9.3 Stabilizer Codes
9.3.1 Stabilizer Coding
9.4 Quantum LDPC Codes
9.4.1 An Introduction to classical LDPC Codes
9.4.1.1 Representations of LPDC Codes
9.4.1.2 LDPC Code Design Techniques
9.4.1.3 Iterative Decoding Algorithms
9.4.2 Constructing regular quantum LDPC codes
9.5 Homological family of quantum LDPC codes
9.5.1 Code construction based on a Regular Tessellation of Hyperbolic Space
9.5.2 Hyperbolic 4‐space and its Regular Tessellation by Hypercubes
9.5.3 Compact Manifolds
9.5.4 Code Performance
9.5.5 Decoders
REFERENCES
Ch 10 QUANTUM MACHINE LEARNING
10.1. Quantum Neural Networks with DV
10.1.1 Error Backpropagation in Quantum ANN
10.1.2 Firing Pattern Selection
10.1.3 Representation of n-to-m Boolean Functions
10.1.4 General Architecture Networks
10.2 Quantum Neural Networks with CV
10.2.1 Continuous Quantum Registers
10.2.2 Discrete Simulation of Continuous Quantum Registers
10.2.3 Quantum Phase Estimation
10.2.4. Quantum Phase Kickback
10.2.5 Quantum Gradients
10.3 Quantum Parametric Optimization
10.3.1 Preliminaries
10.3.1.1. Quantum Feedforward and Backwards Propagation of Phase Errors
10.3.1.2 Full‐batch Effective Phase Kicks
10.3.1.3 Interpretation in classical Hamiltonian dynamics
10.3.2 Quantum Dynamical Descent
10.3.2.1. Basic Algorithm
10.3.2.2. Heisenberg picture update rule
10.3.2.3. Quantum Approximate Optimization Algorithm
10.3.2.4. Quantum Adiabatic Algorithm (QAA)
10.3.3 Extensions of Quantum Descent Methods
10.4 Quantum Neural Network Learning
10.4.1. Quantum‐Coherent Neural Networks
10.4.1.1. Classical‐to‐Quantum Computational Embedding
10.4.1.2. Classical Data Phase Kicking
10.4.1.3. Abstract Quantum Neuron
10.4.1.4. QFB Neural Network
10.4.2. Quantum Phase Error Backpropagation
10.4.2.1. Operator Chain Rule
10.4.2 Implementations of Quantum Coherent Neurons
10.4.2.1. Hybrid CV‐DV Neurons
10.4.2.2. CV‐only
10.5. Quantum Parametric Circuit Learning
10.5.1 Parametric Ansatze & Error Backpropagation
10.5.1.1 From Classic to Quantum Parametrization of Ansatze
10.5.1.2. Quantum Parametric Circuit Error Backpropagation
10.5.2. Quantum State Exponentiation
10.5.2.1. Single state exponentiation
10.5.2.2. Sequential Exponential Batching
10.5.2.3. Quantum Random Access Memory Batching
10.5.3. Quantum State Learning
10.5.3.1. Quantum Pure State Learning
10.5.3.2. Quantum Mixed State Learning
10.5.4. Quantum Unitary & Channel Learning
10.5.4.1. Supervised Unitary Learning
10.5.4.2. Supervised Channel Learning
10.5.4.3. Unsupervised Unitary Learning
10.5.4.4. Unsupervised Channel Learning
10.5.5 Quantum Basic Learning Algorithms
10.5.5.1. Preliminaries
10.5.5.2. Loss Functions
10.5.6. Estimation of Quantum Code
10.5.6.1. Estimation of Quantum Autoencoders: Compression Code
10.5.6.2. Denoising Quantum Autoencoder
10.5.6.3. Quantum Error Correcting Code Learning
10.5.7. Generative Adversarial Quantum Circuits
10.5.7.1. Classical Generative Adversarial Networks Review
10.5.7.2. Generative Adversarial Quantum Circuits
10.5.8. Parametric Hamiltonian Optimization
10.5.8.1. Hamiltonian‐Parallelized Gradient Accumulation
10.5.9. Hybrid Quantum Neural‐Circuit Networks
10.5.9.1. Fully Coherent Hybrid Networks
10.5.9.2. Hybrid Quantum‐Classical Networks
10.6 Quantum Deep Convolutional Neural Networks
10.6.1 Classical Convolutional neural network (CNN)
10.6.1.1 Tensor representation
10.6.1.2 Architecture
10.6.1.3 Mathematical Formulations
10.6.2 Forward Propagation in QCNN
10.6.2.1 Single Quantum Convolution Layer
10.6.2.2 Inner Product Estimation
10.6.2.3 Encoding the amplitude in a register
10.6.2.4 Conditional rotation
10.6.2.5. Amplitude Amplification
10.6.2.6. l_∞ tomography and probabilistic sampling
10.6.2.7 Quantum Pooling
10.6.3. Backward Propagation in QCNN
10.6.3.1 Classical Backpropagation
10.6.3.2 Quantum Algorithm for Backpropagation
10.6.3.3 Gradient Descent and Classical equivalence
REFERENCES
Ch 11 QUANTUM COMPUTING GATES LIBRARIES
11.1 Quantum Gates Library
11.1.1 Classical Logic Gates Library
11.1.2 Quantum Logic Gates Library
11.1.2.1 1- Qubit Gates
11.1.2.2 Rotations About the x‐, y‐, and z‐Axes
11.1.2.3 Controlled Quantum Gates
11.1.2.4 Selected 2‐Qubit Gates Libraries
11.1.2.5 Entangling Power of Quantum Gates
11.1.2.6 Arbitrary 2‐Qubit Gates
11.2 Depth‐Optimal Quantum Circuits
11.2.1 Meet-in-the-Middle (mm) Search Algorithm
11.2.2 Search tree pruning
11.2.3 Implementation Aspects
11.3 Exact Minimization of Quantum Circuits
11.3.1 Preliminaries
11.4 Decomposing CV Operations into a Universal Gate Library
11.4.1 Exact Decompositions of Multi‐Mode Gates
REFERENCES
Index