### EITC/AI/TFQML TensorFlow Quantum Machine Learning is the European IT Certification programme on using Google TensorFlow Quantum library for implementing machine learning on Google Quantum Processor Sycamore architecture.

The curriculum of the EITC/AI/TFQML TensorFlow Quantum Machine Learning focuses on theoretical knowledge and practical skills in using Google’s TensorFlow Quantum library for advanced quantum computational model based machine learning on the Google Quantum Processor Sycamore architecture organized within the following structure, encompassing comprehensive video didactic content as a reference for this EITC Certification.

TensorFlow Quantum (TFQ) is a quantum machine learning library for rapid prototyping of hybrid quantum-classical ML models. Research in quantum algorithms and applications can leverage Google’s quantum computing frameworks, all from within TensorFlow.

TensorFlow Quantum focuses on quantum data and building hybrid quantum-classical models. It integrates quantum computing algorithms and logic designed in Cirq (quantum programming framework based on quantum circuits model), and provides quantum computing primitives compatible with existing TensorFlow APIs, along with high-performance quantum circuit simulators. Read more in the TensorFlow Quantum white paper.

Quantum computing is the use of quantum phenomena such as superposition and entanglement to perform computation. Computers that perform quantum computations are known as quantum computers. Quantum computers are believed to be able to solve certain computational problems, such as integer factorization (which underlies RSA encryption), substantially faster than classical computers. The study of quantum computing is a subfield of quantum information science.

Quantum computing began in the early 1980s, when physicist Paul Benioff proposed a quantum mechanical model of the Turing machine. Richard Feynman and Yuri Manin later suggested that a quantum computer had the potential to simulate things that a classical computer could not. In 1994, Peter Shor developed a quantum algorithm for factoring integers that had the potential to decrypt RSA-encrypted communications. Despite ongoing experimental progress since the late 1990s, most researchers believe that “fault-tolerant quantum computing is still a rather distant dream.”. In recent years, investment into quantum computing research has increased in both the public and private sector. On 23 October 2019, Google AI, in partnership with the U.S. National Aeronautics and Space Administration (NASA), claimed to have performed a quantum computation that is infeasible on any classical computer (so-called quantum supremacy result).

There are several models of quantum computers (or rather, quantum computing systems), including the quantum circuit model, quantum Turing machine, adiabatic quantum computer, one-way quantum computer, and various quantum cellular automata. The most widely used model is the quantum circuit. Quantum circuits are based on the quantum bit, or “qubit”, which is somewhat analogous to the bit in classical computation. Qubits can be in a 1 or 0 quantum state, or they can be in a superposition of the 1 and 0 states. However, when qubits are measured the result of the measurement is always either a 0 or a 1; the probabilities of these two outcomes depend on the quantum state that the qubits were in immediately prior to the measurement.

Progress towards building a physical quantum computer focuses on technologies such as transmons, ion traps and topological quantum computers, which aim to create high-quality qubits. These qubits may be designed differently, depending on the full quantum computer’s computing model, whether quantum logic gates, quantum annealing, or adiabatic quantum computation. There are currently a number of significant obstacles in the way of constructing useful quantum computers. In particular, it is difficult to maintain the quantum states of qubits as they suffer from quantum decoherence and state fidelity. Quantum computers therefore require error correction. Any computational problem that can be solved by a classical computer can also be solved by a quantum computer. Conversely, any problem that can be solved by a quantum computer can also be solved by a classical computer, at least in principle given enough time. In other words, quantum computers obey the Church–Turing thesis. While this means that quantum computers provide no additional advantages over classical computers in terms of computability, quantum algorithms for certain problems have significantly lower time complexities than corresponding known classical algorithms. Notably, quantum computers are believed to be able to quickly solve certain problems that no classical computer could solve in any feasible amount of time—a feat known as “quantum supremacy.” The study of the computational complexity of problems with respect to quantum computers is known as quantum complexity theory.

Google Sycamore is a quantum processor created by Google Inc.’s Artificial Intelligence division. It comprises 53 qubits.

In 2019, Sycamore completed a task in 200 seconds that Google claimed, in a Nature paper, would take a state-of-the-art supercomputer 10,000 years to finish. Thus, Google claimed to have achieved quantum supremacy. To estimate the time that would be taken by a classical supercomputer, Google ran portions of the quantum circuit simulation on the Summit, the most powerful classical computer in the world. Later, IBM made a counter-argument, claiming that the task would only take 2.5 days on a classical system like Summit. If Google’s claims are upheld, then it would represent an exponential leap in computing power.

In August 2020 quantum engineers working for Google reported the largest chemical simulation on a quantum computer – a Hartree-Fock approximation with Sycamore paired with a classical computer that analyzed results to provide new parameters for the 12-qubit system.

In December 2020, the Chinese photon-based Jiuzhang processor, developed by USTC, achieved a processing power of 76 qubits and was 10 billion times faster than Sycamore, making it the second computer to attain quantum supremacy.

The Quantum Artificial Intelligence Lab (also called the Quantum AI Lab or QuAIL) is a joint initiative of NASA, Universities Space Research Association, and Google (specifically, Google Research) whose goal is to pioneer research on how quantum computing might help with machine learning and other difficult computer science problems. The lab is hosted at NASA’s Ames Research Center.

The Quantum AI Lab was announced by Google Research in a blog post on May 16, 2013. At the time of launch, the Lab was using the most advanced commercially available quantum computer, D-Wave Two from D-Wave Systems.

On May 20, 2013, it was announced that people could apply to use time on the D-Wave Two at the Lab. On October 10, 2013, Google released a short film describing the current state of the Quantum AI Lab. On October 18, 2013, Google announced that it had incorporated quantum physics into Minecraft.

In January 2014, Google reported results comparing the performance of the D-Wave Two in the lab with that of classical computers. The results were ambiguous and provoked heated discussion on the Internet. On 2 September 2014, it was announced that the Quantum AI Lab, in partnership with UC Santa Barbara, would be launching an initiative to create quantum information processors based on superconducting electronics.

On the 23rd of October 2019, the Quantum AI Lab announced in a paper that it had achieved quantum supremacy.

Google AI Quantum is advancing quantum computing by developing quantum processors and novel quantum algorithms to help researchers and developers solve near-term problems both theoretical and practical.

Quantum computing is considered to help in development of the innovations of tomorrow, including AI. That’s why Google committs significant resources to building dedicated quantum hardware and software.

Quantum computing is a new paradigm that will play a big role in accelerating tasks for AI. Google aims to offer researchers and developers access to open source frameworks and computing power that can operate beyond classical capabilities of computation.

The main focus areas of Google AI Quantum are

- Superconducting qubit processors: Superconducting qubits with chip-based scalable architecture targeting two-qubit gate error < 0.5%.
- Qubit metrology: Reducing two-qubit loss below 0.2% is critical for error correction. We are working on a quantum supremacy experiment, to approximately sample a quantum circuit beyond the capabilities of state-of-the-art classical computers and algorithms.
- Quantum simulation: Simulation of physical systems is among the most anticipated applications of quantum computing. We especially focus on quantum algorithms for modelling systems of interacting electrons with applications in chemistry and materials science.
- Quantum assisted optimization: We are developing hybrid quantum-classical solvers for approximate optimization. Thermal jumps in classical algorithms to overcome energy barriers could be enhanced by invoking quantum updates. We are in particular interested in coherent population transfer.
- Quantum neural networks: We are developing a framework to implement a quantum neural network on near-term processors. We are interested in understanding what advantages may arise from generating massive superposition states during operation of the network.

The main tools developed by Google AI Quantum are open-source frameworks specifically designed for developing novel quantum algorithms to help solve near-term applications for practical problems. These include:

- Cirq: an open-source quantum framework for building and experimenting with noisy intermediate scale quantum (NISQ) algorithms on near-term quantum processors
- OpenFermion: an open-source platform for translating problems in chemistry and materials science into quantum circuits that can be executed on existing platforms

Google AI Quantum near-term applications include:

Quantum Simulation

The design of new materials and elucidation of complex physics through accurate simulations of chemistry and condensed matter models are among the most promising applications of quantum computing.

Error mitigation techniques

We work to develop methods on the road to full quantum error correction that have the capability of dramatically reducing noise in current devices. While full-scale fault tolerant quantum computing may require considerable developments, we have developed the quantum subspace expansion technique to help utilize techniques from quantum error correction to improve performance of applications on near-term devices. Moreover, these techniques facilitate testing of complex quantum codes on near-term devices. We are actively pushing these techniques into new areas and leveraging them as a basis for design of near term experiments.

Quantum Machine Learning

We are developing hybrid quantum-classical machine learning techniques on near-term quantum devices. We are studying universal quantum circuit learning for classification and clustering of quantum and classical data. We are also interested in generative and discriminative quantum neural networks, that could be used as quantum repeaters and state purification units within quantum communication networks, or for verification of other quantum circuits.

Quantum Optimization

Discrete optimizations in aerospace, automotive, and other industries may benefit from hybrid quantum-classical optimization, for example simulated annealing, quantum assisted optimization algorithm (QAOA) and quantum enhanced population transfer may have utility with today’s processors.

To acquaint yourself in-detail with the certification curriculum you can expand and analyze the table below.

The EITC/AI/TFQML TensorFlow Quantum Machine Learning Certification Curriculum references open-access didactic materials in a video form. Learning process is divided into a step-by-step structure (programmes -> lessons -> topics) covering relevant curriculum parts. Unlimited consultancy with domain experts are also provided.

For details on the Certification procedure check How it Works.

### Curriculum Reference Resources

TensorFlow Quantum (TFQ) is a quantum machine learning library for rapid prototyping of hybrid quantum-classical ML models. Research in quantum algorithms and applications can leverage Google’s quantum computing frameworks, all from within TensorFlow. TensorFlow Quantum focuses on quantum data and building hybrid quantum-classical models. It integrates quantum computing algorithms and logic designed in Cirq, and provides quantum computing primitives compatible with existing TensorFlow APIs, along with high-performance quantum circuit simulators. Read more in the TensorFlow Quantum white paper. As additional reference you can check out the overview and run the notebook tutorials.

https://www.tensorflow.org/quantum

#### Cirq

Cirq is an open-source framework for Noisy Intermediate Scale Quantum (NISQ) computers. It was developed by the Google AI Quantum Team, and the public alpha was announced at the International Workshop on Quantum Software and Quantum Machine Learning on July 18, 2018. A demo by QC Ware showed an implementation of QAOA solving an example of the maximum cut problem being solved on a Cirq simulator. Quantum programs in Cirq are represented by "Circuit" and "Schedule" where "Circuit" represents a Quantum circuit and "Schedule" represents a Quantum circuit with timing information. The programs can be executed on local simulators.^{} The following example shows how to create and measure a Bell state in Cirq.

```
import cirq
# Pick qubits
qubit0 = cirq.GridQubit(0, 0)
qubit1 = cirq.GridQubit(0, 1)
# Create a circuit
circuit = cirq.Circuit.from_ops(
cirq.H(qubit0),
cirq.CNOT(qubit0, qubit1),
cirq.measure(qubit0, key='m0'),
cirq.measure(qubit1, key='m1')
)
```

Printing the circuit displays its diagram

```
print(circuit)
# prints
# (0, 0): ───H───@───M('m0')───
# │
# (0, 1): ───────X───M('m1')───
```

Simulating the circuit repeatedly shows that the measurements of the qubits are correlated.

```
simulator = cirq.Simulator()
result = simulator.run(circuit, repetitions=5)
print(result)
# prints
# m0=11010
# m1=11010
```