Tutorials Abstracts

Sunday, Sep 15, 2024 — Tutorials Abstracts


Date: Sun, Sep 15, 2024
Time: 10:00-14:30 Eastern Time (EDT) — UTC-4
Duration: 3 hours (2 x 1.5 hours)
Summary: The participants will learn what are threats to security of quantum computing systems, the types of attacks that these systems could be vulnerable to, and types of defenses that can be applied to secure quantum computing systems from both local and remote attacks. 
Abstract: This updated, 2nd edition of the tutorial will introduce the audience to the emerging field of quantum computer security, which focuses on research on how to make quantum computing systems secure from attacks. By design, this tutorial will not cover post-quantum cryptography as that is an important, but orthogonal topic. The tutorial focuses on security of quantum computing systems as the rapid advances in quantum computer technologies, quantum computers hold promise to be able to run algorithms for generating novel drugs or material compounds. Once quantum computers are generating or processing sensitive data or valuable intellectual property, they will become a target for attacks that aim to disturb their operation, modify computation, or even try to steal data or quantum circuit code. Moreover, many quantum computers are already cloud-based and with remote, on-demand cloud access it makes them vulnerable to remote security attacks, no different from today’s classical cloud computing. First, this tutorial will introduce audience to classical computer security ideas such as threat modeling, confidentiality, integrity, and availability, information leaks, side- and covert-channel attacks. Second, this tutorial will demonstrate examples of security attacks prototyped on real cloud based NISQ quantum computers available today. Third, the tutorial will present designs for securing the cloud based NISQ quantum computers from the security attacks. Lastly, the tutorial will present challenges and opportunities to build quantum computer security from the ground up, rather than patch them later once security attacks have occurred in the wild. 
Keywords: Quantum computing, security, security attacks and defenses
Contents Level: Quantum Computing: 75% beginner, 25% intermediate, 0% advanced Computer Security: 50% beginner, 25% intermediate, 25% advanced
Target Audience: A. Expected background and prerequisites of the tutorial attendees:
Audience does not require computer security background. Expected background includes basic knowledge of computers, and interest in learning about cloud-based quantum computers and making them secure.
B. What will target audience learn:
Quantum computing engineers and researchers will learn how to make more secure quantum computers and how to protect the computing machines from vulnerabilities now that many of the quantum computers are connected to the internet (e.g., IBM Quantum, Amazon Braket, Microsoft Azure).
 

Date: Sun, Sep 15, 2024
Time: 10:00-14:30 Eastern Time (EDT) — UTC-4
Duration: 3 hours (2 x 1.5 hours)
Summary: In this tutorial attendees will learn to code, run and test standard qubit characterization routines, by means of practical exercises that will see them running their code live on an actual quantum computer provided by Quantum Machines remotely. This will allow participants to compose their knowledge of qubit characterization and tuning, and to walk away with consolidated practical experience of manual and automated qubit calibration routines.
Abstract: Tuning up a qubit can pose a formidable challenge, and making this into an automated process adds another level of complexity. However, with the right blend of hardware and software, we can turn this into a manageable endeavor.

In our comprehensive 3-hour tutorial, we will delve into the intricacies of tuning up a transmon qubit chip in real time. Through practical exercises, attendees will code their own qubit-control programs in a pulse-level language and execute them on an actual quantum processor available live during the session, to then analyze the results to uncover the transmon qubit properties. In this way, the attendees will understand the fundamental building blocks of qubit characterization and calibration; they will develop codes and run them on an actual device.

Once the programs are deemed reliable in extracting the relevant qubit characteristics, the participants will learn to transform their codes into calibration nodes and connect them to one another to form an automated calibration sequence. The automated sequences are then iteratively tested and fine-tuned to improve their robustness and performance, covering the entire process from basic qubit characterization to full-fledged automated calibration and tune-up.

At the end of this tutorial, the participant will emerge equipped with valuable hands-on experience on both manual and automated tuning of a transmon qubit, having understood qubit characterization and tuning, by creating calibration codes and testing them on a real device. Join us as we unravel the complexities of qubit tuning, empowering you to harness the full potential of quantum computing. 
Keywords: Qubit calibration, automatic calibration, quantum computing, superconducting qubits
Contents Level: 50% beginner, 30% intermediate, 20% advanced.
Target Audience: We expect researchers and specifically experimentalists, at all levels, to greatly benefit from the tutorial. No specific background is requiered, although we will assume a basic understanding of qubit physics.

Date: Sun, Sep 15, 2024
Time: 10:00-14:30 Eastern Time (EDT) — UTC-4
Duration: 3 hours (2 x 1.5 hours)
Summary: During this tutorial, participants will gain hands-on experience in the development of variational quantum algorithms for NISQ devices on Amazon Braket. They can follow the instructor-led demonstration in AWS environments provided free of charge for the duration of the tutorial. All attendees leave with code examples that they can use as a foundation for their own projects.
Abstract: Variational quantum algorithms belong to the class of hybrid classical-quantum computation, leveraging both classical as well as quantum compute resources. These algorithms are widely believed to be promising candidates for first demonstration of useful application of quantum computation in areas such as quantum chemistry, condensed matter simulations, and discrete optimization tasks. Variational algorithms use a parametrized quantum circuit ansatz to estimate the lowest-energy eigenvalue of a Hamiltonian encoding an underlying problem-specific objective function and progress by iterative execution of the parametrized circuit, passing back the result of each computation to a classical optimizer which updates the circuit parameters until a convergence or stopping criteria is satisfied. In a real world quantum hardware, the noise of the device effectively limits the number of circuit operations which can be faithfully executed and degrades the quality of the results of the computation, which affects the convergence of the optimization performed on the classical computer. In this tutorial, we walk the participants through the practical steps in the development of a variational quantum algorithm for realistic NISQ devices. We first introduce an example variational algorithm and provide a reference implementation for a specific problem. We discuss in detail how we can conveniently and efficiently execute the algorithm on Amazon Braket, the Amazon Web Services (AWS) quantum computing service. Furthermore, we investigate the impact of noise on the algorithm performance and demonstrate how participants can automate their noise simulations. Finally we explore error mitigation strategies to improve the algorithm performance in the presence of noise. During the tutorial, participants get free access to Amazon Braket and can follow along the guided steps in execution in their own AWS-provided development environment. All attendees leave with code examples that they can use as a foundation for their own projects. 
Keywords:  Quantum Computing, Quantum Circuits, Variational Quantum Algorithms, Quantum Classical Algorithms, Amazon Braket, Hands On Programming
Contents Level: 30% beginner, 50% intermediate, 20% advanced. 
Target Audience: This tutorial is structured to appeal to diverse audiences that include researchers in quantum computing, attendees from industry, as well as students and general audience. The minimum requirements for attendees are an elementary knowledge of a modern programming language (such as Python), Jupyter notebooks, and familiarity with the basic concepts of quantum computing (quantum gates, quantum circuits, qubit measurement). Industry participants will gain better understanding of the practical applications of hybrid classical-quantum algorithms, and students and the general audience will gain hands-on experience with running quantum workloads on quantum computers in the cloud. 

Date: Sun, Sep 15, 2024
Time: 10:00-14:30 Eastern Time (EDT) — UTC-4
Duration: 3 hours (2 x 1.5 hours)
Summary: This tutorial is intended to close the knowledge gap between quantum engineers and electrical engineers. The quantum engineers will learn the circuits used to generate the pulses and the electrical engineers will learn the meaning of the pulses they generate and collect.
Abstract: Electrical engineers play a critical role in quantum computers. This is because a quantum computer is just an array of passive qubits controlled by sophisticated classical electronics. Low-power, high-speed, low-noise, and integrated circuits are necessary to build the future large-scale quantum computers. However, many electrical engineers are not aware of their pivotal role in quantum computers. Many outstanding circuit designers and microwave engineers are also not aware of the basics of quantum computers and the interactions between the qubits and the control circuits. This tutorial is an effort to educate those engineering communities in order to bridge the aforementioned knowledge gap by simplifying complex theory from quantum mechanics into a language familiar to engineers. The tutorial has 6 sections (each 30 mins). In section I, an overview of qubit operations and the role of control electronics will be discussed. In sections II-IV, the operation of Si qubits and the corresponding classical control and detector circuits (e.g. DAC, TIA, RF reflectometry) will be covered. In sections V–VI, we will discuss superconducting (SC) qubit operations and qubit interaction with classical electronics. Particularly, we will highlight common microwave components on SC qubit integrated chip and their roles. Both lectures and demos are included.
Keywords: Silicon qubits, superconducting qubits, decoherence, entanglement, spin resonance, spin-to-charge conversion, microwave circuits
Contents Level: 60% beginner, 40% intermediate
Target Audience: We expect there are three types of attendees. The first type is the electrical engineers who are familiar with circuit design. They will learn basic qubit operations and appreciate why electrical circuits (digital, analog, and microwave circuits) play a pivotal role in qubit control. They will be ready to design the microwave components on superconducting (SC) chips after the tutorial. The second type is the physicists who know qubit physics but not much about electrical circuits. They will learn the basics of electrical circuits and understand what to keep in mind when they design qubits in the future (e.g. why SC needs to be at GHz range instead of THz due to the limitation of microwave circuits). The last type of attendees are those who are familiar with algorithms but not hardware. They will appreciate the constraints in the physical realization of the qubit operations, although they might not fully understand the material but the experience of seeing qubit-classical-circuit interaction can be transformative to them.

Date: Sun, Sep 15, 2024
Time: 10:00-14:30 Eastern Time (EDT) — UTC-4
Duration: 3 hours (2 x 1.5 hours)
Summary: The attendees will learn about the Hamiltonian and Lindblad formalisms for treating quantum dynamics, the physical mechanisms that lead to decoherence and relaxation of quantum systems, and the approaches for measuring and mitigating nonMarkovian noise sources, including example implementations on realistic qubit platforms.
Abstract: In recent years, quantum systems have emerged as leading candidates for developing future generations of computers, sensors, and networks. However, the performance of quantum systems toward such applications is limited due to their interaction with the environment, which leads to longitudinal relaxation and decoherence of the quantum system. In this Tutorial, I will describe the theory of noise sources that lead to the relaxation and decoherence of quantum systems, and present a few approaches for measuring and mitigating these noise sources toward improved qubit performance. The first part of the Tutorial will be dedicated to the theory of open quantum systems. I will start by describing Hamiltonian and Lindblad formalisms for treating the dynamics of qubits under the application of coherent control. I will then discuss standard methods for modeling Markovian and non-Markovian noise sources, while clarifying the differences between decay timescales of qubits due to these sources (T2*, T2, T1, and others). The second part of the Tutorial will focus on the approaches for measuring and mitigating non-Markovian noise sources. I will describe the process of applying pulsed and continuous coherent control to perform noise spectroscopy for characterizing the noise sources and for (dynamically) decoupling quantum systems from these sources. I will finish the Tutorial by presenting typical examples of realistic qubit platforms such as spin qubits in the solid-state, superconducting qubits, and trapped ions, and how coherent control can be used to improve the coherence of arbitrary states of such systems for quantum applications.
Keywords: Open Quantum Systems, Relaxation, Decoherence Noise Spectroscopy, Dynamical Decoupling 
Contents Level: 40% Beginner, 40% Intermediate, 20% Advanced  
Target Audience: The target audience of the Tutorial is students, postdocs, and faculty who wish to extend their knowledge on the physical modeling, characterization, control, and enhancement of quantum systems. The prerequisite knowledge is minimal (basic understanding of quantum states and superposition principle), which makes the Tutorial relevant for students and researchers from a variety of disciplines, including Physics, Electrical and Computer Engineering, Computer Science, Material Science, Chemistry, and Mathematics. 

Date: Sun, Sep 15, 2024
Time: 10:00-14:30 Eastern Time (EDT) — UTC-4
Duration: 3 hours (2 x 1.5 hours)
Summary: Attendees will learn how to program digital and analog quantum operations on a neutral atom quantum computer through the Python library Pulser. They will also play with hands-on examples related to many-body physics and combinatorial optimization.
Abstract: In recent years, remarkable progress has been achieved in the development and operation of programmable quantum computers made of arrays of Rydberg atoms. When operated in digital mode, these have established themselves as promising platforms for fault-tolerant quantum computing. On the other hand, the analog setting has led to the discovery of new phenomena in many-body physics, while recent algorithms tailored to this mode of operation have shown promise in solving complex combinatorial optimization or machine learning tasks. In this tutorial, we will introduce the Pulser library, an open-source Python package maintained by PASQAL. After a general introduction to Rydberg atom physics, we will explore in the first part the digital mode of operation and learn how single and two-qubit gates can programmed at the pulse level. Using the simulation routines, this allows one to optimize the pulses for higher fidelity, be it in a noiseless or noisy environment. A test case on the preparation of Bell pairs and n-body W-states will be presented. In the second part, we will focus on the analog mode of operation, where global pulses are used to control the energy levels of the system, allowing for the preparation of large antiferromagnetic ground states or the solution to graph-based combinatorial optimization problems. This tutorial will conclude with examples of real-world contexts where such combinatorial problems can be found. The tutorial will be hands-on, and we will provide jupyter notebooks with examples to all participants.
Keywords: Neutral atoms, Analog quantum computing, Pulse-level programming
Contents Level: 40% beginner, 40% intermediate, 20% advanced.
Target Audience: The tutorial’s content will resonate most with researchers and students in academia, as well as professionals in industry. Its goal is to empower individual researchers to take advantage of the analog mode and start implementing their own applications on neutral-atom quantum computers. Therefore, a basic knowledge of quantum computing concepts such as ”two-qubit gates”, ”entanglement” and ”hamiltonians”, typically to the level of an undergraduate class, is preferred. Attendees should have a working knowledge of Python in order to follow the hands-on component. Advanced knowledge of combinatorial optimization is not required, as the lesson will be self-contained.

Monday, Sep 16, 2024 — Tutorials Abstracts


Date: Mon, Sep 16, 2024
Time: 13:00-16:30 Eastern Time (EDT) — UTC-4
Duration: 3 hours (2 x 1.5 hours)
Summary: Upskill users to use advanced features with pytket and H-Series. Users will understand how to use unique H-Series features for their use cases. Additionally, users will learn how to use pytket to define and optimise their quantum program before execution on H-Series. Emphasis will be placed on using unique H-Series features through the pytket API, and use cases that will benefit from those features. This tutorial is ideal for new users interested in benefiting from Quantinuum middleware and hardware. Users using other quantum platforms will learn how pytket will enable usage of H-Series with minimal code rewrite.
Abstract: Quantinuum provides H-Series, a premium quantum computer based on trapped ion technologies with the highest gate fidelities in the industry. Users can access H-Series devices and noisy emulators via the cloud. Additionally, end users can benefit from a free, local, and noiseless emulator. TKET is a high-performance quantum software development kit (SDK), developed by Quantinuum, and enables access to H-Series. It is accessible via Python, facilitating the construction and execution of quantum programs on H-Series and other quantum platforms. The end user defines the quantum program with pytket and uses a client-side interface to H-Series, pytket-quantinuum, to submit jobs to the H-Series device or emulator. Specifically, this tutorial will demonstrate the usage of special H-Series features with pytket in three parts: (1) Using unique features including H-Series TKET compilation, arbitrary-angle two-qubit gates, mid-circuit measurements and reset (MCMR) with qubit reuse, (2) Application use cases run on H-Series devices, (3) Running Quantum Error Correction (QEC) experiments on H-Series.
Keywords: H-Series, ion-trap, tket Python, gate-based quantum computing. cloud quantum computing, Quantinuum
Contents Level: 40% beginner, 40% intermediate, 20% advanced.
Target Audience: The audience is expected to have expertise with python and quantum computing. The audience is not required to have used H-Series or pytket before. Users of alternative platform (qiskit and cirq) will understand how their workflows can be adapted to access the H-Series device. Current users of H-Series will learn about the latest upgrades 

Date: Mon, Sep 16, 2024
Time: 13:00-16:30 Eastern Time (EDT) — UTC-4
Duration: 3 hours (2 x 1.5 hours)
Summary: In this tutorial, we will present the current challenges and requirements in setting up a comprehensive quantum software stack and discuss the current state of the art in this growing area. We will then showcase the development and usage of a real-life software stack, namely the mentioned Munich Quantum Software Stack (MQSS).
Abstract: Triggered by the prospects of the technology, quantum computing systems are moving towards production systems. For this to succeed, though, this trend have to be matched with significant efforts in software development. As a consequence, every quantum vendor is investing heavily in this area and several projects world-wide have been established targeting integrated software stacks for quantum computing. In this tutorial we will discuss the need and requirements for such software environments, highlight existing efforts, including their status, interfaces and prospects, and will discuss their differences in approach. We will also introduce the Munich Quantum Software Stack (MQSS) as one of these efforts to illustrate the challenges software developers are facing when targeting quantum computing systems. The goal of this tutorial is to raise awareness of the challenges, but also opportunities, in developing deploying and using quantum software stacks, their impact on the end user communities and the hardware vendors alike, and to help guide users in effeciently utilizing quantum computing systems. 
Keywords: Quantum Computing, Development Environments, System Software, Software Stacks 
Contents Level: 60% beginner, 30% intermediate, 10% advanced.
Target Audience: Since the tutorial rests on experiences of a huge software initiative (namely the Munich Quantum Valley comprising over 300 researchers from various disciplines and from both, academia and industry), this tutorial will cover a broad range of target groups. As main audience, we obviously target developers and users of software for quantum computing. However, we also aim at end-users/domain experts (who, eventually, will have to rely on this software to realize their applications) and physicists/experimentalists (who will provide their devices via this software stack). Indeed, we strongly believe that more exchange among these groups is essential and urgently needed, especially in the development of the needed software intended to connect the different communities. The tutorial provides a perfect opportunity for such an exchange. 

Date: Mon, Sep 16, 2024
Time: 13:00-16:30 Eastern Time (EDT) — UTC-4
Duration: 3 hours (2 x 1.5 hours)
Summary: Attendees will be introduced to Qiskit Patterns as a framework for building quantum workloads as well as see examples of utlity-scale workloads built using composable tools from Qiskit.
Abstract: Delivering useful quantum computing requires turning key capabilities into software and effectively stitching them together to explore problems at scales that are hard classically. In this tutorial, we will introduce Qiskit Patterns as a framework for breaking down utility-scale quantum workloads and contextualizing the necessary capabilities into four main steps – mapping domain-specific problems to quantum circuits and operators, optimizing the circuits for the target quantum hardware, executing, and post-processing the results. We will discuss the need for developing software building blocks that address tasks within these steps and that can be composed together to build larger workflows. Then, we will highlight several software building blocks that address these steps, examples of which may include building quantum circuits with the Qiskit circuit library, transpiling circuits through the latest AI and heuristic methods, executing on quantum hardware via the Qiskit Runtime Primitives, applying error mitigation techniques such as TREX, ZNE, and PEC, and using circuit knitting techniques to further optimize circuits for execution. Participants will be introduced to the main concepts behind each of these building blocks, and will be shown how to access and configure them in Qiskit. Finally, we will show through concrete examples how such building blocks can be composed together to construct larger workflows.
Keywords: Quantum software, Qiskit, Qiskit Patterns, error mitigation, transpilation, circuit knitting
Contents Level: 50% beginner, 30% intermediate, 20% advanced.
Target Audience: This tutorial is appropriate for a variety of audiences including: researchers and practitioners who are looking to make scientific advances in the quantum computing field, students and educators who are interested in gaining a deeper understanding of quantum computing to prepare for the future, and quantum computational scientists who want to use quantum computers as another tool to impove their fields of study. A background in Python and some basic knowledge of Quantum Computing will be useful but not necessary for attendees of this session 

Date: Mon, Sep 16, 2024
Time: 13:00-16:30 Eastern Time (EDT) — UTC-4
Duration: 3 hours (2 x 1.5 hours)
Summary: In this tutorial, we will first present physics-based spin-qubit models employed by quantum scientists from academia and industry to describe their systems. We will then introduce the audience to 3D TCAD simulations of realistic spinqubit devices through practical examples that illustrate typical design and simulation workflows.
Abstract: As quantum technologies mature and gain industrial relevance, it becomes imperative to accelerate design, prototyping, and manufacturing cycles by reducing trial and error. Akin to the situation that prevails in the semiconductor industry, quantum device design workflows increasingly leverage digital simulation tools to predict hardware performance before fabrication, and analyze characterization outcomes thanks to physics-based modeling. This Tutorial aims to introduce the participants of the IEEE Quantum Week to Technology Computer-Aided Design (TCAD) of quantum devices. We will introduce the spin-qubit technology to the audience, along with the main theoretical techniques used to model these systems. We will then demonstrate how to create 3D models of typical spin qubit systems that are currently explored by the emerging quantum hardware industry in potentially scalable quantum computer architectures and present basic simulation workflows using Nanoacademic’s finite-element modeling tool QTCAD. 
Keywords: Quantum computing, Quantum hardware, Technology Computer-Aided Design, Digital simulations Quantum dots, Spin qubits
Contents Level: 50% beginner, 30% intermediate, 20% advanced.
Target Audience: The target audience is all quantum scientists and engineers with an interest in quantum hardware design. This includes quantum device engineers in industry (startups and established companies), government laboratories, and academia. We also expect interest from experimental and theoretical physicists from academia. Finally, based on tutorials given in the past, we expect a large portion of the audience to consist of graduate students who are interested in learning about software tools that are currently used in the industry. We expect the attendees to have a basic knowledge of physics, Python programming, and quantum computing; nevertheless, we will not expect the attendees to be spin-qubit experts. A background in any quantum technology should suffice as we will introduce the spin-qubit technology in the tutorial. In addition, this tutorial will focus on practical aspects of spin-qubit modeling and a step-by-step pedagogical approach leveraging 3D visualization will make the presentation intuitive to beginners

Date: Mon, Sep 16, 2024
Time: 13:00-16:30 Eastern Time (EDT) — UTC-4
Duration: 3 hours (2 x 1.5 hours)
Summary: In this tutorial attendees will learn to use Copilot in Azure Quantum to increase their research and development productivity in quantum software development and solving chemistry and materials science problems.
Abstract: Harnessing the power of recent advances in generative AI offers ample opportunities to accelerate research, increase development productivity, and usher in a new era of scientific discovery.

In this tutorial, we introduce Copilot in Azure Quantum, which enables scientists to use natural language to reason through, and orchestrate solutions to, some of the most complex chemistry and materials science problems – from developing safer and more sustainable products to accelerating drug discovery. Furthermore, students can use Copilot to speed up learning quantum computing, quantum programming, and chemistry and materials science.

We show how to use Copilot in Azure Quantum in a variety of learning and research scenarios, and share the lessons we learned from developing this tool and grounding it in domain-specific knowledge.

At the end of the tutorial, the attendees will be ready to use Copilot in their own studies and work.  
Keywords: Generative AI, Copilot, quantum computing, quantum programming, quantum software development, chemistry and materials science
Contents Level: 30% beginner, 20% intermediate, 50% advanced.
Target Audience: This tutorial aims to appeal to diverse audiences that include quantum computing, chemistry, and materials science researchers and industry professionals, as well as students. Each session will start with content useful for those who are only getting started with the topic, and continue to more advanced topics that will be most beneficial to professionals. The audience will learn to use Copilot in Azure Quantum for learning quantum computing, quantum programming, and chemistry and materials science. They will also gain deeper appreciation of the importance of using AI in scientific workflows to increase research and development productivity.

Date: Mon, Sep 16, 2024
Time: 13:00-16:30 Eastern Time (EDT) — UTC-4
Duration3 hours (2 x 1.5 hours)
Summary: Understanding of the challenges of applying quantum computing to chemistry, the main quantum algorithms in the community, as well as modeling tips to extend their application to more complex chemical systems. Understanding of the challenges and approaches pertaining to quantum experiments on quantum devices.
Understanding of how researchers can simultaneously leverage various frameworks developed by the community in order to facilitate their research, teaching and exploration. Exposure to modular and extensible software to accelerate
R&D around the application of quantum to a field.
Abstract: Quantum chemistry simulation is a promising area of application for quantum computing. However, employing this technology to tackle industrially-relevant chemical systems will require considerable advances in quantum algorithm development, incorporating classical modeling insights, and developing a better understanding of practical experiments on quantum hardware. We present Tangelo, an open-source Python package enabling chemistry workflows on quantum computers. Built to be an engine to accelerate research in the field, it supports rapid prototyping, benchmarking and integration of new research to keep up with the state of the art. Its compatibility with many platforms offered in the quantum ecosystem enables researchers to leverage innovation scattered across various software and hardware efforts, and facilitates experimentation on quantum hardware. Tangelo provides a collection of reusable building blocks and algorithms, with the flexibility to let users introduce their own. It features various toolboxes tackling the challenges present in modeling chemical systems and the steps involved in running feasible hardware experiments. In this tutorial, we familiarize attendees with Tangelo’s growing collection of chemistry features and algorithms. The first session focuses on the backend-agnostic quantum circuit simulation / execution layer, and introduces the basics of chemical modeling on quantum computers before applying them with variational algorithms. The second session highlights toolboxes and challenges pertaining to hardware experiments, before focusing on fault-tolerant approaches. We first emphasize key differences between fault-tolerant and variational approaches, before introducing some fault-tolerant building blocks and algorithms through hands-on attendees can choose from (Hamiltonian simulation, state preparation, phase estimation). 
Keywords: Quantum, chemistry, experiments, workflows, algorithms, modeling, NISQ, fault-tolerant
Contents Level: 40% beginner, 40% intermediate, 20% advanced. 
Target Audience: The target audience include researchers in computational chemistry or physics, quantum developers, and simply anyone interested in applications of quantum computing. Prerequisites include a basic understanding of the Python programming language, basic knowledge about quantum mechanics (Schrodinger equation, wavefunction, eigenvalues, correlation energy. . . ) as well as the concept of quantum gates and circuits. The audience will further their understanding of the challenges of applying quantum computing to chemistry, and the different steps of end-to-end workflows aimed at leveraging current and future quantum devices. They will learn how to leverage the toolboxes and algorithms present in Tangelo, and how they can be used in conjunction with various frameworks developed by the community in order to facilitate their research 

Tuesday, Sep 17, 2024 — Tutorials Abstracts


Date: Tue, Sep 17, 2024
Time: 10:00-14:30 Eastern Time (EDT) — UTC-4
Duration: 3 hours (2 x 1.5 hours)
Summary: QubiC 2.0 is an open-source quantum control system with mid-circuit measurement and feedforward capabilities for quantum applications. In this hands-on tutorial, participants can learn and use QubiC infrastructure and its new and novel features through simulation and control hardware.
Abstract: The NISQ era of quantum computing places significant demands on flexible and cost-efficient classical control systems. Developed at Lawrence Berkeley National Laboratory, QubiC positions as an open-source quantum controls system capable of mid-circuit measurement and feedforward for superconducting quantum computing. In this tutorial, we introduce QubiC and provide a deep dive into its architecture, functionalities, and capabilities in quantum control and measurement. Specifically, we discuss advanced control requirements for superconducting qubits and QubiCs approach to satisfying these requirements, including features such as fast parameter updates and real-time decision-making. We provide the audience access to our simulator and conduct pulse and gate-level experiments.

Following the success of last year’s tutorial, this tutorial aims to provide physicists, engineers, and quantum hobbyists insight into our open-source control system to empower the audience to utilize its capabilities to strengthen the quantum ecosystem.
Keywords: Quantum Computing, Quantum Control  
Contents Level: 50% beginner, 30% intermediate, 20% advanced.
Target Audience: This tutorial targets a multi-disciplinary quantum community of experimentalists, hardware designers, and software engineers interested in using open-source control systems. QubiC being a cost-effective open-source controller for superconducting qubits, we expect interest from superconducting (and non-superconducting) qubit experimentalists requiring a flexible control system. The software infrastructure will also attract developers interested in transpilation and integration into the hardware.  

Date: Tue, Sep 17, 2024
Time: 13:00-16:30 Eastern Time (EDT) — UTC-4
Duration: 3 hours (2 x 1.5 hours)
Summary: The tutorial will present the real time control requirements emerging from QEC and multi qubit physics and present a step-by-step guide to implementing hybrid classical-quantum compute strategies addressing these challenges on the DGX Quantum platform, the first to integrate QPU control with CPU and GPU computation.  
Abstract: Tight integration of classical and quantum compute is becoming a necessity as we scale QPUs to many dozens of qubits. The scale of QPUs as well as the improvement in gate and readout fidelities opens the path to fault tolerant computations using quantum error correction with ensembles of qubits. This however requires the real time decoding of the logical qubit state via the observation of ancilla qubit states, tightly merging classical and quantum compute. In this tutorial we will show how to set up a control system based on Quantum Machines OPX controllers tightly integrated with a NVidia Grace Hopper server. The participants will learn how to set up the qubit drive and readout sequences on the OPX using QUA – a pulse level programming language, and C/C++ to code QEC decoding algorithms running on an adjacent NVidia server. Low latency communication between the two systems is made possible using the DGX Quantum platform which allows the transfer of data from QUA to the GH superchip in under 4us (round trip). In addition to the QEC use case, the multitude of degrees of freedom present in a multi qubit system must be controlled to consistently deliver nominal qubit drive and readout policies resulting in fidelities below the threshold required for QEC. In this context we will demonstrate how the DGX Quantum platform can be used to learn optimal policies through reinforcement learning strategies that efficiently explore the large parameter space as well as track parameter drift to deliver consistent results.
Keywords:  Quantum-classical computing, GPU integration, quantum computing 
Contents Level: 20% beginner, 50% intermediate, 30% advanced.
Target Audience: Anyone with interest in hybrid quantum-classical compute will greatly benefit from the tutorial. This includes people in the quantum error correction field, but also HPC and data centers getting ready for quantum technologies. Additionally, this hybrid compute field sees players and use cases in many different qubit modalities, from atoms and ions to superconducting, spin, and more. This will make for a broad audience with diverse backgrounds. No specific background is required. 

Date: Tue, Sep 17, 2024
Time: 13:00-16:30 Eastern Time (EDT) — UTC-4
Duration: 3 hours (2 x 1.5 hours)
Summary: This tutorial introduces the basics of quantum annealing and explores its possible advantages and practical implications in database and data management research. In addition to the theory, we provide three hands-on demonstrations of quantum annealing for join order optimization, transaction scheduling, and virtual machine allocation in cloud infrastructures.
Abstract: Quantum annealing is a metaheuristic approach tailored to solve complex combinatorial optimization problems with quantum annealers. In this tutorial, we provide a fundamental and comprehensive introduction to quantum annealing and modern data management systems and show quantum annealing’s potential benefits and applications in the realm of database optimization. We demonstrate how to apply quantum annealing for selected database optimization problems, which are critical challenges in many data management platforms. The demonstrations include solving hard join order optimization problems in relational databases, optimizing sophisticated transaction scheduling, and allocating virtual machines within cloud-based architectures with respect to sustainability metrics. On the one hand, the demonstrations show how to solve extremely hard but theoretical problems (join order selection, transaction scheduling), and on the other hand, they show how quantum annealing can be integrated as a part of larger and dynamic optimization pipelines (virtual machine allocation). The goal of our tutorial is to provide a centralized and condensed source regarding theories and applications of quantum annealing technology for database researchers, practitioners, and everyone who wants to understand how to optimize data management with quantum computing in practice. Besides, we identify the advantages, limitations, and potentials of quantum computing for future database and data management research.
Keywords: Quantum annealing, database optimization, join order optimization, transaction scheduling, virtual machine, allocation 
Contents Level: 50% beginner consisting of an ntroduction to quantum annealing and the basics of database optimization, and 50% intermediate consisting of the selected database applications
Target Audience: This tutorial is intended for a wide audience that includes academic researchers and students as well as industrial developers and practitioners who want to understand the impact of quantum annealing on databases. The audience is expected to have basic knowledge of quantum computing, including concepts such as qubits, superposition, and entanglement. It would be advantageous if attendees knew the concept of optimization and its diverse applications. Additionally, following the handson demonstration demands a basic knowledge of the Python language. 

Date: Tue, Sep 17, 2024
Time: 13:00-16:30 Eastern Time (EDT) — UTC-4
Duration: 3 hours (2 x 1.5 hours)
Summary: Attendees will gain a comprehensive understanding of quantum networking, including foundational concepts, current developments, and potential real-world applications.
Abstract: This tutorial introduces the concept and practical aspects of Quantum Internet design, covering key principles, technologies, and applications that will shape the next generation of communication networks.
Keywords: Quantum Internet, Quantum Networks, Protocol Stack
Contents Level: 40% beginner, 35% intermediate, 25% advanced.
Target Audience: This tutorial is designed for researchers, engineers, and practitioners interested in quantun computing, communication, or networking. A worshop-style interactive presentation will be adopted to engage the audience in paving the way for the classical communications community to contribute to the quantum communications field. Whilst this overview is ambitious in terms of providing a research-oriented outlook, potential attendees require only a modest background in communications. The mathematical contents are kept to a minimum and a conceptual approach is adopted. Postgraduate students, researchers, and practitioners as well as managers looking for cross-pollination of their experience with other topics may find the coverage of the presentation beneficial. Given the novelty of the topic, the tutorial is targeted to audience new to the field. A graduated knowledge on communications theory and/or computer networks is assumed. Very basic knowledge of quantum information is recommended, and compact concise introductory material will be provided to audience before the tutorial. 

Date: Tue, Sep 17, 2024
Time: 13:00-16:30 Eastern Time (EDT) — UTC-4
Duration: 3 hours (2 x 1.5 hours)
Summary: Attendees will learn the challenges of compiling programs for quantum computers, how to overcome them utilizing the BQSKit compiler, and how best to navigate the growing diversity in quantum hardware to arrive at the best results for their problems.
Abstract: In this tutorial, we will teach you how to take advantage of the recent advancements in quantum hardware. Quantum hardware is experiencing a boon leading to more chip variety, configurations with higher fidelities, and distributed interconnects. While ultimately, this will translate to a boon for the entire field of quantum computing, it presents two problems. First, algorithm designers and users must make more difficult choices between potential hardware vendors. Second, this places more of the overall burden of end-to-end quantum applications on the software stacks, specifically the quantum compiler. The Berkeley Quantum Synthesis Toolkit (BQSKit) is a powerful and portable compiler with a proven ability to alleviate these issues and translate recent hardware successes up to the algorithm level. In this tutorial, we show how every practitioner can benefit from BQSKit using just the default tooling. We then do a deep dive, fine-tuning compilation workflows for end-to-end applications and equipping attendees with the knowledge and ability to better implement algorithms on NISQ devices and beyond.  
Keywords: Circuit, Unitary, Synthesis, Gateset, Topology-aware, Mapping, Compilation, Transpilation, Instantiation
Contents Level: 20% beginner, 60% intermediate, 20% advanced.  
Target Audience: This tutorial introduces the practical challenges of compiling quantum programs for current NISQ-era and near-term future devices and teaches how to use BQSKit to overcome these challenges. Beginners, who only need to be familiar with the quantum circuit model, basic linear algebra terms, and the Python programming language, will learn how to utilize circuit transpilation methods to pick and compile for NISQ-era hardware. Advanced quantum computing users with a target application in mind will additionally benefit when we teach how to fine-tune a compilation workflow for their specific application. Additionally, anyone looking to extract the best results from new, distributed computers will find this tutorial helpful.

Date: Tue, Sep 17, 2024
Time: 13:00-16:30 Eastern Time (EDT) — UTC-4
Duration3 hours (2 x 1.5 hours)
Abstract: This tutorial will teach prospective users how to use upcoming dynamically programmable qubit arrays (DPQA) neutral atom quantum computers based on mid-circuit atom moving. By the end of the tutorial, you will understand the strengths, weaknesses, and differences of DPQA architectures against other systems; the commands and structure of programming DPQA; and best practices for hardware-aware co-design of algorithms, including NISQ circuits and design of error correction codes.
Summary: The goals of this tutorial is to help the audience develop interest and autonomy for R&D on early error-corrected quantum computing. The topic is timely and absolutely necessary for the community as we advance towards finding practical demonstrations of quantum advantage.
Keywords: In late 2023, a team led by Harvard and QuEra researchers demonstrated world-record operations of logical qubits. These demonstrations included showing entangling performance increase with scaling size of surface codes, preparation of color codes with break-even fidelities, the creation of fault-tolerant GHZ states, feedforward teleportation of entanglement, operation of 40 color codes, and complex circuit sampling algorithms (akin to Google “quantum supremacy” demonstration) with 48 logical qubits. These tour-de-force demonstrations were achievable by intimate co-design of algorithms and hardware capabilities, leveraging parallelized operations, including logical transversal gates, multi-qubit gates, and other features that are exclusive to neutral-atom platforms. This work heralds the rise of early error-corrected quantum computation and remark the importance of disseminating information with regards to how to design algorithms and error-correction in alignment with hardware architectures. In this two-part tutorial, we will be introducing the details of gate-based neutral-atom platforms with zoned architectures, explaining to the audience what are the design parameters that should go in consideration when developing algorithms and compilers for physical and logical algorithms. We will then move to hands-on practice with QuEra’s upcoming logical qubit simulator, while teaching the audience about compilation and logical qubit encoding. Participants will come out of this tutorial enabled to start contributing to the era of early error-corrected quantum computing.
Contents Level: This tutorial is geared towards beginners with a minimal understanding of quantum mechanics and quantum information. An undergraduate 1-semester course in each topic isthe expected level, with mathematical understanding of linear algebra as an added relevant skill. 
Target Audience: The contents of this tutorial will be most appreciated by researchers and students from academia, as well as researchers and developers from industry. In particular, it is relevant for application developers to start understanding the true requirements of their solutions at a logical-qubit level.

Date: Tue, Sep 17, 2024
Time: 13:00-16:30 Eastern Time (EDT) — UTC-4
Duration3 hours (2 x 1.5 hours)
Summary: From this tutorial, attendees will gain an understanding of the parity encoding as a tool for circuit transpliation and in particular, how it can be leveraged to efficiently implement quantum algorithms on restricted hardware. They will learn how to use various techniques to optimize circuit depth or gate count, and trade off different metrics in circuit optimization against each other.
Abstract: Over the last decade, the Parity Architecture has evolved from its initial purpose to address the connectivity problem of solving optimization problems on quantum computers to a versatile tool that is broadly applicable in many quantum computing applications. The main idea is to use parity qubits which represent the relative alignment (parity) of two or more logical qubits. This tutorial begins by introducing the parity formalism, illustrating how quantum (and classical) operations can be expressed in terms of the parity of the involved qubits. Furthermore, it explores how many-body operations can be mapped to single-body operations on physical parity qubits and how this can be leveraged to optimise quantum circuits for gate count or circuit depth. The approach is particularly useful to parallelize the implementation of (potentially non-local and high-order) many-body gates which in principle commute, but usually cannot be performed efficiently due to connectivity limitations of the hardware. We will illustrate this parallelization technique on the example the Quantum Approximate Optimization Algorithm (QAOA) and the quantum Fourier transform (QFT). The tutorial discusses different methods for creating, removing and manipulating parity qubits and thereby transpiling quantum algorithms to efficient circuits based on CNOT gates and single-qubit rotations or dynamic circuits for quantum computing hardware with restricted connectivity.
Keywords: Quantum Circuits, Transpilation, Circuit Depth, Connectivity, Parity, Dynamic Circuits, QAOA, QFT
Contents Level: 20% Beginner, 60% Intermediate, 20% Advanced
Target Audience: The tutorial targets an audience with a circuit optimization or transpilation background, as well as anyone interested in methods to implement algorithms such as the quantum Fourier transform or optimization algorithms under hardware restrictions. A basic knowledge of quantum computing and quantum circuits is required. Familiarity with stabilizer codes is helpful but not necessary. 

Wednesday, Sep 18, 2024 — Tutorials Abstracts


Date: Wed, Sep 18, 2024
Time: 13:00-16:30 Eastern Time (EDT) — UTC-4
Duration3 hours (2 x 1.5 hours)
Summary: In this tutorial attendees will learn to develop, test, and evaluate quantum algorithms using Q# programming language and Azure Quantum Development Kit, run them on quantum hardware in the cloud via Azure Quantum, and estimate the resources required to run them on the future fault-tolerant quantum computers.
Abstract: As the domain of quantum computing matures, the programs that implement quantum algorithms of interest grow both in size and in complexity. Quantum software development workflow for a reasonably complex quantum application includes multiple steps: implementing the quantum and the classical portions of the algorithm in code, running the code in simulation to test its correctness, and, depending on the algorithm, either evaluating the resources required to run it on a fault-tolerant quantum computer or running it on a NISQ device.
In this tutorial, we introduce Azure Quantum Development Kit – an open-source toolkit that supports end-to-end algorithm development, evaluation, and hardware execution. We walk through quantum algorithms implementation in Q# – domain-specific quantum programming language – and testing using the tools included in the toolkit. Then we show how to use Azure Quantum Resource Estimator to perform automatic quantum resource estimation and obtain estimates for the time it would take to solve large problem instances using this implementation on a fault-tolerant quantum computer and the required characteristics of this computer. Finally, we show how to run quantum code on quantum hardware and/or cloud simulators available in Azure Quantum. At the end of the tutorial, the attendees will have examples of end-to-end algorithm implementations that they can use as a foundation for their own projects, and will be ready to start quantum algorithm development using Azure Quantum Development Kit.
Keywords: Quantum computing, quantum programming, quantum software development, quantum resource estimation, testing quantum, programs 
Contents Level: 50% beginner, 25% intermediate, 25% advanced.
Target Audience: This tutorial aims to appeal to diverse audiences that include students, early career industry professionals, and quantum computing researchers. The talks will require basic knowledge of quantum computing and quantum programming, but no knowledge of specific application domains such as quantum chemistry.

Date: Wed, Sep 18, 2024
Time: 13:00-16:30 Eastern Time (EDT) — UTC-4
Duration3 hours (2 x 1.5 hours)
Summary: The aim of this tutorial is to provide the audience with an understanding of the advantages of the Lattice Boltzmann Method (LBM) and how its variations can be effectively implemented on Quantum Computers. Concretely, the programme will first introduce participants to the theoretical fundamentals of the (classical) LBM, its applications to fluid flow problems, and its implementation in terms of quantum circuits, before offering the audience a first-hand experience into developing, simulating, and benchmarking Quantum-LBM algorithms using our in-house QLBM Python package. 
Abstract: Fluid flow problems remain hard to solve with conventional computers, when the problem size becomes excessively large, as it is for instance the case in the direct numerical simulation (DNS) of turbulent flows that calls for ultra-high resolution. As the computational space of quantum computers scales exponentially with the number of qubits, they make for a promising hardware platform to simulate flow problems on exponentially large grids. In this tutorial we introduce attendees to the Quantum Lattice Boltzmann method and to our in-house QLBM package. The Lattice Boltzmann Method (LBM) is a computational method designed to solve flow problems on computers by dividing the equations into a non-local and a non-linear part. As such the so-called transport and collision steps can be done separately. Thanks to its highly parallalizable nature and ability to simulate complex geometries, the LBM has gained significant attention over the past years. The Quantum Lattice Boltzmann Method (QLBM) is designed to exploit this parallizability by translating it into superposed quantum states. In this workshop we will show how you can implement the LBM method on a quantum computer and give insight into the main open challenges of the field. This tutorial is aimed at PhD students, postdocs, early career researchers, other interested academics, and practitioners and will give the hands on experience of using a quantum computer (simulator) to solve fluid flow problems.
Keywords: Lattice Boltzmann Method, Computational Fluid Dynamics, Quantum Computing, Applied Quantum Algorithm
Contents Level: The contents of this tutorial is split as follows: 40% beginner, 40% intermediate, 20% advanced. Basic prior knowledge of quantum circuits and basic programming experience in Python will be helpful. No prior experience with LBM or computational fluid dynamics (CFD) is required.
Target Audience: This tutorial is aimed at students, researchers, and practitioners with a basic understanding of quantum computing and an interest into its application to real-world fluid flow problems. Attendees with prior experience in classical LBM and CFD problems will also benefit from novel insight into the development of quantum computing counterparts, though no prior expertise in fluid dynamics is required. 

Date: Wed, Sep 18, 2024
Time: 13:00-16:30 Eastern Time (EDT) — UTC-4
Duration3 hours (2 x 1.5 hours)
Summary: In this tutorial, attendees will learn how to use the open source Quantum Instrumentation Control Kit (QICK) system to control qubits. Attendees will gain understanding of the QICK hardware/firmware/software stack before participating in hands-on demonstrations of qubit control with QICK boards. 
Abstract: The QICK is an open-source qubit controller that is used worldwide by academia and industry. In this tutorial, we hope to engage those interested in quantum controls at any level of the stack, and teach them the QICK hardware, firmware and software, all of which is publicly available on GitHub. This includes learning about the second-generation QICK timed processor which adds to QICK’s flexibility and flattens the learning curve. Much of the tutorial will consist of live demonstrations, giving attendees firsthand understanding of how QICK works.
Keywords: Quantum computing, quantum sensing, real-time control, field-programmable gate arrays (FPGAs), digital signal processing, superconducting qubit
Contents Level: 20% beginner, 40% intermediate, 40% advanced.
Target Audience: The tutorial targets a broad audience that ranges from researchers to students with some engineering, math, computer science or physics background that are willing to get started using the QICK for quantum experimentation. Attendees will learn how the QICK system works, and be given the tools to be able to describe for themselves single- or multi-qubit experiments using this system. 

Date: Wed, Sep 18, 2024
Time: 13:00-16:30 Eastern Time (EDT) — UTC-4
Duration3 hours (2 x 1.5 hours)
Summary: The attendees will learn: the basic concepts of compilation through interactive exercises in Black Opal and how to apply error suppression techniques to their quantum applications using Fire Opal for deployment on real quantum hardware. The software packages Black Opal and Fire Opal, both from Q-CTRL, will be made available to the attendees of the tutorials.
Abstract: Quantum technologies promise to change our world, boosting our ability to solve optimization and search problems through quantum computing. Errors induced by environmental noise and control procedures limit the community’s ability to reliably perform quantum computations at scale and solve meaningful problems using quantum computers. Q-CTRL’s error suppression pipeline enables quantum algorithms and application developers to get the most performance out of the quantum hardware without requiring any low-level device knowledge. Through this tutorial you will learn about the building blocks of deterministic error-suppression and how it manipulates an input circuit. You will then learn how to apply these control techniques to real problems through coding examples. The concepts that form the foundation of these controls will be taught through interactive visualizations and tasks delivered through the Black Opal learning platform. After building the foundations, we will work through coding challenges to implement these control techniques using the Python package Fire Opal — a zero-config quantum control solution. Demonstrations of these control techniques completed by Q-CTRL will be presented, Including: Boosting the performance of superconducting quantum computers by 9000x and successful execution of utility-era hybrid quantum algorithms for 127-qubit devices. After completing this tutorial you will be able to: Identify the appropriate algorithm formulation for getting the best performance from error suppression, rapidly find quantum application solutions using Fire Opal and deploy these controls on real experiments, with automated error-suppression techniques that perform.
Keywords: Quantum computing, error suppression, quantum algorithms, quantum applications
Contents Level: 30% beginner, 50% intermediate, 20% advanced. 
Target Audience: The target audience ranges from PhD candidates and postdoctoral fellows who are exploring novel ways to utilize quantum technology for solving real-life problems or professional engineers and algorithm developers looking to improve the performance of their algorithms on real quantum devices. No prior knowledge of quantum control or quantum hardware is required. An understanding of the basics of quantum computation is needed for the introductory materials. The latter quantum control exercises are completed in the programming language Python and require a basic understanding of Python. The attendees will learn: the basic concepts of compilation through interactive exercises in Black Opal and how to apply error suppression techniques to their quantum applications using Fire Opal for deployment on real quantum hardware. The software packages Black Opal and Fire Opal, both from QCTRL, will be made available to the attendees of the tutorials. Both platforms have free options available that will ensure the attendees can all continue to complete content from the tutorial after the conference.

Date: Wed, Sep 18, 2024
Time: 13:00-16:30 Eastern Time (EDT) — UTC-4
Duration3 hours (2 x 1.5 hours)
Summary: This tutorial highlights the lesser known field of simulation-based optimization problems, that is expected to allow for exponential quantum speedups of highly relevant industrial use cases. We show how algorithms from quantum optimization and quantum simulation of classical physical systems can be combined to implement solutions for many industrially relevant problems such as drive train optimization in automobiles.
Abstract: Many industrial optimization problems involve a simulation subroutine that computes the objective value of a potential solution of said optimization problem. A concrete area of application are mechatronic systems and in general dynamical systems that can be discretized or linearized into a linear system. Examples range from optimizing the efficiency of a drive train architecture (i.e., optimal combination of engines, transmissions, batteries) or electrical and heat transmission systems to the optimal distribution of material in a generative design problem. Based on recent advancements in the domain of quantum simulation of classical physical systems, the simulation component of such problems can be solved without expert knowledge of the practitioner while providing up to exponential speedups. In this tutorial, we show how this can be combined with existing quantum optimization routines using basic quantum subroutines such as amplitude estimation. Being structured in a theoretical and a hands-on part, the tutorial first conveys intuition and a technical understanding of quantum algorithms for simulation-based optimization and then shows how these can be combined to solve concrete industrially relevant problems. We target a broad audience of academia and industry interested in the quantum applications in optimization and the simulation of classical physical systems with basic knowledge of quantum gate computing. The central takeaway from this tutorial is the ability to implement solutions to simulation-based optimization problems using the provided toolkit of quantum algorithms.
Keywords: Quantum Optimization, Quantum Simulation, Simulation-based Optimization, QAOA, QSVT
Contents Level: 30% beginner, 50% intermediate, 20% advanced.
Target Audience: The self-contained tutorial is intended to serve a broad audience. We address everyone who is interested in quantum optimization and quantum simulation of classical physical systems and those who would like to experience a wellprepared introduction to this topic. This includes researchers and students from the fields of electrical and computer engineering, computer science, applied physics, and other domains, with a basic knowledge of quantum computing. We would also like to bring interested practitioners from the industry closer to this emerging field of research. A readily usable toolbox of quantum algorithms needed for simulationbased optimization may help increasing the potential to exploit a quantum advantage for these applications in the company concerned.

Thursday, Sep 19, 2024 — Tutorials Abstracts


Date: Thu, Sep 19, 2024
Time: 13:00-16:30 Eastern Time (EDT) — UTC-4
Duration3 hours (2 x 1.5 hours)
Summary: Assuming no prior background, ZX-calculus will be presented with audience participation and prizes. We then show how the new quantum formalism gives rise to Interpretable Quantum NLP.
Abstract: ZX-calculus has already been around as long as 2007, but it only has become widely used in the past couple of years, and especially with the emergence of quantum industry, given the new scientific and technological challenges this poses. Areas where it has become prominent include compilation, circuit optimisation, error-correction, quantum natural language processing, and photonic quantum computing. Another reason for this more recent rise to prominence is the availability of the book Picturing Quantum Processes (2017), which presents ZX-calculus without any reference to category theory, which before was an obstacle to many. Even more so, since very recently there also is the book Quantum in Pictures (2023) which has no mathematical prerequisites whatsoever. Hence, this book also establishes ZX-calculus as an educational tool that can importantly contribute to making quantum more inclusive, while at the same time providing an entirely new perspective on it. This tutorial is all presented by Bob Coecke: creator of the ZX-calculus, author of both the above textbooks, and Chief Scientist of Quantinuum. It offers history, perspectives, and fun with mathematically rigorous diagrams throughout a two-hour introduction to the ZX-calculus. The tutorial then leads into a one-hour beginner-friendly presentation on cutting-edge developments in quantum natural language processing (QNLP). Following this tutorial, you will have learned how to wield a new tool and approach by which to understand quantum computing.
Keywords: ZX-calculus, quantum picturalism, quantum natural language processing
Contents Level: Part A of the tutorial is beginner-level and designed to introduce diagrammatic methods for gate-based quantum computing from the ground up. Part B of the tutorial is at the intermediate level, covering topics across quantum natural language processing research and practice in industry.
Target Audience: The tutorial presumes familiarity with qubit states, quantum gates, and the quantum circuit model. Knowledge of linear algebra (Hilbert spaces, Dirac/bra-ket notation, etc.) is not a prerequisite. Mathematical topics used throughout the tutorial include angle addition and simple binary arithmetic. The tutorial introduces a new language for gate-based quantum computing: its target audience consists of all practitioners and researchers, at any level, who want to learn and apply the ZX-calculus or quantum natural language processing. 

Date: Thu, Sep 19, 2024
Time: 13:00-16:30 Eastern Time (EDT) — UTC-4
Duration3 hours (2 x 1.5 hours)
Summary: This three-hour tutorial will allow attendees to gain insights into quantum machine learning, focusing on practical applications. Participants will first learn relevant fundamentals of quantum machine learning theory, and how to apply these concepts in a data science setting. This will include being guided through three published papers on quantum machine learning covering popular classes of problems, and being shown how to adapt these methods using the opensource Qiskit Machine Learning library.
Abstract: The open-source Qiskit Machine Learning library has undergone changes and restructuring following the Qiskit 1.0 release. These upgrades include new powerful features, tutorials and a focus on user experience. In this tutorial, we aim to guide participants in using Qiskit Machine Learning to build practical applications. We adopt the following approach. Proposing 3 well-established quantum machine learning papers, we demonstrate how the Qiskit Machine Learning framework can (i) reproduce these experiments utilizing the full range of features and (ii) enable the user to build custom quantum networks tailored to their task. This will cover quantum feature maps, ansatz architecture, circuit framework, quantum cost functions, and couplings to classical algorithms in the form of both classifiers and regressors. After attending this tutorial, we hope participants will feel confident in applying quantum machine learning principals to their workflows using the Qiskit Machine Learning library. This tutorial is aimed at beginner to intermediate quantum machine learning practitioners, with basic knowledge of quantum computation, classical or quantum machine learning and data science. We require participants to use their own computing platform that supports Python 3.x and the latest Qiskit Machine Learning; quantum simulator backends will be the default method to test quantum machine learning models.
Keywords: Quantum machine learning, Qiskit machine learning, quantum neural networks, quantum circuit, application of quantum computation, data science
Contents Level: 40% beginner, 60% intermediate.
Target Audience: The tutorial content is mainly targeted at novices in the Quantum Machine Learning (QML) field who have experience in mathematics, classical machine learning, data science, quantum mechanics, or quantum circuits. The tutorial will be most beneficial to those with some related expertise and an interest in learning more about QML practices. QML concepts will be covered but basic knowledge of quantum computing and machine learning is a prerequisite to obtain the most value from this tutorial. Required quantum computing knowledge is an understanding of qubits, the Bloch sphere, superposition and entanglement of states, and being able to recognize a quantum logic gate and quantum circuit. Required machine learning knowledge is a rudimentary understanding of what a neural network is and how they are optimised (gradient descent). These prerequisites will be briefly covered in the tutorial. The tutorial will also be helpful to those with QML knowledge but interested in migrating to the Qiskit Software Development Kit (SDK) and Qiskit Machine Learning, as this tutorial will focus on Qiskit Machine Learning features, class structures and customisation options.

Date: Thu, Sep 19, 2024
Time: 10:00-14:30 c
Duration3 hours (2 x 1.5 hours)
Part 2TUT34 on Fri, Sep 20 a 10:30-14:30 EDT
Summary: Through hands-on exercises, demonstrations, and explanations, attendees will gain experience with CUDA-Q
and distributed quantum computing for improved scaling and performance. This is demonstrated through strategic modifications of the standard Quantum Approximate Optimization Algorithm (QAOA) for the max cut problem, including circuit cutting.
Abstract: As quantum computing evolves, the integration of quantum and classical systems will become a cornerstone of high-performance computing strategies, driving advancements in various scientific and commercial fields. CUDA-Q is a programming model integrating heterogeneous QPU, CPU, GPU, and emulated quantum systems in both Python and C++. CUDA-Q streamlines hybrid quantum-classical workflows and provides a platform to experiment with distributed multiple-QPU workflows.

During this two-part tutorial, we introduce attendees to CUDA-Q distributed computing through an implementation of the Quantum Approximate Optimization Algorithm (QAOA) to the max cut problem, focusing on parallel simulation on GPUs for improved performance and scalability. This sequence of tutorials is targeted to a broad audience; those already familiar with QAOA may choose to only attend Part 2.

Part 1 sets the stage by demonstrating QAOA on a small graph instance and by introducing the divide-and-conquer approach. Part 2 targets attendees who have completed Part 1 or who are already familiar with QAOA. In part 2, we examine a recursive divide-and-conquer approach to max cut that uses two separate applications of QAOA at two different steps of the algorithm. Participants are encouraged to experiment with various adjustments to QAOA for improved performance and outcomes. Moreover, Part 2 equips the audience with the ability to execute quantum kernels in parallel on GPUs using CUDA-Q.

While this tutorial focuses on QAOA, the techniques learned can be adapted to a variety of hybrid quantum algorithms. Attendees will take away skills, Jupyter notebooks, and Python code to apply to their own projects.
Keywords: Quantum computing, hands-on programming, distributed computing, QAOA 
Contents Level: 30% beginner, 60% intermediate, 10% advanced.
Target Audience: For Part 1, learners should have a basic understanding of quantum computing (circuits, Hamiltonians, sampling, observations) and be familiar with a few quantum algorithms. Part 2 is appropriate for those that have completed Part 1 or who are already familiar with variational algorithms. We expect learners to have some experience with Python and Jupyter notebooks. 

Date: Thu, Sep 19, 2024
Time: 13:00-16:30 Eastern Time (EDT) — UTC-4
Duration3 hours (2 x 1.5 hours)
Summary: Participants will learn how to efficiently transform abstract circuits into circuits obeying an instruction set specified by a target quantum system using Qiskit. Transpilation services and benchmarking will also be presented.
Abstract: In this tutorial we detail the procedures involved with mapping abstract quantum circuits to those executable on a target quantum system. Focus will be on the Qiskit implementation of this transpilation process, how it works, what people can do with it, how to utilize different passes (including transpiler services), and some benchmarks comparing the performance of transpilation in Qiskit against other SDKs. We focus on some celebrated applications of quantum computing, such as simulating the Fermi-Hubbard model, and show how circuit transpilation can be optimized and customized in Qiskit using a variety of techniques. We also highlight some passes that use artificial intelligence for improved performance.
Keywords: Qiskit, Transpiler, Compiler
Contents Level: 30% beginner, 40% intermediate, 20% advanced.
Target Audience: Researchers and practitioners looking to leverage nearterm quantum computers for utility-scale computations can particularly take advantage of this tutorial.

Date: Thu, Sep 19, 2024
Time: 13:00-16:30 Eastern Time (EDT) — UTC-4
Duration3 hours (2 x 1.5 hours)
Abstract: One of the biggest challenges in building a fault-tolerant quantum computer is the number of additional physicals qubits required to implement error correction. Cat qubits are a development focused on tackling this challenge by providing a “hardware-efficient” roadmap towards fault-tolerant, universal quantum computing. Cat qubits are superconducting bosonic qubits which encode quantum information in two coherent states of a microwave resonator or cavity. By carefully tailoring the dissipation of the resonator, it is possible to stabilize the two computational states of the cat qubit without affecting their superpositions. They have demonstrated the capability to resist bit-flips, up to 2 minutes [1]. This strong suppression of one type of quantum errors allows for a resource efficient road to fault-tolerant and universal gate-based quantum computing, in that it allows error correction to be performed with simpler schemes than a surface code. The hardware-efficiency of the cat qubit at the low levels of the architecture naturally results in a dramatic reduction of the final application-relevant size of the quantum processor: a recent study [3] quantifies how this approach might reduce the number of qubits required to run Shor’s algorithm down to 350.000, where it was previously thought that millions of qubits were needed. This tutorial will go over the physics of cat qubits, using classical analogies as well as exposing the quantum equations of the hardware. It will then expose how they can be leveraged in a complete roadmap towards fault tolerance. It will also demonstrate tools to simulate cat qubits behavior. The tutorial will also be an opportunity to run some experiments on cat qubits.
Keywords: Fault Tolerant Quantum Computing, Hands on tutorial, Roadmap, Cat qubits, Bosonic qubits
Contents Level: 30% beginner, 60% intermediate, 10% advanced.
Target Audience: The tutorial is aimed at anyone wanting to understand the cat qubit technology. It will not go over the basics of quantum computing. Concepts such as the Bloch Sphere, Schrödinger equation etc. are prerequisite for attendance. Additionally, passing knowledge of error correction scheme is expected. Some familiarity with quantum optics and of quantum development languages such as Qiskit will help. 

Friday, Sep 20, 2024 — Tutorials Abstracts


Date: Fri, Sep 20, 2024
Time: 13:00-16:30 Eastern Time (EDT) — UTC-4
Duration3 hours (2 x 1.5 hours)
Summary: In this tutorial, attendees will learn about quantum error mitigation and get hands-on programming experience with some of the techniques available in Mitiq, an open-source quantum circuit-level Python toolkit.
Abstract: We demonstrate how Mitiq, an open-source Python package for quantum error mitigation (QEM), can be used to effectively reduce the impact of noise on near-term quantum computers when the error rate is below a certain threshold. Compared to the qubit overhead of quantum error correction, error mitigation techniques require a minimal qubit overhead by shifting the overheads to a mixture of quantum sampling on noisy quantum devices and classical post-processing techniques. Mitiq (over 100,00 downloads, 65 project contributors) is an extensible toolkit of different QEM methods and noise-tailoring techniques. It is interoperable with a majority of quantum software frameworks (frontends) and real or simulated noisy devices (backends). We provide a two-part tutorial through interactive Jupyter notebooks which illustrate the use of zero-noise extrapolation (ZNE) and Pauli twirling (PT). The first half of the tutorial will consist of an overview of the available functionality in Mitiq and establish the theoretical grounding of QEM, ZNE, and PT with the aid of code excerpts. We will also report on the interoperability of the Mitiq-compatible frontends and noisy backends. In the second half of the tutorial, we will apply error mitigation by combining ZNE and PT on a frontend and simulated noisy backends. We conclude with an overview of Mitiq’s calibration module designed to help users find an optimal quantum error mitigation strategy, including the choice of noise scaling methods, extrapolation techniques, and further technique-specific hyperparameters.
Keywords: Quantum error mitigation, quantum software, quantum utility, open source, Python
Contents Level: 10% beginners, 60% intermediate, 30% advanced. 
Target Audience: The tutorial is designed for practitioners and researchers
with a knowledge of linear algebra, a basic understanding of classical neural networks for image classification, speech
recognition, natural language processing, or other ML tasks.

Date: Fri, Sep 20, 2024
Time: 13:00-16:30 Eastern Time (EDT) — UTC-4
Duration3 hours (2 x 1.5 hours)
Summary: Attendees of this tutorial will gain a basic understanding of mixed-dimensional quantum computing, starting from the underlying differences with qubit-based quantum computers to recent developments in the field, encompassing technology platforms, software, and methods for compiling and simulating applications for these architectures. They will grasp the importance of this novel approach and acquire experience with programming techniques tailored for this type of quantum computing.
Abstract: Quantum computing holds the promise of transforming high-performance computing by potentially solving specific computational problems much faster than classical computers. Current quantum architectures primarily focus on two-level systems known as qubits. However, the underlying technology inherently supports more than two levels, referred to as qudits. Leveraging qudits not only broadens the range of gate sets but also enhances performance and enables more efficient circuit design, addressing problems beyond the capabilities of current qubit-based quantum computers and classical machines. While these advantages present exciting prospects for new applications, they also pose new challenges. This tutorial aims to provide users and developers with a comprehensive overview of this rapidly evolving field. We will delve into essential background information, addressing the unique challenges and opportunities posed by mixed-dimensional quantum computing. Specifically, we will explore how new quantum software must adapt for compiling and simulating quantum circuits and algorithms designed for these architectures. Throughout the tutorial, practical examples and hands-on demonstrations will be provided using MQT Qudits, an open-source quantum computing software development framework tailored for working with mixed-dimensional quantum circuits. This framework facilitates the simulation of quantum circuits on classical computers and supports quantum research and education.
Keywords: Quantum computing, qudits, compilation, simulation, quantum education
Contents Level: 60% beginners, 30% intermediate, 10% advanced. 
Target Audience: Our primary audience comprises developers and users of software for quantum computing. However, we also target end-users and domain experts, such as theoretical physicists specializing in condensed matter or particle physics who seek to implement quantum simulations, or user that want to explore new alternatives in quantum optimization. Additionally, physicists and experimentalists who need to operate their devices via this software stack are included, as well as quantum information specialists aiming to easily verify their methods. We strongly believe that increased exchange among these groups is essential and urgently needed. This tutorial presents a perfect opportunity for achieving this goal. A background in Python and some basic knowledge of quantum computing will be useful but not necessary for attendees of this tutorial.

Date: Fri, Sep 20, 2024
Time: 13:00-16:30 Eastern Time (EDT) — UTC-4
Duration3 hours (2 x 1.5 hours)
Summary: Quantum tensor networks, along with their applications in classification, time-series modeling, and natural language processing, signify a burgeoning and interdisciplinary field within the realms of quantum computing and machine learning. This tutorial strives to empower researchers in the fields of machine learning and artificial intelligence by offering insights and tools to explore this cutting-edge domain, presenting accessible entry points and illustrative code samples.
Abstract: This tutorial proposes a comprehensive exploration of the integration of Tensor Networks (TNs) and Quantum Neural Networks (QNNs) within the realm of Quantum Machine Learning (QML). With quantum computing advancing rapidly, TNs have emerged as powerful tools for managing complex data structures like quantum states. The tutorial aims to bridge the gap between classical neural networks and quantum computing platforms by elucidating the principles of TNs, QNNs, and their combined applications. The tutorial begins with foundational knowledge, covering core principles of quantum information processing and multi-linear algebra, catering to attendees with varying levels of expertise. It then delves into the innovative fusion of TNs and QNNs, addressing challenges related to their trainability and showcasing their practical utility across diverse machine learning tasks. Attendees will gain insights into the potential applications of TN-QNNs in image classification, speech recognition, natural language processing, and reinforcement learning. Through real-world examples and demonstrations, the tutorial aims to inspire participants to explore and implement TN-QNNs in their projects, fostering innovation in the intersection of quantum computing and artificial intelligence. Overall, this tutorial provides a succinct yet thorough exploration of TN-QNN integration, empowering researchers and students to actively engage in the burgeoning field of Quantum Machine Learning.
Keywords: Quantum machine learning, tensor networks, variational quantum circuits, quantum neural networks
Contents Level: 10% beginners, 60% intermediate, 30% advanced. 
Target Audience: The tutorial is designed for practitioners and researchers with a knowledge of linear algebra, a basic understanding of classical neural networks for image classification, speech recognition, natural language processing, or other ML tasks.

Date: Fri, Sep 20, 2024
Time: 13:00-16:30 Eastern Time (EDT) — UTC-4
Duration3 hours (2 x 1.5 hours)
Summary: The tutorial reviews error mitigation methods and presents a workflow for implementing them for utility-scale applications on IBM Quantum hardware through Qiskit Runtime. It also discusses how to configure optimal error mitigation settings using a scalable classical simulation approach.
Abstract: The utility-scale quantum computers with 100+ qubits, now available through IBM Quantum, have provided a cutting-edge platform for researchers and practitioners to demonstrate their quantum applications of interest. A suite of error mitigation methods has been developed for near term quantum applications, but how to apply them to large-scale quantum computational tasks with optimal settings is yet to be understood and established. This tutorial will present state-of-the-art error mitigation methods and focus on applying those methods to use cases involving quantum circuits at utility scales, using the latest Qiskit Runtime capabilities. The tutorial addresses a central challenge in error mitigation, which is to decide on an optimal error mitigation setting to achieve the desired accuracy and precision for the computation results. For utility-scale circuits, it is generally hard to validate or predict the results of the quantum circuits using direct classical simulation methods due to the exponential overhead of representing the qubit state and the noise channels. On the other hand, testing the computational workflow on the actual hardware by trial and error is also expensive. The tutorial presents a classical workflow to classically simulate the effect of noise on a given quantum computational task under an error mitigation setting. The said workflow is made scalable using a circuit Cliffordization procedure and standard assumptions about hardware noise. The workflow presented can be used to validate if the results from quantum hardware execution are expected to meet the desired accuracy and precision, and it can be further used to optimize error mitigation settings.
Keywords: Quantum error mitigation, Qiskit, Qiskit Runtime, Quantum development kit, Quantum algorithms, Quantum-classical programming, Quantum applications
Contents Level: 20% beginners, 60% intermediate, 20% advanced. 
Target Audience: This tutorial targets quantum researchers and practitioners who are interested in learning and leveraging error mitigation within their utility-scale quantum programs in order to obtain the best results from today’s quantum hardware. This tutorial is appropriate for a variety of audiences, including quantum researchers and practitioners who are looking to optimize the accuracy and reliability of their quantum computing experiments, students and educators who are interested in gaining a deeper understanding of error mitigation, and software engineers who want to learn about specific software implementation for error mitigation in quantum computing. A background in Python and some basic knowledge of quantum computing will be helpful but not necessary for attendees of this session.

Date: Fri, Sep 20, 2024
Time: 10:00-14:30 Eastern Time (EDT) — UTC-4
Duration3 hours (2 x 1.5 hours)
Summary: Through hands-on exercises, demonstrations, and explanations, attendees will gain experience with CUDA-Q and distributed quantum computing for improved scaling and performance. This is demonstrated through strategic modifications of the standard Quantum Approximate Optimization Algorithm (QAOA) for the max cut problem, including circuit cutting.
Abstract: As quantum computing evolves, the integration of quantum and classical systems will become a cornerstone of high-performance computing strategies, driving advancements in various scientific and commercial fields. CUDA-Q is a programming model integrating heterogeneous QPU, CPU, GPU, and emulated quantum systems in both Python and C++. CUDA-Q streamlines hybrid quantum-classical workflows and provides a platform to experiment with distributed multiple-QPU workflows.

During this two-part tutorial, we introduce attendees to CUDA-Q distributed computing through an implementation of the Quantum Approximate Optimization Algorithm (QAOA) to the max cut problem, focusing on parallel simulation on GPUs for improved performance and scalability. This sequence of tutorials is targeted to a broad audience; those already familiar with QAOA may choose to only attend Part 2.

Part 1 sets the stage by demonstrating QAOA on a small graph instance and by introducing the divide-and-conquer approach. Part 2 targets attendees who have completed Part 1 or who are already familiar with QAOA. In part 2, we examine a recursive divide-and-conquer approach to max cut that uses two separate applications of QAOA at two different steps of the algorithm. Participants are encouraged to experiment with various adjustments to QAOA for improved performance and outcomes. Moreover, Part 2 equips the audience with the ability to execute quantum kernels in parallel on GPUs using CUDA-Q.

While this tutorial focuses on QAOA, the techniques learned can be adapted to a variety of hybrid quantum algorithms. Attendees will take away skills, Jupyter notebooks, and Python code to apply to their own projects.
Keywords: Quantum computing, hands-on programming, distributed computing, QAOA 
Contents Level: 30% beginner, 60% intermediate, 10% advanced.
Target Audience: For Part 1, learners should have a basic understanding of quantum computing (circuits, Hamiltonians, sampling, observations) and be familiar with a few quantum algorithms. Part 2 is appropriate for those that have completed Part 1 or who are already familiar with variational algorithms. We expect learners to have some experience with Python and Jupyter notebooks. 

Date: Fri, Sep 20, 2024
Time: 13:00-16:30 Eastern Time (EDT) — UTC-4
Duration3 hours (2 x 1.5 hours)
Summary: In this tutorial, attendees will learn about the ARTIQ realtime control system, widely used in trapped ion and cold atom experiments. In addition, they will learn about the additional features provided by the Duke ARTIQ Extensions (DAX) packages to more efficiently develop experiment control software and interface their systems with high-level representations. All presentations will be followed by a hands-on demo with Sinara hardware.
Abstract: ARTIQ (Advanced Real-Time Infrastructure for Quantum physics) is an open-source Python-based software suite for controlling quantum experiments, particularly those involving trapped ions and neutral atoms. ARTIQ provides tools for scheduling, sequencing, and executing complex experiments with high precision and timing accuracy, making it suitable for various quantum computing and quantum simulation applications. It aims to provide researchers with a flexible and scalable platform for implementing and running experiments for quantum information processing. Sinara is an open-source hardware platform that accompanies ARTIQ. Sinara provides a modular architecture for building control systems, including modules housing DACs, ADCs, DDSs, and other RF control hardware. All hardware designs in Sinara are open-source and modules are readily available from commercial vendors. DAX (Duke Artiq eXtensions) is an open-source framework that extends the functionality of ARTIQ to provide a modular control software stack that is portable across systems. It also implements features including dataset access, plotting, scheduling, system simulation, circuit-level representation, and high-level language interface. By the end of this tutorial which involves hands-on interaction with live ARTIQ systems, attendees will gain a basic understanding of Sinara, ARTIQ, and DAX. They’ll learn how these technologies work together to build experiment control systems with real-time control at nanosecond resolution. Users will also see the ARTIQ control system running remote experiments live on a real trapped-ion Quantum Computing testbed. Additionally, the workshop will explore best practices for structuring control code using DAX for optimal scalability, portability, and maintainability.
Keywords: ion trap, ARTIQ, Sinara, DAX, FPGA, quantum instrumentation, DAC, DDS
Contents Level: 70% beginners, 30% intermediate. 
Target Audience: The target audience for this tutorial is practitioners and researchers working with experimental Quantum Computing setups who are interested in exploring ARTIQ for their experiment control systems. Attendees are expected to be familiar with typical devices common in Quantum Information laboratories, such as oscilloscopes and DDS. Attendees should bring their own laptops to fully participate in the handson demos; a limited supply of laptops will be available for participants. Basic skills in Python programming language are also needed for this tutorial.