University of Waterloo
200 University Avenue West
Waterloo, Ontario, Canada N2L 3G1
Phone: (519) 888-4567 ext 32215
Fax: (519) 746-8115
The Canadian Undergraduate Physics Conference (CUPC, https://cupc.ca/) is an annual conference for undergraduate physics students to share their research experiences with their peers, and to learn about the array of research opportunities awaiting them.
A delegation of our own undergraduates presented talks at the conference in Hamilton, October 17-20.
Supervisor: Dr. Adam Garnsworthy (TRIUMF)
Gamma-Ray Infrastructure For Fundamental Investigations of Nuclei (GRIFFIN) is a new high efficiency gammaray spectrometer being developed to replace the 8pi spectrometer at TRIUMF’s Isotope Separator and Accelerator (ISAC) facility for decay spectroscopy. GRIFFIN will consist of an array of up to 16 unsegmented Hyper-Pure Germanium (HPGe) clover detectors. It will use a state-of-the-art, custom designed digital Data Acquisition (DAQ) system. The completion of the GRIFFIN project will support research in nuclear structure, fundamental symmetries, and nuclear astrophysics. This presentation covers the optimisation of the various firmware algorithms to be implemented on the GRIFFIN electronics modules to extract the energy and time values from the preamplifier signals, with the goal being to achieve energy resolutions equivalent, or superior, to those achieved by traditional analogue systems. The system will also have an improved ability to detect low energy signals and handle high counting rates where each HPGe crystal is operating in excess of 50 kHz. The results of several tests performed using offline analysis of real waveforms collected using TIGRESS TIG10 modules show that by optimising the algorithm parameters, a digital system is capable of achieving energy resolutions comparable to those achieved by analogue shaping methods. Through the use of additional filtering techniques the signal-to-noise ratio of the waveforms was improved significantly such that pulses equivalent to gamma-ray energies as low as 5 keV can be detected with good efficiency. The next steps will be to implement these algorithms in firmware and perform online testing to obtain the same quality of results in real-time signal processing.
Supervisor: Dr. Kevin Resch
Bell’s inequalities are important to our understanding of quantum foundations and critical to several quantum technologies. A recent work [E.Wolfe and S. F. Yelin, Phys. Rev. A 86, 012123 (2012)] derived three parametrized families of two-particle, two-setting Bell inequalities. These inequalities are important as they theoretically explore a larger volume of allowed quantum correlations over local hidden-variable models than previous results [A. Cabello, Phys. Rev. A 72, 012113 (2005)] by exploiting marginal, or single particle measurements. In this work we subject those predictions to experimental test using nonmaximally entangled photon pairs to optimize the expected violation. We find excellent agreement with the upper bounds predicted by quantum mechanics with violations of the limits imposed by local hidden-variable models as large as almost 30σ for the optimal parameters and a significant violation over a wide range of parameters.
Spervisor: Dr. Adrian Lupascu
Graphene is the wonder material that has attracted much attention since its discovery in 2004. Among many amazing properties, Graphene has been shown to exhibit the Josephson effect, which is central to the construction of superconducting qubits—the basis of quantum computation. We obtained high quality Graphene flakes through the standard mechanical exfoliation method and characterized them with Raman spectroscopy. Since suspended Graphene are expected to have superior transport properties, we designed a simple procedure to fabricate suspended Graphene junctions. Samples of suspended-Graphene Josephson junctions are successfully made and preliminary measurements have been taken. Field effect has been demonstrated at liquid helium temperature. The next steps are to demonstrate supercurrent with such junctions and further explore their potential applications in various quantum devices.
Supervisor: Dr. Joseph Sanderson
Coulomb explosion imaging is used to produce images of simple molecules as they undergo ultrafast changes so as to produce "molecular movies" with frame rates of one frame per 0.000000000000001 seconds. The pulse length acts like the shutter speed of a camera allowing us take a snapshot of a molecule in motion and the intense laser radiation gives us a means to make the image as the laser light develops a momentary electric field stronger than the one which binds the electrons to the atoms. This could remove up to six electrons from a typical triatomic molecule and causes the molecule to explode because there are not enough electrons left to bind the positively charged ions together. We call this process a coulomb explosion. To use this explosion as a way of imaging the molecule we need to detect all of the fragment ions created by the explosion and measure their momentum then we can run a simulation of the explosion to see what the original geometry of the molecule was. You will learn how the position sensitive spectrometer at the heart of the imaging apparatus works and where the high intensity femtosecond laser pulses come into play. You will get to know how a simplex algorithm can be used to reconstruct molecular geometries and some of the challenges associated with this approach. You will be presented with published and unpublished images from this research, specifically for CO2 and OCS, and you will get to see these molecules explode over a time scale of femtoseconds!
Supervisor: Dr. Avery Broderick
Supervisor: Dr. Robert Mann
The nature of Earth-moon interactions result in a complex astrophysical system. Due to subtle effects of the tidal gravitational field of the moon, the Earth's spin angular momentum is actually decreasing over time. But conservation of angular momentum requires that this be compensated for by altering the moon's orbit. I present a novel method of analyzing this interaction using a quasilocal approach that produces a highly general conservation law. Through the use of rigid quasilocal frames (RQFs), we can achieve physically relevant equations by taking a more natural approach to gravitational energy, and we maintain the ability to make accurate predictions. This is demonstrated by considering the example of the Earth-moon system and calculating the recession of the moon to be 3.8cm/year - exactly the experimental result.
Supervisor: Dr. Avery Broderick
The accretion disks around supermassive black holes at the centers of some galaxies form ultra-relativistic, highly collimated, outflowing jets. These jets extend to intergalactic scales, and have a significant effect on the evolution of the galaxy. Using very long baseline interferometry, the Event Horizon Telescope will soon be able to make horizon scale observations of M87. This will allow us to see the jet launching region in detail, and provide information as to how the jets are launched, their relationship to the supermassive black hole, and the source of the non-thermal electrons in the jet. Current models for these jets ignore variability seen in both jet formation simulations and observed in M87's jet. We will look at how outflowing hot spots (i.e. over-densities of non-thermal electrons) launched from different locations near the black hole affect variability in the jet.
Supervisor: Dr. Joseph Emerson
The Pusey-Barrett-Rudolph theorem (PBR) aims to rule out the possibility of a purely statistical interpretation of the quantum state under a set of reasonable assumptions. Here we show that, within the context of non-local hidden variable models, the PBR theorem assumes a notion of independence that is unnecessarily strong because it is not justified from the empirical constraints imposed by locality. In particular, a weaker formulation of the rule for composing the preparations of two independent systems is possible, one which accommodates the possibility of correlations with non-local hidden variables but still preserves the empirically verifiable notion of independence. Under this weaker principle purely statistical models are not ruled out. Our results suggest that PBR's argument, rather than a no-go theorem for a purely statistical hidden variable model, can be understood as an insight that non-local hidden variables are a necessary kinematic feature of any realist model underlying quantum mechanics.
Supervisor: Dr. Christine Kraus (Laurentian University)
SNO+ is a neutrino detector being commissioned at SNOLAB. One of the most important physics goals of SNO+ is to attempt to detect neutrinoless double beta decay, which would show that the neutrino is its own anti-particle, contrary to the standard model of physics. This talk will focus on the background signal due to polonium-210 in the SNO+ experiment. Events from this background signal have energies similar to that expected for neutrinoless double beta decay, leading to concerns about misreconstruction. Methods for estimating the rate of events from polonium-210 reconstructed within the neutrinoless double beta decay energy window will be discussed. Possible techniques for differentiating this background will also be described.
Supervisor: Dr. Vadim Makarov
The relatively recent focus on research in quantum computing has led to the commercialization of quantum key distribution by companies such as ID Quantique. The very real issue of protecting the information carried by these systems manifests itself through discovering possible attacks that a hacker, Eve, can carry out, and creating defences against them. The photon detectors use in QKD schemes often present the greatest vulnerability, thus characterizing them is a primary research objective. In my talk, I will briefly outline the characteristics and operation of typical single-photon avalanche detectors (SPADs), what vulnerabilities exist, how these vulnerabilities have been demonstrated, as well as giving a brief summary of my work with constructing an automated setup for this purpose.
Supervisor: Dr. Muhammad Kohandel (Applied Mathematics, University of Waterloo)
Hypoxia, the lack of oxygen, is a feature of many solid malignant tumours, which strongly influences response to conventional treatments like radiotherapy. The lower efficiency of radiotherapy in hypoxic conditions is often quantified by the so-called oxygen enhancement ratio (OER). The OER is defined as the ratio of the radiation dose at a given hypoxic condition and the radiation dose in fully oxygenated condition that results in the same survival. We apply a modified linear-quadratic model to numerically calculate the overall survival of cancer cells and the OER for various distributions of oxygen in one- and two-dimension. We discuss how these results may help in designing efficient radiotherapy protocol.