Post #2: Correcting for Beam Effects in the Qweak Experiment

In this post, I will describe in greater detail the underpinnings of both the Qweak Experiment and my work. The material is extracted from a paper that I wrote for the REU (Research Experience for Undergraduates) program in physics. I’ll spare anyone reading this from the math in the actual paper, which I will describe qualitatively here:

The purpose of the Qweak experiment is to calculate the weak charge of the proton (or “Qweak”) to a high precision using the asymmetry of parity-violating electron scattering against liquid hydrogen. This value is used to calculate the weak mixing angle, a parameter explicitly predicted by the Standard Model of particle physics.

The Standard Model reflects our best current model of both the strong and electroweak forces, but is deeply flawed in that it is inconsistent with many known phenomena such as dark matter, dark energy, gravity, and neutrino masses. A major goal in physics today is the pursuit of ‘new physics’ in the hope of finding a more complete model that accurately explains what is observed. The Qweak Experiment is part of this search. Qweak probes ‘new physics’ through precision measurements of the weak charge, a different approach than that used in large brute-force accelerators such as the Large Hadron Collider. We hope that through this precise measurement of the weak charge, we may be able to confirm elements of the Standard Model or reveal it to be flawed, aiding in the search for a better theory. The Qweak experiment is now over and is in the process of being dismantled so that its housing at Jefferson Lab (Experimental Hall C) can be used for other experiments. The experiment is now in an analysis phase. After all error corrections and statistical analyses have been completed (the analysis phase should last for the next one to two years), the Qweak experiment should be able to predict the weak charge to within a 4% measurement.

——————————————————–

There are four fundamental forces in nature: the electromagnetic force, gravity, and the strong and weak nuclear forces. Of these, the weak force is the least tangible and is responsible for beta decay and initiating nuclear fusion in stars. The weak force is the result of the exchange of W and Z bosons, the carriers of the weak force. The weak charge describes how a particle participates in the neutral weak interaction and is analogous to the charge in the electromagnetic force.

The Qweak experiment measures the weak charge of the proton by finding the parity-violating asymmetry of polarized electron scattering against liquid hydrogen. This asymmetry is dependent on a phenomenon called parity violation.

A parity transformation is a transformation in which an object is flipped to its mirror image. In particle physics, the alternation of a particle’s spin is analogous to this transformation (i.e. a left-handed electron is the mirror image of a right-handed electron). Over this kind of transformation, a symmetry may exist between the original object and its image. However, in parity violation, the mirror image behaves differently than the original. In particle physics, this is illustrated by the weak nuclear force, which differentiates between right and left-handed particles. W and Z bosons, the mediators of the weak interaction, exclusively interact with left-handed particles and not right-handed ones. This introduces an inherent asymmetry in the weak interaction that is exploited in measurements by the Qweak Experiment.

Helicity is another concept central to the Qweak experiment. Helicity is the orientation of a particle’s spin with respect to its momentum. In the experiment, the electron beam produced by Jefferson Lab is spin-polarized (the spins of incoming electrons all have the same spin within 89% accuracy) with respect to the direction of the beam, so nearly all of them have the same helicity. The helicity is then alternated, introducing the source of the measured asymmetry as left and right-handed electrons interact differently with the hydrogen target due to the weak interaction.

——————————————————–

The Qweak experiment was an experiment housed in Hall C in Jefferson Lab, a particle accelerator in Newport News, Virginia that is operated by the US Department of Energy. Jefferson Lab is a Continuous Electron Beam Accelerator Facility, abbreviated as CEBAF, and accelerates electrons to speeds very near the speed of light. At Jefferson Lab, the electron beam is directed at stationary targets in order to study the fundamental nature of atoms (a different approach than that used at CERN, where protons are accelerated in head-on proton/proton collisions). After undergoing acceleration in the facility’s Linacs (Linear particle accelerators), electrons are directed through the beamline at Hall C, one of three underground experimental halls and the one that housed the Qweak Experiment.

Upon delivery, the beam is 89% longitudinally spin-polarized. The asymmetry calculation used to find the weak charge (discussed later) is dependent upon alternation between incoming left and right-handed electrons. The spin (or helicity) of the incoming electrons is alternated at a very high rate of 960 Hz. Sets of four alternations called quartets (consisting of helicity patterns RLLR or LRRL) are completed at a rate of 240 Hz. Additionally, polarization is alternated on a nearly weekly basis by a device called the Wien flipper.

The beam passes through a number of beamline elements before reaching the main apparatus. These include the Compton and Moller Polarimeters, which respectively noninvasively and invasively measure the beam polarization, as well as many beam position and current monitors.

The target is liquid hydrogen, chosen because the simple nucleus is ideal for the purposes of this experiment. The electron beam strikes the target and is scattered against it. Collimators (devices that select incoming particles) let elastically scattered electrons through and absorb inelastically scattered ones. QTOR, the Qweak toroidal magnet, then selects scattered electrons that should be measured and bends and focuses them towards the main detectors. In this experiment, we are interested only in the electrons that are elastically scattered against protons. Electrons that scatter against other electrons in the target (Moler scattering) have insufficient momentum and are bent away by QTOR. Neutral particles that are unaffected by QTOR are not directed towards the main detectors and pass through the apparatus. Electrons that do not interact with the target pass straight through into the beam dump behind the experiment.

The main detectors are a ring of eight synthetic quartz detectors. The parity-violating asymmetry is calculated using the data from these detectors. Each detector is made from two halves (called positive and negative) which are optically glued together. Each detector has photomultiplier tubes at each end, devices which are sensitive to light created when particles pass through the detectors.

Drift chambers are located before (the horizontal drift chambers, or HDCs) and after (the vertical drift chambers, or VDCs) QTOR. These drift chambers are capable of measuring particle trajectories through them to an extreme precision without significantly impacting their paths. These devices are used to calculate the momentum transfer, a value important to the calculation of the weak charge.

Luminosity detectors are located both upstream and downstream of the target and are used to measure the rate at which interactions occur.

——————————————————–

The purpose of the Qweak Experiment is to measure the weak charge of the proton and use it to find the weak mixing angle. However, the weak charge must first be extracted using the parity-violating asymmetry and momentum transfer, values measured by the experiment. The parity-violating asymmetry is calculated using the difference over the sum of cross sections of right and left-handed electrons and is proportional to the weak charge.

The cross section of a particle is an area which describes the probability of it having an interaction. As the beam helicity is alternated, left-handed particles will interact with protons in the target feeling both the weak and electromagnetic forces, while right-handed particles will only feel the effects of the electromagnetic force. This inherent asymmetry is minute but measurable in this ratio between cross sections.

In the experiment, momentum transfer is minimized so that the ‘B term’ of the Qweak equation (derived from the results of other experiments) is not the dominant term.

The physics asymmetry that is used to find the weak charge must be extracted from the measured value, accounting for false asymmetry and background asymmetries.

——————————————————–

The Qweak experiment seeks to exclusively study the parity-violating asymmetry of electrons scattered against protons. Any other interactions or phenomena will detract from this result and should be corrected for. The purpose of the research described in this paper is to use linear regression to correct for effects of beam properties which may affect the weak charge calculated by Qweak and then study remaining residual correlations with other beam properties. These beam effects are predominantly due to natural motion of the electron beam, a side-to-side jittering resulting from natural beam imperfections, but many measures of background were also considered. The current correction by linear regression software used by Qweak, called LinRegBlue, only corrects for a limited number of beam properties and does not recalculate residual correlations after correction. The properties corrected for by LinRegBlue are known to affect the measured asymmetry, but there may be other remaining parameters that we should correct for.

——————————————————–

Correlation is a measure of a statistical relationship between two sets of data. If two sets of data are correlated, this implies a dependence in which they tend to vary together. They may also be anticorrelated, meaning that they tend to vary opposite to one another. Either way, the two data sets may share some common factor. This relationship may in some cases imply that one variable causes the other, but this is not always the case.

A concept related to correlation is covariance, the statistical measure of how well two variables vary with one another. Variance is the covariance of a variable with itself and shows how spread out the data is.

In the Qweak Experiment, we need to be able to correct measured asymmetry data for natural beam motion and other sources of error that may confound our measure of the weak charge. Ideally, there should be no correlation between data from unrelated beam elements and data from the main detectors. For example, the variable targetX describes the x position where the beam strikes the target. The asymmetry data taken by the main detectors ideally should only result from the inherent asymmetry of the weak interaction. However, the measured asymmetry would also be sensitive to motion of the beam in the +x or -x directions across the target, something that should be accounted for. If there was natural beam motion in the +x or -x direction (which there is), the measured asymmetry would be correlated (or anticorrelated, respectively) with targetX. Using this correlation, the main detector asymmetry data can be corrected for targetX (as well as many other variables) . For simplicity, main detector asymmetry data will be referred to as the ‘main detectors’ and anything else used for correction or comparison will be referred to as ‘beam properties.’

Over the course of the experiment, Qweak collected some 150 terabytes of asymmetry data. This asymmetry data is organized by runs and run segments which give an idea of sequentially when the data was taken. Runs are also differentiated by which analyzer version was applied to them (Qweak software that corrects data for known errors). In this research, only data from two analyzer versions (QwPass3 and QwPass5beta) were used. Data are organized into root trees, data structures with hundreds of different variables including the main asymmetry data and data from many different beamline elements. Each tree contains data from 90,000 or so ‘events,’ one ms segments of data collection that contain data from approximately 600,000 electrons reaching the target. The data are stored at Jefferson Lab in a seven petabyte magnetic tape silo and are accessed using a utility called jcache which copies the data from tape to cache, which is more accessible.

The asymmetry data are blinded by +/- 60 ppb in order to prevent human bias towards a particular outcome during statistical analysis.

——————————————————–

For the purposes of this research, preexisting methods for correcting main detector data for correlations using linear regression were used, drawing from a Sandia Labs report (Pebay, 2008) and a thesis by Damon Spayde (2001). Calculations are done using update formulas, where values such as the mean and covariance are updated with each successive loop through the tree. It is necessary to calculate covariance in this way because otherwise digits of precision would be lost to truncation. In order to do the correction described at the beginning of the methodology, one must calculate the covariances between each of the beam properties with one another and with the asymmetries measured by the main detectors. Variances for each beam property and main detector must also be calculated.

The mean value is updated with each successive loop through for use in the covariance calculation. Covariances between each of these beam properties and the main detectors are calculated the same way. Variance of a single variable or main detector asymmetry would be calculated similarly.

These covariances are then used to find r-values, a measure of percentage correlation where 1 corresponds to 100% correlation and -1 corresponds to 100% anticorrelation.

The r value between j and k is equal to their covariance divided by the square root of the product of their variances. Matrices of these r-values are then used to compute the matrix D_k, used for doing the correction. D_k can then be used to find the matrix C_k of asymmetry slopes using a ratio of variances.

The correction is summed over all of the entries and done for each beam property that is being corrected for.

——————————————————–

The purpose of software written for this research is to correct asymmetry data for an initial set of beam properties with known effects, then calculate and display remaining residual correlations with many other beam properties to see if correlations remain. The scripts used for this research were written using ROOT, a C++ based data analysis package developed by CERN.

The program begins by requesting data from the magnetic tape silo at Jefferson Lab. In a first pass through the data tree, the program reads main detector asymmetry data and data from six beam properties that are initially corrected for, as well as error flags for respective entries and devices. These six beam properties are referred to as the ‘5+1’ beam properties, and include targetX and targetY (the position in x and y that the beam strikes the target), targetXSlope and targetYSlope (the rate of change of this position), beam charge and beam energy. The ‘+1’ element is the beam charge, which is sometimes omitted. The main detector asymmetry data is contained in sums for each of the eight detectors over each entry, called md1barsum, md2barsum, md3barsum, etc. Additionally, a sum over all of the main detectors called mdallbars is used. Using the formulas previously described, correlations are calculated between each of the six beam properties and the main detector asymmetry values. Matrices of these correlation values are produced with color-coded cells. Cuts for bad entries and device error are made where appropriate. In a second pass, the main detector data are corrected for the correlations between them and the 5+1 beam properties calculated in the first pass. Residual correlations are then calculated between these and data from many other beam properties and beamline elements, to be discussed in my results. Additional matrices of these residual correlation values with color-coded cells are created. Again, appropriate cuts are made for bad events and device errors. An additional program calls the output of many of these and compares them side-by-side with other correlations for the same main detector data in order to show how the results change with time.