Research

The following projects were completed during my undergraduate career. More astrophysics research to come soon!

Using SNEWPY to Analyze Neutrinos from the Black Hole Formation Stage

Core-collapse supernovae (CCSNe) are a known multi-messenger astronomy candidate which can be studied using different sources such as neutrinos, electromagnetic rays, and gravitational waves to aid in our understanding of the Universe. Studying neutrinos from CCSNe in the local Universe and Milky Way provides insight into the mechanisms and processes behind these violent explosions’ core collapse and shock wave. These supernovae neutrinos escape the star and reach detectors before light, which means that neutrinos will be the first detectable messenger from a supernova. With neutrino detectors being capable of detecting neutrinos from CCSNe, understanding the capabilities of current supernovae simulations to analyze particular stages of the explosion will provide insight into the behavior of neutrinos during these processes. For my project, I used SNEWPY, a software package that utilizes supernova simulations to study neutrinos. In particular, I focused on the neutrinos from the black-hole formation stage of CCSN using the Nakazato model, in which I will examine energy spectra, event rates, and fluxes for the 40-kiloton liquid argon detector. I also studied the observable parameters and time bin sizes for the black hole cutoff.

Determining the Feasibility of Matched Filter Searches for Core-Collapse Supernovae

With the efforts of the Laser Interferometer Gravitational-Wave Observatory (LIGO) collaboration, gravitational waves (GWs) have been successfully detected from black hole mergers, neutron stars, and neutron star-black hole binaries. However, there are other violent phenomena, such as core-collapse supernovae (CCSNe), that are potential candidates for gravitational wave studies. CCSNe are of particular interest because they emit other astrophysical messengers such as neutrinos and electromagnetic rays. I studied the feasibility of using matched filter searches for CCSNe with a phenomenological GW model that aims to be representative of CCSNe waveforms. I examined the impact of stochasticity on the g-mode dominated emission of CCSNe, investigated if the randomness of waveforms is manageable for generating a parameter space, and designed a template bank of CCSNe gravitational waveforms. I concluded that matched filtering is feasible because I could generate a template bank with high match values such that a significant amount of the supernova signal was retained.

The Gaussian Process vs. Log Gaussian Cox Process: Comparing Two Methods for High Energy Physics Data

High energy physics (HEP) data from particle colliders are studied using statistical methods to compare the data we observe with what we expect. HEP collider data is often arranged into histograms of counts of the number of events observed with a given feature value which can be modeled with the Poisson distribution. In many analyses, the signal appears as a localized excess on top of a smooth background that must be modeled to observe a signal. For this project, we use toy data that mimics the falling exponential behavior of HEP data. An effective method for modeling smooth backgrounds in HEP data are Gaussian processes. Nonetheless, this method requires data to be binned, which loses information. Gaussian processes generally yield meaningful uncertainties, but it fails to capture the Poissonian uncertainties of HEP data. The log Gaussian Cox process is a novel method we expect to improve on those shortcomings. We compare the ability of the Gaussian process and the log Gaussian Cox process to reproduce a known intensity function when modeling the smooth background events. We find that the log Gaussian Cox process is promising, however, further exploration is needed to create an optimal model and develop a deeper understanding of the log Gaussian Cox process.

Finding Transient Artifacts with Deep Learning in the Dark Energy Survey

Cosmological surveys increasingly rely on deeper astronomical imaging, including tens to hundreds of single-epoch images in coadds. These images contain many non-cosmological transient artifacts, such as cosmic rays, satellites, and asteroids, which can hinder astronomers’ ability to accurately infer a representative cosmological object sample. The Dark Energy Survey (DES) Year 3 results relied on manual visual inspection by volunteers of all deep field coadd images to detect and mask residual transient artifacts. As the quantity of data of cosmological surveys increase, machine learning techniques provide an opportunity to supplement and assist these time-consuming stages of data processing. Through the utilization of deep learning architectures trained on DES deep field image data masked by volunteers, we develop methods for both patch-based classification and direct localized detection of transient artifacts. We find the artifact detection labels by volunteers to have a number of errors, so we introduce a human-in-the-loop relabeling step to correct these mislabels. The models reach high levels of precision and recall and show promising use for automated artifact detection in DES Year 6 deep coadd image stacks and other cosmological surveys.