lsstdarkmatter / dark-matter-paper

Repo for tracking LSST dark matter whitepaper
8 stars 2 forks source link

Constraints of Primordial Black Holes #8

Open wadawson opened 6 years ago

wadawson commented 6 years ago

In particular, at the outset, we discussed stellar dynamic, microlensing, and CMB constraints of primordial black holes (largely due to the parties involved). We are open to broadening consideration to the full range of primordial black hole constraints.

wadawson commented 6 years ago

Here is the link to the simulations of LSST's sensitivity of intermediate mass black holes including paralensing signal: https://my.syncplicity.com/share/rkyf6hfhooe1kt1/sim_truths_alt_sched_RA0Decn15.csv password: lsstsimdata

simfull

simplax

unknown

alihaimoud commented 6 years ago

** Explanation of fields in Will's file:

fieldID: irrelevant seq_n: ID number lens_signal: irrelevant y_0: impact parameter in Einstein radii t_e: Eisntein crossing time in days t_0: peak time in MJD (modified julian date) m_lens d_d = d_lens [in kpc] theta_e: Eistein radius (radians) v_t: transverse velocity lambda: celestial longitude of the source star [degrees] beta: celestial latitude of the source star [degrees] t_c: time of closest approach [when Earth is closest to the Sun-source line] t_p: time of perihelion of Earth's orbit
snr_full: full SNR of all lensing signal snr_plax: SNR due to parallax only

Observations are in 30-second spans, this is all for a 23-magnitude star at 8-kpc (in the bulge). A typical error is ~ 0.08-0.3 magnitude, mostly due to sky background. LSST expects to observe 1-8 billions stars this faint or brighter.

nrgolovich commented 6 years ago

Optical depth ref: https://arxiv.org/abs/astro-ph/0502363

sbird commented 6 years ago

A computation of the LSST PBH constraints as a function of mass: lsst_pbh_constraints

sbird commented 6 years ago

The code to make the figure:

import numpy as np
import matplotlib.pyplot as plt

def observed_fraction(csvdata, snrcut=5, full=True, nm=60):
    """Find the expected number of events. snrcut is the snr to require for a detection.
    if full == True, then use the full SNR, otherwise use the parallax SNR."""
    csvfile = open(csvdata)
    snr = csv.reader(csvfile)
    data = np.array([[float(elem) for elem in row] for row in snr])
    mlens = data[:,6]
    if full:
        snr_full = data[:,-2]
    else:
        snr_full = data[:,-1]
    ii = np.where(snr_full > snrcut)
    mbins = np.logspace(np.log10(np.min(mlens)), np.log10(np.max(mlens)), nm)
    hist = np.histogram(mlens[ii], mbins)
    hist2 = np.histogram(mlens, mbins)
    mbins2 = (hist[1][1:] + hist[1][:-1])/2
    return mbins2, hist[0]/hist2[0]

def number_of_events(f_pbh):
    """Find expected number of events from PBHs.
    NOTE this neglects any potential dependence on optical depth
    from PBH mass for large PBHs which have an einstein radius
    large enough so that the crossing time is longer than survey time."""
    #This is a lower bound (Will Dawson, private comm. :) )
    number_of_stars = 1e9
    #Polchinski 96 says 5e-7 to the LMC
    #We use Sumi 2005 et al (Ogle)
    optical_depth = 4.48e-6
    total = optical_depth * number_of_stars * f_pbh
    return total

def make_fig():
    """Make the figure!"""
    hist_full = observed_fraction("sim_truths_alt_sched_RA0Decn15.csv", full=True)
    hist_par = observed_fraction("sim_truths_alt_sched_RA0Decn15.csv", full=False)
    ev = number_of_events(1)
    plt.semilogx(hist_full[0], 1e2/(ev * hist_full[1]), label="Full SNR")
    plt.semilogx(hist_par[0], 1e2/(ev * hist_par[1]), label="Parallax SNR")
    plt.ylabel(r"$f_\mathrm{PBH}$ (x100)")
    plt.xlabel(r"$M_\mathrm{PBH}$")
    plt.title("LSST constraints")
    plt.legend(loc='upper right')
sbird commented 6 years ago

We note that a strong upper bound on the PBH mass is the mass of the smallest dwarf galaxy, which is around 10 to 7 - 10 to 8 solar.

wadawson commented 6 years ago

Here is a simulation with extended mass range 0.01-1e8 solar mass: 2000 samples https://my.syncplicity.com/share/ifcuz25ybkvdhsg/sim_truths_OpSimRA0Decn15_2000samp 20000 samples https://my.syncplicity.com/share/wy8xfwj2gamiavw/sim_truths_OpSimRA0Decn15_20000samp password: lsstsimdata

wadawson commented 6 years ago

Some notes on assumptions in the simulations that will affect the validity of the ultimate conclusions:

sbird commented 6 years ago

Extended mass range constraints with 20000 points:

lsst_pbh_constraints_extended

nrgolovich commented 6 years ago

zipped together 8 files that each have 25k PBHs

These assume a triangular density distribution (ideally we'd do a 1/r distribution), where the p(r=0) = 0 and p(r=8) = 0.25.

data.zip

EDIT: forgot to set seed, so rerunning now

nrgolovich commented 6 years ago

new zip:

data2.zip

sbird commented 6 years ago

Plot from the full dataset: lsst_pbh_constraints_extended

sbird commented 6 years ago

and this was the code:


import glob
import numpy as np
import matplotlib.pyplot as plt

def observed_fraction(csvdata, snrcut=5, full=True, nm=60):
    """Find the expected number of events. snrcut is the snr to require for a detection.
    if full == True, then use the full SNR, otherwise use the parallax SNR."""
    files = glob.glob(csvdata)
    mlens = np.array([])
    snr_full = np.array([])
    for cvd in files:
        csvfile = open(cvd, 'r')
        #Skip first line
        csvfile.readline()
        snr = csv.reader(csvfile)
        data = np.array([[float(elem) for elem in row] for row in snr])
        mlens = np.concatenate([data[:,6], mlens])
        if full:
            tsnrs = data[:,-2]
        else:
            tsnrs = data[:,-1]
        snr_full = np.concatenate([tsnrs, snr_full])
    ii = np.where(snr_full > snrcut)
    mbins = np.logspace(np.log10(np.min(mlens)), np.log10(np.max(mlens)), nm)
    hist = np.histogram(mlens[ii], mbins)
    hist2 = np.histogram(mlens, mbins)
    mbins2 = (hist[1][1:] + hist[1][:-1])/2
    return mbins2, hist[0]/hist2[0]

def number_of_events(f_pbh):
    """Find expected number of events from PBHs.
    NOTE this neglects any potential dependence on optical depth
    from PBH mass for large PBHs which have an einstein radius
    large enough so that the crossing time is longer than survey time."""
    #This is a lower bound (Will Dawson, private comm. :) )
    number_of_stars = 1e9
    #Polchinski 96 says 5e-7 to the LMC
    #We use Sumi 2005 et al (Ogle)
    optical_depth = 4.48e-6
    total = optical_depth * number_of_stars * f_pbh
    return total

def make_fig(csvdata, nbins=60, snrcut=5):
    """Make the figure!"""
    hist_full = observed_fraction(csvdata, snrcut=snrcut, full=True, nm=nbins)
    hist_par = observed_fraction(csvdata, snrcut=snrcut, full=False, nm=nbins)
    ev = number_of_events(1)
    plt.semilogx(hist_full[0], 1e2/(ev * hist_full[1]), label="Full SNR")
    plt.semilogx(hist_par[0], 1e2/(ev * hist_par[1]), label="Parallax SNR")
    plt.ylabel(r"$f_\mathrm{PBH}$ (x100)")
    plt.xlabel(r"$M_\mathrm{PBH}$")
    plt.title("LSST constraints")
    plt.legend(loc='upper right')
    plt.tight_layout()```
wadawson commented 6 years ago

There is something weird below a mass of 10^-2 where the parallax only signal is larger than the full signal. This could easily be investigated by plotting some of the time curves for these low mass microlensing events and visualizing the data.

wadawson commented 6 years ago

Exploring a case at the very low mass regime.

unknown-2 withdata withdata_zoom