Example of a spectral analysis with 3ML


Giacomo Vianello (Stanford University) giacomov@stanford.edu

IPython Notebook setup (only if you are using the IPython Notebook)

This line will activate the support for inline display of matplotlib images:

In [1]:
%matplotlib inline

Import 3ML and see the available plug-ins

In [2]:
from threeML import *
getAvailablePlugins()
Plugins:

* FermiLATUnbinnedLike for Fermi LAT (standard classes)        available
* FermiGBMLike for Fermi GBM (all detectors)                   available

Define some general features for the source of interest

Here we define a name and the coordinates (Equatorial J2000). In this case, we are analyzing a GRB:

In [3]:
triggerName = 'bn090217206'
ra = 204.9
dec = -8.4

Setup the use of LAT data

Here we instanciate a plugin to handle Fermi/LAT data:

In [4]:
datadir = '/home/giacomov/science/hawc/jointLikelihood/examples/090217206/'
eventFile   = os.path.join(datadir,'gll_ft1_tr_bn090217206_v00_filt.fit')
ft2File = '/home/giacomov/FermiData/bn090217206/gll_ft2_tr_bn090217206_v00.fit'
expomap = os.path.join(datadir,'gll_ft1_tr_bn090217206_v00_filt_expomap.fit')
ltcube = os.path.join(datadir,'gll_ft1_tr_bn090217206_v00_filt_ltcube.fit')
xmlModel = os.path.join(datadir,"bn090217206_LAT_xmlmodel.xml")
irf = "P7REP_TRANSIENT_V15"

fl = FermiLATUnbinnedLike("LAT_p7rep_transient",eventFile,ft2File,expomap,ltcube,xmlModel,irf)

Setup of GBM data

Here we create 3 instances of the plugin which handle GBM data, for 3 different GBM detectors (NaI6, NaI9, BGO1). We also define the channels (or energies) we want to use, by using setActiveMeasurements():

In [5]:
obsSpectrum = os.path.join(datadir,"bn090217206_n6_srcspectra.pha{1}")
bakSpectrum = os.path.join(datadir,"bn090217206_n6_bkgspectra.bak{1}")
rspFile = os.path.join(datadir,"bn090217206_n6_weightedrsp.rsp{1}")
#Plugin instance
NaI6 = FermiGBMLike("NaI6",obsSpectrum,bakSpectrum,rspFile)
#Choose chanels to use
NaI6.setActiveMeasurements("10.0-30.0 40.0-950.0")


obsSpectrum = os.path.join(datadir,"bn090217206_n9_srcspectra.pha{1}")
bakSpectrum = os.path.join(datadir,"bn090217206_n9_bkgspectra.bak{1}")
rspFile = os.path.join(datadir,"bn090217206_n9_weightedrsp.rsp{1}")
#Plugin instance
NaI9 = FermiGBMLike("NaI9",obsSpectrum,bakSpectrum,rspFile)
#Choose chanels to use
NaI9.setActiveMeasurements("10.0-30.0 40.0-950.0")


obsSpectrum = os.path.join(datadir,"bn090217206_b1_srcspectra.pha{1}")
bakSpectrum = os.path.join(datadir,"bn090217206_b1_bkgspectra.bak{1}")
rspFile = os.path.join(datadir,"bn090217206_b1_weightedrsp.rsp{1}")
#Plugin instance
BGO1 = FermiGBMLike("BGO1",obsSpectrum,bakSpectrum,rspFile)
#Choose chanels to use
BGO1.setActiveMeasurements("2-126")

Create a dataList object to collect all our data

In [6]:
datalist = dataList(fl,NaI6,NaI9,BGO1)

Define our model

Here we define our spectral model, and the init values for the parameters. Each value can be a straight number, or a full specification with a list of [value, initial delta, hard minimum, soft minimum, soft maximum, hard maximum]. For further information, see the Xspec manual for the newpar command.

In [7]:
mod = ModelManager(triggerName,ra,dec,"grbm",datalist)
mod.setParameterValue("grbm.alpha","-1.01482")
mod.setParameterValue("grbm.beta","-2.21990")
mod.setParameterValue("grbm.tem","526.247")
mod.setParameterValue("grbm.norm",[0.0173822,0.001,1e-12,1e-12,1e5,1e5])
print mod

Model:                                   grbm      

Parameters:
-----------
grbm.alpha                               -1.01482   
grbm.beta                                -2.2199    
grbm.tem                                 526.247    
grbm.norm                                0.0173822  

Nuisance parameters:
--------------------
IsotropicTemplate-Normalization          1.0        (in dataset LAT_p7rep_transient)
bn090217206-Normalization                1.0        (in dataset LAT_p7rep_transient)


Create a jointLikelihood object

In [8]:
jl = JointLikelihood(mod)

Profile likelihood maximization

If you want to use the profile likelihood, just perform the fit:

In [9]:
(p_mle, p_1SigmaErr, logLikeMin) = jl.fit()
Minimum of -logLikelihood is: 1534.09748191
Contributions to the -logLikelihood at the minimum:
LAT_p7rep_transient : 84.8253976923
NaI6                : 512.863149834
NaI9                : 461.247646696
BGO1                : 475.161287692

Values for the parameters at the minimum are:
grbm.alpha           = -0.805 +/- 0.0452
grbm.beta            =  -2.56 +/- 0.0654
grbm.tem             =    493 +/-   60.6
grbm.norm            = 0.0181 +/- 0.000745

Full Bayesian analysis

  • Define your model and its initial values (as before):
In [10]:
mod = ModelManager(triggerName,ra,dec,"grbm",datalist)
mod.setParameterValue("grbm.alpha","-1.01482")
mod.setParameterValue("grbm.beta","-2.21990")
mod.setParameterValue("grbm.tem","526.247")
mod.setParameterValue("grbm.norm",[0.0173822,0.001,1e-12,1e-12,1e5,1e5])
  • Define the priors for all your parameters:
In [11]:
#Define priors for model parameters
mod.setParameterPrior("grbm.alpha",Priors.UniformPrior(-10,10))
mod.setParameterPrior("grbm.beta",Priors.UniformPrior(-10,10))
mod.setParameterPrior("grbm.tem",Priors.UniformPrior(10,1e4))

#Use a log-uniform (Jeffrey's) prior for the normalization
mod.setParameterPrior("grbm.norm",Priors.LogUniformPrior(1e-3,1e6))

#Define priors for the nuisance parameters
mod.setNuisanceParameterPrior("IsotropicTemplate-Normalization",Priors.UniformPrior(0.85,1.15))
mod.setNuisanceParameterPrior("bn090217206-Normalization",Priors.UniformPrior(0.85,1.15))
mod.setNuisanceParameterValue("IsotropicTemplate-Normalization",1.0)

print mod

Model:                                   grbm      

Parameters:
-----------
grbm.alpha                               -1.01482   
grbm.beta                                -2.2199    
grbm.tem                                 526.247    
grbm.norm                                0.0173822  

Nuisance parameters:
--------------------
IsotropicTemplate-Normalization          1.0        (in dataset LAT_p7rep_transient)
bn090217206-Normalization                1.0        (in dataset LAT_p7rep_transient)


  • Create a jointLikelihood object:
In [12]:
jl = jointLikelihood.JointLikelihood(mod)
  • Run the Markov Chain MC and explore the parameter space: (this will take some time, depending on the number of samples)
In [13]:
nwalkers = 100
samplesPerWalker = 3000
jl.explore(nwalkers,samplesPerWalker)
Using default burn of nsamples/10 = 300
Performing profile-likelihood minimization to get init values...
Minimum of -logLikelihood is: 1534.09748191
Contributions to the -logLikelihood at the minimum:
LAT_p7rep_transient : 84.8253976923
NaI6                : 512.863149834
NaI9                : 461.247646696
BGO1                : 475.161287692

Values for the parameters at the minimum are:
grbm.alpha           = -0.805 +/- 0.0452
grbm.beta            =  -2.56 +/- 0.0654
grbm.tem             =    493 +/-   60.6
grbm.norm            = 0.0181 +/- 0.000745

Now sampling posterior distribution with MCMC...
done

Getting some useful information from the Bayesian analysis

Now that we have our MCMC sample of parameters, we can extract a lot of useful information. We can start by taking the usual median values, and the credibility intervals (the Bayesian equivalent of confidence intervals). This is made with the getPercentiles method, which uses 1-sigma intervals by default (i.e., 16 and 84 percentiles):

In [19]:
percentiles = jl.getPercentiles()
Percentiles: [50, 16, 84]
grbm.alpha                               = -0.8065 -0.02452 0.02425
grbm.beta                                = -2.565 -0.04893 0.04425
grbm.tem                                 = 495.2 -30.8 34.57
grbm.norm                                = 0.01804 -0.0004231 0.000438
IsotropicTemplate-Normalization          = 1.001 -0.1006 0.09882
bn090217206-Normalization                = 1.002 -0.1031 0.1002

If we want instead the 90% interval, we can request the appropriate percentiles:

In [20]:
percentiles = jl.getPercentiles(levels=[50,5,95])
Percentiles: [50, 5, 95]
grbm.alpha                               = -0.8065 -0.03983 0.04029
grbm.beta                                = -2.565 -0.08377 0.07096
grbm.tem                                 = 495.2 -49.69 59.87
grbm.norm                                = 0.01804 -0.0006905 0.0007352
IsotropicTemplate-Normalization          = 1.001 -0.135 0.1331
bn090217206-Normalization                = 1.002 -0.1369 0.1333

We can also easily produce a triangle plot which show all the monodimensional and bidimensional marginal distribution, with the latter being the equivalent of countour plots in frequentist analysis:

In [21]:
import triangle
fig = triangle.corner(jl.samples,
                      labels=[r'$\alpha$',r'$\beta$','tem',r'A$_{Band}$','N','N1'],
                      quantiles=[0.16, 0.5, 0.84],
                      show_titles=True, label_args={"fontsize": 16})
Quantiles:
[(0.16, -0.8309711910484926), (0.5, -0.80645116728362176), (0.84, -0.78219738645098524)]
Quantiles:
[(0.16, -2.6138930644236442), (0.5, -2.5649595866576336), (0.84, -2.5207103814777243)]
Quantiles:
[(0.16, 464.43799327633155), (0.5, 495.23732372667916), (0.84, 529.80622952088288)]
Quantiles:
[(0.16, 0.017613188321541354), (0.5, 0.01803627667428416), (0.84, 0.018474302418454228)]
Quantiles:
[(0.16, 0.90008240897250791), (0.5, 1.0006364836552399), (0.84, 1.0994604190289932)]
Quantiles:
[(0.16, 0.89870680583665818), (0.5, 1.0018255769768944), (0.84, 1.1020282739230052)]

By using the MCMC samples we can also plot the best fit model along with its "1-sigma contour":

In [24]:
bestFitParam = map(lambda x:x[0],percentiles.values())
nufnu = mod.plotWithChain(bestFitParam,jl.samples,kind="nufnu",ymin=1e-19,emin=1e-3,emax=1e12,nbins=1000)